
115298.287664
Yesterday Ryan Mandelbaum, at Gizmodo, posted a decidedly tongueincheek piece about whether or not the universe is a computer simulation. (The piece was filed under the category “LOL.”)
The immediate impetus for Mandelbaum’s piece was an blog post by Sabine Hossenfelder, a physicist who will likely be familiar to regulars here in the nerdosphere. …

216328.287735
Commonsense morality includes various agentcentred constraints, including ones against killing unnecessarily and breaking a promise. However, it’s not always clear whether, had an agent φed, she would have violated a constraint. And sometimes the reason for this is not that we lack knowledge of the relevant facts, but that there is no fact about whether her φing would have constituted a constraintviolation. What, then, is a constraintaccepting theory (that is, a theory that includes such constraints) to say about whether it would have been permissible for her to have φ ed? In this paper, I canvass various possible approaches to answering this question and argue that teleology offers the most plausible approach—teleology being the view that every act has its deontic status in virtue of how its outcome (or prospect) ranks relative to those of its alternatives. So although, until recently, it had been thought that only deontological theories can accommodate constraints, it turns out that teleological theories not only can accommodate constraints, but can do so more plausibly than deontological theories can.

345767.287757
Nonrelativistic quantum mechanics is grounded on ‘classical’ (Newtonian) space and time (NST). The mathematical description of these concepts entails that any two spatially separated objects are necessarily different, which implies that they are discernible (in classical logic, identity is defined by means of indiscernibility) — we say that the space is T2, or "Hausdorff". But quantum systems, in the most interesting cases, sometimes need to be taken as indiscernible, so that there is no way to tell which system is which, and this holds even in the case of fermions. But in the NST setting, it seems that we can always give an identity to them, which seems to be contra the physical situation. In this paper we discuss this topic for a case study (that of two potentially infinite wells) and conclude that, taking into account the quantum case, that is, when physics enter the discussion, even NST cannot be used to say that the systems do have identity. Keywords: identity of quantum particles, spatial identity, space and time in quantum mechanics.

396822.287773
In philosophy of statistics, Deborah Mayo and Aris Spanos have championed the following epistemic principle, which applies to frequentist tests: Severity Principle (full). Data x (produced by process G) provides good evidence for hypothesis H (just) to the extent that test T severely passes H with x . (Mayo and Spanos 2011, pp.162). They have also devised a severity score that is meant to measure the strength of the evidence by quantifying the degree of severity with which H passes the test T (Mayo and Spanos 2006, 2011; Spanos 2013). That score is a real number defined on the interval [0,1]. In this paper, I put forward a paradoxical feature of the severity score as a measure of evidence. To do this, I create a scenario where a frequentist statistician S is interested in finding out if there is a difference between the means of two normally distributed random variables. The null hypothesis (H0) states that there is no difference between the two means.

518823.287788
Last week, I wrote about a problem that arises if you wish to aggregate the credal judgments of a group of agents when one or more of those agents has incoherent credences. I focussed on the case of two agents, Adila and Benoit, who have credence functions $c_A$ and $c_B$, respectively. …

522105.287805
Today’s Virtual Colloquium is “Global and Local Atheisms” by Jeanine Diller. Dr. Diller received her PhD from the University of Michigan and is currently an assistant professor in the Department of Philosophy and Program on Religious Studies of the University of Toledo in Ohio. …

569658.287826
Neither Karl Popper, nor Frank Knight, nor Max Weber are cited or mentioned in Friedman’s famous 1953 essay “On the methodology of positive economics” (F53). However, they play a crucial role in F53. Making their contribution explicit suggests that F53 has been seriously misread in the past. I will first show that there are several irritating statements in F53 that are, taken together, not compatible with any of the usual readings of F53. Second, I show that an alternative reading of F53 can be achieved if one takes seriously Friedman’s reference to ideal types; “ideal type” is a technical term introduced by Max Weber. Friedman was familiar with Max Weber’s work through Frank Knight, who was his teacher in Chicago. Given that in F53’s view ideal types are fundamental building blocks of economic theory, it becomes clear why both instrumentalist and realist readings of F53 are inadequate. Third, the reading of F53 in terms of ideal types gives the role of elements from Popper’s falsificationist methodology in F53 a somewhat different twist. Finally, I show that the irritating passages of F53 make good sense under the new reading, including the infamous “the more significant the theory, the more unrealistic the assumptions”.

569777.287841
I propose a new definition of identification in the limit (also called convergence to the truth), as a new success criterion that is meant to complement, rather than replacing, the classic definition due to Gold (1967). The new definition is designed to explain how it is possible to have successful learning in a kind of scenario that Gold’s classic account ignores—the kind of scenario in which the entire infinite data stream to be presented incrementally to the learner is not presupposed to completely determine the correct learning target. From a purely mathematical point of view, the new definition employs a convergence concept that generalizes net convergence and sits in between pointwise convergence and uniform convergence. Two results are proved to suggest that the new definition provides a success criterion that is by no means weak: (i) Between the new identification in the limit and Gold’s classic one, neither implies the other. (ii) If a learning method identifies the correct target in the limit in the new sense, any Ushaped learning involved therein has to be redundant and can be removed while maintaining the new kind of identification in the limit. I conclude that we should have (at least) two success criteria that correspond to two senses of identification in the limit: the classic one and the one proposed here. They are complementary: meeting any one of the two is good; meeting both at the same time, if possible, is even better.

627378.287857
In this work we present a dynamical approach to quantum logics. By changing the standard formalism of quantum mechanics to allow nonHermitian operators as generators of time evolution, we address the question of how can logics evolve in time. In this way, we describe formally how a nonBoolean algebra may become a Boolean one under certain conditions. We present some simple models which illustrate this transition and develop a new quantum logical formalism based in complex spectral resolutions, a notion that we introduce in order to cope with the temporal aspect of the logical structure of quantum theory.

627394.287871
We discuss generalized pobabilistic models for which states not necessarily obey Kolmogorov’s axioms of probability. We study the relationship between properties and probabilistic measures in this setting, and explore some possible interpretations of these measures.

627410.287886
Although during the last decades the philosophy of chemistry has greatly extended its thematic scope, the problem of the relationship between chemistry and physics still attracts a great interest in the area. In particular, the main difficulties appear in the attempt to link the chemical description of atoms and molecules and the description supplied by quantum mechanics.

627426.287903
Pearl and Woodward are both wellknown advocates of interventionist causation. What is less wellknown is the interesting relationship between their respective accounts. In this paper we discuss the different perspectives of causation these two accounts present and show that they are two sides of the same coin. Pearl’s focus is on leveraging global network constraints to correctly identify local causal relations. The rules by which global causal structures are composed from distinct causal relations are precisely defined by the global constraints. Woodward’s focus, however, is on the use of local manipulation to identify single causal relations that then compose into global causal structures. The rules by which this composition takes place emerge as a result of local interventionist constraints (or so the claim goes). We contend that the complete picture of causality to be found between these two perspectives from the interventionist tradition must recognise both the global constraints of the sort identified by Pearl and the local constraints of the sort identified by Woodward, and the interplay between them: Pearl requires the possibility of local interventions and Woodward requires a global statistical framework within which to build composite causal structures.

677792.287919
Beauty is perfectly rational and on Sunday knows with certainty that the following events will transpire: After falling into a dreamless sleep Sunday night, she will awaken Monday morning. Later that day she will be told that it is Monday. That evening she will once again fall into a dreamless sleep, and then a fair coin will be tossed. If it lands heads, Beauty will remain asleep until Wednesday. If the coin lands tails, her memory of Monday will be erased prior to her awakening again on Tuesday with experiences subjectively indiscernible from the experiences she had on Monday.

834922.287942
There was a period in the 1970’s when the admissions data for the UC–Berkeley graduate school (hereafter, BGS) exhibited some (prima facie) peculiar statistical correlations. Specifically, a strong negative correlation was observed between being female and being accepted into BGS. This negative correlation (in the overall population of BGS applicants) was (initially) a cause for some concern regarding the possibility of gender bias in the admissions process at BGS. However, closer scrutiny of the BGS admissions data from this period revealed that no individual department’s admissions data exhibited a negative correlation between being female and being admitted. In fact, every department reported a positive correlation between being female and being accepted. In other words, a correlation that appears at the level of the general population of BGS applicants is reversed in every single department of BGS. This sort of correlation reversal is known as Simpson’s Paradox. Because admissions decisions at BGS are made (autonomously) by each individual department, the lack of departmental correlations seems to ruleout the gender bias hypothesis as the best (causal) explanation of the observed correlations in the data. As it happens, there was a strong positive correlation between being female and applying to a department with a (relatively) high rejection rate.

916355.287958
I argue that our judgements regarding the locally causal models which are compatible with a given quantum nogo theorem implicitly depend, in part, on the context of inquiry. It follows from this that certain nogo theorems, which are particularly striking in the traditional foundational context, have no force when the context switches to a discussion of the physical systems we are capable of building with the aim of classically reproducing quantum statistics. I close with a general discussion of the possible implications of this for our understanding of the limits of classical description, and for our understanding of the fundamental aim of physical investigation.

1075430.287973
It was, I think, till recently broadly assumed among working analytic metaphysicians that metaphysics, or at least that branch of it called ontology, is concerned with issues of existence, and that one’s known arguments that one can resist positing Meinongian unreal objects by accepting his theory of descriptions. However, it would be a mistake to read Russell as nothing more than a protoQuinean. This will no doubt already be conceded for the period of Russell’s career in which he thought there were notions of “existence” not explicable by means of the existential quantifier, or embraced a distinction between existence and mere being or subsistence (e.g., PoM §427, Papers 4, 486–89, PP 100). However, in what follows I want to argue that this is true even for mature Russell, during the period (starting roughly 1913) in which he officially held the position that all existence claims are to be understood quantificationally. In particular, metaphysical position is more or less exhausted by one’s position on while mature Russell understood “Fs exist” as expressing p(∃v)Fvq, the question of what entities there are, or what entities exist. This he would not have taken the truth of this claim necessarily to setlikely stemmed from Quine’s wellknown paper “On What There Is”, tle the metaphysical or ontological status of Fs. Russell had, runviews is determined by what things its quantifiers range over: “To be is to be the value of a variable,” as he succinctly put it (Quine 1948, 15). Of course, Quine’s views were never universal, but at least most ning alongside his account of existence, a conception of belonging to what is, as he variously put it, “ultimate,” “fundamental”, the “bricks of the universe”, the “furniture of the world”, something “really there”.

1098549.287988
Let's suppose that Adila and Benoit are both experts, and suppose that we are interested in gleaning from their opinions about a certain proposition $X$ and its negation $\overline{X}$ a judgment of our own about $X$ and $\overline{X}$. …

1254978.288003
According to the orthodox treatment of risk preferences in decision theory, they are to be explained in terms of the agent’s desires about concrete outcomes. The orthodoxy has been criticised both for conflating two types of attitudes and for committing agents to attitudes that do not seem rationally required. To avoid these problems, it has been suggested that an agent’s attitudes to risk should be captured by a risk function that is independent of her utility and probability functions. The main problem with that approach is that it suggests that attitudes to risk are wholly distinct from people’s (noninstrumental) desires. To overcome this problem, we develop a framework where an agent’s utility function is defined over chance propositions (i.e., propositions describing objective probability distributions) as well as ordinary (nonchance) ones, and argue that one should explain different risk attitudes in terms of different forms of the utility function over such propositions.

1312600.288018
We consider a naturallanguage sentence that cannot be formally represented in a firstorder language for epistemic twodimensional semantics. We also prove this claim in the appendix. It turns out, however, that the most natural ways to repair the expressive inadequacy of the firstorder language render moot the original philosophical motivation of formalizing a priori knowability as necessity along the diagonal. In this paper we investigate some questions concerning the expressive power of a firstorder modal language with twodimensional operators. In particular, a language endowed with a twodimensional semantics intended to provide a logical analysis of the discourse involving a priori knowledge. We consider a naturallanguage sentence that cannot be formally represented in such a language. This was firstly conjectured in Lampert (manuscript), but here we present a proof. It turns out, however, that the most natural ways to repair this expressive inadequacy render moot the original philosophical motivation of formalizing a priori knowability as necessity along the diagonal.

1321117.288032
I’m surprised it’s a year already since posting my published comments on the ASA Document on PValues. Since then, there have been a slew of papers rehearsing the wellworn fallacies of tests (a tad bit more than the usual rate). …

1376546.288048
Traditional monotheism has long faced logical puzzles (omniscience, omnipotence, and more) [10, 11, 13, 14]. We present a simple but plausible ‘gappy’ framework for addressing these puzzles. By way of illustration we focus on God’s alleged stone problem. What we say about the stone problem generalizes to other familiar ‘paradoxes of omni properties’, though we leave the generalization implicit. We assume familiarity with the proposed (subclassical) logic but an appendix is offered as a brief review.

1376563.288063
In a short unpublished note, Gödel once remarked: [A]t least intuitively, if you divide a geometrical line at a point, you would expect that the two halves of the line would be mirror images of each other. Yet, this is not the case if the geometrical line is isomorphic to the real numbers. ([18, p. 3]) Because a division is exhaustive the ‘center’ point must fall either in the left half or in the right. And because a division is exclusive this point cannot be in both halves, leaving one half open in that it does not contain its boundary, and the other side closed in that it contains its boundary. How strange. Which side is the lucky one? Which side gets to have its own boundary as a part? Any attempt to answer would surely be arbitrary. That is, we feel an intuitive pull toward a certain kind of symmetry: if there is no principled difference between two objects, then there is no principled difference between their boundaries, either. When one object is open and another is closed, there should be some reason as to why. What is strange is not merely that it is possible to divide a line in such an asymmetric way, but rather that it is impossible to do so symmetrically.

1377742.288078
The original “modal interpretation” of nonrelativistic
quantum theory was born in the early 1970s, and at that time the
phrase referred to a single interpretation. The phrase now encompasses
a class of interpretations, and is better taken to refer to a
general approach to the interpretation of quantum theory. We shall
describe the history of modal interpretations, how the phrase has come
to be used in this way, and the general program of (at least some of)
those who advocate this approach.

1550175.288093
Grounding contingentism is the doctrine according to which grounds are not guaranteed to necessitate what they ground. In this paper I will argue that the most plausible version of contingentism (which I will label ‘serious contingentism’) is incompatible with the idea that the grounding relation is transitive, unless either ‘priority monism’ or ‘contrastivism’ are assumed.

1551290.288107
Is part of a perfectly natural, or fundamental, relation? Philosophers have been hesitant to take a stand on this issue. One of reason for this hesitancy is the worry that, if parthood is perfectly natural, then the perfectly natural properties and relations are not suitably “independent” of one another. (Roughly, the perfectly natural properties are not suitably independent if there are necessary connections among them.) In this paper, I argue that parthood is a perfectly natural relation. In so doing, I argue that this “independence” worry is unfounded. I conclude by noting some consequences of the naturalness of parthood.

1557509.288121
We prove that under some technical assumptions on a general, nonclassical probability space, the probability space is extendible into a larger probability space that is common cause closed in the sense of containing a common cause of every correlation between elements in the space. It is argued that the philosophical significance of this common cause completability result is that it allows the defence of the Common Cause Principle against certain attempts of falsification. Some open problems concerning possible strengthening of the common cause completability result are formulated.

1564165.288136
Let me tell you about the game Buckets of fish. This is a twoplayer game played with finitely many buckets in a line on the beach, each containing a finite number of fish. There is also a large supply of additional fish available nearby, fresh off the boats. …

1781835.288151
Population axiologists hope to shed light on central questions in population ethics (How many people should we want there to be? How well off should we want them to be? What if these things are in tension?) by ranking populations that differ with respect to the number of people they contain, and with respect to how well off those people are. But the enterprise of population axiology has, for thirty five years, been overshadowed by certain paradoxes – collections of propositions that are individually truthy (each looks true, at first glance), but jointly inconsistent (they cannot all be true). Here is one of the simplest :

1876817.288166
We offer a defense of one aspect of Paul Horwich’s response to the Liar paradox—more specifically, of his move to preserve classical logic. Horwich’s response requires that the full intersubstitutivity of ‘ ‘A’ is true’ and A be abandoned. It is thus open to the objection, due to Hartry Field, that it undermines the generalization function of truth. We defend Horwich’s move by isolating the grade of intersubstitutivity required by the generalization function and by providing a new reading of the biconditionals of the form “ ‘A’ is true iff A.”

1897349.288184
In a previous post, I floated the possibility that we might use recent work in decision theory by Orri Stefánsson and Richard Bradley to solve the socalled Swamping Problem for veritism. In this post, I'll show that, in fact, this putative solution can't work.According to the Swamping Problem, I value beliefs that are both justified and true more than I value beliefs that are true but unjustified; and, we might suppose, I value beliefs that are justified but false more than I value beliefs that are both unjustified and false. …