
97634.984491
Causalists and Evidentialists can agree about the right course of action in an (apparent) Newcomb problem, if the causal facts are not as initially they seem. If declining $1,000 causes the Predictor to have placed $1m in the opaque box, CDT agrees with EDT that oneboxing is rational. This creates a difficulty for Causalists. We explain the problem with reference to Dummett’s work on backward causation and Lewis’s on chance and crystal balls. We show that the possibility that the causal facts might be properly judged to be nonstandard in Newcomb problems leads to a dilemma for Causalism. One horn embraces a subjectivist understanding of causation, in a sense analogous to Lewis’s own subjectivist conception of objective chance. In this case the analogy with chance reveals a terminological choice point, such that either (i) CDT is completely reconciled with EDT, or (ii) EDT takes precedence in the cases in which the two theories give different recommendations. The other horn of the dilemma rejects subjectivism, but now the analogy with chance suggests that it is simply mysterious why causation so construed should constrain rational action.

100064.98454
Fisher/ Neyman
This continues my previous post: “Can’t take the fiducial out of Fisher…” in recognition of Fisher’s birthday, February 17. I supply a few more intriguing articles you may find enlightening to read and/or reread on a Saturday night
Move up 20 years to the famous 1955/56 exchange between Fisher and Neyman. …

143255.984561
Shagrir ([2001]) and Sprevak ([2010]) explore the apparent necessity of representation for the individuation of digits (and processors) in computational systems. I will first offer a response to Sprevak’s argument that does not mention Shagrir’s original formulation, which was more complex. I then extend my initial response to cover Shagrir’s argument, thus demonstrating that it is possible to individuate digits in nonrepresentational computing mechanisms. I also consider the implications that the nonrepresentational individuation of digits would have for the broader theory of computing mechanisms.

143316.984576
This paper gives a definition of selfreference on the basis of the dependence relation given by Leitgeb (2005), and the dependence digraph by Beringer & Schindler (2015). Unlike the usual discussion about selfreference of paradoxes centering around Yablo’s paradox and its variants, I focus on the paradoxes of finitary characteristic, which are given again by use of Leitgeb’s dependence relation. They are called ‘locally finite paradoxes’, satisfying that any sentence in these paradoxes can depend on finitely many sentences. I prove that all locally finite paradoxes are selfreferential in the sense that there is a directed cycle in their dependence digraphs. This paper also studies the ‘circularity dependence’ of paradoxes, which was introduced by Hsiung (2014). I prove that the locally finite paradoxes have circularity dependence in the sense that they are paradoxical only in the digraph containing a proper cycle. The proofs of the two results are based directly on Konig’s infinity lemma. In contrast, this paper also shows that Yablo’s paradox and its ∀∃unwinding variant are nonselfreferential, and neither McGee’s paradox nor the ωcycle liar has circularity dependence.

143337.984589
When scientists seek further confirmation of their results, they often attempt to duplicate the results using diverse means. To the extent that they are successful in doing so, their results are said to be ‘robust’. This article investigates the logic of such ‘robustness analysis’ (RA). The most important and challenging question an account of RA can answer is what sense of evidential diversity is involved in RAs. I argue that prevailing formal explications of such diversity are unsatisfactory. I propose a unified, explanatory account of diversity in RAs. The resulting account is, I argue, truer to actual cases of RA in science; moreover, this account affords us a helpful new foothold on the logic undergirding RAs.

143373.984602
Ruetsche ([2011]) claims that an abstract C*algebra of observables will not contain all of the physically significant observables for a quantum system with infinitely many degrees of freedom. This would signal that in addition to the abstract algebra, one must use Hilbert space representations for some purposes. I argue to the contrary that there is a way to recover all of the physically significant observables by purely algebraic methods.

155274.984616
For simplicity, most of the literature introduces the concept of definitional equivalence only to languages with disjoint signatures. In a recent paper, Barrett and Halvorson introduce a straightforward generalization to languages with nondisjoint signatures and they show that their generalization is not equivalent to intertranslatability in general. In this paper, we show that their generalization is not transitive and hence it is not an equivalence relation. Then we introduce the Andr´eka and N´emeti generalization as one of the many equivalent formulations for languages with disjoint signatures. We show that the Andr´eka–N´emeti generalization is the smallest equivalence relation containing the Barrett–Halvorson generalization and it is equivalent to intertranslatability even for languages with nondisjoint signatures. Finally, we investigate which definitions for definitional equivalences remain equivalent when we generalize them for theories with nondisjoint signatures.

178434.984631
Symmetry plays a number of central roles in modern physics. As the physicist Paul Anderson famously remarked, “it is only slightly overstating the case to say that physics is the study of symmetry” (1972, p. 394). Here I discuss just one role of symmetry: its use as a guide to superfluous structure, with a particular eye on its application to metaphysics. What is symmetry? Generally speaking, a symmetry is an operation that leaves its object unchanged in a certain respect. Rotation by 90 degrees is a symmetry of a square piece of paper, insofar as the paper’s extension through space is the same after the rotation as before. But we will focus on symmetries of physical theories, not paper. Roughly speaking, these are operations on possible physical systems that leave some aspect of the theory unchanged. Which aspect? That depends: different symmetries preserve different aspects. But an important class of symmetries are those that leave the dynamical laws of the theory unchanged; these are known as dynamical symmetries.

191450.984645
R.A. Fisher: February 17, 1890 – July 29, 1962
Continuing with posts in recognition of R.A. Fisher’s birthday, I post one from a couple of years ago on a topic that had previously not been discussed on this blog: Fisher’s fiducial probability. …

470868.984661
We all rely on this basic assumption when we try to interpret each other’s actions. For instance, if Suzie frequently asks for chocolate ice cream, we infer that she likes chocolate ice cream. Why? Because we assume she is choosing the correct way of getting what she wants. It could be that she hates chocolate ice cream, but she’s irrational, so she acts to get the things she hates; but we assume this isn’t the case. Notice that without this sort of assumption, we would have no way of identifying each other’s desires and beliefs.

490855.984674
Ontological arguments like those of Gödel (1995) and Pruss (2009; 2012) rely on premises that initially seem plausible, but on closer scrutiny are not. The premises have modal import that is required for the arguments but is not immediately grasped on inspection, and which ultimately undermines the simpler logical intuitions that make the premises seem plausible. Furthermore, the notion of necessity that they involve goes unspecified, and yet must go beyond standard varieties of logical necessity. This leaves us little reason to believe the premises, while their implausible existential import gives us good reason not to.

554664.984686
In March, I’ll be talking at Spencer Breiner‘s workshop on Applied Category Theory at the National Institute of Standards and Technology. I’ll be giving a joint talk with John Foley about our work using operads to design networks. …

558782.9847
It is a striking fact from reverse mathematics that almost all theorems of countable and countably representable mathematics are equivalent to just five subsystems of second order arithmetic. The standard view is that the significance of these equivalences lies in the set existence principles that are necessary and sufficient to prove those theorems. In this article I analyse the role of set existence principles in reverse mathematics, and argue that they are best understood as closure conditions on the powerset of the natural numbers.

558821.984714
This article follows on the introductory article “Direct Logic for Intelligent Applications” [Hewitt 2017a]. Strong Types enable new mathematical theorems to be proved including the Formal Consistency of Mathematics. Also, Strong Types are extremely important in Direct Logic because they block all known paradoxes[Cantini and Bruni 2017]. Blocking known paradoxes makes Direct Logic safer for use in Intelligent Applications by preventing security holes.

594569.984758
. As part of the week of recognizing R.A.Fisher (February 17, 1890 – July 29, 1962), I reblog a guest post by Stephen Senn from 2012/2017. The comments from 2017 lead to a troubling issue that I will bring up in the comments today. …

717656.984775
There is a vast literature that seeks to uncover features underlying moral judgment by eliciting reactions to hypothetical scenarios such as trolley problems. These thought experiments assume that participants accept the outcomes stipulated in the scenarios. Across seven studies (N = 968), we demonstrate that intuition overrides stipulated outcomes even when participants are explicitly told that an action will result in a particular outcome. Participants instead substitute their own estimates of the probability of outcomes for stipulated outcomes, and these probability estimates in turn influence moral judgments. Our findings demonstrate that intuitive likelihoods are one critical factor in moral judgment, one that is not suspended even in moral dilemmas that explicitly stipulate outcomes. Features thought to underlie moral reasoning, such as intention, may operate, in part, by affecting the intuitive likelihood of outcomes, and, problematically, moral differences between scenarios may be confounded with nonmoral intuitive probabilities.

789463.984789
There are two standard responses to the discrepancy between observed galactic rotation curves and the theoretical curves calculated on the basis of luminous matter: postulate dark matter, or modify gravity. Most physicists accept the former as part of the concordance model of cosmology; the latter encompasses a family of proposals, of which MOND is perhaps the bestknown example. Don Saari, however, claims to have found a third alternative: to explain this discrepancy as a result of approximation methods which are unfaithful to the underlying Newtonian dynamics. If he is correct, eliminating the problematic approximations should allow physicists and astronomers to preserve the validity of Newtonian dynamics in galactic systems without invoking dark matter.

789484.984803
We defend the manyworlds interpretation of quantum mechanics (MWI) against the objection that it cannot explain why measurement outcomes are predicted by the Born probability rule. We understand quantum probabilities in terms of an observer’s selflocation probabilities. We formulate a probability postulate for the MWI: the probability of selflocation in a world with a given set of outcomes is the absolute square of that world’s amplitude. We provide a proof of this postulate, which assumes the quantum formalism and two principles concerning symmetry and locality. We also show how a structurally similar proof of the Born rule is available for collapse theories. We conclude by comparing our account to the recent account offered by Sebens and Carroll.

844840.984816
Weak supplementation says that if x is a proper part of y, then y has a proper part that doesn’t overlap x. Suppose that we are impressed by standard counterexamples to weak supplementation like the following. …

1067494.98483
I argue for patternism, a new answer to the question of when some objects compose a whole. None of the standard principles of composition comfortably capture our natural judgments, such as that my cat exists and my table exists, but there is nothing wholly composed of them. Patternism holds, very roughly, that some things compose a whole whenever together they form a “real pattern”. Plausibly we are inclined to acknowledge the existence of my cat and my table but not of their fusion, because the first two have a kind of internal organizational coherence that their putative fusion lacks. Kolmogorov complexity theory supplies the needed rigorous sense of “internal organizational coherence”.

1069050.984843
Comparativism is the position that the fundamental doxastic state consists in comparative beliefs (e.g., believing p to be more likely than q), with partial beliefs (e.g., believing p to degree x) being grounded in and explained by patterns amongst comparative beliefs that exist under special conditions. In this paper, I develop a version of comparativism that originates with a suggestion made by Frank Ramsey in his ‘Probability and Partial Belief’ (1929). By means of a representation theorem, I show how this ‘Ramseyan comparativism’ can be used to weaken the (unrealistically strong) conditions required for probabilistic coherence that comparativists usually rely on, while still preserving enough structure to let us retain the usual comparativists’ account of quantitative doxastic comparisons.

1077748.984859
A number of naturalistic philosophers of mind endorse a realist attitude towards the results of Bayesian cognitive science. This realist attitude is currently unwarranted, however. It is not obvious that Bayesian models possess special epistemic virtues over alternative models of mental phenomena involving uncertainty. In particular, the Bayesian approach in cognitive science is not more simple, unifying and rational than alternative approaches; and it not obvious that the Bayesian approach is more empirically adequate than alternatives. It is at least premature, then, to assert that mental phenomena involving uncertainty are best explained within the Bayesian approach. To continue on with an exclusive praise for Bayes would be dangerous as it risks monopolizing the center of attention, leading to the neglect of different but promising formal approaches. Naturalistic philosophers of mind would be wise instead to endorse an agnostic, instrumentalist attitude towards Bayesian cognitive science to correct their mistake.

1080127.984872
People often talk about the synchronic Dutch Book argument for Probabilism and the diachronic Dutch Strategy argument for Conditionalization. But the synchronic Dutch Book argument for the Principal Principle is mentioned less. …

1248412.984886
Now students in the Applied Category Theory 2018 school are reading about categories applied to linguistics. Read the blog article here for more:
• Jade Master and Cory Griffith, Linguistics using category theory, The nCategory Café, 6 February 2018. …

1423596.984899
This contribution is devoted to addressing the question as to whether the methodology followed in building/assessing string theory can be considered scientific – in the same sense, say, that the methodology followed in building/assessing the Standard Model of particle physics is scientific – by focussing on the ”founding” period of the theory. More precisely, its aim is to argue for a positive answer to the above question in the light of a historical analysis of the early developments of the string theoretical framework. The paper’s main claim is a simple one: there is no real change of scientific status in the way of proceeding and reasoning in fundamental physical research. Looking at the developments of quantum field theory and string theory since their very beginning, one sees the very same strategies at work both in theory building and theory assessment. Indeed, as the history of string theory clearly shows (see Cappelli et al., 2012), the methodology characterising the theoretical process leading to the string idea and its successive developments is not significantly different from the one characterising many fundamental developments in theoretical physics which have been crowned with successful empirical confirmation afterwards (sometimes after a considerable number of years, as exemplified by the story of the Higgs particle).

1423723.984912
Three arguments against universally regular probabilities have been posed based on examples where, if regularity holds, then perfectly similar events must have different probabilities. Howson (2017) and Benci et al. (2016) have raised technical objections to these symmetry arguments, but their objections fail. Howson says that Williamson’s (2007) “isomorphic” events are not in fact isomorphic, but Howson is speaking of settheoretic representations of events in a probability model. While those sets are not isomorphic, Williamson’s physical events are, in the relevant sense. Benci et al. claim that all three arguments rest on a conflation of different models, but they do not. They are founded on the premise that similar events should have the same probability in the same model, or in one case, on the assumption that a single rotationinvariant distribution is possible. Having failed to refute the symmetry arguments on such technical grounds, one could deny their implicit premises, which is a heavy cost, or adopt varying degrees of instrumentalism or pluralism about regularity, but that would not serve the project of accurately modelling chances.

1463279.984927
The central question of my paper is whether there is a coherent logical theory in which truth is construed in epistemic terms and in which also some version of the law of excluded middle is defended. Brentano in his later writings has such a theory. My first question is whether his theory is consistent. I also make a comparison between Brentano’s view and that of an intuitionist at the present day, namely Per MartinLöf. Such a comparison might provide some insight into what is essential to a theory that understands truth in epistemic terms.

1480976.984941
The KochenSpecker theorem is an important and subtle topic in the
foundations of quantum mechanics (QM). The theorem demonstrates the
impossibility of a certain type of interpretation of QM in terms of
hidden variables (HV) that naturally suggests itself when one begins
to consider the project of interpretating QM.We here present the
theorem/argument and the foundational discussion surrounding it at
different levels. The reader looking for a quick overview should read
the following sections and subsections: 1, 2, 3.1, 3.2, 4, and
6. Those who read the whole entry will find proofs of some nontrivial
claims in supplementary documents.

1481018.984954
Newton’s success sharpened our understanding of the nature of space and time in the XVII century. Einstein’s special and general relativity improved this understanding in the XX century. Quantum gravity is expected to take a step further, deepening our understanding of space and time, by grasping of the implications for space and time of the quantum nature of the physical world. The best way to see what happens to space and time when their quantum traits cannot be disregarded is to look how this actually happens in a concrete theory of quantum gravity. Loop Quantum Gravity (LQG) [1–7] is among the few current theories sufficiently developed to provide a complete and clearcut answer to this question. Here I discuss the role(s) that space and time play in LQG and the version of these notions required to make sense of a quantum gravitational world. For a detailed discussion, see the first part of the book [ ]. A brief summary of the structure of LQG is given in the Appendix, for the reader unfamiliar with this theory.

1515466.984967
There is an ambiguity in the fundamental concept of deductive logic that went unnoticed until the middle of the 20th Century. Sorting it out has led to profound mathematical investigations with applications in complexity theory and computer science. The origins of this ambiguity and the history of its resolution deserve philosophical attention, because our understanding of logic stands to benefit from an appreciation of their details.