
32610.852173
Quantum entanglement is a physical resource, like energy, associated
with the peculiar nonclassical correlations that are possible between
separated quantum systems. Entanglement can be measured, transformed,
and purified. A pair of quantum systems in an entangled state can be
used as a quantum information channel to perform computational and
cryptographic tasks that are impossible for classical systems. The
general study of the informationprocessing capabilities of quantum
systems is the subject of quantum information theory.

98160.852232
R.A. Fisher: February 17, 1890 – July 29, 1962
Continuing with posts in recognition of R.A. Fisher’s birthday, I post one from a few years ago on a topic that had previously not been discussed on this blog: Fisher’s fiducial probability. …

142264.852256
I develop an account of naturalness (that is, approximately: lack of extreme finetuning) in physics which demonstrates that naturalness assumptions are not restricted to narrow cases in highenergy physics but are a ubiquitous part of interlevel relations are derived in physics. After exploring how and to what extent we might justify such assumptions on methodological grounds or through appeal to speculative future physics, I consider the apparent failure of naturalness in cosmology and in the Standard Model. I argue that any such naturalness failure threatens to undermine the entire structure of our understanding of intertheoretic reduction, and so risks a much larger crisis in physics than is sometimes suggested; I briefly review some currentlypopular strategies that might avoid that crisis.

142287.852271
The thermal time hypothesis (TTH) is a proposed solution to the problem of time: a coarsegrained, statistical state determines a thermal dynamics according to which it is in equilibrium, and this dynamics is identified as the flow of physical time in generally covariant quantum theories. This paper raises a series of objections to the TTH as developed by Connes and Rovelli (1994). Two technical challenges concern the relationship between thermal time and proper time conjectured by the TTH and the implementation of the TTH in the classical limit. Three conceptual problems concern the flow of time in nonequilibrium states and the extent to which the TTH is background independent and gaugeinvariant. While there are potentially viable strategies for addressing the two technical challenges, the three conceptual problems present a tougher hurdle for the defender of the TTH.

142301.852289
Bayesian inference is limited in scope because it cannot be applied in idealized contexts where none of the hypotheses under consideration is true and because it is committed to always using the likelihood as a measure of evidential favoring, even when that is inappropriate. The purpose of this paper is to study inductive inference in a very general setting where finding the truth is not necessarily the goal and where the measure of evidential favoring is not necessarily the likelihood. I use an accuracy argument to argue for probabilism and I develop a new kind of argument to argue for two general updating rules, both of which are reasonable in different contexts. One of the updating rules has standard Bayesian updating, Bissiri et al.’s (2016) general Bayesian updating, and Vassend’s (2019a) quasiBayesian updating as special cases. The other updating rule is novel.

303734.852306
This paper discusses the relevance of supertask computation for the determinacy of arithmetic. Recent work in the philosophy of physics has made plausible the possibility of supertask computers, capable of running through infinitely many individual computations in a finite time. A natural thought is that, if true, this implies that arithmetical truth is determinate (at least for e.g. sentences saying that every number has a certain decidable property). In this paper we argue, via a careful analysis of putative arguments from supertask computations to determinacy, that this natural thought is mistaken: supertasks are of no help in explaining arithmetical determinacy.

309501.85232
Suppose we have a group of perfect Bayesian agents with the same evidence who nonetheless disagree. By definition of “perfect Bayesian agent”, the disagreement must be rooted in differences in priors between these peers. …

315186.852334
Gao (2017) presents a new mentalistic reformulation of the wellknown measurement problem affecting the standard formulation of quantum mechanics. According to this author, it is essentially a determinateexperience problem, namely a problem about the compatibility between the linearity of the Schrödinger’s equation, the fundamental law of quantum theory, and definite experiences perceived by conscious observers. In this essay I aim to clarify (i) that the wellknown measurement problem is a mathematical consequence of quantum theory’s formalism, and that (ii) its mentalistic variant does not grasp the relevant causes which are responsible for this puzzling issue. The first part of this paper will be concluded claiming that the “physical” formulation of the measurement problem cannot be reduced to its mentalistic version. In the second part of this work it will be shown that, contrary to the case of quantum mechanics, Bohmian mechanics and GRW theories provide clear explanations of the physical processes responsible for the definite localization of macroscopic objects and, consequently, for welldefined perceptions of measurement outcomes by conscious observers. More precisely, the macroobjectification of states of experimental devices is obtained exclusively in virtue of their clear ontologies and dynamical laws without any intervention of human observers. Hence, it will be argued that in these theoretical frameworks the measurement problem and the determinateexperience problem are logically distinct issues.

372913.852348
De Finetti is one of the founding fathers of the subjective school of probability. He held that probabilities are subjective, coherent degrees of expectation, and he argued that none of the objective interpretations of probability make sense. While his theory has been influential in science and philosophy, it has encountered various objections. I argue that these objections overlook central aspects of de Finetti’s philosophy of probability and are largely unfounded. I propose a new interpretation of de Finetti’s theory that highlights these aspects and explains how they are an integral part of de Finetti’s instrumentalist philosophy of probability. I conclude by drawing an analogy between misconceptions about de Finetti’s philosophy of probability and common misconceptions about instrumentalism.

446832.852365
. As part of the week of posts on R.A.Fisher (February 17, 1890 – July 29, 1962), I reblog a guest post by Stephen Senn from 2012, and 2017. See especially the comments from Feb 2017. ‘Fisher’s alternative to the alternative’
By: Stephen Senn
[2012 marked] the 50th anniversary of RA Fisher’s death. …

544737.852381
This article describes some recent work on ‘direct air capture’ of carbon dioxide—essentially, sucking it out of the air:
• Jon Gerntner, The tiny Swiss company that thinks it can help stop climate change, New York Times Magazine, 12 February 2019. …

666644.852395
The logical analysis of agency and games—for an expository introduction to the field see van der Hoek and Pauly’s overview paper 2007—has boomed in the last two decades giving rise to a plethora of different logics in particular within the multiagent systems field. At the heart of these logics are always representations of the possible choices (or actions) of groups of players (or agents) and their powers to force specific outcomes of the game. Some logics take the former as primitives, like STIT (the logic of seeing to it that, [Belnap et al., 2001; Horty, 2001]), some take the latter like CL (coalition logic, [Pauly, 2002; Goranko et al., 2013]) and ATL (alternatingtime temporal logic, [Alur et al., 2002]). In these formalisms the power of players is modeled in terms of the notion of effectivity. In a strategic game, the αeffectivity of a group of players consists of those sets of outcomes of the game for which the players have some collective action which forces the outcome of the game to end up in that set, no matter what the other players do [Moulin and Peleg, 1982]. So, if a set of outcomes X belongs to the αeffectivity of a set of players J , there exists an individual action for each agent in J such that, for all actions of the other players, the outcome of the game will be contained in X. If we keep the actions of the other agents fixed, then the selection of an individual action for each agent in J corresponds to a choice of J under the assumption that the other agents stick to their choices.

666659.852409
In this paper we attempt to shed light on the concept of an agent’s knowledge after a nondeterministic action is executed. We start by making a comparison between notions of nondeterministic choice, and between notions of sequential composition, of settings with dynamic and/or epistemic character; namely Propositional Dynamic Logic (PDL), Dynamic Epistemic Logic (DEL), and the more recent logic of SemiPublic Environments (SPE). These logics represent two different approaches for defining the aforementioned actions, and in order to provide unified frameworks that encompass both, we define the logics DELVO (DEL+Vision+Ontic change) and PDLVE (PDL+Vision+Epistemic operators). DELVO is given a sound and complete axiomatisation.

1007062.852423
This note clarifies several details about the description of the measurement process in Bohmian mechanics and responds to a recent preprint by Shan Gao, wrongly claiming a contradiction in the theory.

1246392.852436
The probability that intervals are related by a particular Allen relation is calculated relative to sample spaces Ωn given by the number n of, in one case, points, and, in another, interval names. In both cases, worlds in the sample space are assumed equiprobable, and Allen relations are classified as short, medium and long, A useful basis for relating intervals are the 13 relations described in (Allen, 1983) and widely applied to temporal relations in text and beyond (Liu et al., 2018; Verhagen et al., 2009; Allen and Ferguson, 1994; Kamp and Reyle, 1993, among many others). The present work proceeds from the following question.

1340883.85245
I explore the logic of the conditional, using credence judgments to argue against Duality and in favor of Conditional Excluded Middle. I then explore how to give a theory of the conditional which validates the latter and not the former, developing a variant on Kratzer (1981)’s restrictor theory, as well as a proposal which combines Stalnaker (1968)’s theory of the conditional with the theory of epistemic modals I develop in Mandelkern 2019a. I argue that the latter approach fits naturally with a conception of conditionals as referential devices which allow us to talk about particular worlds.

1353198.852463
Many philosophical discussions presuppose a picture of reality on which, fundamentally, there are objects which have properties and stand in relations. But if we look to how science describes the world, it might be more natural to bring (partial) functions in at the ground level. …

1523555.852478
If you take the entries Pascal’s triangle mod 2 and draw black for 1 and white for 0, you get a pleasing pattern:
The th row consists of all 1’s. If you look at the triangle consisting of the first rows, and take the limit as you get a fractal called the Sierpinski gasket. …

1785321.852501
Safet is considering the proposition $R$, which says that the handkerchief in his pocket is red. Now, suppose we take red to be a vague concept. And suppose we favour a supervaluationist semantics for propositions that involve vague concepts. …

2277094.852546
This paper defends the use of quasiexperiments for causal estimation in economics against the widespread objection that quasiexperimental estimates lack external validity. The defence is that quasiexperimental replication of estimates can yield defeasible evidence for external validity. The paper then develops a different objection. The stable unit treatment value assumption (SUTVA), on which quasiexperiments rely, is argued to be implausible due to the influence of social interaction effects on economic outcomes. A more plausible stable marginal unit treatment value assumption (SMUTVA) is proposed, but it is demonstrated to severely limit the usefulness of quasiexperiments for economic policy evaluation.

2277121.852579
Is the mathematical function being computed by a given physical system determined by the system’s dynamics? This question is at the heart of the indeterminacy of computation phenomenon (Fresco et al. [unpublished]). A paradigmatic example is a conventional electrical ANDgate that is often said to compute conjunction, but it can just as well be used to compute disjunction. Despite the pervasiveness of this phenomenon in physical computational systems, it has been discussed in the philosophical literature only indirectly, mostly with reference to the debate over realism about physical computation and computationalism. A welcome exception is Dewhurst’s ([2018]) recent analysis of computational individuation under the mechanistic framework. He rejects the idea of appealing to semantic properties for determining the computational identity of a physical system. But Dewhurst seems to be too quick to pay the price of giving up the notion of computational equivalence. We aim to show that the mechanist need not pay this price. The mechanistic framework can, in principle, preserve the idea of computational equivalence even between two different enough kinds of physical systems, say, electrical and hydraulic ones.

2279448.852597
Semantics of propositional logic can be formulated in terms of 2player games of perfect information. In the present paper the question is posed what would a generalization of propositional logic to a 3player setting look like. Two formulations of such a ‘3player propositional logic’ are given, denoted PL and PL . An overview of some metalogical properties of these logics is provided.

2292390.852619
4.8 All Models Are False
. . . it does not seem helpful just to say that all models are wrong. The very word model implies simplification and idealization. . . . The construction of idealized representations that capture important stable aspects of such systems is, however, a vital part of general scientific analysis. …

2437581.852665
A month ago, I excerpted just the very start of Excursion 4 Tour I* on The Myth of the “Myth of Objectivity”. It’s a short Tour, and this continues the earlier post. 4.1 Dirty Hands: Statistical Inference Is Sullied with Discretionary Choices
If all flesh is grass, kings and cardinals are surely grass, but so is everyone else and we have not learned much about kings as opposed to peasants. …

2763504.852697
The lesson to be learned from the paradoxical St. Petersburg game and Pascal’s Mugging is that there are situations where expected utility maximizers will needlessly end up (with high probability) poor and on death’s door, and hence we should not be expected utility maximizers. Instead, when it comes to decisionmaking, for possibilities that have very small probabilities of occurring, we should discount those probabilities down to zero, regardless of the utilities associated with those possibilities.

2875133.852735
Let’s start with a puzzle:
Puzzle. You measure the energy and frequency of some laser light trapped in a mirrored box and use quantum mechanics to compute the expected number of photons in the box. Then someone tells you that you used the wrong value of Planck’s constant in your calculation. …

2963397.852772
I came across an interesting letter in response to the ASA’s Statement on pvalues that I hadn’t seen before. It’s by Ionides, Giessing, Ritov and Page, and it’s very much worth reading. I make some comments below. …

3156560.852803
A prominent objection against the logicality of secondorder logic is the socalled Overgeneration Argument. However, it is far from clear how this argument is to be understood. In the first part of the article, we examine the argument and locate its main source, namely, the alleged entanglement of secondorder logic and mathematics. We then identify various reasons why the entanglement may be thought to be problematic. In the second part of the article, we take a metatheoretic perspective on the matter. We prove a number of results establishing that the entanglement is sensitive to the kind of semantics used for secondorder logic. These results provide evidence that by moving from the standard settheoretic semantics for secondorder logic to a semantics which makes use of higherorder resources, the entanglement either disappears or may no longer be in conflict with the logicality of secondorder logic.

3214538.85282
one of the more obscure arguments for Rawls’ difference principle dubbed ‘the Pareto argument for inequality’ has been criticised by G. A. Cohen (1995, 2008) as being inconsistent. In this paper, we examine and clarify the Pareto argument in detail and argue (1) that justification for the Pareto principles derives from rational selfinterest and thus the Pareto principles ought to be understood as conditions of individual rationality, (2) that the Pareto argument is not inconsistent, contra Cohen, and (3) that the kind of bargaining model required to arrive at the particular unequal distribution that the difference principle picks out is a model that is not based on bargaining according to one’s threat advantage.

3214564.852834
Meat Eating: In the past, Jeff didn’t eat meat, since he was concerned with the harm that meat production does to animals. But then he did some calculations, and figured that the harm that he would do to animals by eating meat for a year equals the harm to animals that he could prevent by donating $200 to animal welfare charities. And he would much rather donate $200 more to charity and eat meat than neither make the extra donation nor eat meat. Given this, Jeff now eats meat, but each year donates $200 more than he otherwise would to animal welfare charities.