
35185.980249
A Boltzmann Brain is a hypothesized observer that comes into existence by way of an extremely lowprobability quantum or thermodynamic “fluctuation” and that is capable of conscious experience (including sensory experience and apparent memories) and at least some degree of reflection about itself and its environment. Boltzmann Brains do not have histories that are anything like the ones that we seriously consider as candidates for own history; they did not come into existence on a large, stable planet, and their existence is not the result of any sort of evolutionary process or intelligent design. Rather, they are staggeringly improbable cosmic “accidents” that are (at least typically) massively deluded about their own predicament and history. It is uncontroversial that Boltzmann Brains are both metaphysically and physically possible, and yet that they are staggeringly unlikely to fluctuate into existence at any particular moment. Throughout the following, I will use the term “ordinary observer” to refer to an observer who is not a Boltzmann Brain. We naturally take ourselves to be ordinary observers, and I will not be arguing that we are in any way wrong to do so.

35198.9803
Accuracy and the Laws of Credence is required reading for anyone interested in the foundations of epistemology. It is that rare philosophical work which serves both as a stunningly clear overview of a topic and as a cuttingedge contribution to that topic. I can’t possibly address all of the interesting and philosophically rich components of Accuracy and the Laws of Credence here, so I will largely restrict my attention to pieces of Parts I, II, and III of the book, though I’ll have some more general things to say about Pettigrew’s accuracyonly approach to epistemology toward the end.

82493.980321
Brian Jabarian U. Paris 1 & Paris School of Economics How should we evaluate options when we are uncertain about the correct standard of evaluation, for instance due to con‡icting normative intuitions? Such ‘normative’ uncertainty differs from ordinary ‘empirical’uncertainty about an unknown state, and raises new challenges for decision theory and ethics. The most widely discussed proposal is to form the expected value of options, relative to correctness probabilities of competing valuations. But this metatheory overrules our beliefs about the correct riskattitude: it for instance fails to be riskaverse when we are certain that the correct (…rstorder) valuation is riskaverse. We propose an ‘impartial’metatheory, which respects riskattitudinal beliefs. We show how one can address empirical and normative uncertainty within a uni…ed formal framework, and rigorously de…ne risk attitudes of theories. Against a common impression, the classical expectedvalue theory is not riskneutral, but of hybrid risk attitude: it is neutral to normative risk, not to empirical risk. We show how to de…ne a fully riskneutral metatheory, and a metatheory that is neutral to empirical risk, not to normative risk. We compare the various metatheories based on their formal properties, and conditionally defend the impartial metatheory.

84634.980336
According to logical inferentialists, the meanings of logical expressions are fully determined by the rules for their correct use. Two key prooftheoretic requirements on admissible logical rules, harmony and separability, directly stem from this thesis—requirements, however, that standard singleconclusion and assertionbased formalizations of classical logic provably fail to satisfy (Dummett in The logical basis of metaphysics, Harvard University Press, Harvard, MA, 1991; Prawitz in Theoria, 43:1–40, 1977; Tennant in The taming of the true, Oxford University Press, Oxford, ; Humberstone and Makinson in Mind 120(480):1035–1051, 2011). On the plausible assumption that our logical practice is both singleconclusion and assertionbased, it seemingly follows that classical logic, unlike intuitionistic logic, can’t be accounted for in inferentialist terms. In this paper, I challenge orthodoxy and introduce an assertionbased and singleconclusion formalization of classical propositional logic that is both harmonious and separable. In the framework I propose, classicality emerges as a structural feature of the logic.

105706.98035
In 1933 the Polish logician Alfred Tarski published a paper in which
he discussed the criteria that a definition of ‘true
sentence’ should meet, and gave examples of several such
definitions for particular formal languages. In 1956 he and his
colleague Robert Vaught published a revision of one of the 1933 truth
definitions, to serve as a truth definition for modeltheoretic
languages. This entry will simply review the definitions and make no
attempt to explore the implications of Tarski’s work for
semantics (natural language or programming languages) or for the
philosophical study of truth. (For those implications, see the entries
on
truth
and
Alfred Tarski.)

124760.980363
We present a formal semantics for epistemic logic, capturing the notion of knowability relative to information (KRI). Like Dretske, we move from the platitude that what an agent can know depends on her (empirical) information. We treat operators of the form KAB (‘B is knowable on the basis of information A’) as variably strict quantifiers over worlds with a topicor aboutness preservation constraint. Variable strictness models the nonmonotonicity of knowledge acquisition while allowing knowledge to be intrinsically stable. Aboutnesspreservation models the topicsensitivity of information, allowing us to invalidate controversial forms of epistemic closure while validating less controversial ones. Thus, unlike the standard modal framework for epistemic logic, KRI accommodates plausible approaches to the KripkeHarman dogmatism paradox, which bear on nonmonotonicity, or on topicsensitivity. KRI also strikes a better balance between agent idealization and a nontrivial logic of knowledge ascriptions.

173236.980376
I’ve been talking about my new paper with Jade Master:
• John Baez and Jade Master, Open Petri nets. In Part 1 we saw the double category of open Petri nets; in Part 2 we saw the reachability semantics for open Petri nets as a double functor. …

298039.980389
We defend the thesis that every necessarily true proposition is always true. Since not every proposition that is always true is necessarily true, our thesis is at odds with theories of modality and time, such as those of Kit Fine and David Kaplan, which posit a fundamental symmetry between modal and tense operators. According to such theories, just as it is a contingent matter what is true at a given time, it is likewise a temporary matter what is true at a given possible world; so a proposition that is now true at all worlds, and thus necessarily true, may yet at some past or future time be false in the actual world, and thus not always true. We reconstruct and criticize several lines of argument in favor of this picture, and then argue against the picture on the grounds that it is inconsistent with certain sorts of contingency in the structure of time.

327932.980402
In recent work, Alfredo Roque Freire and I have realized that the axiom of wellordered replacement is equivalent to the full replacement axiom, over the Zermelo set theory with foundation. The wellordered replacement axiom is the scheme asserting that if $I$ is wellordered and every $i\in I$ has unique $y_i$ satisfying a property $\phi(i,y_i)$, then $\{y_i\mid i\in I\}$ is a set. …

349914.980417
In Reasons and Persons, Derek Parfit (1984) observed that most people are biased towards the future at least when it comes to pain and pleasure. That is, they regard a given amount of pain as less bad when it is in the past than when it is in the future, and a given amount of pleasure as less good. While Parfit (implicitly) held that this bias is rational, it has recently come under effective attack by temporal neutralists, who have offered cases that with plausible auxiliary assumptions appear to be counterexamples to the rationality claim. I’m going to argue that these cases and the rationale behind them only suffice to motivate a more limited rejection of future bias, and that constrained future bias is indeed rationally permissible. My argument turns on the distinct rational implications of actionguiding and pure temporal preferences. I’ll argue that future bias is rational when it comes to the latter, even if not the former. As I’ll say, Only Action Fixes Utility: it is only when you act on the basis of assigning a utility to an outcome that you rationally commit to giving it the same value when it is past as when it is in the future.

475610.98043
Ibn Sīnā [hereafter: Avicenna] (980–1037 CE) is—directly
or indirectly—the most influential logician in the Arabic
tradition. His work is central in the redefinition of a family of
problems and doctrines inherited from ancient and late ancient logic,
especially Aristotle and the Peripatetic tradition. While, in general
terms, Avicenna squarely falls into a logical tradition that it is
reasonable to characterize as Aristotelian, the trove of innovations
he introduces establishes him as a genuinely new canonical figure. Every
later logician in this tradition confronts him, either as a critic or
as a follower, to the extent that, with few exceptions, Aristotle and
the Peripatetic tradition almost entirely disappear from the
scene.

496723.980443
Jade Master and I have nearly finished a paper on open Petri nets, and it should appear on the arXiv soon. I’m excited about this, especially because our friends at Statebox are planning to use open Petri nets in their software. …

570021.980457
A novel approach to quantization is shown to allow for superpositions of the cosmological constant in isotropic and homogeneous minisuperspace models. Generic solutions featuring such superpositions display unitary evolution and resolution of the classical singularity. Physically wellmotivated cosmological solutions are constructed. These particular solutions exhibit characteristic features of a cosmic bounce including universal phenomenology that can be rendered insensitive to Planckscale physics in a natural manner.

624662.98047
Continuing with the discussion of E.S. Pearson in honor of his birthday:
Egon Pearson’s Neglected Contributions to Statistics
by Aris Spanos
Egon Pearson (11 August 1895 – 12 June 1980), is widely known today for his contribution in recasting of Fisher’s significance testing into the NeymanPearson (1933) theory of hypothesis testing. …

642611.980485
It has been observed (e.g. Cooper (1979), Chierchia (1993), von Fintel (1994), Marti (2003)) that the interpretation of natural language variables (overt or covert) can depend on a quantifier. The standard analysis of this phenomenon is to assume a hidden structure inside the variable, part of which is semantically bound by the quantifier. In this paper I argue that the presupposition of the adverb 'again' and other similar presuppositions depend on a variable that gives rise to the same phenomenon.

646880.980504
Coincidence Analysis (CNA) is a configurational comparative method of causal data analysis that is related to Qualitative Comparative Analysis (QCA) but, contrary to the latter, is custombuilt for analyzing causal structures with multiple outcomes. So far, however, CNA has only been capable of processing dichotomous variables, which greatly limited its scope of applicability. This paper generalizes CNA for multivalue variables as well as continuous variables whose values are interpreted as membership scores in fuzzy sets. This generalization comes with a major adaptation of CNA’s algorithmic protocol, which, in an extended series of benchmark tests, is shown to give CNA an edge over QCA not only with respect to multioutcome structures but also with respect to the analysis of nonideal data stemming from singleoutcome structures. The inferential power of multivalue and fuzzyset CNA is made available to end users in the newest version of the R package cna.

661388.980517
This will be a series of lectures on the philosophy of mathematics, given at Oxford University, Michaelmas term 2018. The lectures are mainly intended for undergraduate students preparing for exam paper 122, although all interested parties are welcome. …

668751.980532
Karl Popper developed a theory of deductive logic in the late 1940s. In his approach, logic is a metalinguistic theory of deducibility relations that are based on certain purely structural rules. Logical constants are then characterized in terms of deducibility relations. Characterizations of this kind are also called inferential definitions by Popper. In this paper, we expound his theory and elaborate some of his ideas and results that in some cases were only sketched by him. Our focus is on Popper’s notion of duality, his theory of modalities, and his treatment of different kinds of negation. This allows us to show how his works on logic anticipate some later developments and discussions in philosophical logic, pertaining to trivializing (tonklike) connectives, the duality of logical constants, dualintuitionistic logic, the (non)conservativeness of language extensions, the existence of a biintuitionistic logic, the nonlogicality of minimal negation, and to the problem of logicality in general.

668754.980549
The conception of implications as rules is interpreted in Lorenzenstyle dialogical semantics. Implicationsasrules are given attack and defense principles, which are asymmetric between proponent and opponent. Whereas on the proponent’s side, these principles have the usual form, on the opponent’s side implications function as database entries that can be used by the proponent to defend assertions independent of their logical form. The resulting system, which also comprises a principle of cut, is equivalent to the sequentstyle system for implicationsasrules. It is argued that the asymmetries arising in the dialogical setting are not deficiencies but reflect the prelogical (‘structural’) character of the notion of rule.

668756.980566
Atomic systems, that is, sets of rules containing only atomic formulas, play an important role in prooftheoretic notions of logical validity. We consider a view of atomic systems as definitions that allows us to discuss a proposal made by Prawitz (2016). The implementation of this view in the base case of an inductive definition of validity leads to the problem that derivability of atomic formulas in an atomic system does not coincide with the validity of these formulas. This is due to the fact that, on the definitional view of atomic systems, there are not just production rules, but both introduction and elimination rules for atoms, which may even generate nonnormalizable atomic derivations. This shows that the way atomic systems are handled is a fundamental issue of prooftheoretic semantics.

668955.980584
The BHK interpretation of logical constants is analyzed in terms of a systematic account given by Prawitz, resulting in a reformulation of the BHK interpretation in which the assertability of atomic propositions is determined by Post systems. It is shown that the reformulated BHK interpretation renders more propositions assertable than are provable in intuitionistic propositional logic. Mints’ law is examined as an example of such a proposition. Intuitionistic propositional logic would thus have to be considered incomplete. We conclude with a discussion on the adequacy of the BHK interpretation of implication.

669116.980601
The inversion principle expresses a relationship between left and right introduction rules for logical constants. Hallnas and Schroeder Heister [2] presented the principle of definitional reflection as a means of capturing the idea embodied in the inversion principle. Using the principle of definitional reflection, we show for minimal propositional logic that the left introduction rules are admissible when the right introduction rules are given as the definition of logical constants, and vice versa. Keywords: Proof theory, inversion principle, admissibility, logical rules.

699132.980618
We propose a compositional Bayesian semantics that interprets declarative sentences in a natural language by assigning them probability conditions. These are conditional probabilities that estimate the likelihood that a competent speaker would endorse an assertion, given certain hypotheses. Our semantics is implemented in a functional programming language. It estimates the marginal probability of a sentence through Markov Chain Monte Carlo (MCMC) sampling of objects in vector space models satisfying specified hypotheses. We apply our semantics to examples with several predicates and generalised quantifiers, including higherorder quantifiers. It captures the vagueness of predication (both gradable and nongradable), without positing a precise boundary for classifier application. We present a basic account of semantic learning based on our semantic system. We compare our proposal to other current theories of probabilistic semantics, and we show that it offers several important advantages over these accounts.

758262.980631
In this paper we show how to formalise falsebelief tasks like the Sally Anne task and the secondorder chocolate task in Dynamic Epistemic Logic (DEL). Falsebelief tasks are used to test the strength of the Theory of Mind (ToM) of humans, that is, a human’s ability to attribute mental states to other agents. Having a ToM is known to be essential to human social intelligence, and hence likely to be essential to social intelligence of artificial agents as well. It is therefore important to find ways of implementing a ToM in artificial agents, and to show that such agents can then solve falsebelief tasks. In this paper, the approach is to use DEL as a formal framework for representing ToM, and use reasoning in DEL to solve falsebelief tasks. In addition to formalising several falsebelief tasks in DEL, the paper introduces some extensions of DEL itself: edgeconditioned event models and observability propositions. These extensions are introduced to provide better formalisations of the falsebelief tasks, but expected to have independent future interest.

782651.980644
The title well represents this paper’s goals. I shall discuss certain basic issues pertaining to subjective probability and, in particular, the point at which the concept of natural predicates is necessary within the probabilistic framework. Hempel’s wellknown puzzle of ravens serves as a starting point and as a concrete example. I begin by describing in §2 four solutions that have been proposed. Two of these represent fundamental approaches that concern me most: the probabilistic standard solution and what I refer to as the naturalpredicates solution. The first is essentially due to various investigators, among them Hempel himself. The second has been proposed by Quine in his ‘Natural kinds’; it represents a general line rather than a single precise solution. Underlying it is some classification of properties (or, to remain safely on the linguistic level, of predicates) which derives from epistemic or pragmatic factors and is, at least prima facie, irreducible to distinctions in terms of logical structure. Goodman’s concept of entrenchment belongs here as well (his paradox is taken up in §3 and §5). Of the other two, the one referred to as a “nearlyall”solution is based on interpreting ‘all’ (in ‘all ravens are black’) as nearly all. An analysis shows that the valid part of this argument is reducible to the standard probabilistic solution. The remaining solution is based on a modal interpretation; it is shown to belong to the naturalpredicates brand. Another modality argument turns out, upon analysis, to be false.

845009.980658
E.S. Pearson: 11 Aug 189512 June 1980. Today is Egon Pearson’s birthday. In honor of his birthday, I am posting “Statistical Concepts in Their Relation to Reality” (Pearson 1955). I’ve posted it several times over the years, but always find a new gem or two, despite its being so short. …

991069.980671
When Aristotle argues at the Metaphysics Z.17, 1041b11–33 that a whole, which is not a heap, contains ‘something else’, i.e. the form, besides the elements, it is not clear whether or not the form is a proper part of the whole. I defend the claim that the form is not a proper part within the context of the relevant passage, since the whole is divided into elements, not into elements and the form. Different divisions determine different senses of ‘part’, and thus the form is not a part in the same sense as the elements are parts. I object to Koslicki’s (2006) interpretation, according to which the form is a proper part along the elements in a single sense of ‘part’, although she insists that the form and the elements belong to different categories. I argue that Koslicki’s reading involves a category mistake, i.e. the conjunction of items that do not belong to the same category (Goldwater 2018). Since for Aristotle parthood presupposes some kind of similarity of parts, the conjunction of form and elements requires treating these items as somehow belonging to the same category, e.g. ‘being’, but no such category exists.

1021649.980689
According to an increasingly popular epistemological view, people need outright beliefs in addition to credences to simplify their reasoning. Outright beliefs simplify reasoning by allowing thinkers to ignore small error probabilities. What is outright believed can change between contexts. It has been claimed that thinkers manage shifts in their outright beliefs and credences across contexts by an updating procedure resembling conditionalization, which I call pseudoconditionalization (PC). But conditionalization is notoriously complicated. The claim that thinkers manage their beliefs via PC is thus in tension with the view that the function of beliefs is to simplify our reasoning. I propose to resolve this puzzle by rejecting the view that thinkers employ PC. Based on this solution, I furthermore argue for a descriptive and a normative claim. The descriptive claim is that the available strategies for managing beliefs and credences across contexts that are compatible with the simplifying function of outright beliefs can generate synchronic and diachronic incoherence in a thinker’s attitudes. Moreover, I argue that the view of outright belief as a simplifying heuristic is incompatible with the view that there are ideal norms of coherence or consistency governing outright beliefs that are too complicated for human thinkers to comply with.

1021656.980719
This thesis is an account of research undertaken between February 2007 and August 2011 at the Centre for Time, Department of Philosophy, School of Philosophical and Historical Inquiry, University of Sydney, Australia. Except where acknowledged in the customary manner, the material presented in this thesis is, to the best of my knowledge, original and has not been submitted in whole or part for a degree in any university.

1125616.980744
The quantum query complexity of approximate counting was one of the first topics studied in quantum algorithms. Given a nonempty finite set S ⊆ [N ] (here and throughout, [N ] = {1, . . . , N }), suppose we want to estimate its cardinality, S, to within some multiplicative accuracy ε. This is a fundamental task in theoretical computer science, used as a subroutine for countless other tasks. As is standard in quantum algorithms, we work in the socalled blackbox model (see [10]), where we assume only that we’re given a membership oracle for S: an oracle that, for any i ∈ [N ], tells us whether i ∈ S. We can, however, query the oracle in quantum superposition. How many queries must a quantum computer make, as a function of both N and S, to solve this problem with high probability?