
121067.715994
A trivalent theory of indicative conditionals automatically enforces Stalnaker’s thesis— the equation between probabilities of conditionals and conditional probabilities. This result holds because the trivalent semantics requires, for principled reasons, a modification of the ratio definition of conditional probability in order to accommodate the possibility of undefinedness. I analyze precisely how this modification allows the trivalent semantics to avoid a number of wellknown triviality results, in the process clarifying why these results hold for many bivalent theories. I suggest that the slew of triviality published in the last 40odd years need not be viewed as an argument against Stalnaker’s thesis: it can be construed instead as an argument for abandoning the bivalent requirement that conditionals somehow be assigned a truthvalue in worlds in which their antecedents are false.

136742.716042
In a series of papers Colbeck and Renner (2011, 2015a,b) claim to have shown that the quantum state provides a complete description for the prediction of future measurement outcomes. In this paper I argue that thus far no solid satisfactory proof has been presented to support this claim. Building on the earlier work of Leifer (2014), Landsman (2015) and Leegwater (2016), I present and prove two results that only partially support this claim. I then discuss the arguments by Colbeck, Renner and Leegwater concerning how these results are to generalize to the full claim. This argument turns out to hinge on the implicit use of an assumption concerning the way unitary evolution is to be represented in any possible completion of quantum mechanics. I argue that this assumption is unsatisfactory and that possible attempts to validate it based on measurement theory also do not succeed.

194881.716059
Following Reichenbach, it is widely held that in making a direct inference, one should base one’s conclusion on a relevant frequency statement concerning the most specific reference class for which one is able to make a warranted and relatively precisevalued frequency judgment. In cases where one has accurate and precisevalued frequency information for two relevant reference classes, R1 and R2, and one lacks accurate and precisevalued frequency information concerning their intersection, R R2, it is widely held, following Reichenbach, that no inference may be drawn. In contradiction to Reichenbach and the common wisdom, I argue for the view that it is often possible to draw a reasonable informative conclusion, in such circumstances. As a basis for drawing such a conclusion, I show that one is generally in a position to formulate a reasonable direct inference for a reference class that is more specific than either of R1 and R2.

234134.716079
In our representations of the world, especially in physics,
(mathematical) infinities play a crucial role. The continuum of the
real numbers, \(\Re\), as a representation of time or of
onedimensional space is surely the best known example and, by
extension, the \(n\)fold cartesian product, \(\Re^{n}\), for
\(n\)dimensional space. However, these same infinities also cause
problems. One just has to think about Zeno’s paradoxes or the
presentday continuation of that discussion, namely the discussion
about supertasks, to see the difficulties (see
the entry on supertasks in this
encyclopedia for a full treatment).

657095.716094
Traditional oppositions are at least twodimensional in the sense that they are built based on a famous bidimensional object called square of oppositions and on one of its extensions such as Blanche’s hexagon. Instead of twodimensional objects, this article proposes a construction to deal with oppositions in a onedimensional line segment.

786688.716108
The study of iterated belief change has principally focused on revision, with the other main operator of AGM belief change theory, namely contraction, receiving comparatively little attention. In this paper we show how principles of iterated revision can be carried over to iterated contraction by generalising a principle known as the ‘Harper Identity’. The Harper Identity provides a recipe for defining the belief set resulting from contraction by a sentence A in terms of (i) the initial belief set and (ii) the belief set resulting from revision by ¬A.

825708.716122
In ‘Essence and Modality ’, Kit Fine (1994) proposes that for a proposition to be metaphysically necessary is for it to be true in virtue of the nature of all objects. Call this view Fine’s Thesis. This paper is a study of Fine’s Thesis in the context of Fine’s logic of essence (LE). Fine himself has offered his most elaborate defence of the thesis in the context of LE. His defence rests on the widely shared assumption that metaphysical necessity obeys the laws of the modal logic S5. In order to get S5 for metaphysical necessity, he assumes a controversial principle about the nature of all objects. I will show that the addition of this principle to his original system E5 leads to inconsistency with an independently plausible principle about essence. In response, I develop a theory that avoids this inconsistency while allowing us to maintain S5 for metaphysical necessity. However, I conclude that our investigation of Fine’s Thesis in the context of LE motivates the revisionary conclusion that metaphysical necessity obeys the principles of the modal logic S4, but not those of S5. I argue that this constitutes a distinctively essentialist challenge to the received view that the logic of metaphysical necessity is S5.

887701.716138
According to Jens Høyrup, the propositions 1 to 10 of book 2 of Euclid’s Elements function as a critique of previous nonrigorous procedures of Old Babylonian mathematics. Høyrup’s remarks on his notion of critique are disseminated throughout his works. Here, we take them into account to make an integrated presentation of the notion of critique that also looks to reveal features left implicit in Høyrup’s account.

1118636.716155
Running verification tasks in database driven systems requires solving quantifier elimination problems of a new kind. These quantifier elimination problems are related to the notion of a cover introduced in ESOP 2008 by Gulwani and Musuvathi. In this paper, we show how covers are strictly related to model completions, a wellknown topic in model theory. We also investigate the computation of covers within the Superposition Calculus, by adopting a constrained version of the calculus, equipped with appropriate settings and reduction strategies. In addition, we show that cover computations are computationally tractable for the fragment of the language used in applications to database driven verification. This observation is confirmed by analyzing the preliminary results obtained using the MCMT tool on the verification of dataaware process benchmarks. These benchmarks can be found in the last version of the tool distribution.

1272350.716169
Fitch’s Paradox shows that if every truth is knowable, then every truth is known. Standard diagnoses identify the factivity/negative infallibility of the knowledge operator and Moorean contradictions as the root source of the result. This paper generalises Fitch’s result to show that such diagnoses are mistaken. In place of factivity/negative infallibility, the weaker assumption of any ‘levelbridging principle’ suffices. A consequence is that the result holds for some logics in which the “Moorean contradiction” commonly thought to underlie the result is in fact consistent. This generalised result improves on the current understanding of Fitch’s result and widens the range of modalities of philosophical interest to which the result might be fruitfully applied. Along the way, we also consider a semantic explanation for Fitch’s result which answers a challenge raised by Kvanvig (2006).

1272394.716183
The aim of this paper is to investigate counterfactual logic and its implications for the modal status of mathematical claims. It is most directly a response to an ambitious program by YliVakkuri and Hawthorne (2018), who seek to establish that mathematics is committed to its own necessity. I claim that their argument fails to establish this result for two reasons. First, the system of counterfactual logic they develop is provably equivalent to appending Deduction Theorem to a T modal logic. It is neither new nor surprising that the combination of T with Deduction Theorem results in necessitation; this has been widely known since the formalization of modal logic in the 1960’s. Indeed, it is precisely for this reason that Deduction Theorem is almost universally rejected in modal contexts. Absent a reason to accept Deduction Theorem in this case, we remain without a compelling argument for the necessity of mathematics. Second, their assumptions force our hand on controversial debates within counterfactual logic. In particular, they license counterfactual strengthening— the inference from ‘If A were true then C would be true’ to ‘If A and B were true then C would be true’—which many reject. Many philosophers are thus unable to avail themselves of this result.

1487666.716196
All standard epistemic logics legitimate something akin to the principle of closure, according to which knowledge is closed under competent deductive inference. And yet the principle of closure, particularly in its multiple premise guise, has a somewhat ambivalent status within epistemology. One might think that serious concerns about closure point us away from epistemic logic altogether—away from the very idea that the knowledge relation could be fruitfully treated as a kind of modal operator. This, however, need not be so. The abandonment of closure may yet leave in place plenty of formal structure amenable to systematic logical treatment. In this paper we describe a family of weak epistemic logics in which closure fails, and describe two alternative semantic frameworks in which these logics can be modelled. One of these—which we term plurality semantics—is relatively unfamiliar. We explore under what conditions plurality frames validate certain muchdiscussed principles of epistemic logic. It turns out that plurality frames can be interpreted in a very natural way in light of one motivation for rejecting closure, adding to the significance of our technical work. The second framework that we employ—neighbourhood semantics—is much better known. But we show that it too can be interpreted in a way that comports with a certain motivation for rejecting closure.

1488851.71621
Any philosophy of mathematics deserving the name “logicism” must hold that mathematical truths are in some sense logical truths. Today, a typical characterization of a logical truth is one that remains true under all (re)interpretations of its nonlogical vocabulary. Put a bit crudely, this means that something can be a logical truth only if all other statements of the same form are also true. “Fa ⊃ (Rab ⊃ Fa)” can be a logical truth because not only it, but all propositions of the form “p ⊃ (q ⊃ p)” are true. It does not matter what “F”, “R”, “a” and “b” mean, or what specific features the objects meant have. Applying this conception of a logical truth in the context of logicism seems to present an obstacle. “Five is prime”, at least on the surface, is a simple subjectpredicate assertion, and obviously, not all subjectpredicate assertions are true. How, then could this be a logical truth? Similarly, “7 > 5” asserts a binary relation, but obviously not all binary relations hold. In what follows, I shall call this the logical form problem for logicism.

1575214.716224
In a recent paper, Brian Rabern suggests a semantics for languages with two kinds of modality, standard Kripkean metaphysical modality as well as epistemic modality. This semantics presents an alternative to twodimensionalism, which was developed in the last decades. Both Rabern’s semantics and twodimensionalism are subject to a puzzle that Chalmers and Rabern (Analysis, 74(2), 210–224 2014) call the nesting problem. I will investigate how Rabern’s semantics answers this puzzle.

1577131.716238
In a series of papers Colbeck and Renner (2011, 2015a,b) claim to have shown that the quantum state provides a complete description for the prediction of future measurement outcomes. In this paper I argue that thus far no solid satisfactory proof has been presented to support this claim. Building on the earlier work of Landsman (2015) and Leegwater (2016) I highlight the implicit use of an assumption concerning the way unitary evolution is to be represented in any possible completion of quantum mechanics. I show that the assumption is quite crucial to the proof of the claim and argue that it is unwarranted. I further discuss a possible validation for a restricted version of this assumption that is based on considerations of measurement processes. I argue that also this restricted version is unsatisfactory.

1771979.716251
We compare the notions of genericity and arbitrariness on the basis of the realist import of the method of forcing. We argue that Cohen’s Theorem, similarly to Cantor’s Theorem, can be considered a metatheoretical argument in favor of the existence of uncountable collections. Then we discuss the effects of this metatheoretical perspective on Skolem’s Paradox. We conclude discussing how the connection between arbitrariness and genericity can offer arguments in favor of Forcing Axioms.

1834677.716265
We present a construction of a truth class (an interpretation of a compositional truth predicate) in an arbitrary countable recursively saturated model of firstorder arithmetic. The construction is fully classical in that it employs nothing more than the classical techniques of formal proof theory.

2030734.716279
Quasitruth (a.k.a. pragmatic truth or partial truth) is typically advanced as a framework accounting for incompleteness and uncertainty in the actual practices of science. Also, it is said to be useful for accommodating cases of inconsistency in science without leading to triviality. In this paper, we argue that the given developments do not deliver all that is promised. We examine the most prominent account of quasitruth available in the literature, advanced by da Costa and collaborators in many places, and argue that it cannot legitimately account for incompleteness in science: we shall claim that it conflates paraconsistency and paracompleteness. It also cannot account for inconsistencies, because no direct contradiction of the form α ∧ ¬α can be quasitrue, according to the framework. Finally, we advance an alternative interpretation of the formalism in terms of dealing with distinct contexts where incompatible information is dealt with. This does not save the original program, but seems to make better sense of the formalism.

2053561.716293
We consider systems of rational agents who act in pursuit of their individual and collective objectives and we study the reasoning of an agent or an external observer about the consequences from the expected choices of action of the other agents based on their objectives, in order to assess the reasoners ability to achieve his own objective. To formalize such reasoning we introduce new modal operators of conditional strategic reasoning and use them to extend Coalition Logic in order to capture variations of conditional strategic reasoning. We provide formal semantics for the new conditional strategic operators, introduce the matching notion of bisimulation for each of them and discuss and compare briefly their expressiveness.

2217378.716308
In the last post I set out a puzzling passage from Lewis. That was the first part of his account of “common knowledge”. If we could get over the sticking point I highlighted, we’d find the rest of the argument would show us how individuals confronted with a special kind of state of affairs A—a “basis for common knowledge that Z”—would end up having reason to believe that Z, reason to believe that all others have reason to believe Z, reason to believe that all others have reason to believe that all others have reason to believe Z, and so on for ever. …

2230168.716323
We introduce Arbitrary Public Announcement Logic with Memory (APALM), obtained by adding to the models a ‘memory’ of the initial states, representing the information before any communication took place (“the prior”), and adding to the syntax operators that can access this memory. We show that APALM is recursively axiomatizable (in contrast to the original Arbitrary Public Announcement Logic, for which the corresponding question is still open). We present a complete recursive axiomatization, that uses a natural finitary rule, we study this logic’s expressivity and the appropriate notion of bisimulation.

2230237.716341
We present a dynamic logic for inductive learning from partial observations by a “rational” learner, that obeys AGM postulates for belief revision. We apply our logic to an example, showing how various concrete properties can be learnt with certainty or inductively by such an AGM learner. We present a sound and complete axiomatization, based on a combination of relational and neighborhood version of the canonical model method.

2230476.716358
Subset space semantics for public announcement logic in the spirit of the effort modality have been proposed by Wang and Ågotnes [18] and by Bjorndahl [6]. They propose to model the public announcement modality by shrinking the epistemic range with respect to which a postcondition of the announcement is evaluated, instead of by restricting the model to the set of worlds satisfying the announcement. Thus we get an “elegant, modelinternal mechanism for interpreting public announcements” [6, p.12]. In this work, we extend Bjorndahl’s logic PALint of public announcement, which is modelled on topological spaces using subset space semantics and adding the interior operator, with an arbitrary announcement modality, and we provide topological subset space semantics for the corresponding arbitrary announcement logic APALint, and demonstrate completeness of the logic by proving that it is equal in expressivity to the logic without arbitrary announcements, employing techniques from [2, 13].

2230515.716375
We extend the ‘topologic’ framework [13] with dynamic modalities for ‘topological public announcements’ in the style of Bjorndahl [5]. We give a complete axiomatization for this “Dynamic TopoLogic”, which is in a sense simpler than the standard axioms of topologic. Our completeness proof is also more direct (making use of a standard canonical model construction). Moreover, we study the relations between this extension and other known logical formalisms, showing in particular that it is coexpressive with the simpler (and older) logic of interior and global modality [10, 4, 14, 1]. This immediately provides an easy decidability proof (both for topologic and for our extension).

2231234.716392
Here is a potential problem for Aquinas’ Five Ways. Each of them proves the existence of a very special being. But do they each prove the existence of the same being? After giving the Five Ways in Summa Theologica I, Aquinas goes on to argue that the being he proved the existence of has the attributes that are needed for it to be the God of Western monotheism. …

2578638.716461
Games between two players, of the kind where one player wins and one
loses, became a familiar tool in many branches of logic during the
second half of the twentieth century. Important examples are semantic
games used to define truth, backandforth games used to compare
structures, and dialogue games to express (and perhaps explain) formal
proofs.

2723640.716479
We investigate the issue of aggregativity in fair division problems from the perspective of cooperative game theory and Broomean theories of fairness. Paseau and Saunders (Utilitas 27:460–469, 2015) proved that no nontrivial theory of fairness can be aggregative and conclude that theories of fairness are therefore problematic, or at least incomplete. We observe that there are theories of fairness, particularly those that are based on cooperative game theory, that do not face the problem of nonaggregativity. We use this observation to argue that the universal claim that no nontrivial theory of fairness can guarantee aggregativity is false. Paseau and Saunders’s mistaken assertion can be understood as arising from a neglect of the (cooperative) games approach to fair division. Our treatment has two further payoffs: for one, we give an accessible introduction to the (cooperative) games approach to fair division, whose significance has hitherto not been appreciated by philosophers working on fairness. For another, our discussion explores the issue of aggregativity in fair division problems in a comprehensive fashion.

2841065.716493
Assume that propositions—things that are or determine functions from possible worlds to truthvalues—are the objects of the attitudes, the possessors of modal properties like being possible or necessary, the things we assert by uttering sentences in contexts, and perhaps more. Suppose we have a compositional semantics that assigns semantic values relative to contexts to the wellformed expressions of a natural language. What is the relation between the proposition expressed by a sentence in a context and the semantic value assigned to the sentence in that context? Surely the simplest view, and so the one we should prefer other things being equal, is that the relation is identity: the proposition expressed by a sentence in a context just is the semantic value of that sentence in that context. We’ll call this the classical picture. Lewis [1980] claimed that the proper semantics for certain expressions precluded identifying the compositional semantic values of sentences relative to contexts with propositions as the classical picture does.

2841097.716511
We examine general decision problems with loss functions that are bounded below. We allow the loss function to assume the value 1. No other assumptions are made about the action space, the types of data available, the types of nonrandomized decision rules allowed, or the parameter space. By allowing prior distributions and the randomizations in randomized rules to be finitelyadditive, we prove very general complete class and minimax theorems. Specifically, under the sole assumption that the loss function is bounded below, we show that every decision problem has a minimal complete class and all admissible rules are Bayes rules. We also show that every decision problem has a minimax rule and a leastfavorable distribution and that every minimax rule is Bayes with respect to the leastfavorable distribution. Some special care is required to deal properly with infinitevalued risk functions and integrals taking infinite values.

3431949.716528
This paper develops and defends a truthmaker semantics for relevant logic. Relevant logics developed first by way of proof theory (Church 1951; Moh 1950; Orlov 1928), followed by various formal semantics (Dunn 1966; Fine 1974; Routley and Meyer 1972a;b; ; Urquhart 1972a;b), leaving open the issue of a suitable philosophical interpretation. Various attempts followed (e.g. Mares 1996; 2004; Restall 1995). (Restall (2004) gives a good history of relevant logics.) My approach, by contrast, is to begin with a philosophical idea and develop it in various directions, so as to build a technically adequate relevant semantics.