
604183.224997
Problems about the existence of converses for nonsymmetric relations go back to Russell 1903. These resurfaced in Fine 2000 and were recently rehearsed in MacBride 2014. In this paper, I focus one problem that is described in all three works. I show how object theory (Zalta 1983, 1993; Bueno, Menzel, & Zalta 2014, Menzel & Zalta2014) provides a solution to those problems.

642571.225124
Dynamic Belief Update (DBU) is a model checking problem in Dynamic Epistemic Logic (DEL) concerning the effect of applying a number of epistemic actions on an initial epistemic model. It can also be considered as a plan verification problem in epistemic planning. The problem is known to be PSPACEhard. To better understand the source of complexity of the problem, previous research has investigated the complexity of 128 parameterized versions of the problem with parameters such as number of agents and size of actions. The complexity of many parameter combinations has been determined, but previous research left a few combinations as open problems. In this paper, we solve most of the remaining open problems by proving all of them to be fixedparameter intractable. Only two parameter combinations are still left as open problem for future research.

642728.225151
We claim that the various sharpenings in a supervaluationist analysis are best understood as possible worlds in a Kripke structure. It’s not just that supervaluationism wishes to assert ¬(∀n)(if a man with n hairs on his head is bald then so is a man with n + 1 hairs on his head) while refusing to assert (∃n)(a man with n hairs on his head is bald but is a man with n + 1 hairs on his head is not) and that this refusal can be accomplished by a constructive logic (tho’ it can)—the point is that the obvious Kripke semantics for this endeavour has as its possible worlds precisely the sharpenings that supervaluationism postulates. Indeed the sharpenings do nothing else. The fit is too exact to be coincidence.

1059494.225166
Tese apresentada ao Programa de Pósgraduação em Filosofia do Departamento de Filosofia da Faculdade de Filosofia, Letras e Ciências Humanas da Universidade de São Paulo.

1059632.225182
This article introduces a Probabilistic Logic of Communication and Change, which captures in a unified framework subjective probability, arbitrary levels of mutual knowledge and a mechanism for multiagent Bayesian updates that can model complex socialepistemic scenarios, such as informational cascades. We show soundness, completeness and decidability of our logic, and apply it to a concrete example of cascade.

1172761.225213
Over the summer, I got interested in the problem of the priors again. Which credence functions is it rational to adopt at the beginning of your epistemic life? Which credence functions is it rational to have before you gather any evidence? …

1250531.225235
Writing in 1948, Turing felt compelled to confront a “religious belief” that “any attempt” to construct intelligent machines was seen “a sort of Promethean irreverence.” And yet he has been associated by his own biographer Andrew Hodges with the image of “a Frankenstein — the proud irresponsibility of pure science, concentrated in a single person.” Reader of a 1865 version of Samuel Butler’s Darwin among the machines, Turing challenged the conventional wisdom of what machines really were or could be and prophesized a future pervaded by intelligent machines which may be seen as a dystopia or as a utopia. The question is thus posed: what future did Turing actually envision and propose to machines? I will formulate and study the problem of identifying Turing’s specific Promethean ambition about intelligent machines. I shall suggest that Turing’s primary aim was the development of mechanistic explanations of the human mindbrain. But his secondary aim, implied in irony and wit, was the delivery of a social criticism about gender, race, nation and species chauvinisms. Turing’s association with Mary Shelley’s Frankenstein will be discouraged. Rather, his third aim was to send a precautionary message about the possibility of machines outstripping us in intellectual power in the future.

1250679.22525
Bayesian epistemology has struggled with the problem of regularity: how to deal with events that in classical probability have zero probability. While the cases most discussed in the literature, such as infinite sequences of coin tosses or continuous spinners, do not actually come up in scientific practice, there are cases that do come up in science. I shall argue that these cases can be resolved without leaving the realm of classical probability, by choosing a probability measure that preserves “enough” regularity. This approach also provides a resolution to the McGrew, McGrew and Vestrum normalization problem for the finetuning argument.

1688484.225264
Conservatism in choice under uncertainty means that a statusquo is abandoned in favor of some alternative only if it is dominated. The standard model of conservative choice introduced by Bewley (2002) introduces multiple decision criteria, and calls the status quo dominated when all criteria agree that the alternative is better than the status quo. We consider the case when multiple criteria are used to evaluate the status quo and the alternative, but cannot be used to rank them. The alternative is chosen only if it is preferable to the status quo even when the first is evaluated according to the worstcase scenario, and the second according to the bestcase scenario. The resulting model is one of obvious dominance, or twofold conservatism.

1723713.225277
Epistemic decision theory produces arguments with both normative and mathematical premises. I begin by arguing that philosophers should care about whether the mathematical premises (1) are true, (2) are strong, and (3) admit simple proofs. I then discuss a theorem that Briggs and Pettigrew (2020) use as a premise in a novel accuracydominance argument for conditionalization. I argue that the theorem and its proof can be improved in a number of ways. First, I present a counterexample that shows that one of the theorem’s claims is false. As a result of this, Briggs and Pettigrew’s argument for conditionalization is unsound. I go on to explore how a sound accuracydominance argument for conditionalization might be recovered. In the course of doing this, I prove two new theorems that correct and strengthen the result reported by Briggs and Pettigrew. I show how my results can be combined with various normative premises to produce sound arguments for conditionalization. I also show that my results can be used to support normative conclusions that are stronger than the one that Briggs and Pettigrew’s argument supports. Finally, I show that Briggs and Pettigrew’s proofs can be simplified considerably.

1851259.225291
This is not the five minute argument version for Randall; it’s rather the full half hour version for Alice and Adam. Let’s abbreviate this proposition to ‘SCU’. That was Alice Vidrine’s suggestion. It’s pronounced scum or screw or perhaps skew. SCU crops up naturally in the analysis of NF from a category theoretic perspective. Consider the (conjectural) category of small sets and small maps, where ‘small’ means ‘strongly cantorian’, and a small map is a map whose every fibre is strongly cantorian. For this gadget to be a category we need a composition of small maps to be small and this is equivalent to SCU.

1995305.225305
This chapter traces the historical and conceptual development of the idea of the continuum and the allied concept of real number. Particular attention is paid to the idea of infinitesimal, which played a key role in the development of the calculus during the 17^{th} and 18^{th} centuries, and which has undergone a revival in the later 20^{th} century.

2092511.225318
I discuss relative ignorance of an agent with respect to the knowledge or ignorance of other agents. It turns out, not surprisingly, that even the twoagent case is quite complex and generates a rich variety of naturally arising nonequivalent operators of relative ignorance. In this paper I explore these in a more systematic way and put together several simple, though technically laborious, observations about their interrelations. For the technical proofs of these I employ the software tool MOLTAP, which implements, inter alia, tableaux for the underlying multiagent epistemic logic.

2201825.225334
A set of players delegate playing a game to a set of representatives, one for each player. We imagine that each player trusts their respective representative’s strategic abilities. Thus, we might imagine that per default, the original players would simply instruct the representatives to play the original game as best as they can. In this paper, we ask: are there safe Pareto improvements on this default way of giving instructions? That is, we imagine that the original players can coordinate to tell their representatives to only consider some subset of the available strategies and to assign utilities to outcomes differently than the original players. Then can the original players do this in such a way that the payoff is guaranteed to be weakly higher than under the default instructions for all the original players? In particular, can they Paretoimprove without probabilistic assumptions about how the representatives play games? In this paper, we give some examples of safe Pareto improvements. We further prove that the notion of safe Pareto improvements is closely related to a notion of outcome correspondence between games. We also show that under some specific assumptions about how the representatives play games, finding safe Pareto improvements is NPcomplete.

2201827.225348
This paper considers the metagame of delegation. SPIs are a proposed way of playing these games. However, throughout most of this paper, we do not analyze the metagame directly as a game using the typical tools of game theory. We here fill that gap and in particular prove Theorem 4.1, which shows that SPIs are played in Nash equilibria of the meta game, assuming sufficiently strong contracting abilities. As noted, this result is essential. However, since it is mostly an application of existing ideas from the literature on program equilibrium, we left a detailed treatment out of the main text.

2201882.225362
Modal formulae express monadic secondorder properties on Kripke frames, but in many important cases these have firstorder equivalents. Computing such equivalents is important for both logical and computational reasons. On the other hand, canonicity of modal formulae is important, too, because it implies framecompleteness of logics axiomatized with canonical formulae.

2201894.225376
A (pointvalued) solution for cooperative games with transferable utility, or simply TUgames, assigns a payoff vector to every TUgame. In this paper we discuss two classes of equal surplus sharing solutions. The first class consists of all convex combinations of the equal division solution (which allocates the worth of the ‘grand coalition’ consisting of all players equally over all players) and the centerofgravity of the imputationset value (which first assigns every player its singleton worth and then allocates the remainder of the worth of the grand coalition, N , equally over all players). The second class is the dual class consisting of all convex combinations of the equal division solution and the egalitarian nonseparable contribution value (which first assigns every player its contribution to the ‘grand coalition’ and then allocates the remainder equally over all players). We provide characterizations of the two classes of solutions using either population solidarity or a reduced game consistency in addition to other standard properties.

2265774.225389
We apply and extend the theory and methods of algorithmic correspondence theory for modal logics, developed over the past 20 years, to the language L? of relevance logics with respect to their standard RoutleyMeyer relational semantics. We develop the nondeterministic algorithmic procedure PEARL for computing firstorder equivalents of formulae of the language L?, in terms of that semantics.

2419663.225407
This paper presents a semantic analysis of indirect speech reports. The analysis aims to explain a combination of two phenomena. Firstly, there are true utterances of sentences of the form α said that φ which are used to report an utterance u of a sentence wherein φ’s content is not u’s content. This implies that in uttering a single sentence, one can say several things. Secondly, when the complements of these reports (and indeed, these reports themselves) are placed in conjunctions, the conjunctions are typically infelicitous. I argue that this combination of phenomena can be explained if speech reports report (perhaps contingent) parts of the contents of the sentences reported.

3116837.225421
Complexity is heterogenous, involving nonlinearity, selforganisation, diversity, adaptive behaviour, among other things. It is therefore obviously worth asking whether purported measures of complexity measure aggregate phenomena, or individdual aspects of complexity and if so which. This paper uses a recently developed rigorous framework for understanding complexity to answer this question about measurement. The approach is twofold: find measures of individual aspects of complexity on the one hand, and explain measures of complexity on the other. We illustrate the conceptual framework of complexity science and how it links the foundations to the practised science with examples from different scientific fields and of various aspects of complexity. Furthermore, we analyse a selection of purported measures of complexity that have found wide application and explain why and how they measure aspects of complexity. This work gives the reader a tool to take any existing measure of complexity and analyse it, and to take any feature of complexity and find the right measure for it.

3389086.225435
Given some background logical premises – that truthmaking is factive and distributes over conjunction – it logically follows that every truth has a truthmaker. The reasoning is familiar from Fitch’s paradox (Church 2009; Salerno 2009). Suppose for reductio that A is a truthmakerless truth (A ∧ ¬T A). Then, by assumption, it’s possible for this fact (A ∧ ¬T A) itself to have a truthmaker (◇T (A ∧ ¬T A)). This quickly entails a possible contradiction: that A both has and lacks a truthmaker (◇(T A ∧ ¬T A)). But no contradiction is possible and so, by reductio, every truth has a truthmaker (A → T A).

3568637.225449
The main result of present paper is that the definition of negation has to be referred to the totality of a theory and at last to what is defined as the organization of a scientific theory; in other words, negation’s definition is of a structural kind, rather than of an objective kind or a subjective kind. The paper starts by remarking that the ancient Greek word for truth was “aletheia”, which is a double negation, i.e.

3611799.225462
« Quantum supremacy, now with BosonSampling
Shor’s algorithm in higher dimensions: Guest post by Greg Kuperberg
Upbeat advertisement: If research in QC theory or CS theory otherwise is your thing, then wouldn’t you like to live in peaceful, quiet, bicyclebased Davis, California, and be a faculty member at the large, prestigious, friendly university known as UC Davis? …

3648408.225476
We study probabilistic logic under the viewpoint of the coherence principle of de Finetti. In detail, we explore how probabilistic reasoning under coherence is related to modeltheoretic probabilistic reasoning and to default reasoning in System . In particular, we show that the notions of gcoherence and of gcoherent entailment can be expressed by combining notions in modeltheoretic probabilistic logic with concepts from default reasoning. Moreover, we show that probabilistic reasoning under coherence is a generalization of default reasoning in System . That is, we provide a new probabilistic semantics for System , which neither uses , which neither uses infinitesimal probabilities nor atomic bound (or bigstepped) probabilities. These results also provide new algorithms for probabilistic reasoning under coherence and for default reasoning in System , and they give new insight into default reasoning with conditional objects.

3806024.225489
We determine the modal logic of fixedpoint models of truth and their axiomatizations by Solomon Feferman via Solovaystyle completeness results. Given a fixedpoint model M, or an axiomatization S thereof, we find a modal logic M such that a modal sentence ϕ is a theorem of M if and only if the sentence ϕ obtained by translating the modal operator with the truth predicate is true in M or a theorem of S under all such translations. To this end, we introduce a novel version of possible worlds semantics featuring both classical and nonclassical worlds and establish the completeness of a family of noncongruent modal logics whose internal logic is nonclassical with respect to this semantics.

4203894.225502
In Part I of this paper (Ketland in Logica Universalis 14:357– 381, 2020), I assumed we begin with a (relational) signature P = {Pi} and the corresponding language LP , and introduced the following notions: a definition system dΦ for a set of new predicate symbols Qi, given by a set Φ = {φi} of defining LP formulas (these definitions have the form: ∀x(Qi(x) ↔ φi)); a corresponding translation function τΦ : LQ → LP ; the corresponding definitional image operator DΦ, applicable to LP  structures and LP theories; and the notion of definitional equivalence itself: for structures A + dΦ ≡ B + dΘ; for theories, T1 + dΦ ≡ T + dΘ Some results relating these notions were given, ending with two characterizations for definitional equivalence. In this second part, we explain the notion of a representation basis. Suppose a set Φ = {φi} of LP formulas is given, and Θ = {θi} is a set of LQformulas. Then the original set Φ is called a representation basis for an LP structure A with inverse Θ iff an inverse explicit definition ∀x(Pi(x) ↔ θi) is true in A + dΦ, for each Pi Similarly, the set Φ is called a representation basis for a LP theory T with inverse Θ iff each explicit definition ∀x(Pi(x) ↔ θi) is provable in T + dΦ.

4204004.225518
This article provides a computational example of a mathematical explanation within science, concerning computational equivalence of programs. In addition, it outlines the logical structure of the reasoning involved in explanations in applied mathematics. It concludes with a challenge that the nominalist provide a nominalistic explanation for the computational equivalence of certain programs.

4261761.225532
Sometimes structures or theories are formulated with different sets of primitives and yet are definitionally equivalent. In a sense, the transformations between such equivalent formulations are rather like basis transformations in linear algebra or coordinate transformations in geometry. Here an analogous idea is investigated. Let a relational signature P = {Pi}i∈I_{P} be given. For a set Φ = {φi}i∈I_{Φ} of LP formulas, we introduce a corresponding set Q = {Qi}i∈IΦ of new relation symbols and a set of explicit definitions of the Qi in terms of the φi. This is called a definition system, denoted dΦ. A definition system dΦ determines a translation function τΦ : LQ → LP . Any LP structure A can be uniquely definitionally expanded to a model A = dΦ, called A + dΦ.

4295734.225548
Humans are imperfect reasoners. In particular, humans are imperfect mathematical reasoners. They are fallible, with a nonzero probability of making a mistake in any step of their reasoning. This means that there is a nonzero probability that any conclusion that they come to is mistaken. This is true no matter how convinced they are of that conclusion. Even brilliant mathematicians behave in this way; Poincare wrote that he was “absolutely incapable of adding without mistakes” (1910, p. 323).

4452378.225561
Relational semantics for nonclassical logics lead straightforwardly to topological representation theorems of their algebras. Ortholattices and De Morgan lattices are reducts of the algebras of various nonclassical logics. We define three new classes of topological spaces so that the lattice categories and the corresponding categories of topological spaces turn out to be dually isomorphic. A key feature of all these topological spaces is that they are ordered relational or ordered product topologies.