
561264.437993
Angell’s logic of analytic containment AC has been shown to be characterized by a 9valued matrix NC by Ferguson, and by a 16valued matrix by Fine. We show that the former is the image of a surjective homomorphism from the latter, i.e., an epimorphic image. The epimorphism was found with the help of MUltlog, which also provides a tableau calculus for NC extended by quantifiers that generalize conjunction and disjunction.

810933.438118
Some find it plausible that a sufficiently long duration of torture is worse than any duration of mild headaches. Similarly, it has been claimed that a million humans living great lives is better than any number of wormlike creatures feeling a few seconds of pleasure each. Some have related bad things to good things along the same lines. For example, one may hold that a future in which a sufficient number of beings experience a lifetime of torture is bad, regardless of what else that future contains, while minor bad things, such as slight unpleasantness, can always be counterbalanced by enough good things. Among the most common objections to such ideas are sequence arguments. But sequence arguments are usually formulated in classical logic. One might therefore wonder if they work if we instead adopt manyvalued logic. I show that, in a common manyvalued logical framework, the answer depends on which versions of transitivity are used as premises. We get valid sequence arguments if we grant any of several strong forms of transitivity of ‘is at least as bad as’ and a notion of completeness. Other, weaker forms of transitivity lead to invalid sequence arguments. The plausibility of the premises is largely set aside here, but I tentatively note that almost all of the forms of transitivity that result in valid sequence arguments seem intuitively problematic. Still, a few moderately strong forms of transitivity that might be acceptable result in valid sequence arguments, although weaker statements of the initial value claims avoid these arguments at least to some extent.

836824.438154
On the basis of a wide range of historical examples various features of axioms are discussed in relation to their use in mathematical practice. A very general framework for this discussion is provided, and it is argued that axioms can play many roles in mathematics and that viewing them as selfevident truths does not do justice to the ways in which mathematicians employ axioms. Possible origins of axioms and criteria for choosing axioms are also examined. The distinctions introduced aim at clarifying discussions in philosophy of mathematics and contributing towards a more refined view of mathematical practice.

836869.438173
The design of good notation is a cause that was dear to Charles Babbage’s heart throughout his career. He was convinced of the “immense power of signs” (1864, 364), both to rigorously express complex ideas and to facilitate the discovery of new ones. As a young man, he promoted the Leibnizian notation for the calculus in England, and later he developed a Mechanical Notation for designing his computational engines. In addition, he reflected on the principles that underlie the design of good mathematical notations. In this paper, we discuss these reflections, which can be found somewhat scattered in Babbage’s writings, for the first time in a systematic way. Babbage’s desiderata for mathematical notations are presented as ten guidelines pertinent to notational design and its application to both individual symbols and complex expressions. To illustrate the applicability of these guidelines in nonmathematical domains, some aspects of his Mechanical Notation are also discussed.

1000565.438188
Mathematical pluralism can take one of three forms: (1) every consistent mathematical theory is about its own domain of individuals and relations; (2) every mathematical theory, consistent or inconsistent, is about its own (possibly uninteresting) domain of individuals and relations; and (3) many of the principal philosophies of mathematics is based upon some insight or truth about the nature of mathematics that can be preserved. (1) includes the multiverse approach to set theory. (2) helps us to understand the significance of the distinguished nonlogical individual and relation terms of even inconsistent theories. (3) is a metaphilosophical form of mathematical pluralism and hasn’t been discussed in the literature. In what follows, I show how the analysis of theoretical mathematics in object theory exhibits all three forms of mathematical pluralism.

1329488.438202
The paper explores Hermann Weyl’s turn to intuitionism through a philosophical prism of normative framework transitions. It focuses on three central themes that occupied Weyl’s thought: the notion of the continuum, logical existence, and the necessity of intuitionism, constructivism, and formalism to adequately address the foundational crisis of mathematics. The analysis of these themes reveals Weyl’s continuous endeavor to deal with such fundamental problems and suggests a view that provides a different perspective concerning Weyl’s wavering foundational positions. Building on a philosophical model of scientific framework transitions and the special role that normative indecision or ambivalence plays in the process, the paper examines Weyl’s motives for considering such a radical shift in the first place. It concludes by showing that Weyl’s shifting stances should be regarded as symptoms of a deep, convoluted intrapersonal process of selfdeliberation induced by exposure to external criticism.

1648961.438216
Multiplayer meanpayoff games are a natural formalism to model concurrent and multiagent systems with selfinterested players. Players in such a game traverse a graph, while trying to maximise a meanpayoff function that depends on the plays so generated. As with all games, the equilibria that could arise may have undesirable properties. However, as system designers, we typically wish to ensure that equilibria in such systems correspond to desirable system behaviours, for example, satisfying certain safety or liveness properties. One natural way to do this would be to specify such desirable properties using temporal logic. Unfortunately, the use of temporal logic specifications causes game theoretic verification problems to have very high computational complexity. To this end, we consider ωregular specifications, which offer a concise and intuitive way of specifying desirable behaviours of a system. The main results of this work are characterisation and complexity bounds for the problem of determining if there are equilibria that satisfy a given ωregular specification in a multiplayer meanpayoff game in a number of computationally relevant gametheoretic settings.

1648998.43823
By means of any one of the relations just listed, the other three relations can be defined. Here, following Tarski, we shall take both identity and as primitive. Accordingly, the other three relations may be defined as follows.

1746585.438253
Landauer’s principle is, roughly, the principle that logically irreversible operations cannot be performed without dissipation of energy, with a specified lower bound on that dissipation. Though widely accepted in the literature on the thermodynamics of computation, it has been the subject of considerable dispute in the philosophical literature. Proofs of the principle have been questioned on the grounds of insufficient generality and on the grounds of the assumption, used in the proofs, of the availability of reversible processes at the microscale. The relevance of the principle, should it be true, has also been questioned, as it has been argued that microscale fluctuations entail dissipation that always greatly exceeds the Landauer bound. In this article Landauer’s principle is treated within statistical mechanics, and a proof of the principle is given that neither relies on neglect of fluctuations nor assumes the availability of thermodynamically reversible processes. In addition, it is argued that microscale fluctuations are no obstacle to approximating thermodynamic reversibility, in the appropriate sense, as closely as one would like.

1761092.438268
This study takes a careful inferentialist look at Graham Priest’s Logic of Paradox (LP). I conclude that it is sorely in need of a proofsystem that could furnish formal proofs that would regiment faithfully the ‘naïve logical’ reasoning that could be undertaken by a rational thinker within LP (if indeed such reasoning could ever take place).

1761137.438285
This is in part a reply to a recent work of VidalRosset, which expresses various mistaken beliefs about Core Logic. Rebutting these leads us further to identify, and argue against, some mistaken core beliefs about logic. In his recent work titled “Why Intuitionistic Relevant Logic Cannot Be a Core Logic,” Joseph VidalRosset [18] raises some objections to that logical system. Here these objections are rebutted (in Section ) by showing that they rest on some mistaken beliefs about the system. But because these mistaken beliefs derive from some mistaken core beliefs about logic tout court, some space will also be devoted (in Section 2) to identifying and refuting the latter. The reader needs to be alerted, at the outset, to the fact that the system IR of intuitionistic relevant logic was renamed Core Logic. The reasons for this will be explained below. VidalRosset, however, uses the phrase “core logic” as a common noun, without explaining the criteria one ought to apply in order to tell whether a given system is a core logic.

1761185.438299
Our regimentation of Goodman and Myhill’s proof of Excluded Middle revealed among its premises a form of Choice and an instance of Separation. Here we revisit Zermelo’s requirement that the separating property be definite. The instance that Goodman and Myhill used is not constructively warranted. It is that principle, and not Choice alone, that precipitates Excluded Middle.

1761233.438315
We explore the problems that confront any attempt to explain or explicate exactly what a primitive logical rule of inference is, or consists in. We arrive at a proposed solution that places a surprisingly heavy load on the prospect of being able to understand and deal with specifications of rules that are essentially selfreferring. That is, any rule ñ is to be understood via a specification that involves, embedded within it, reference to rule ñ itself. Just how we arrive at this position is explained by reference to familiar rules as well as less familiar ones with unusual features. An inquiry of this kind is surprisingly absent from the foundations of inferentialism— the view that meanings of expressions (especially logical ones) are to be characterized by the rules of inference that govern them.

1761279.43833
The onepage 1978 informal proof of Goodman and Myhill is regimented in a weak constructive set theory in free logic. The decidability of identities in general (a=b∨¬a=b) is derived; then, of sentences in general (ψ∨¬ψ). Martin Lof’s and Bell’s receptions of the latter result are discussed. Regimentation reveals the form of Choice used in deriving Excluded Middle. It also reveals an abstraction principle that the proof employs. It will be argued that the Goodman–Myhill result does not provide the constructive set theorist with a dispositive reason for not adopting (full) Choice.

1761383.438345
This paper clarifies, revises, and extends the account of the transmission of truthmakers by core proofs that was set out in chap. 9 of Tennant (2017). Brauer provided two kinds of example making clear the need for this. Unlike Brouwer’s counterexamples to excluded middle, the examples of Brauer that we are dealing with here establish the need for appeals to excluded middle when applying, to the problem of truthmakertransmission, the already classical metalinguistic theory of modelrelative evaluations.

1761441.438361
In the inferential semantics presented in Tennant [2010] and Tennant [forthcoming], simple inferences determine the truth value of a sentence in a model. They allow one to define coinductively the notions ‘V is a verification of ϕ in the model M ’ and ‘F is a falsification of ϕ in the model M ’. Such evaluations explicate the different ways that a sentence is true, or false, in M . They explicate the structure involved in ϕ’s being true under a given interpretation M . These evaluations employ facts relevantly to determine truthvalue. They can be infinitary if the domain is infinite. Verifications and falsifications are relevantly from, or relative to, a set of literals expressing some of the atomic information in the model.

1784868.438376
The problem of theory choice and model selection is hard but still important when useful truths are underdetermined, perhaps not by all kinds of data but by the kinds of data we can have access to ethically or practicably—even if we have an infinity of such data. This article addresses a crucial instance of that problem: the problem of inferring causal structures from nonexperimental, nontemporal data without assuming the socalled causal faithfulness condition or the like. A new account of epistemic evaluation is developed to solve that problem and justify a standard practice of causal inference in data science.

1879199.438395
A recurring narrative in the literature on conditionals is that the empirical facts about negated ifs provide compelling evidence for the principle of Conditional Excluded Middle and sit uncomfortably with a large family of analyses of conditionals as universal quantifiers over possible worlds. I show that both parts of the narrative are in need of a rewrite. I do so by articulating an innovative conditional analysis in a bilateral semantic setting that takes inspiration from the Ramsey test for conditionals but distinguishes the classical Ramseyan question of what it takes to accept a conditional from the one of what it takes to reject a conditional. The resulting framework disentangles the empirical facts about negated conditionals from the validity of Conditional Excluded Middle but also shows how the principle can live happily in a strict analysis of conditionals, and in fact how it can coexist with other nonclassical principles such as Simplification of Disjunctive Antecedents without negative side effects.

1927887.438409
Invariance is one of several dimensions of causal relationships within the interventionist account. The more invariant a relationship between two variables, the more the relationship should be considered paradigmatically causal. In this paper, I propose two formal measures to estimate invariance, illustrated by a simple example. I then discuss the notion of invariance for causal relationships between nonnominal (i.e., ordinal and quantitative) variables, for which Information theory, and hence the formalism proposed here, is not well suited. Finally, I propose how invariance could be qualified for such variables.

1977728.43843
In this paper I examine the phenomenon of informal proofs as found in mathematical practice and the difficulties these face concerning rigour and correctness. I focus on one particular type of response, which I call derivationist, which seeks to explain these in terms of underlying formal derivations. I proceed to set out five desiderata that the derivationist approach should aim to satisfy. With particular emphasis on Azzouni’s derivationindicator account, I raise a dilemma for the type of link that must be posited from informal proofs to formal derivations: that it must either be agentindependent or else agentdependent. I show that derivationist theories want to take the first horn, but that considerations of proof identity, uniqueness and informal content determining formal structure are serious obstacles in that direction. I further argue that the other horn is incompatible with the original motivations of the derivationists. Thus I conclude that the desiderata for a derivationist theory cannot be satisfied.

2390389.438467
One often hears of a radical transformation in our conception of logic ushered in by the work of Gottlob Frege, a transformation away from thinking of logic in terms of the theory of syllogism and towards the axiomatic propositional and predicate calculi, later pursued by Russell, Tarksi, Heyting, Łukasiewicz, and others. Under this transformation, the assorted patterns of syllogistic inference, which for centuries had been thought of as primitive inference schemes from which the cogency of rational discourse could be evaluated, became objects of study and analysis from a more fundamental and more finegrained vantage point.

2396031.438509
Since SunJoo Shin’s groundbreaking study (2002), Peirce’s existential graphs have attracted much attention as a way of writing logic that seems profoundly different from our usual logical calculi. In particular, Shin argued that existential graphs enjoy a distinctive property that marks them out as “diagrammatic”: they are “multiply readable,” in the sense that there are several different, equally legitimate ways to translate one and the same graph into a standard logical language. Stenning (2000) and Bellucci and Pietarinen (2016) have retorted that similar phenomena of multiple readability can arise for sentential notations as well. Focusing on the simplest kinds of existential graphs, called alpha graphs (AGs), this paper argues that multiple readability does point to important features of AGs, but that both Shin and her critics have misdiagnosed its source.

2612982.438576
The present paper is a sequel to [16]. A class of implicative expansions of Kleene’s 3valued logic functionally including Lukasiewicz’s logic L3 is defined. Several properties of this class and/or some of its subclasses are investigated. Properties contemplated include functional completeness for the 3element set of truthvalues, presence of natural conditionals, variablesharing property (vsp) and vsprelated properties.

2613517.438606
In this paper I shall consider two related avenues of argument that have been used to make the case for the inconsistency of mathematics: firstly, Godel’s paradox which leads to a contradiction within mathematics and, secondly, the incompatibility of completeness and consistency established by Godel’s incompleteness theorems. By bringing in considerations from the philosophy of mathematical practice on informal proofs, I suggest that we should add to the two axes of completeness and consistency a third axis of formality and informality. I use this perspective to respond to the arguments for the inconsistency of mathematics made by Beall and Priest, presenting problems with the assumptions needed concerning formalisation, the unity of informal mathematics and the relation between the formal and informal.

2671993.438641
A standard concern in scientific practice is to prevent erroneous estimates of causal effects produced by "confounding" of the association of an experimental treatment and an outcome variable by other variables, usually called "covariates." Recently, Fuller (2021) has argued that one strategy for avoiding confounding"balancing" of covariates between treated and untreated groupsis unnecessary and insufficient for correctly estimating a causal relation between variables in randomized controlled trials (RCTs). His example is a fully deterministic system in which all causes of an outcome variable, except possibly the treatment variable, are known, a circumstance far different from that of most RCTs. The joint distribution of the measured variables is not available in his causal story, which is just made up without regard to any principle connecting causal relations and joint probabilities. Fuller remarks that epidemiologists (one at least) say they are not concerned about balancing but that some philosophers are. He does not explain why epidemiologists or others think balancing is not of concern for causal estimation. His discussion suggests that a philosophical exposition of wellunderstood aspects of confounding and its attempted remedies is needed in settings that are statistically and epistemologically more realistic.

2729856.438674
In terms of generating discussion, few articles in the philosophy of physics can parallel Earman and Norton’s (1987) article on the “hole argument” in the General Theory of Relativity. In short, by the 1970s, spacetime substantivalism had come into vogue, but Earman and Norton argued that a substantivalist must be committed to a pernicious form of indeterminism. Their argument seems to cleverly exploit the diffeomorphism freedom of GTR, a mathematical subtlety that had tripped up even Einstein himself. Here we attempt to answer the following question: what is the mathematical fact or facts on which the hole argument is supposed to be based? We identify two mathematical claims that might be relevant. The first of these mathematical claims is trivially true — as pointed out by Weatherall (2018) — and so does not underwrite any metaphysically interesting conclusions. While we agree with Weatherall’s point, we suggest that others may have confused the trivial mathematical fact he identifies with another mathematical claim, which, if true, would have profound consequences for the interpretation of GTR. We prove here that this second mathematical claim is false, and we conclude that there is no basis for the hole argument.

2772375.438704
● Here are some fundamental philosophical questions with mathematical answers: ○ (1) Is there a (recursive) algorithm for deciding whether an arbitrary sentence in the language of firstorder arithmetic is true? ○ (2) Is there an algorithm for deciding whether an arbitrary sentence in the language of firstorder arithmetic is a theorem of Peano or Robinson Arithmetic?

3076036.438737
This paper proposes a way of doing type theory informally, assuming a cubical style of reasoning. It can thus be viewed as a first step toward a cubical alternative to the program of informalization of type theory carried out in the homotopy type theory book for dependent type theory augmented with axioms for univalence and higher inductive types. We adopt a cartesian cubical type theory proposed by Angiuli, Brunerie, Coquand, Favonia, Harper, and Licata as the implicit foundation, confining our presentation to elementary results such as function extensionality, the derivation of weak connections and path induction, the groupoid structure of types, and the Eckmman–Hilton duality.

3291873.438767
In this paper, we aim to explore connections between a Carnapian semantics of theoretical terms and an eliminative structuralist approach in the philosophy of mathematics. Specifically, we will interpret the language of Peano arithmetic by applying the modal semantics of theoretical terms introduced in Andreas (Synthese 174(3):367– 383, 2010). We will thereby show that the application to Peano arithmetic yields a formal semantics of universal structuralism, i.e., the view that ordinary mathematical statements in arithmetic express general claims about all admissible interpretations of the Peano axioms. Moreover, we compare this application with the modal structuralism by Hellman (Mathematics without numbers: towards a modalstructural interpretation.

3538677.438806
In a series of papers Ladyman and Presnell raise an interesting challenge of providing a premathematical justification for homotopy type theory. In response, they propose what they claim to be an informal semantics for homotopy type theory where types and terms are regarded as mathematical concepts. The aim of this paper is to raise some issues which need to be resolved for the successful development of their typesasconcepts interpretation.