1. 108720.298171
    As usually presented, octagons of opposition are rather complex objects and can be difficult to assimilate at a glance. We show how, under suitable conditions that are satisfied by most historical examples, different display conventions can simplify the diagrams, making them easier for readers to grasp without the loss of information. Moreover, those conditions help reveal the conceptual structure behind the visual display.
    Found 1 day, 6 hours ago on David Makinson's site
  2. 237437.298329
    We extend a result by Gallow concerning the impossibility of following two epistemic masters, so that it covers a larger class of pooling methods. We also investigate a few ways of avoiding the issue, such as using nonconvex pooling methods, employing the notion of imperfect trust or moving to higher-order probability spaces. Along the way we suggest a conceptual issue with the conditions used by Gallow: whenever two experts are considered, whether we can trust one of them is decided by the features of the other!
    Found 2 days, 17 hours ago on PhilSci Archive
  3. 641256.29834
    The simulation hypothesis has recently excited renewed interest, especially in the physics and philosophy communities. However, the hypothesis specifically concerns computers that simulate physical universes, which means that to properly investigate it we need to couple computer science theory with physics. Here I do this by exploiting the physical Church-Turing thesis. This allows me to introduce a preliminary investigation of some of the computer science theoretic aspects of the simulation hypothesis. In particular, building on Kleene’s second recursion theorem, I prove that it is mathematically possible for us to be in a simulation that is being run on a computer by us. In such a case, there would be two identical instances of us; the question of which of those is “really us” is meaningless. I also show how Rice’s theorem provides some interesting impossibility results concerning simulation and self-simulation; briefly describe the philosophical implications of fully homomorphic encryption for (self-)simulation; briefly investigate the graphical structure of universes simulating universes simulating universes, among other issues. I end by describing some of the possible avenues for future research that this preliminary investigation reveals.
    Found 1 week ago on PhilSci Archive
  4. 756625.298348
    Within the context of general relativity, Leibnizian metaphysics seems to demand that worlds are “maximal” with respect to a variety of space-time properties (Geroch 1970; Earman 1995). Here, we explore maximal worlds with respect to the “Heraclitus” asymmetry property which demands that of no pair of spacetime events have the same structure (Manchak and Barrett 2023). First, we show that Heraclitus-maximal worlds exist and that every Heraclitus world is contained in some Heraclitus-maximal world. This amounts to a type of compatibility between the Leibnizian and Heraclitian demands. Next, we consider the notion of “observationally indistinguishable” worlds (Glymour 1972, 1977; Malament 1977). We know that, modulo modest assumptions, any world is observationally indistinguishable from some other (non-isomorphic) world (Manchak 2009). But here we show a way out of this general epistemic predicament: if attention is restricted to Heraclitus-maximal worlds, then worlds are observationally indistinguishable if and only if they are isomorphic. Finally, we show a sense in which cosmic underdetermination can still arise for individual observers even if the Leibnizian and Heraclitian demands are met.
    Found 1 week, 1 day ago on PhilSci Archive
  5. 929802.298354
    This paper examines different kinds of definite descriptions denoting purely contingent, necessary or impossible objects. The discourse about contingent/impossible/necessary objects can be organised in terms of rational questions to ask and answer relative to the modal profile of the entity in question. There are also limits on what it is rational to know about entities with this or that modal profile. We will also examine epistemic modalities; they are the kind of necessity and possibility that is determined by epistemic constraints related to knowledge or rationality. Definite descriptions denote so-called offices, roles, or things to be. We explicate these -offices as partial functions from possible worlds to chronologies of objects of type , where  is mostly the type of individuals. Our starting point is Prior’s distinction between a ‘weak’ and ‘strong’ definite article ‘the’. In both cases, the definite description refers to at most one object; yet, in the case of the weak ‘the’, the referred object can change over time, while in the case of the strong ‘the’, the object referred to by the definite description is the same forever, once the office has been occupied. The main result we present is the way how to obtain a Wh-knowledge about who or what plays a given role presented by a hyper-office, i.e. procedure producing an office. Another no less important result concerns the epistemic necessity of the impossibility of knowing who or what occupies the impossible office presented by a hyper-office.
    Found 1 week, 3 days ago on PhilSci Archive
  6. 946409.29836
    First, we suggest and discuss second-order versions of properties for solutions for TU games used to characterize the Banzhaf value, in particular, of standardness for two-player games, of the dummy player property, and of 2-efficiency. Then, we provide a number of characterizations of the Banzhaf value invoking the following properties: (i) [second-order standardness for two-player games or the second-order dummy player property] and 2-efficiency, (ii) standardness for one-player games, standardness for two-player games, and second-order 2-efficiency, (iii) standardness for one-player games, [second-order standardness for two-player games or the second-order dummy player property], and second-order 2-efficiency. These characterizations also work within the classes of simple games, of superadditive games, and of simple superadditive games.
    Found 1 week, 3 days ago on André Casajus's site
  7. 1333373.298366
    Reconstructions of quantum theory are a novel research program in theoretical physics which aims to uncover the unique physical features of quantum theory via axiomatization. I focus on Hardy’s “Quantum Theory from Five Reasonable Axioms” (2001), arguing that reconstructions represent a modern usage of axiomatization with significant points of continuity to von Neumann’s axiomatizations in quantum mechanics. In particular, I show that Hardy and von Neumann share similar methodological ordering, have a common operational framing, and insist on the empirical basis of axioms. In the reconstruction programme, interesting points of discontinuity with historical axiomatizations include the stipulation of a generalized space of theories represented by a framework and the stipulation of analytic machinery at two levels of generality (first by establishing a generalized mathematical framework and then by positing specific formulations of axioms). In light of the reconstruction programme, I show that we should understand axiomatization attempts as being context–dependent, context which is contingent upon the goals of inquiry and the maturity of both mathematical formalism and theoretical underpinnings within the area of inquiry. Drawing on Mitsch (2022)’s account of axiomatization, I conclude that reconstructions should best be understood as provisional, practical, representations of quantum theory that are well suited for theory development and exploration. However, I propose my context–dependent re–framing of axiomatization as a means of enriching Mitsch’s account.
    Found 2 weeks, 1 day ago on PhilSci Archive
  8. 1900673.298372
    A drawback of the standard modal ontological proof is that it assumes that it is possible that there is something godlike. Kurt Gödel’s ontological proof seeks to establish this possibility with the help of certain axiological principles. But the axiological principles he relies on are not very plausible. And the same goes for other Gödelian ontological proofs in the literature. In this paper, I put forward a Gödelian ontological proof that only relies on plausible axiological principles. And I adopt the proof both for constant and variable domains. Nevertheless, the proof still needs the axiom that being godlike is positive in the sense of being a “purely good”-making property.
    Found 3 weeks ago on Johan E. Gustafsson's site
  9. 2061273.298379
    This paper investigates the conditions under which diagonal sentences can be taken to constitute paradigmatic cases of self-reference. We put forward well-motivated constraints on the diagonal operator and the coding apparatus which separate paradigmatic self-referential sentences, for instance obtained via Gödel’s diagonalization method, from accidental diagonal sentences. In particular, we show that these constraints successfully exclude refutable Henkin sentences, as constructed by Kreisel.
    Found 3 weeks, 2 days ago on Volker Halbach's site
  10. 2061341.298385
    We introduce and analyze a new axiomatic theory CD of truth. The primitive truth predicate can be applied to sentences containing the truth predicate. The theory is thoroughly classical in the sense that CD is not only formulated in classical logic, but that the axiomatized notion of truth itself is classical: The truth predicate commutes with all quantifiers and connectives, and thus the theory proves that there are no truth value gaps or gluts. To avoid inconsistency, the instances of the T-schema are restricted to determinate sentences. Determinateness is introduced as a further primitive predicate and axiomatized. The semantics and proof theory of CD are analyzed.
    Found 3 weeks, 2 days ago on Volker Halbach's site
  11. 2652880.298393
    Following Post program, we will propose a linguistic and empirical interpretation of G¨odel’s incompleteness theorem and related ones on unsolvability by Church and Turing. All these theorems use the diagonal argument by Cantor in order to find limitations in finitary systems, as human language, which can make “infinite use of finite means”. The linguistic version of the incompleteness theorem says that every Turing complete language is G¨odel incomplete. We conclude that the incompleteness and unsolvability theorems find limitations in our finitary tool, which is our complete language.
    Found 4 weeks, 2 days ago on PhilPapers
  12. 2779945.298399
    We describe a linear time algorithm that determines all “two-vertex bottlenecks” in a directed graph. This gives all pairs of vertices that disconnect two given nodes s and t in a directed graph. There may be quadratically many two-vertex bottlenecks, but a compressed representation allows them to all be determined in linear time. Applications include the determination of Dual Implication Points (DIPs) in the CDCL solver conflict graph, as discussed in Buss, Chung, Ganesh, and Oliveras [preprint, 2024]. The algorithm for finding all DIPs is an algorithm for Menger’s Theorem on a directed graph that not only verifies that two given vertices are not 3-connected but also finds all possible separating vertex pairs.
    Found 1 month ago on Samuel R. Buss's site
  13. 2941563.298405
    This paper is a discussion note on Isaacs et al. 2022, who have claimed to offer a new motivation for imprecise probabilities, based on the mathematical phenomenon of nonmeasurability. In this note, I clarify some consequences of their proposal. In particular, I show that if their proposal is applied to a bounded 3-dimensional space, then they have to reject at least one of the following: • If A is at most as probable as B and B is at most as probable as C, then A is at most as probable as C.
    Found 1 month ago on PhilPapers
  14. 3007346.29841
    Rumfitt has given two arguments that in unilateralist verificationist theories of meaning, truth collapses into correct assertibility. The present paper I give similar arguments that show that in unilateral falsificationist theories of meaning, falsehood collapses into correct deniability. According to bilateralism, meanings are determined by assertion and denial conditions, so the question arises whether it succumbs to similar arguments. I show that this is not the case. The final section considers the question whether a principle central to Rumfitt’s first argument, ‘It is assertible that A if and only if it is assertible that it is assertible that A’, is one that bilateralists can reject and concludes that they cannot. It follows that the logic of assertibility and deniability, according to a result by Williamson, is the little known modal logic K4 studied by Soboci ´nski. The paper ends with a plaidoyer for bilateralists to adopt this logic.
    Found 1 month ago on Nils Kürbis's site
  15. 3518799.298416
    Discussion of the Aristotelian syllogism over the last sixty years has arguably centered on the question whether syllogisms are inferences or implications. But the significance of this debate at times has been taken to concern whether the syllogistic is a logic or a theory, and how it ought to be represented by modern systems.
    Found 1 month, 1 week ago on PhilPapers
  16. 3540288.298422
    The Turing test for machine thought has an interrogator communicate (by typing) with a human and a machine both of which try to convince the interrogator that they are human. The interrogator then guesses which is human. …
    Found 1 month, 1 week ago on Alexander Pruss's Blog
  17. 3757579.298428
    I propose a revision of Cantor’s account of set size that understands comparisons of set size fundamentally in terms of surjections rather than injections. This revised account is equivalent to Cantor’s account if the Axiom of Choice is true, but its consequences differ from those of Cantor’s if the Axiom of Choice is false. I argue that the revised account is an intuitive generalization of Cantor’s account, blocks paradoxes—most notably, that a set can be partitioned into a set that is bigger than it—that can arise from Cantor’s account if the Axiom of Choice is false, illuminates the debate over whether the Axiom of Choice is true, is a mathematically fruitful alternative to Cantor’s account, and sheds philosophical light on one of the oldest unsolved problems in set theory.
    Found 1 month, 1 week ago on PhilSci Archive
  18. 3815421.298437
    In classical first-order logic (FOL), let T be a theory with an unspecified (arbitrary) constant c, where the symbol c does not occur in any of the axioms of T. Let psi(x) be a formula in the language of T that does not contain the symbol c. In a well-known result due to Shoenfield (the “theorem on constants”), it is proven that if psi(c) is provable in T, then so is psi(x), where x is the only free variable in psi(x). In the proof of this result, Shoenfield starts with the hypothesis that P is a valid proof of psi(c) in T, and then replaces each occurrence of c in P by a variable to obtain a valid proof of psi(x) in T, the argument being that no axiom of T is violated by this replacement. In this paper, we demonstrate that the theorem on constants leads to a meta-inconsistency in FOL (i.e., a logical inconsistency in the metatheory of T in which Shoenfield’s proof is executed), the root cause of which is the existence of arbitrary constants. In previous papers, the author has proposed a finitistic paraconsistent logic (NAFL) in which it is provable that arbitrary constants do not exist. The nonclassical reasons for this nonexistence are briefly examined and shown to be relevant to the above example.
    Found 1 month, 1 week ago on PhilSci Archive
  19. 4089840.298443
    How can the Biblical God be the Lord and King who, being typically unseen and even self-veiled at times, authoritatively leads people for divine purposes? This article’s main thesis is that the answer is in divine moral leading via human moral experience of God (of a kind to be clarified). The Hebrew Bible speaks of God as ‘king,’ including for a time prior to the Jewish human monarchy. Ancient Judaism, as Martin Buber has observed, acknowledged direct and indirect forms of divine rule and thus of theocracy. This article explores the importance of divine rule as divine direct leading, particularly in moral matters, without reliance on indirect theocracy supervised by humans. It thus considers a role for God as Über-King superior to any human king, maintaining a direct moral theocracy without a need for indirect theocracy. The divine goal, in this perspective, is a universal commonwealth in righteousness, while allowing for variation in political structure. The article identifies the importance in the Hebrew Bible of letting God be God as an Über-King who, although self-veiled at times, leads willing people directly and thereby rules over them uncoercively. It also clarifies a purpose for divine self-veiling neglected by Buber and many others, and it offers a morally sensitive test for unveiled authenticity in divine moral leading.
    Found 1 month, 2 weeks ago on PhilPapers
  20. 4147598.29845
    This is an attempt to axiomatise the natural laws. Note especially axiom 4, which is expressed in third order predicate logic, and which permits a solution to the problem of causation in nature without stating that “everything has a cause”. The undefined term “difference” constitutes the basic element and each difference is postulated to have an exact position and to have a discrete cause. The set of causes belonging to a natural set of dimensions is defined as a law. This means that a natural law is determined by the discrete causes tied to a natural set of dimensions. A law is defined as “defined” in a point if a difference there has a cause. Given that there is a point for which the law is not defined it is shown that a difference is caused that connects two points in two separate sets of dimensions.
    Found 1 month, 2 weeks ago on PhilPapers
  21. 4147620.298456
    Ontology and theology cannot be combined if ontology excludes non physical causes. This paper examines some possibilities for ontology to be combined with theology in so far as non physical causes are permitted. The paper builds on metaphysical findings that shows that separate ontological domains can interact causally indirectly via interfaces. As interfaces are not universes a first universe is allowed to be caused by an interface without violating the principle of causal closure of any universe. Formal theology can therefore be based on the assumption that the (first) universe is caused by God if God is defined as the first cause. Given this, formal theology and science can have the same ontological base.
    Found 1 month, 2 weeks ago on PhilPapers
  22. 4667354.298461
    This paper develops the idea that valid arguments are equivalent to true conditionals by combining Kripke’s theory of truth with the evidential account of conditionals offered by Crupi and Iacona. As will be shown, in a first-order language that contains a naïve truth predicate and a suitable conditional, one can define a validity predicate in accordance with the thesis that the inference from a conjunction of premises to a conclusion is valid when the corresponding conditional is true. The validity predicate so defined significantly increases our expressive resources and provides a coherent formal treatment of paradoxical arguments.
    Found 1 month, 3 weeks ago on PhilPapers
  23. 4667378.298467
    Bilateral proof systems, which provide rules for both affirming and denying sentences, have been prominent in the development of proof-theoretic semantics for classical logic in recent years. However, such systems provide a substantial amount of freedom in the formulation of the rules, and, as a result, a number of different sets of rules have been put forward as definitive of the meanings of the classical connectives. In this paper, I argue that a single general schema for bilateral proof rules has a reasonable claim to inferentially articulating the core meaning of all of the classical connectives. I propose this schema in the context of a bilateral sequent calculus in which each connective is given exactly two rules: a rule for affirmation and a rule for denial. Positive and negative rules for all of the classical connectives are given by a single rule schema, harmony between these positive and negative rules is established at the schematic level by a pair of elimination theorems, and the truth-conditions for all of the classical connectives are read off at once from the schema itself.
    Found 1 month, 3 weeks ago on PhilPapers
  24. 4710407.298473
    In the longstanding foundational debate whether to require that probability is countably additive, in addition to being finitely additive, those who resist the added condition raise two concerns that we take up in this paper. (1) Existence: Settings where no countably additive probability exists though finitely additive probabilities do. (2) Complete Additivity: Where reasons for countable additivity don’t stop there. Those reasons entail complete additivity—the (measurable) union of probability 0 sets has probability 0, regardless the cardinality of that union. Then probability distributions are discrete, not continuous. We use Easwaran’s (Easwaran, Thought 2:53–61, 2013) advocacy of the Comparative principle to illustrate these two concerns. Easwaran supports countable additivity, both for numerical probabilities and for finer, qualitative probabilities, by defending a condition he calls the Comparative principle [C ].
    Found 1 month, 3 weeks ago on Teddy Seidenfeld's site
  25. 4754951.29848
    Some people want to be able to compare the sizes of infinite sets while preserving the "Euclidean" proper subset principle that holds for finite sets: - If A is a proper subset of B, then A < B. We also want to make sure that our comparison agrees with how we compare finite sets: - If A and B are finite, then A ≤ B if and only if A has no more elements than B. …
    Found 1 month, 3 weeks ago on Alexander Pruss's Blog
  26. 4792047.298486
    This essay examines the philosophical significance of Ω-logic in Zermelo- Fraenkel set theory with choice (ZFC). The categorical duality between coalgebra and algebra permits Boolean-valued algebraic models of ZFC to be interpreted as coalgebras. The hyperintensional profile of Ω-logical validity can then be countenanced within a coalgebraic logic. I argue that the philosophical significance of the foregoing is two-fold. First, because the epistemic and modal and hyperintensional profiles of Ω-logical validity correspond to those of second-order logical consequence, Ω-logical validity is genuinely logical. Second, the foregoing provides a hyperintensional account of the interpretation of mathematical vocabulary.
    Found 1 month, 3 weeks ago on PhilSci Archive
  27. 4840691.298491
    Fragmentalism allows incompatible facts to constitute reality in an absolute manner, provided that they fail to obtain together. In recent years, the view has been extensively discussed, with a focus on its formalisation in model-theoretic terms. This paper focuses on three formalisations: Lipman’s approach, the subvaluationist interpretation, and a novel view that has been so far overlooked. The aim of the paper is to explore the application of these formalisations to the alethic modal case. This logical exploration will allow us to study (i) cases of metaphysical incompatibility between modal facts and (ii) cases of modal dialetheias. In turn, this will enrich our understanding of the role of impossibility in the fragmentalist framework.
    Found 1 month, 3 weeks ago on PhilPapers
  28. 5484270.298497
    Citation: Ellerman, D. A New Logic, a New Information Measure, and a New Information-Based Approach to Interpreting Quantum Mechanics.
    Found 2 months ago on PhilSci Archive
  29. 5522387.298503
    In this paper, we discuss J. Michael Dunn’s foundational work on the semantics for First Degree Entailment logic (FDE), also known as Belnap–Dunn logic (or Sanjaya–Belnap–Smiley–Dunn Four-valued Logic, as suggested by Dunn himself). More specifically, by building on the framework due to Dunn, we sketch a broad picture towards a systematic understanding of contra-classicality. Our focus will be on a simple propositional language with negation, conjunction, and disjunction, and we will systematically explore variants of FDE, K3, and LP by tweaking the falsity condition for negation.
    Found 2 months ago on Hitoshi Omori's site
  30. 5522411.298512
    In this paper, we apply a Herzberger-style semantics to deal with the question: is the de Finetti conditional a conditional? The question is pressing, in view of the inferential behavior of the de Finetti conditional: it allows for inferences that seem quite unexpected for a conditional. The semantics we advance here for the de Finetti conditional is simply the classical semantics for material conditional, with a further dimension whose understanding depends on the kind of application one has in mind. We discuss such possible applications and how they cover ground already advanced in the literature.
    Found 2 months ago on Hitoshi Omori's site