1. 15199.879809
    I propose an approach to liar and Curry paradoxes inspired by the work of Roger Swyneshed in his treatise on insolubles (1330-1335). The keystone of the account is the idea that liar sentences and their ilk are false (and only false) and that the so-called “capture” direction of the T-schema should be restricted. The proposed account retains what I take to be the attractive features of Swyneshed’s approach without leading to some worrying consequences Swyneshed accepts. The approach and the resulting logic (called “Swynish Logic”) are non-classical, but are consistent and compatible with many elements of the classical picture including modus ponens, modus tollens, and double-negation elimination and introduction. It is also compatible with bivalence and contravalence. My approach to these paradoxes is also immune to an important kind of revenge challenge that plagues some of its rivals.
    Found 4 hours, 13 minutes ago on PhilPapers
  2. 71678.880058
    Throughout the history of automated reasoning, mathematics has been viewed as a prototypical domain of application. It is therefore surprising that the technology has had almost no impact on mathematics to date and plays almost no role in the subject today. This article presents an optimistic view that the situation is about to change. It describes some recent developments in the Lean programming language and proof assistant that support this optimism, and it reflects on the role that automated reasoning can and should play in mathematics in the years to come.
    Found 19 hours, 54 minutes ago on Jeremy Avigad's site
  3. 133856.880079
    Suppose that we have n objects α1, ..., αn, and we want to define something like numerical values (at least hyperreal ones, if we can’t have real ones) on the basis of comparisons of value. Here is one interesting way to proceed. …
    Found 1 day, 13 hours ago on Alexander Pruss's Blog
  4. 371681.880094
    We show that knowledge satisfies interpersonal independence, meaning that a non-trivial sentence describing one agent’s knowledge cannot be equivalent to a sentence describing another agent’s knowledge. The same property of interpersonal independence holds, mutatis mutandis, for belief. In the case of knowledge, interpersonal independence is implied by the fact that there are no non-trivial sentences that are common knowledge in every model of knowledge. In the case of belief, interpersonal independence follows from a strong interpersonal independence that knowledge does not have. Specifically, there is no sentence describing the beliefs of one person that implies a sentence describing the beliefs of another person.
    Found 4 days, 7 hours ago on PhilSci Archive
  5. 410949.880107
    Davide Grossi Artificial Intelligence, Bernoulli Institute, University of Groningen ILLC/ACLE, University of Amsterdam The Netherlands d.grossi@rug.nl its application varies in complexity and depends, in particular, on whether relevant past decisions agree, or exist at all. The contribution of this paper is a formal treatment of types of the hardness of case-based decisions. The typology of hardness is defined in terms of the arguments for and against the issue to be decided, and their kind of validity (conclusive, presumptive, coherent, incoherent). We apply the typology of hardness to Berman and Hafner’s research on the dynamics of case-based reasoning and show formally how the hardness of decisions varies with time.
    Found 4 days, 18 hours ago on Davide Grossi's site
  6. 424032.88012
    This paper examines the logic of conditional obligation, which originates from the works of Hansson, Lewis, and others. Some weakened forms of transitivity of the betterness relation are studied. These are quasi-transitivity, Suzumura consistency, acyclicity and the interval order condition. The first three do not change the logic. The axiomatic system is the same whether or not they are introduced. This holds true under a rule of interpretation in terms of maximality and strong maximality. The interval order condition gives rise to a new axiom. Depending on the rule of interpretation, this one changes. With the rule of maximality, one obtains the principle known as disjunctive rationality. With the rule of strong maximality, one obtains the Spohn axiom (also known as the principle of rational monotony, or Lewis’ axiom CV). A completeness theorem further substantiates these observations. For interval order, this yields the finite model property and decidability of the calculus.
    Found 4 days, 21 hours ago on X. Parent's site
  7. 477320.880132
    Transitivity, Simplification, and Contraposition are intuitively compelling. Although Antecedent Strengthening may seem less attractive at first, close attention to the full range of data reveals that it too has considerable appeal. An adequate theory of conditionals should account for these facts. The strict theory of conditionals does so by validating the four inferences. It says that natural language conditionals are necessitated material conditionals: A B is true if and only if A B is true throughout a set of accessible worlds. As a result, it validates many classical inferences, including Transitivity, Simplification, Contraposition, and Antecedent Strengthening. In what follows I will refer to these as the strict inferences.
    Found 5 days, 12 hours ago on PhilPapers
  8. 477418.880143
    Let serious propositional contingentism (SPC) be the package of views which consists in (i) the thesis that propositions expressed by sentences featuring terms depend, for their existence, on the existence of the referents of those terms, (ii) serious actualism— the view that it is impossible for an object to exemplify a property and not exist—and (iii) contingentism—the view that it is at least possible that some thing might not have been something. SPC is popular and compelling. But what should we say about possible worlds, if we accept SPC? Here, I first show that a natural view of possible worlds, well-represented in the literature, in conjunction with SPC is inadequate. Though I note various alternative ways of thinking about possible worlds in response to the first problem, I then outline a second more general problem—a master argument— which generally shows that any account of possible worlds meeting very minimal requirements will be inconsistent with compelling claims about mere possibilia which the serious propositional contingentist should accept.
    Found 5 days, 12 hours ago on PhilPapers
  9. 589853.880155
    As usually presented, octagons of opposition are rather complex objects and can be difficult to assimilate at a glance. We show how, under suitable conditions that are satisfied by most historical examples, different display conventions can simplify the diagrams, making them easier for readers to grasp without the loss of information. Moreover, those conditions help reveal the conceptual structure behind the visual display.
    Found 6 days, 19 hours ago on David Makinson's site
  10. 718570.880179
    We extend a result by Gallow concerning the impossibility of following two epistemic masters, so that it covers a larger class of pooling methods. We also investigate a few ways of avoiding the issue, such as using nonconvex pooling methods, employing the notion of imperfect trust or moving to higher-order probability spaces. Along the way we suggest a conceptual issue with the conditions used by Gallow: whenever two experts are considered, whether we can trust one of them is decided by the features of the other!
    Found 1 week, 1 day ago on PhilSci Archive
  11. 1122389.880192
    The simulation hypothesis has recently excited renewed interest, especially in the physics and philosophy communities. However, the hypothesis specifically concerns computers that simulate physical universes, which means that to properly investigate it we need to couple computer science theory with physics. Here I do this by exploiting the physical Church-Turing thesis. This allows me to introduce a preliminary investigation of some of the computer science theoretic aspects of the simulation hypothesis. In particular, building on Kleene’s second recursion theorem, I prove that it is mathematically possible for us to be in a simulation that is being run on a computer by us. In such a case, there would be two identical instances of us; the question of which of those is “really us” is meaningless. I also show how Rice’s theorem provides some interesting impossibility results concerning simulation and self-simulation; briefly describe the philosophical implications of fully homomorphic encryption for (self-)simulation; briefly investigate the graphical structure of universes simulating universes simulating universes, among other issues. I end by describing some of the possible avenues for future research that this preliminary investigation reveals.
    Found 1 week, 5 days ago on PhilSci Archive
  12. 1237758.880206
    Within the context of general relativity, Leibnizian metaphysics seems to demand that worlds are “maximal” with respect to a variety of space-time properties (Geroch 1970; Earman 1995). Here, we explore maximal worlds with respect to the “Heraclitus” asymmetry property which demands that of no pair of spacetime events have the same structure (Manchak and Barrett 2023). First, we show that Heraclitus-maximal worlds exist and that every Heraclitus world is contained in some Heraclitus-maximal world. This amounts to a type of compatibility between the Leibnizian and Heraclitian demands. Next, we consider the notion of “observationally indistinguishable” worlds (Glymour 1972, 1977; Malament 1977). We know that, modulo modest assumptions, any world is observationally indistinguishable from some other (non-isomorphic) world (Manchak 2009). But here we show a way out of this general epistemic predicament: if attention is restricted to Heraclitus-maximal worlds, then worlds are observationally indistinguishable if and only if they are isomorphic. Finally, we show a sense in which cosmic underdetermination can still arise for individual observers even if the Leibnizian and Heraclitian demands are met.
    Found 2 weeks ago on PhilSci Archive
  13. 1410935.880218
    This paper examines different kinds of definite descriptions denoting purely contingent, necessary or impossible objects. The discourse about contingent/impossible/necessary objects can be organised in terms of rational questions to ask and answer relative to the modal profile of the entity in question. There are also limits on what it is rational to know about entities with this or that modal profile. We will also examine epistemic modalities; they are the kind of necessity and possibility that is determined by epistemic constraints related to knowledge or rationality. Definite descriptions denote so-called offices, roles, or things to be. We explicate these -offices as partial functions from possible worlds to chronologies of objects of type , where  is mostly the type of individuals. Our starting point is Prior’s distinction between a ‘weak’ and ‘strong’ definite article ‘the’. In both cases, the definite description refers to at most one object; yet, in the case of the weak ‘the’, the referred object can change over time, while in the case of the strong ‘the’, the object referred to by the definite description is the same forever, once the office has been occupied. The main result we present is the way how to obtain a Wh-knowledge about who or what plays a given role presented by a hyper-office, i.e. procedure producing an office. Another no less important result concerns the epistemic necessity of the impossibility of knowing who or what occupies the impossible office presented by a hyper-office.
    Found 2 weeks, 2 days ago on PhilSci Archive
  14. 1427542.880229
    First, we suggest and discuss second-order versions of properties for solutions for TU games used to characterize the Banzhaf value, in particular, of standardness for two-player games, of the dummy player property, and of 2-efficiency. Then, we provide a number of characterizations of the Banzhaf value invoking the following properties: (i) [second-order standardness for two-player games or the second-order dummy player property] and 2-efficiency, (ii) standardness for one-player games, standardness for two-player games, and second-order 2-efficiency, (iii) standardness for one-player games, [second-order standardness for two-player games or the second-order dummy player property], and second-order 2-efficiency. These characterizations also work within the classes of simple games, of superadditive games, and of simple superadditive games.
    Found 2 weeks, 2 days ago on André Casajus's site
  15. 1814506.880241
    Reconstructions of quantum theory are a novel research program in theoretical physics which aims to uncover the unique physical features of quantum theory via axiomatization. I focus on Hardy’s “Quantum Theory from Five Reasonable Axioms” (2001), arguing that reconstructions represent a modern usage of axiomatization with significant points of continuity to von Neumann’s axiomatizations in quantum mechanics. In particular, I show that Hardy and von Neumann share similar methodological ordering, have a common operational framing, and insist on the empirical basis of axioms. In the reconstruction programme, interesting points of discontinuity with historical axiomatizations include the stipulation of a generalized space of theories represented by a framework and the stipulation of analytic machinery at two levels of generality (first by establishing a generalized mathematical framework and then by positing specific formulations of axioms). In light of the reconstruction programme, I show that we should understand axiomatization attempts as being context–dependent, context which is contingent upon the goals of inquiry and the maturity of both mathematical formalism and theoretical underpinnings within the area of inquiry. Drawing on Mitsch (2022)’s account of axiomatization, I conclude that reconstructions should best be understood as provisional, practical, representations of quantum theory that are well suited for theory development and exploration. However, I propose my context–dependent re–framing of axiomatization as a means of enriching Mitsch’s account.
    Found 3 weeks ago on PhilSci Archive
  16. 2381806.880253
    A drawback of the standard modal ontological proof is that it assumes that it is possible that there is something godlike. Kurt Gödel’s ontological proof seeks to establish this possibility with the help of certain axiological principles. But the axiological principles he relies on are not very plausible. And the same goes for other Gödelian ontological proofs in the literature. In this paper, I put forward a Gödelian ontological proof that only relies on plausible axiological principles. And I adopt the proof both for constant and variable domains. Nevertheless, the proof still needs the axiom that being godlike is positive in the sense of being a “purely good”-making property.
    Found 3 weeks, 6 days ago on Johan E. Gustafsson's site
  17. 2542406.880266
    This paper investigates the conditions under which diagonal sentences can be taken to constitute paradigmatic cases of self-reference. We put forward well-motivated constraints on the diagonal operator and the coding apparatus which separate paradigmatic self-referential sentences, for instance obtained via Gödel’s diagonalization method, from accidental diagonal sentences. In particular, we show that these constraints successfully exclude refutable Henkin sentences, as constructed by Kreisel.
    Found 4 weeks, 1 day ago on Volker Halbach's site
  18. 2542474.880279
    We introduce and analyze a new axiomatic theory CD of truth. The primitive truth predicate can be applied to sentences containing the truth predicate. The theory is thoroughly classical in the sense that CD is not only formulated in classical logic, but that the axiomatized notion of truth itself is classical: The truth predicate commutes with all quantifiers and connectives, and thus the theory proves that there are no truth value gaps or gluts. To avoid inconsistency, the instances of the T-schema are restricted to determinate sentences. Determinateness is introduced as a further primitive predicate and axiomatized. The semantics and proof theory of CD are analyzed.
    Found 4 weeks, 1 day ago on Volker Halbach's site
  19. 3134013.880296
    Following Post program, we will propose a linguistic and empirical interpretation of G¨odel’s incompleteness theorem and related ones on unsolvability by Church and Turing. All these theorems use the diagonal argument by Cantor in order to find limitations in finitary systems, as human language, which can make “infinite use of finite means”. The linguistic version of the incompleteness theorem says that every Turing complete language is G¨odel incomplete. We conclude that the incompleteness and unsolvability theorems find limitations in our finitary tool, which is our complete language.
    Found 1 month ago on PhilPapers
  20. 3261078.880307
    We describe a linear time algorithm that determines all “two-vertex bottlenecks” in a directed graph. This gives all pairs of vertices that disconnect two given nodes s and t in a directed graph. There may be quadratically many two-vertex bottlenecks, but a compressed representation allows them to all be determined in linear time. Applications include the determination of Dual Implication Points (DIPs) in the CDCL solver conflict graph, as discussed in Buss, Chung, Ganesh, and Oliveras [preprint, 2024]. The algorithm for finding all DIPs is an algorithm for Menger’s Theorem on a directed graph that not only verifies that two given vertices are not 3-connected but also finds all possible separating vertex pairs.
    Found 1 month ago on Samuel R. Buss's site
  21. 3422696.88032
    This paper is a discussion note on Isaacs et al. 2022, who have claimed to offer a new motivation for imprecise probabilities, based on the mathematical phenomenon of nonmeasurability. In this note, I clarify some consequences of their proposal. In particular, I show that if their proposal is applied to a bounded 3-dimensional space, then they have to reject at least one of the following: • If A is at most as probable as B and B is at most as probable as C, then A is at most as probable as C.
    Found 1 month, 1 week ago on PhilPapers
  22. 3488479.880331
    Rumfitt has given two arguments that in unilateralist verificationist theories of meaning, truth collapses into correct assertibility. The present paper I give similar arguments that show that in unilateral falsificationist theories of meaning, falsehood collapses into correct deniability. According to bilateralism, meanings are determined by assertion and denial conditions, so the question arises whether it succumbs to similar arguments. I show that this is not the case. The final section considers the question whether a principle central to Rumfitt’s first argument, ‘It is assertible that A if and only if it is assertible that it is assertible that A’, is one that bilateralists can reject and concludes that they cannot. It follows that the logic of assertibility and deniability, according to a result by Williamson, is the little known modal logic K4 studied by Soboci ´nski. The paper ends with a plaidoyer for bilateralists to adopt this logic.
    Found 1 month, 1 week ago on Nils Kürbis's site
  23. 3999932.880344
    Discussion of the Aristotelian syllogism over the last sixty years has arguably centered on the question whether syllogisms are inferences or implications. But the significance of this debate at times has been taken to concern whether the syllogistic is a logic or a theory, and how it ought to be represented by modern systems.
    Found 1 month, 2 weeks ago on PhilPapers
  24. 4021421.880356
    The Turing test for machine thought has an interrogator communicate (by typing) with a human and a machine both of which try to convince the interrogator that they are human. The interrogator then guesses which is human. …
    Found 1 month, 2 weeks ago on Alexander Pruss's Blog
  25. 4238712.880382
    I propose a revision of Cantor’s account of set size that understands comparisons of set size fundamentally in terms of surjections rather than injections. This revised account is equivalent to Cantor’s account if the Axiom of Choice is true, but its consequences differ from those of Cantor’s if the Axiom of Choice is false. I argue that the revised account is an intuitive generalization of Cantor’s account, blocks paradoxes—most notably, that a set can be partitioned into a set that is bigger than it—that can arise from Cantor’s account if the Axiom of Choice is false, illuminates the debate over whether the Axiom of Choice is true, is a mathematically fruitful alternative to Cantor’s account, and sheds philosophical light on one of the oldest unsolved problems in set theory.
    Found 1 month, 2 weeks ago on PhilSci Archive
  26. 4296554.880394
    In classical first-order logic (FOL), let T be a theory with an unspecified (arbitrary) constant c, where the symbol c does not occur in any of the axioms of T. Let psi(x) be a formula in the language of T that does not contain the symbol c. In a well-known result due to Shoenfield (the “theorem on constants”), it is proven that if psi(c) is provable in T, then so is psi(x), where x is the only free variable in psi(x). In the proof of this result, Shoenfield starts with the hypothesis that P is a valid proof of psi(c) in T, and then replaces each occurrence of c in P by a variable to obtain a valid proof of psi(x) in T, the argument being that no axiom of T is violated by this replacement. In this paper, we demonstrate that the theorem on constants leads to a meta-inconsistency in FOL (i.e., a logical inconsistency in the metatheory of T in which Shoenfield’s proof is executed), the root cause of which is the existence of arbitrary constants. In previous papers, the author has proposed a finitistic paraconsistent logic (NAFL) in which it is provable that arbitrary constants do not exist. The nonclassical reasons for this nonexistence are briefly examined and shown to be relevant to the above example.
    Found 1 month, 2 weeks ago on PhilSci Archive
  27. 4570973.880406
    How can the Biblical God be the Lord and King who, being typically unseen and even self-veiled at times, authoritatively leads people for divine purposes? This article’s main thesis is that the answer is in divine moral leading via human moral experience of God (of a kind to be clarified). The Hebrew Bible speaks of God as ‘king,’ including for a time prior to the Jewish human monarchy. Ancient Judaism, as Martin Buber has observed, acknowledged direct and indirect forms of divine rule and thus of theocracy. This article explores the importance of divine rule as divine direct leading, particularly in moral matters, without reliance on indirect theocracy supervised by humans. It thus considers a role for God as Über-King superior to any human king, maintaining a direct moral theocracy without a need for indirect theocracy. The divine goal, in this perspective, is a universal commonwealth in righteousness, while allowing for variation in political structure. The article identifies the importance in the Hebrew Bible of letting God be God as an Über-King who, although self-veiled at times, leads willing people directly and thereby rules over them uncoercively. It also clarifies a purpose for divine self-veiling neglected by Buber and many others, and it offers a morally sensitive test for unveiled authenticity in divine moral leading.
    Found 1 month, 3 weeks ago on PhilPapers
  28. 4628731.880422
    This is an attempt to axiomatise the natural laws. Note especially axiom 4, which is expressed in third order predicate logic, and which permits a solution to the problem of causation in nature without stating that “everything has a cause”. The undefined term “difference” constitutes the basic element and each difference is postulated to have an exact position and to have a discrete cause. The set of causes belonging to a natural set of dimensions is defined as a law. This means that a natural law is determined by the discrete causes tied to a natural set of dimensions. A law is defined as “defined” in a point if a difference there has a cause. Given that there is a point for which the law is not defined it is shown that a difference is caused that connects two points in two separate sets of dimensions.
    Found 1 month, 3 weeks ago on PhilPapers
  29. 4628753.880433
    Ontology and theology cannot be combined if ontology excludes non physical causes. This paper examines some possibilities for ontology to be combined with theology in so far as non physical causes are permitted. The paper builds on metaphysical findings that shows that separate ontological domains can interact causally indirectly via interfaces. As interfaces are not universes a first universe is allowed to be caused by an interface without violating the principle of causal closure of any universe. Formal theology can therefore be based on the assumption that the (first) universe is caused by God if God is defined as the first cause. Given this, formal theology and science can have the same ontological base.
    Found 1 month, 3 weeks ago on PhilPapers
  30. 5148487.880444
    This paper develops the idea that valid arguments are equivalent to true conditionals by combining Kripke’s theory of truth with the evidential account of conditionals offered by Crupi and Iacona. As will be shown, in a first-order language that contains a naïve truth predicate and a suitable conditional, one can define a validity predicate in accordance with the thesis that the inference from a conjunction of premises to a conclusion is valid when the corresponding conditional is true. The validity predicate so defined significantly increases our expressive resources and provides a coherent formal treatment of paradoxical arguments.
    Found 1 month, 4 weeks ago on PhilPapers