1. 371833.961075
    In a series of recent papers, I presented a puzzle and theory of definition.2 I did not, however, indicate how the theory resolves the puzzle. This was an oversight, on my part, and one I hope to correct. My aim here is to provide that resolution: to demonstrate that my theory can consistently embrace the principles I prove to be inconsistent. To the best of my knowledge, this theory is the only one capable of this embrace—which marks yet another advantage it has over competitors.
    Found 4 days, 7 hours ago on PhilPapers
  2. 413626.961167
    A metainference is usually understood as a pair consisting of a collection of inferences, called premises, and a single inference, called conclusion. In the last few years, much attention has been paid to the study of metainferences—and, in particular, to the question of what are the valid metainferences of a given logic. So far, however, this study has been done in quite a poor language. Our usual sequent calculi have no way to represent, e.g. negations, disjunctions or conjunctions of inferences. In this paper we tackle this expressive issue. We assume some background sentential language as given and define what we call an inferential language, that is, a language whose atomic formulas are inferences. We provide a model-theoretic characterization of validity for this language—relative to some given characterization of validity for the background sentential language—and provide a proof-theoretic analysis of validity. We argue that our novel language has fruitful philosophical applications. Lastly, we generalize some of our definitions and results to arbitrary metainferential levels.
    Found 4 days, 18 hours ago on Buenos Aires Logic Group
  3. 599330.961203
    Peirce’s diagrammatic system of Existential Graphs (EGα) is a logical proof system corresponding to the Propositional Calculus (P L). Most known proofs of soundness and completeness for EGα depend upon a translation of Peirce’s diagrammatic syntax into that of a suitable Frege-style system. In this paper, drawing upon standard results but using the native diagrammatic notational framework of the graphs, we present a purely syntactic proof of soundness, and hence consistency, for EGα, along with two separate completeness proofs that are constructive in the sense that we provide an algorithm in each case to construct an EGα formal proof starting from the empty Sheet of Assertion, given any expression that is in fact a tautology according to the standard semantics of the system.
    Found 6 days, 22 hours ago on PhilSci Archive
  4. 716879.961221
    The paper investigates from a proof-theoretic perspective various noncontractive logical systems circumventing logical and semantic paradoxes. Until recently, such systems only displayed additive quantifiers (Grišin, Cantini). Systems with multiplicative quantifers have also been proposed in the 2010s (Zardini), but they turned out to be inconsistent with the naive rules for truth or comprehension. We start by presenting a first-order system for disquotational truth with additive quantifiers and we compare it with Grišin set theory. We then analyze the reasons behind the inconsistency phenomenon affecting multiplicative quantifers: after interpreting the exponentials in affine logic as vacuous quantifiers, we show how such a logic can be simulated within a truth-free fragment of a system with multiplicative quantifiers. Finally, we prove that the logic of these multiplicative quantifiers (but without disquotational truth) is consistent, by showing that an infinitary version of the cut rule can be eliminated. This paves the way to a syntactic approach to the proof theory of infinitary logic with infinite sequents.
    Found 1 week, 1 day ago on Carlo Nicolai's site
  5. 733699.961235
    We present completeness results for inference in Bayesian networks with respect to two different parameterizations, namely the number of variables and the topological vertex separation number. For this we introduce the parameterized complexity classes W[1]PP and XLPP, which relate to W[1] and XNLP respectively as PP does to NP. The second parameter is intended as a natural translation of the notion of pathwidth to the case of directed acyclic graphs, and as such it is a stronger parameter than the more commonly considered treewidth. Based on a recent conjecture, the completeness results for this parameter suggest that deterministic algorithms for inference require exponential space in terms of pathwidth and by extension treewidth. These results are intended to contribute towards a more precise understanding of the parameterized complexity of Bayesian inference and thus of its required computational resources in terms of both time and space. Keywords: Bayesian networks; inference; parameterized complexity theory.
    Found 1 week, 1 day ago on Johan Kwisthout's site
  6. 880891.961249
    According to the standard analysis of degree questions (see, among others, Rullmann 1995 and Beck and Rullmann 1997), a degree question’s LF contains a variable that ranges over individual degrees and is bound by the degree-question operator how. In contrast with this, we claim that the variable bound by the degree-question operator how does not range over individual degrees but over intervals of degrees, by analogy with Schwarzschild and Wilkinson’s (2002) proposal regarding the semantics of comparative clauses. Not only does the interval-based semantics predict the existence of certain readings that are not predicted under the standard view, it is also able, together with other natural assumptions, to account for the sensitivity of degree questions to negative-islands, as well as for the fact, uncovered by Fox and Hackl (2007), that negative islands can be obviated by some properly placed modals. Like Fox and Hackl (2007), we characterize negative island effects as arising from the fact that the relevant question, due to its meaning alone, can never have a maximally informative answer. Contrary to Fox and Hackl (2007), however, we do not need to assume that scales are universally dense, nor that the notion of maximal informativity responsible for negative islands is blind to contextual parameters.
    Found 1 week, 3 days ago on Benjamin Spector's site
  7. 1262509.961262
    Marton (2019) argues that that it follows from the standard antirealist theory of truth, which states that truth and possible knowledge are equivalent, that knowing possibilities is equivalent to the possibility of knowing, whereas these notions should be distinct. Moreover, he argues that the usual strategies of dealing with the Church-Fitch paradox of knowability are either not able to deal with his modal-epistemic collapse result or they only do so at a high price. Against this, I argue that Marton’s paper does not present any seriously novel challenge to anti-realism not already found in the Church-Fitch result. Furthermore, Edgington (1985)’s reformulated antirealist theory of truth can deal with his modal-epistemic collapse argument at no cost.
    Found 2 weeks ago on PhilPapers
  8. 1292220.961275
    Scoring rules measure the accuracy or epistemic utility of a credence assignment. A significant literature uses plausible conditions on scoring rules on finite sample spaces to argue for both probabilism— the doctrine that credences ought to satisfy the axioms of probabilism— and for the optimality of Bayesian update as a response to evidence. I prove a number of formal results regarding scoring rules on infinite sample spaces that impact the extension of these arguments to infinite sample spaces. A common condition in the arguments for probabilism and Bayesian update is strict propriety: that according to each probabilistic credence, the expected accuracy of any other credence is worse. Much of the discussion needs to divide depending on whether we require finite or countable additivity of our probabilities. I show that in a number of natural infinite finitely additive cases, there simply do not exist strictly proper scoring rules, and the prospects for arguments for probabilism and Bayesian update are limited. In many natural infinite countably additive cases, on the other hand, there do exist strictly proper scoring rules that are continuous on the probabilities, and which support arguments for Bayesian update, but which do not support arguments for probabilism. There may be more hope for accuracy-based arguments if we drop the assumption that scores are extended-real-valued. I sketch a framework for scoring rules whose values are nets of extended reals, and show the existence of a strictly proper net-valued scoring rules in all infinite cases, both for f.a. and c.a. probabilities. These can be used in an argument for Bayesian update, but it is not at present known what is to be said about probabilism in this case.
    Found 2 weeks ago on PhilSci Archive
  9. 1295693.961288
    Antirealists who hold the knowability thesis, namely that all truths are knowable, have been put on the defensive by the Church-Fitch paradox of knowability. Rejecting the non-factivity of the concept of knowability used in that paradox, Edgington has adopted a factive notion of knowability, according to which only actual truths are knowable. She has used this new notion to reformulate the knowability thesis. The result has been argued to be immune against the Church-Fitch paradox, but it has encountered several other triviality objections. Schlöder in a forthcoming paper defends the general approach taken by Edgington, but amends it to save it in turn from the triviality objections. In this paper I will argue, first, that Schlöder’s justification for the factivity of his version of the concept of knowability is vulnerable to criticism, but I will also offer an improved justification that is in the same spirit as his. To the extent that some philosophers are right about our intuitive concept of knowability being a factive one, it is important to explore factive concepts of knowability that are made formally precise. I will subsequently argue that Schlöder’s version of the knowability thesis overgenerates knowledge or, in other words, it leads to attributions of knowledge where there is ignorance. This fits a general pattern for the research programme initiated by Edgington. This paper also contains preliminary investigations into the internal and logical structure of lines of inquiries, which raise interesting research questions.
    Found 2 weeks ago on PhilPapers
  10. 1639312.961301
    One of the main criticisms of the theory of collections of indiscernible objects is that once we quantify over one of them, we are quantifying over all of them since they cannot be discerned from one another. In this way, we would call the collapse of quantifiers: ‘There exists one x such as P ’ would entail ‘All x are P ’. In this paper we argue that there are situations (quantum theory is the sample case) where we do refer to a certain quantum entity, saying that it has a certain property, even without committing all other indistinguishable entities with the considered property. Mathematically, within the realm of the theory of quasi-sets Q, we can give sense to this claim. We show that the above-mentioned ‘collapse of quantifiers’ depends on the interpretation of the quantifiers and on the mathematical background where they are ranging. In this way, we hope to strengthen the idea that quantification over indiscernibles, in particular in the quantum domain, does not conform with quantification in the standard sense of classical logic. Keywords: quantification, quantum logic, indiscernibility, identity, in- discernible objects.
    Found 2 weeks, 4 days ago on PhilSci Archive
  11. 2118077.961313
    Tables are widely used for storing, retrieving, communicating, and processing information, but in the literature on the study of representations they are still somewhat neglected. The strong structural constraints on tables allow for a clear identification of their characteristic features and the roles these play in the use of tables as representational and cognitive tools. After introducing syntactic, spatial, and semantic features of tables, we give an account of how these affect our perception and cognition on the basis of fundamental principles of Gestalt psychology. Next are discussed the ways in which these features of tables support their uses in providing a global access to information, retrieving information, and visualizing relational structure and patterns. The latter is particularly important, because it shows how tables can contribute to the generation of new knowledge. In addition, tables also provide efficient means for manipulating information in general and in structured notations. In sum, tables are powerful and efficient representational tools.
    Found 3 weeks, 3 days ago on Dirk Schlimm's site
  12. 2198435.961326
    Some years ago, Charles Petzold published his The Annotated Turing which, as its subtitle tells us, provides a guided tour through Alan Turing’s epoch-making 1936 paper. I was prompted at the time to wonder about putting together a similar book, with an English version of Gödel’s 1931 paper interspersed with explanatory comments and asides. …
    Found 3 weeks, 4 days ago on Peter Smith's blog
  13. 2216617.961339
    The purpose of this paper is to show that the mathematics of quantum mechanics (QM) is the mathematics of set partitions (which specify indefiniteness and definiteness) linearized to vector spaces, particularly in Hilbert spaces. The key analytical concepts are definiteness versus indefiniteness, distinctions versus indistinctions, and distinguishability versus indistinguishability. The key machinery to go from indefinite to more definite states is the partition join operation at the set level that prefigures at the quantum level projective measurement as well as the formation of maximally-definite state descriptions by Dirac’s Complete Sets of Commuting Operators (CSCOs). The mathematics of partitions is first developed in the context of sets and then linearized to vector spaces where it is shown to provide the mathematical framework for quantum mechanics. This development is measured quantitatively by logical entropy at the set level and by quantum logical entropy at the quantum level. This follow-the-math approach supports the Literal Interpretation of QM–as advocated by Abner Shimony among others which sees a reality of objective indefiniteness that is quite different from the common sense and classical view of reality as being “definite all the way down.” Keywords: partitions, direct-sum-decompositions, partition join, objective indefiniteness, definite-all-the-way-down..
    Found 3 weeks, 4 days ago on PhilSci Archive
  14. 2335949.961353
    Meyer and Mortensen’s Alien Intruder Theorem includes the extraordinary observation that the rationals can be extended to a model of the relevant arithmetic R , thereby serving as integers themselves. Although the mysteriousness of this observation is acknowledged, little is done to explain why such rationals-as-integers exist or how they operate. In this paper, we show that Meyer and Mortensen’s models can be identified with a class of ultraproducts of finite models of R , providing insights into some of the more mysterious phenomena of the rational models.
    Found 3 weeks, 6 days ago on PhilPapers
  15. 2451839.961366
    In this thesis, I develop and investigate various novel semantic frameworks for deontic logic. Deontic logic concerns the logical aspects of normative reasoning. In particular, it concerns reasoning about what is required, allowed and forbidden. I focus on two main issues: free-choice reasoning and the role of norms in deontic logic. Free-choice reasoning concerns permissions and obligations that offer choices between different actions. Such permissions and obligations are typically expressed by a disjunctive clause in the scope of a deontic operator. For instance, the sentence "Jane may take an apple or a pear" intuitively offers Jane a choice between two permitted courses of action: she may take an apple, and she may take a pear. In the first part of the thesis, I develop semantic frameworks for deontic logic that account for free-choice reasoning. I show that the resulting logics avoid problems that arise for other logical accounts of free-choice reasoning. The main technical contributions are completeness results for axiomatizations of the different logics.
    Found 4 weeks ago on PhilPapers
  16. 2544048.961378
    Over the decades of Bob Meyer’s prodigious career as philosopher and logician, a topic to which he reliably—if intermittently—returned is relevant arithmetic. Fragmented across a series of abstracts, technical reports, and journal articles Meyer outlined a research program in nonclassical mathematics that rivals that of the intuitionists in its maturity, depth, and perspicacity.
    Found 4 weeks, 1 day ago on PhilPapers
  17. 2544090.961391
    We assess Meyer’s formalization of arithmetic in his [21], based on the strong relevant logic R and compare this with arithmetic based on a suitable logic of meaning containment, which was developed in Brady [7]. We argue in favour of the latter as it better captures the key logical concepts of meaning and truth in arithmetic. We also contrast the two approaches to classical recapture, again favouring our approach in [7]. We then consider our previous development of Peano arithmetic including primitive recursive functions, finally extending this work to that of general recursion.
    Found 4 weeks, 1 day ago on PhilPapers
  18. 2568212.961404
    In this paper we introduce a novel way of building arithmetics whose background logic is R. The purpose of doing this is to point in the direction of a novel family of systems that could be candidates for being the infamous R#12 that Meyer suggested we look for.
    Found 4 weeks, 1 day ago on PhilPapers
  19. 2568338.961417
    The bibliography appearing below collects the publications in which Meyer’s investigations into relevant arithmetic saw print. Each bibliographic item is accompanied by a short description of the text or other remarks. We include papers on relevant arithmetic coauthored by Meyer, but omit both Meyer’s work on relevant logic and the work published independently by his collaborators.
    Found 4 weeks, 1 day ago on PhilPapers
  20. 2568478.96143
    The system R of first-order relevant arithmetic was introduced in [12], as the result of adding the (first-order version of the) Peano postulates to relevant predicate calculus RQ. The following model was exhibited to show the system non-trivial (thus partially circumventing Gödel’s Second Theorem). We pick as our domain D of objects the integers mod 2, with +, ·, 0 interpreted in the obvious way; on this plan, the successor operation is evidently interpreted so that 0 = 1 and 1 = 0. As our collection V of truth-values we pick the set 3 = {T, N, F}, with sentential connective &, ∨, ∼, → defined on the (classical) subset 2 = {T, F} in the usual classical way. To complete the definition of connectives on 3, we define
    Found 4 weeks, 1 day ago on PhilPapers
  21. 2858105.961442
    We discuss a well-known puzzle about the lexicalization of logical operators in natural language, in particular connectives and quantifiers. Of the many logically possible operators, only few appear in the lexicon of natural languages: the connectives in English, for example, are conjunction and, disjunction or, and negated disjunction nor; the lexical quantifiers are all, some and no. The logically possible nand (negated conjunction) and Nall (negated universal) are not expressed by lexical entries in English, nor in any natural language. Moreover, the lexicalized operators are all upward or downward monotone, an observation known as the Monotonicity Universal. We propose a logical explanation of lexical gaps and of the Monotonicity Universal, based on the dynamic behaviour of connectives and quantifiers. We define update potentials for logical operators as procedures to modify the context, under the assumption that an update by / depends on the logical form of / and on the speech act performed: assertion or rejection. We conjecture that the adequacy of update potentials determines the limits of lexicalizability for logical operators in natural language. Finally, we show that on this framework the
    Found 1 month ago on Luca Incurvati's site
  22. 2914619.961455
    When are two formal theories of broadly logical concepts, such as truth, equivalent? The paper investigates a case study, involving two well-known variants of Kripke– Feferman truth. The first, KF + CONS, features a consistent but partial truth predicate. The second, KF + COMP, an inconsistent but complete truth predicate. It is known that the two truth predicates are dual to each other. We show that this duality reveals a much stricter correspondence between the two theories: they are intertraslatable. Intertranslatability, under natural assumptions, coincides with definitional equivalence, and is arguably the strictest notion of theoretical equivalence different from logical equivalence. The case of KF + CONS and KF + COMP raises a puzzle: the two theories can be proved to be strictly related, yet they appear to embody remarkably different conceptions of truth. We discuss the significance of the result for the broader debate on formal criteria of conceptual reducibility for theories of truth.
    Found 1 month ago on Carlo Nicolai's site
  23. 2937446.961476
    The paper investigates the impact of weakened forms of transitivity of the betterness relation on the logic of conditional obligation, originating from the work of Hansson, Lewis, and others. These weakened forms of transitivity come from the rational choice literature, and include: quasi-transitivity, Suzumura consistency, a-cyclicity, and the interval order condition. The first observation is that plain transitivity, quasi-transitivity, acyclicity and Suzumura consistency make less difference to the logic of ○(−/−) than one would have thought. The axiomatic system remains the same whether or not these conditions are introduced. The second observation is that unlike the others the interval order condition corresponds to a new axiom, known as the principle of disjunctive rationality. These two observations are substantiated further through the establishment of completeness (or representation) theorems.
    Found 1 month ago on X. Parent's site
  24. 2968855.961492
    Since the publication of Allais (1953), presenting the famous “Allais paradox,’ the Expected Utility (EU) theory of von-Neumann—Morgenstern (vNM) (1947) has been the subject of numerous generalizations designed to cope with the difficulties it raised (and with other difficulties as well). It is beyond the scope of this paper to provide even the briefest survey of these generalizations, some of the most recent of which are Kahneman and Tversky (1979), Machina (1982), Quiggin (1982), Yaari (1987), Sega] (1984), Chew (1984), Fishburn (1985, and many others), and Dekel (1986) (see Machina (1987) for a survey of the literature). This paper suggests yet another generalization of EU theory which may explain the Allais paradox. Our purpose is to provide an axiomatically based model which satisfies the following requirements: (a) It is not too general, i.e., not more than necessary in order to explain the Allais paradox and its variations. The model should not allow violations of EU theory which are not supported by empirical results (nor by intuitive reasoning) as those supporting the Allais paradox.
    Found 1 month ago on Itzhak Gilboa's site
  25. 2969486.961505
    Some duality problems in expected utility thcory, raised by the introduction of non-additive probabilities, are examined. These problems do not arise if the probability measure is symmetric; i.e. has the property of complementary additiviry. Additional, mild properties of coherence of conditional probabilities imply full additivity of the unconditional measure.
    Found 1 month ago on Itzhak Gilboa's site
  26. 3158869.961521
    Edmund Landau’s conjecture states that the set Pn +1 of primes of the form n2 + 1 is infinite. Landau’s conjecture implies the following unproven statement Φ: card(Pn +1) < ω ⇒ Pn +1 ⊆ [2, (((24!)!)!)!]. We heuristically justify the statement Φ. This justification does not yield the finiteness/infiniteness of Pn +1. We present a new heuristic argument for the infiniteness of Pn +1, which is not based on the statement Φ. The distinction between algorithms whose existence is provable in ZFC and constructively defined algorithms which are currently known inspires statements and open problems on decidable sets X ⊆ N that contain informal notions and refer to the current knowledge on X.
    Found 1 month ago on PhilSci Archive
  27. 3158886.961534
    We introduce Arbitrary Public Announcement Logic with Memory (APALM), obtained by adding to the models a ‘memory’ of the initial states, representing the information before any communication took place (“the prior”), and adding to the syntax operators that can access this memory. We show that APALM is recursively axiomatizable (in contrast to the original Arbitrary Public Announcement Logic, for which the corresponding question is still open). We present a complete recursive axiomatization, that includes a natural finitary rule, and study this logic’s expressivity and the appropriate notion of bisimulation. We then examine Group Announcement Logic with Memory (GALM), the extension of APALM obtained by adding to its syntax group announcement operators, and provide a complete finitary axiomatization (again in contrast to the original Group Announcement Logic, for which the only known axiomatization is infinitary). We also show that, in the memory-enhanced context, there is a natural reduction of the so-called coalition announcement modality to group announcements (in contrast to the memory-free case, where this natural translation was shown to be invalid).
    Found 1 month ago on Aybüke Özgün's site
  28. 3854279.961547
    Consider a fair spinner that uniformly chooses an angle between 0 and 360∘. Intuitively, I’ve just fully described a probabilistic situation. In classical probability theory, there is indeed a very natural model of this: Lebesgue probability measure on the unit circle. …
    Found 1 month, 2 weeks ago on Alexander Pruss's Blog
  29. 3973423.961561
    I describe two approaches to modelling the universe, the one having its origin in topos theory and differential geometry, the other in set theory. The first is synthetic differential geometry. Traditionally, there have been two methods of deriving the theorems of geometry: the analytic and the synthetic. While the analytical method is based on the introduction of numerical coordinates, and so on the theory of real numbers, the idea behind the synthetic approach is to furnish the subject of geometry with a purely geometric foundation in which the theorems are then deduced by purely logical means from an initial body of postulates.
    Found 1 month, 2 weeks ago on John Bell's site
  30. 4000826.961573
    Let’s suppose you think that there are no uncountable sets. Have you adopted a restrictive position? It is certainly tempting to say yes—you’ve prohibited the existence of certain kinds of large set. This paper argues that this intuition can be challenged. Instead, I argue that there are some considerations based on a formal notion of restrictiveness which suggest that it is restrictive to hold that there are uncountable sets.
    Found 1 month, 2 weeks ago on PhilPapers