1. 102136.910718
    Last week, I explained how you can give an accuracy dominance argument for Probabilism without assuming that your inaccuracy measures are additive -- that is, without assuming that the inaccuracy of a whole credence function is obtained by adding up the inaccuracy of all the individual credences that it assigns. …
    Found 1 day, 4 hours ago on M-Phi
  2. 296416.910841
    In a recent paper, Barrio, Tajer and Rosenblatt establish a correspondence between metainferences holding in the strict-tolerant logic of transparent truth ST and inferences holding in the logic of paradox LP . They argue that LP is ST ’s external logic and they question whether ST ’s solution to the semantic paradoxes is fundamentally different from LP ’s. Here we establish that by parity of reasoning, ST can be related to LP ’s dual logic K3 . We clarify the distinction between internal and external logic and argue that while ST ’s nonclassicality can be granted, its self-dual character does not tie it to LP more closely than to K3 .
    Found 3 days, 10 hours ago on David Ripley's site
  3. 311847.910885
    This paper presents a novel typed term calculus and reduction relation for it, and proves that the reduction relation is strongly normalizing—that there are no infinite reduction sequences. The calculus bears a close relation to the →, ¬ fragment of core logic, and so is called ‘core type theory’. This paper presents a novel typed term calculus and reduction relation for it, and proves that the reduction relation is strongly normalizing—that there are no infinite reduction sequences. The calculus is similar to the simply-typed lambda calculus with an empty type, but with a twist. The simply-typed lambda calculus with an empty type bears a close relation to the →, ⊥ fragment of intuitionistic logic ([Howard, ; Scherer, 2017; Sørensen and Urzyczyn, 2006]); the calculus to be presented here bears a similar relation to the →, ¬ fragment of a logic known as core logic. Because of this connection, I’ll call the calculus core type theory.
    Found 3 days, 14 hours ago on David Ripley's site
  4. 393793.910937
    For a PDF of this post, see here.One of the central arguments in accuracy-first epistemology -- the one that gets the project off the ground, I think -- is the accuracy-dominance argument for Probabilism. …
    Found 4 days, 13 hours ago on M-Phi
  5. 611079.910976
    Consumption decisions are partly in‡uenced by values and ideologies. Consumers care about global warming as well as about child labor and fair trade. Incorporating values into the consumer’s utility function will often violate monotonicity, in case consumption hurts cherished values in a way that isn’t offset by the hedonic bene…ts of material consumption. We distinguish between intrinsic and instrumental values, and argue that the former tend to introduce discontinuities near zero. For example, a vegetarian’s preferences would be discontinuous near zero amount of animal meat. We axiomatize a utility representation that captures such preferences and discuss the measurability of the degree to which consumers care about such values.
    Found 1 week ago on Itzhak Gilboa's site
  6. 617603.911014
    The relation between causal structure and cointegration and long-run weak exogeneity is explored using some ideas drawn from the literature on graphical causal modeling. It is assumed that the fundamental source of trending behavior is transmitted from exogenous (and typically latent) trending variables to a set of causally ordered variables that would not themselves display nonstationary behavior if the nonstationary exogenous causes were absent. The possibility of inferring the long-run causal structure among a set of time-series variables from an exhaustive examination of weak exogeneity in irreducibly cointegrated subsets of variables is explored and illustrated.
    Found 1 week ago on Kevin D. Hoover's site
  7. 625461.911058
    In linguistics, the dominant approach to the semantics of plurals appeals to mereology. However, this approach has received strong criticisms from philosophical logicians who subscribe to an alternative framework based on plural logic. In the first part of the article, we offer a precise characterization of the mereological approach and the semantic background in which the debate can be meaningfully reconstructed. In the second part, we deal with the criticisms and assess their logical, linguistic, and philosophical significance. We identify four main objections and show how each can be addressed. Finally, we compare the strengths and shortcomings of the mereological approach and plural logic. Our conclusion is that the former remains a viable and well-motivated framework for the analysis of plurals.
    Found 1 week ago on David Nicolas's site
  8. 945728.911094
    Let L be a sentential (object) language containing atoms ‘A’, ‘B’, . . . , and two logical connectives ‘&’ and ‘→’. In addition to these two logical connectives, L will also contain another binary connective ‘Ž’, which is intended to be interpreted as the English indicative. In the meta-language for L , we will have two meta-linguistic operations: ‘ð’ and ‘`’. ‘ð’ is a binary relation between individual sentences in L . It will be interpreted as “single premise entailment” (or “single premise deducibility inL ”). ‘`’ is a monadic predicate on sentences of L . It will be interpreted as “logical truth of the logic ofL ” (or “theorem of the logic of L We will not presuppose anything about the relationship between ‘ð’ and ‘`’. Rather, we will state explicitly all assumptions about these meta-theoretic relations that will be required for Gibbard’s Theorem. Below, I report a new version of Gibbardian Collapse. First, two preliminary remarks: (a) the “if. . . then” and “and” I’m using in the meta-meta-language of L to state the assumptions of the theorem are assumed to be classical, and (b) these assumptions are all schematic (i.e., they are to be interpreted as allowing any instances that can be formed from sentences of L We begin with eight (8) background assumptions, which are purely formal renditions of some of Gib-bard’s presuppositions in his collapse argument. Think of this as a (very weak) background logic for hŽ, &i.
    Found 1 week, 3 days ago on Branden Fitelson's site
  9. 1101853.911131
    This paper introduces the logic of evidence and truth LETF as an extension of the Belnap-Dunn four-valued logic F DE. LETF is a slightly modified version of the logic LETJ , presented in Carnielli and Rodrigues (2017). While LETJ is equipped only with a classicality operator ○, LETF is equipped with a non-classicality operator ● as well, dual to ○. Both LETF and LETJ are logics of formal inconsistency and undeterminedness in which the operator ○ recovers classical logic for propositions in its scope. Evidence is a notion weaker than truth in the sense that there may be evidence for a proposition α even if α is not true. As well as LETJ , LETF is able to express preservation of evidence and preservation of truth. The primary aim of this paper is to propose a probabilistic semantics for LETF where statements P (α) and P (○α) express, respectively, the amount of evidence available for α and the degree to which the evidence for α is expected to behave classically – or non-classically for P (●α). A probabilistic scenario is paracomplete when P (α) + P (¬α) < 1, and paraconsistent when P (α) + P (¬α) > 1, and in both cases, P (○α) < 1. If P (○α) = 1, or P (●α) = 0, classical probability is recovered for α. The proposition ○α ∨ ●α, a theorem of LETF , partitions what we call the information space, and thus allows us to obtain some new versions of known results of standard probability theory.
    Found 1 week, 5 days ago on PhilSci Archive
  10. 1409071.911168
    Here's a neat little result about Bregman divergences that I just happened upon. It might help to prove some more decomposition theorems along the lines of this classic result by Morris DeGroot and Stephen Fienberg and, more recently, this paper by my colleagues in the computer science department at Bristol. …
    Found 2 weeks, 2 days ago on M-Phi
  11. 1650911.911204
    The Busy Beaver function, with its incomprehensibly rapid growth, has captivated generations of computer scientists, mathematicians, and hobbyists. In this survey, I offer a personal view of the BB function 58 years after its introduction, emphasizing lesser-known insights, recent progress, and especially favorite open problems. Examples of such problems include: when does the BB function first exceed the Ackermann function? Is the value of BB (20) independent of set theory? Can we prove that BB (n + 1) > 2BB(n) for large enough n? Given BB (n), how many advice bits are needed to compute BB (n + 1)? Do all Busy Beavers halt on all inputs, not just the 0 input? Is it decidable whether BB (n) is even or odd?
    Found 2 weeks, 5 days ago on Scott Aaronson's site
  12. 1655138.91124
    The received concepts of axiomatic theory and axiomatic method, which stem from David Hilbert, need a systematic revision in view of more recent mathematical and scientific axiomatic practices, which do not fully follow in Hilbert’s steps and re-establish some older historical patterns of axiomatic thinking in unexpected new forms. In this work I motivate, formulate and justify such a revised concept of axiomatic theory, which for a variety of reasons I call constructive, and then argue that it can better serve as a formal representational tool in mathematics and science than the received concept.
    Found 2 weeks, 5 days ago on PhilSci Archive
  13. 1691463.911277
    Landauer’s principle is, roughly, the principle that there is an en-tropic cost associated with implementation of logically irreversible operations. Though widely accepted in the literature on the thermodynamics of computation, it has been the subject of considerable dispute in the philosophical literature. Both the cogency of proofs of the principle and its relevance, should it be true, have been questioned. In particular, it has been argued that microscale fluctuations entail dissipation that always greatly exceeds the Landauer bound. In this article Landauer’s principle is treated within statistical mechanics, and a proof is given that neither relies on neglect of fluctuations nor assumes the availability of thermodynamically reversible processes. In addition, it is argued that microscale fluctuations are no obstacle to approximating thermodynamic reversibility as closely as one would like.
    Found 2 weeks, 5 days ago on PhilSci Archive
  14. 1695878.911313
    « Is this blog obsolete? The Busy Beaver Frontier A life that was all covid, cancellations, and Trump, all desperate rearguard defense of the beleaguered ideals of the Enlightenment, would hardly be worth living. …
    Found 2 weeks, 5 days ago on Scott Aaronson's blog
  15. 1744089.911348
    The maximum entropy principle is widely used to determine non-committal probabilities on a finite domain, subject to a set of constraints, but its application to continuous domains is notoriously problematic. This paper concerns an intermediate case, where the domain is a first-order predicate language. Two strategies have been put forward for applying the maximum entropy principle on such a domain: (i) applying it to finite sublanguages and taking the pointwise limit of the resulting probabilities as the size n of the sub-language increases; (ii) selecting a probability function on the language as a whole whose entropy on finite sublanguages of size n is not dominated by that of any other probability function for sufficiently large n. The entropy-limit conjecture says that, where these two approaches yield determinate probabilities, the two methods yield the same probabilities. If this conjecture is found to be true, it would provide a boost to the project of seeking a single canonical inductive logic—a project which faltered when Carnap’s attempts in this direction succeeded only in determining a continuum of inductive methods. The truth of the conjecture would also boost the project of providing a canonical characterisation of normal or default models of first-order theories.
    Found 2 weeks, 6 days ago on Jon Williamson's site
  16. 1909199.911384
    This paper discusses the scope and significance of the so-called triviality result stated by Allan Gibbard for indicative conditionals, showing that if a conditional operator satisfies the Law of Import-Export, is supraclassical, and is stronger than the material conditional, then it must collapse to the material conditional. Gib-bard’s result is taken to pose a dilemma for a truth-functional account of indicative conditionals: give up Import-Export, or embrace the two-valued analysis. We show that this dilemma can be averted in trivalent logics of the conditional based on Reichenbach and de Finetti’s idea that a conditional with a false antecedent is undefined. Import-Export and truth-functionality hold without triviality in such logics. We unravel some implicit assumptions in Gibbard’s proof, and discuss a recent generalization of Gibbard’s result due to Branden Fitelson.
    Found 3 weeks, 1 day ago on Lorenzo Rossi's site
  17. 2184498.911419
    Let ⪅ be a qualitative probability comparison for some collection F of subsets of a space Ω. Say that A ≈ B iff A ⪅ B and B ⪅ A, and that A < B provided that A ⪅ B but not B ⪅ A. Minimally suppose that ⪅ is a partial preorder (i.e., transitive and reflexive). …
    Found 3 weeks, 4 days ago on Alexander Pruss's Blog
  18. 2202116.911455
    Questions concerning the proof-theoretic strength of classical versus non-classical theories of truth have received some attention recently. A particularly convenient case study concerns classical and nonclassical axiomatizations of fixed-point semantics. It is known that nonclassical axiomatizations in four- or three-valued logics are substantially weaker than their classical counterparts. In this paper we consider the addition of a suitable conditional to First-Degree Entailment – a logic recently studied by Hannes Leitgeb under the label ‘hype’. We show in particular that, by formulating the theory pkf over hype, one obtains a theory that is sound with respect to fixed-point models, while being proof-theoretically on a par with its classical counterpart kf. Moreover, we establish that also its schematic extension – in the sense of Feferman – is as strong as the schematic extension of kf, thus matching the strength of predicative analysis.
    Found 3 weeks, 4 days ago on Carlo Nicolai's site
  19. 2392969.91149
    Here's a PDF of this blogpost. In yesterday's post, I discussed Leonid Hurwicz's Criterion of Realism. This is a decision rule for situations in which your evidence is so sparse and your uncertainty so great that you cannot assign probabilities to the possible states of the world. …
    Found 3 weeks, 6 days ago on M-Phi
  20. 2445445.911527
    The purpose of this paper is to show that the dual notions of elements & distinctions are the basic analytical concepts needed to unpack and analyze morphisms, duality, and universal constructions in the Sets, the category of sets and functions. The analysis extends directly to other concrete categories (groups, rings, vector spaces, etc.) where the objects are sets with a certain type of structure and the morphisms are functions that preserve that structure. Then the elements & distinctions-based de…nitions can be abstracted in purely arrow-theoretic way for abstract category theory. In short, the language of elements & distinctions is the conceptual language in which the category of sets is written, and abstract category theory gives the abstract arrows version of those de…nitions.
    Found 4 weeks ago on David Ellerman's site
  21. 2558356.911563
    Recently, attention has returned to the now­famous 1932 thought experiment in which John von Neumann establishes the form of the quantum mechanical Von Neumann entropy −Tr ρ ln ρ SVN¿ ), supposedly by arguing for its correspondence with the phenomenological thermodynamic entropy ( STD .) Hemmo and Shenker (2006) reconstruct von Neumann’s thought experiment and argue that it fails to establish this desired correspondence. Prunkl (2019) and Chua (2019) challenge Hemmo and Shenker’s result in turn. This paper aims to provide a new foundation for the current debate by revisiting the original text (von Neumann (1996, 2018)). A thorough exegesis of von Neumann’s cyclical gas transformation is put forth, along with a reconstruction of two additional thought experiments from the text. This closer look reveals that von Neumann’s goal is not to establish a link between S VN and STD , as is assumed throughout the current debate, but rather to establish a correspondence between S VN and the Gibbs statistical mechanical entropy SG . On these grounds I argue that the existing literature misunderstands and misrepresents his goals. A revised understanding is required before the success of von Neumann’s reversible gas transformation can be definitively granted or denied.
    Found 4 weeks, 1 day ago on PhilSci Archive
  22. 2639025.9116
    In 1951, Leonid Hurwicz, a Polish-American economist who would go on to share the Nobel prize for his work on mechanism design, published a series of short notes as part of the Cowles Commission Discussion Paper series, where he introduced a new decision rule for choice in the face of massive uncertainty. …
    Found 1 month ago on M-Phi
  23. 2789666.911637
    This is a critical exploration of the relation between two common assumptions in anti-computationalist critiques of Artificial Intelligence: The first assumption is that at least some cognitive abilities are specifically human and non-computational in nature, whereas the second assumption is that there are principled limitations to what machine-based computation can accomplish with respect to simulating or replicating these abilities. Against the view that these putative differences between computation in humans and machines are closely related, this essay argues that the boundaries of the domains of human cognition and machine computation might be independently defined, distinct in extension and variable in relation. The argument rests on the conceptual distinction between intensional and extensional equivalence in the philosophy of computing and on an inquiry into the scope and nature of human invention in mathematics, and their respective bearing on theories of computation.
    Found 1 month ago on PhilSci Archive
  24. 2962798.911673
    Patrick Suppes’ maxim “to axiomatize a theory is to define a set-theoretical predicate” is usually taking as entailing that the formula that defines the predicate needs to be transportable in the sense of Bourbaki. We argue that this holds for theories, where we need to cope with all structures (the models) satisfying the predicate. For instance, in axiomatizing the theory of groups, we need to grasp all groups. But we may be interested in catching not all structures of a species, but just some of them. In this case, the formula that defines the predicate doesn’t need to be transportable. The study of this question has lead us to a careful consideration of Bourbaki’s definition of transportability, usually not found in the literature. In this paper we discuss this topic with examples, recall the notion of transportable formulas and show that we can have significant set-theoretical predicates for classes of structures defined by non transportable formulas as well.
    Found 1 month ago on PhilSci Archive
  25. 3078783.911708
    In the causal modelling literature, it is well known that “ill-defined” variables may give rise to “ambiguous manipulations” (Spirtes and Scheines, 2004). Here, we illustrate how ill-defined variables may also induce mistakes in causal inference when standard causal search methods are applied (Spirtes et al., ; Pearl, 2009). To address the problem, we introduce a representation framework, which exploits an independent component representation of the data, and demonstrate its potential for detecting ill-defined variables and avoiding mistaken causal inferences.
    Found 1 month ago on PhilSci Archive
  26. 3121997.91175
    It is often loosely said that Ramsey (1931) and de Finetti (1937) proved that if your credences are inconsistent, then you will be willing to accept a Dutch Book, a wager portfolio that is sure to result in a loss. Of course, their theorems are true, but the claim about acceptance of Dutch Books assumes a particular method of calculating expected utilities given the inconsistent credences. I will argue that there are better ways of calculating expected utilities given a potentially inconsistent credence assignment, and that for a large class of credences— a class that includes many inconsistent examples—these ways are immune to Dutch Books and single-shot domination failures. The crucial move is to replace Finite Additivity with Monotonicity (if A ⊆ B, then P (A) ≤ P (B)) and then calculate expected utilities for positive U via the formula R ∞ P (U > y) dy. This shows that Dutch Book arguments for probabilism, the thesis that one’s credences should be consistent, do not establish their conclusion. Finally, I will consider a modified argument based on multi-step domination failure that does better, but nonetheless is not as compelling as the Dutch Book arguments appeared to be.
    Found 1 month ago on Alexander R. Pruss's site
  27. 3136495.911788
    The paper is about ‘absolute logic’: an approach to logic that differs from the standard first-order logic and other known approaches. It should be a new approach the author has created proposing to obtain a general and unifying approach to logic and a faithful model of human mathematical deductive process. In first-order logic there exist two different concepts of term and formula, in place of these two concepts in our approach we have just one notion of expression. The set-builder notation is enclosed as an expression-building pattern. In our system we can easily express second-order, third order and any-order conditions. The meaning of a sentence will depend solely on the meaning of the symbols it contains, it will not depend on external ‘structures’. Our deductive system is based on a very simple definition of proof and provides a good model of human mathematical deductive process. The soundness and consistency of the system are proved. We also discuss how our system relates to the most know types of paradoxes, from the discussion no specific vulnerability to paradoxes comes out. The paper provides both the theoretical material and a fully documented example of deduction.
    Found 1 month ago on PhilSci Archive
  28. 3192708.911873
    in the article, an argument is given that Euclidean geometry is a priori in the same way that numbers are a priori, the result of modelling, not the world, but our activities in the world. keywords: symmetry, Euclidean geometry, axioms, Weyl's axioms, philosophy of geometry Until the appearance of non-Euclidean geometries, Euclidean geometry and numbers had an equal status in mathematics. Indeed, until then, mathematics was described as the science of numbers and space. Whether it was thought that mathematical objects belong to a special world of ideas(Plato), or that they are ultimate abstractions drawn from the real world (Aristotle),or that they are a priori forms of our rational cognition (Kant), mathematical truths were considered, because of the clarity of their subject matter, a priori objective truths that are not subject to experimental verication. Descartes in Meditations (1641) writes: "I counted as the most certain the truths which I concieved clearly as regards gures, numbers, and other matters which pertain to arithmetic and geometry, and, in general to pure and abstract mathematics.". Even Hume considered mathematics to be a non-empirical science that deals not with facts but with relations of ideas.
    Found 1 month ago on PhilPapers
  29. 3345369.911911
    We explore some possibilities for developing epistemic logic using truthmaker semantics. We identify three possible targets of analysis for the epistemic logician. We then list some candidate epistemic principles and review the arguments that render some controversial. We then present the classic Hintikkan approach to epistemic logic and note - as per the ‘problem of logical omniscience’ - that it validates all of the aforementioned principles, controversial or otherwise. We then lay out a truthmaker framework in the style of Kit Fine and present six different ways of extending this semantics with a conditional knowledge operator, drawing on notions of implication and content that are prominent in Fine’s work. We demonstrate that different logics are thereby generated, bearing on the aforementioned epistemic principles. Finally, we offer preliminary observations about the prospects for each logic.
    Found 1 month, 1 week ago on Aybüke Özgün's site
  30. 3424774.911947
    One of the central questions of Bayesian epistemology concerns how you should update your credences in response to new evidence you obtain. The proposal I want to discuss here belongs to an approach that consists of two steps. …
    Found 1 month, 1 week ago on M-Phi