1. 120980.104466
    A trivalent theory of indicative conditionals automatically enforces Stalnaker’s thesis— the equation between probabilities of conditionals and conditional probabilities. This result holds because the trivalent semantics requires, for principled reasons, a modification of the ratio definition of conditional probability in order to accommodate the possibility of undefinedness. I analyze precisely how this modification allows the trivalent semantics to avoid a number of well-known triviality results, in the process clarifying why these results hold for many bivalent theories. I suggest that the slew of triviality published in the last 40-odd years need not be viewed as an argument against Stalnaker’s thesis: it can be construed instead as an argument for abandoning the bivalent requirement that conditionals somehow be assigned a truth-value in worlds in which their antecedents are false.
    Found 1 day, 9 hours ago on Daniel Lassiter's site
  2. 136583.104514
    Historians recently rehabilitated Einstein’s “physical strategy” for General Relativity (GR). Independently, particle physicists similarly re-derived Einstein’s equations for a massless spin 2 field. But why not a light massive spin 2, like Neumann and Seeliger did to Newton? Massive gravities are bimetric, supporting conventionalism over geometric empiricism. Nonuniqueness lets field equations explain geometry but not vice versa. Massive gravity would have blocked Schlick’s critique of Kant’s synthetic a priori. Finally in 1970 massive spin 2 gravity seemed unstable or empirically falsified. GR was vindicated, but later and on better grounds. However, recently dark energy and theoretical progress have made massive spin 2 gravity potentially viable again.
    Found 1 day, 13 hours ago on PhilSci Archive
  3. 136655.104531
    In a series of papers Colbeck and Renner (2011, 2015a,b) claim to have shown that the quantum state provides a complete description for the prediction of future measurement outcomes. In this paper I argue that thus far no solid satisfactory proof has been presented to support this claim. Building on the earlier work of Leifer (2014), Landsman (2015) and Leegwater (2016), I present and prove two results that only partially support this claim. I then discuss the arguments by Colbeck, Renner and Leegwater concerning how these results are to generalize to the full claim. This argument turns out to hinge on the implicit use of an assumption concerning the way unitary evolution is to be represented in any possible completion of quantum mechanics. I argue that this assumption is unsatisfactory and that possible attempts to validate it based on measurement theory also do not succeed.
    Found 1 day, 13 hours ago on PhilSci Archive
  4. 151937.104546
    Semantic universals are properties of meaning shared by the languages of the world. We offer an explanation of the presence of such universals by measuring simplicity in terms of ease of learning, showing that expressions satisfying universals are simpler than those that do not according to this criterion. We measure ease of learning using tools from machine learning and analyze universals in a domain of function words (quantifiers) and content words (color terms). Our results provide strong evidence that semantic universals across both function and content words reflect simplicity as measured by ease of learning.
    Found 1 day, 18 hours ago on Jakub Szymanik's site
  5. 194794.10456
    Following Reichenbach, it is widely held that in making a direct inference, one should base one’s conclusion on a relevant frequency statement concerning the most specific reference class for which one is able to make a warranted and relatively precise-valued frequency judgment. In cases where one has accurate and precise-valued frequency information for two relevant reference classes, R1 and R2, and one lacks accurate and precise-valued frequency information concerning their intersection, R R2, it is widely held, following Reichenbach, that no inference may be drawn. In contradiction to Reichenbach and the common wisdom, I argue for the view that it is often possible to draw a reasonable informative conclusion, in such circumstances. As a basis for drawing such a conclusion, I show that one is generally in a position to formulate a reasonable direct inference for a reference class that is more specific than either of R1 and R2.
    Found 2 days, 6 hours ago on PhilSci Archive
  6. 234047.104579
    In our representations of the world, especially in physics, (mathematical) infinities play a crucial role. The continuum of the real numbers, \(\Re\), as a representation of time or of one-dimensional space is surely the best known example and, by extension, the \(n\)-fold cartesian product, \(\Re^{n}\), for \(n\)-dimensional space. However, these same infinities also cause problems. One just has to think about Zeno’s paradoxes or the present-day continuation of that discussion, namely the discussion about supertasks, to see the difficulties (see the entry on supertasks in this encyclopedia for a full treatment).
    Found 2 days, 17 hours ago on Wes Morriston's site
  7. 297012.104594
    The philosopher wrote: The big move in the statistics wars these days is to fight irreplication by making it harder to reject, and find evidence against, a null hypothesis. Mayo is referring to, among other things, the proposal to “redefine statistical significance” as p less than 0.005. …
    Found 3 days, 10 hours ago on D. G. Mayo's blog
  8. 299624.10461
    We study dynamic multi-agent systems (dmass). These are multi-agent systems with explicitly dynamic features, where agents can join and leave the system during the evolution. We propose a general conceptual framework for modelling such dmass and argue that it can adequately capture a variety of important and representative cases. We then present a concrete modelling framework for a large class of dmass, composed in a modular way from agents specified by means of automata-based representations. We develop generic algorithms implementing the dynamic behaviour, namely addition and removal of agents in such systems. Lastly, we state and discuss several formal verification tasks that are specific for dmass and propose general algorithmic solutions for the class of automata representable dmass.
    Found 3 days, 11 hours ago on Valentin Goranko's site
  9. 310915.104626
    The “Cosmological Constant Problem” (CCP) has historically been understood as describing a conflict between cosmological observations in the framework of general relativity (GR) and theoretical predictions from quantum field theory (QFT), which a future theory of quantum gravity ought to resolve. I argue that this view of the CCP is best understood in terms of a bet about future physics made on the basis of particular interpretational choices in GR and QFT respectively. Crucially, each of these choices must be taken as itself grounded in the success of the respective theory for this bet to be justified.
    Found 3 days, 14 hours ago on PhilSci Archive
  10. 404378.10464
    It is widely recognized that the process used to make observations often has a significant effect on how hypotheses should be evaluated in light of those observations. Arthur Stanley Eddington (1939, Ch. II) provides a classic example. You’re at a lake and are interested in the size of the fish it contains. You know, from testimony, that at least some of the fish in the lake are big (i.e., at least 10 inches long), but beyond that you’re in the dark. You devise a plan of attack: get a net and use it to draw a sample of fish from the lake. You carry out your plan and observe: O : 100% of the fish in the net are big.
    Found 4 days, 16 hours ago on Philosopher's Imprint
  11. 492125.104657
    « A rare classified ad Paul Bernays Lectures Last week, I had the honor of giving the annual Paul Bernays Lectures at ETH Zürich. My opening line: “as I look at the list of previous Bernays Lecturers—many of them Nobel physics laureates, Fields Medalists, etc.—I think to myself, how badly did you have to screw up this year in order to end up with me?” Paul Bernays was the primary assistant to David Hilbert, before Bernays (being Jewish by birth) was forced out of Göttingen by the Nazis in 1933. …
    Found 5 days, 16 hours ago on Scott Aaronson's blog
  12. 561980.10467
    Standard decision theory has trouble handling cases involving acts without finite expected values. This paper has two aims. First, building on earlier work by Colyvan (2008), Easwaran (2014), and Lauwers & Vallentyne (2016), it develops a proposal for dealing with such cases, Difference Minimizing Theory. Difference Minimizing Theory provides satisfactory verdicts in a broader range of cases than its predecessors. And it vindicates two highly plausible principles of standard decision theory, Stochastic Equivalence and Stochastic Dominance. The second aim is to assess some recent arguments against Stochastic Equivalence and Stochastic Dominance. If successful, these arguments refute Difference Minimizing Theory. This paper contends that these arguments are not successful.
    Found 6 days, 12 hours ago on PhilPapers
  13. 657008.104683
    Traditional oppositions are at least two-dimensional in the sense that they are built based on a famous bidimensional object called square of oppositions and on one of its extensions such as Blanche’s hexagon. Instead of two-dimensional objects, this article proposes a construction to deal with oppositions in a one-dimensional line segment.
    Found 1 week ago on PhilSci Archive
  14. 786601.104697
    The study of iterated belief change has principally focused on revision, with the other main operator of AGM belief change theory, namely contraction, receiving comparatively little attention. In this paper we show how principles of iterated revision can be carried over to iterated contraction by generalising a principle known as the ‘Harper Identity’. The Harper Identity provides a recipe for defining the belief set resulting from contraction by a sentence A in terms of (i) the initial belief set and (ii) the belief set resulting from revision by ¬A.
    Found 1 week, 2 days ago on Jake Chandler's site
  15. 825621.104711
    In ‘Essence and Modality ’, Kit Fine (1994) proposes that for a proposition to be metaphysically necessary is for it to be true in virtue of the nature of all objects. Call this view Fine’s Thesis. This paper is a study of Fine’s Thesis in the context of Fine’s logic of essence (LE). Fine himself has offered his most elaborate defence of the thesis in the context of LE. His defence rests on the widely shared assumption that metaphysical necessity obeys the laws of the modal logic S5. In order to get S5 for metaphysical necessity, he assumes a controversial principle about the nature of all objects. I will show that the addition of this principle to his original system E5 leads to inconsistency with an independently plausible principle about essence. In response, I develop a theory that avoids this inconsistency while allowing us to maintain S5 for metaphysical necessity. However, I conclude that our investigation of Fine’s Thesis in the context of LE motivates the revisionary conclusion that metaphysical necessity obeys the principles of the modal logic S4, but not those of S5. I argue that this constitutes a distinctively essentialist challenge to the received view that the logic of metaphysical necessity is S5.
    Found 1 week, 2 days ago on Andreas Ditter's site
  16. 887614.104726
    According to Jens Høyrup, the propositions 1 to 10 of book 2 of Euclid’s Elements function as a critique of previous non-rigorous procedures of Old Babylonian mathematics. Høyrup’s remarks on his notion of critique are disseminated throughout his works. Here, we take them into account to make an integrated presentation of the notion of critique that also looks to reveal features left implicit in Høyrup’s account.
    Found 1 week, 3 days ago on PhilSci Archive
  17. 887707.104743
    It is generally assumed that relations of necessity cannot be known by induction on experience. In this paper, I propose a notion of situated possibilities, weaker than nomic possibilities, that is compatible with an inductivist epistemology for modalities. I show that assuming this notion, not only can relations of necessity be known by induction on our experience, but such relations cannot be any more underdetermined by experience than universal regularities. This means that any one believing in a universal regularity is as well warranted to believe in the corresponding relation of necessity.
    Found 1 week, 3 days ago on PhilSci Archive
  18. 887747.104757
    There are two notions in the philosophy of probability that are often used interchangeably: that of subjective probabilities and that of epistemic probabilities. This paper suggests they should be kept apart. Specifically, it suggests that the distinction between subjective and objective probabilities refers to what probabilities are, while the distinction between epistemic and ontic probabilities refers to what probabilities are about. After arguing that there are bona fide examples of subjective ontic probabilities and of epistemic objective probabilities, I propose a systematic way of drawing these distinctions in order to take this into account. In doing so, I modify Lewis’s notion of chances, and extend his Principal Principle in what I argue is a very natural way (which in fact makes chances fundamentally conditional). I conclude with some remarks on time symmetry, on the quantum state, and with some more general remarks about how this proposal fits into an overall Humean (but not quite neo-Humean) framework.
    Found 1 week, 3 days ago on PhilSci Archive
  19. 887891.104771
    In statistics, there are two main paradigms: classical and Bayesian statistics. The purpose of this paper is to investigate the extent to which classicists and Bayesians can (in some suitable sense of the word) agree. My conclusion is that, in certain situations, they can’t. The upshot is that, if we assume that the classicist isn’t allowed to have a higher degree of belief (credence) in a null hypothesis after he has rejected it than before, then (in certain situations), either he has to have trivial or incoherent credences to begin with, or fail to update his credences by conditionalization.
    Found 1 week, 3 days ago on PhilSci Archive
  20. 887973.104785
    Must a theory of quantum gravity have some truth to it if it can recover general relativity in some limit of the theory? This paper answers this question in the negative by indicating that general relativity is multiply realizable in quantum gravity. The argument is inspired by spacetime functionalism – multiple realizability being a central tenet of functionalism – and proceeds via three case studies: induced gravity, thermodynamic gravity, and entanglement gravity. In these, general relativity in the form of the Einstein field equations can be recovered from elements that are either manifestly multiply realizable or at least of the generic nature that is suggestive of functions. If general relativity, as argued here, can inherit this multiple realizability, then a theory of quantum gravity can recover general relativity while being completely wrong about the posited microstructure. As a consequence, the recovery of general relativity cannot serve as the ultimate arbiter that decides which theory of quantum gravity that is worthy of pursuit, even though it is of course not irrelevant either qua quantum gravity. Thus, the recovery of general relativity in string theory, for instance, does not guarantee that the stringy account of the world is on the right track; despite sentiments to the contrary among string theorists.
    Found 1 week, 3 days ago on PhilSci Archive
  21. 888000.104798
    This paper develops a number of quantum mechanical characterisations of Stern-Gerlach. It discusses areas of vagueness in their formulation. Philosophers criticise quantum mechanics for unacceptable vagueness in connection with the measurement problem. The quantum formulation problems identified by this paper go beyond the locus of philosophical criticism. It concludes with an open question, are some areas of vagueness in quantum mechanics more acceptable philosophically than others and, if so, why?
    Found 1 week, 3 days ago on PhilSci Archive
  22. 888014.104812
    Debates in philosophy of probability over the nature and ontology of objective chance by and large remain inconclusive. No reductive account of chance has ultimately prospered. This article proposes a change of focus towards the functions and roles that chance plays in our cognitive practices. Its starting philosophical point is pluralism about objective probability. The complex nexus of chance is the interlinked set of roles in modelling practice of i) parametrized probabilistic dispositions (“propensities”); ii) distribution functions (“probabilities”); and iii) statistical finite data (“frequencies”). It is argued that the modelling literature contains sophisticated applications of the chance nexus to both deterministic and indeterministic phenomena. These applications may be described as lying on a spectrum between what I call ‘pure probabilistic’, and ‘pure stochastic’ models. The former may be found in the tradition of the method of arbitrary functions; the latter in present-day techniques for stochastic modelling in the complex sciences, as well as some orthodox approaches to quantum mechanics. These modelling practices provide positive arguments for the irreducible complexity of the chance nexus.
    Found 1 week, 3 days ago on PhilSci Archive
  23. 1118549.104826
    Running verification tasks in database driven systems requires solving quantifier elimination problems of a new kind. These quantifier elimination problems are related to the notion of a cover introduced in ESOP 2008 by Gulwani and Musuvathi. In this paper, we show how covers are strictly related to model completions, a well-known topic in model theory. We also investigate the computation of covers within the Superposition Calculus, by adopting a constrained version of the calculus, equipped with appropriate settings and reduction strategies. In addition, we show that cover computations are computationally tractable for the fragment of the language used in applications to database driven verification. This observation is confirmed by analyzing the preliminary results obtained using the MCMT tool on the verification of data-aware process benchmarks. These benchmarks can be found in the last version of the tool distribution.
    Found 1 week, 5 days ago on Silvio Ghilardi's site
  24. 1272263.104839
    Fitch’s Paradox shows that if every truth is knowable, then every truth is known. Standard diagnoses identify the factivity/negative infallibility of the knowledge operator and Moorean contradictions as the root source of the result. This paper generalises Fitch’s result to show that such diagnoses are mistaken. In place of factivity/negative infallibility, the weaker assumption of any ‘level-bridging principle’ suffices. A consequence is that the result holds for some logics in which the “Moorean contradiction” commonly thought to underlie the result is in fact consistent. This generalised result improves on the current understanding of Fitch’s result and widens the range of modalities of philosophical interest to which the result might be fruitfully applied. Along the way, we also consider a semantic explanation for Fitch’s result which answers a challenge raised by Kvanvig (2006).
    Found 2 weeks ago on PhilPapers
  25. 1272307.104853
    The aim of this paper is to investigate counterfactual logic and its implications for the modal status of mathematical claims. It is most directly a response to an ambitious program by Yli-Vakkuri and Hawthorne (2018), who seek to establish that mathematics is committed to its own necessity. I claim that their argument fails to establish this result for two reasons. First, the system of counterfactual logic they develop is provably equivalent to appending Deduction Theorem to a T modal logic. It is neither new nor surprising that the combination of T with Deduction Theorem results in necessitation; this has been widely known since the formalization of modal logic in the 1960’s. Indeed, it is precisely for this reason that Deduction Theorem is almost universally rejected in modal contexts. Absent a reason to accept Deduction Theorem in this case, we remain without a compelling argument for the necessity of mathematics. Second, their assumptions force our hand on controversial debates within counterfactual logic. In particular, they license counterfactual strengthening— the inference from ‘If A were true then C would be true’ to ‘If A and B were true then C would be true’—which many reject. Many philosophers are thus unable to avail themselves of this result.
    Found 2 weeks ago on PhilPapers
  26. 1369628.104866
    [warning: it’s proving hard to avoid typos in the formulas here. I’ve caught as many as I can, but please exercise charity in reading the various subscripts]. In the Lewisian setting I’ve been examining in the last series of posts, I’ve been using the following definition of indicates-to-x (I use the same notation as in previous posts, but add a w-subscript to distinguish it from an alternative I will shortly introduce): The arrow on the right is the counterfactual conditional, and the intended interpretation of the B-operator is “has a reason to believe”. …
    Found 2 weeks, 1 day ago on Robbie Williams's blog
  27. 1389702.10488
    In evolutionary games, equilibrium concepts adapted from classical game theory—typically, refinements of the Nash equilibrium—are employed to identify the probable outcomes of evolutionary processes. Over the years, various negative results have been produced demonstrating limitations to each proposed refinement. These negative results rely on an undefined notion of evolutionary significance. We propose an explicit and novel definition of the notion of evolutionary significance in line with what is assumed in these results. This definition enables a comprehensive analysis of the limitations of the proposed equilibrium concepts. Taken together, the results show that even under favorable assumptions as to the underlying dynamics and stability concept—the replicator dynamics and asymptotic stability—all equilibrium concept makes errors of either omission or commission; typically both.
    Found 2 weeks, 2 days ago on PhilSci Archive
  28. 1487579.104896
    All standard epistemic logics legitimate something akin to the principle of closure, according to which knowledge is closed under competent deductive inference. And yet the principle of closure, particularly in its multiple premise guise, has a somewhat ambivalent status within epistemology. One might think that serious concerns about closure point us away from epistemic logic altogether—away from the very idea that the knowledge relation could be fruitfully treated as a kind of modal operator. This, however, need not be so. The abandonment of closure may yet leave in place plenty of formal structure amenable to systematic logical treatment. In this paper we describe a family of weak epistemic logics in which closure fails, and describe two alternative semantic frameworks in which these logics can be modelled. One of these—which we term plurality semantics—is relatively unfamiliar. We explore under what conditions plurality frames validate certain much-discussed principles of epistemic logic. It turns out that plurality frames can be interpreted in a very natural way in light of one motivation for rejecting closure, adding to the significance of our technical work. The second framework that we employ—neighbourhood semantics—is much better known. But we show that it too can be interpreted in a way that comports with a certain motivation for rejecting closure.
    Found 2 weeks, 3 days ago on Martin Smith's site
  29. 1488764.10491
    Any philosophy of mathematics deserving the name “logicism” must hold that mathematical truths are in some sense logical truths. Today, a typical characterization of a logical truth is one that remains true under all (re)interpretations of its non-logical vocabulary. Put a bit crudely, this means that something can be a logical truth only if all other statements of the same form are also true. “Fa ⊃ (RabFa)” can be a logical truth because not only it, but all propositions of the form “p ⊃ (qp)” are true. It does not matter what “F”, “R”, “a” and “b” mean, or what specific features the objects meant have. Applying this conception of a logical truth in the context of logicism seems to present an obstacle. “Five is prime”, at least on the surface, is a simple subject-predicate assertion, and obviously, not all subject-predicate assertions are true. How, then could this be a logical truth? Similarly, “7 > 5” asserts a binary relation, but obviously not all binary relations hold. In what follows, I shall call this the logical form problem for logicism.
    Found 2 weeks, 3 days ago on Kevin Klement's site
  30. 1498849.104923
    Probability is the most important concept in modern science, especially as nobody has the slightest notion what it means. —Bertrand Russell, 1929 Lecture (cited in Bell 1945, 587) ‘The Democrats will probably win the next election.’ ‘The coin is just as likely to land heads as tails.’ ‘There’s a 30% chance of rain tomorrow.’ ‘The probability that a radium atom decays in one year is roughly 0.0004.’ One regularly reads and hears probabilistic claims like these. But what do they mean? This may be understood as a metaphysical question about what kinds of things are probabilities, or more generally as a question about what makes probability statements true or false.
    Found 2 weeks, 3 days ago on Stanford Encyclopedia of Philosophy