
19036.546771
There is a familiar philosophical position – sometimes called the doctrine of the open future – according to which future contingents (claims about underdetermined aspects of the future) systematically fail to be true. For instance: supposing that there are ways things could develop from here in which Trump is impeached, and in which he is not, it is not now true that Trump will be impeached, and not now true that Trump will not be impeached. For well over 2000 years, however, open futurists have been accused of denying certain logical laws – bivalence, excluded middle, or both – for entirely ad hoc reasons, most notably, that their denials are required for the preservation of something we hold dear. In a recent paper, however, I sought to argue that this deeply entrenched narrative ought to be overturned. My thought was this: given a popular, plausible approach to the semantics of future contingents, we can reduce the question of their status to the Russell/Strawson debate concerning presupposition failure, definite descriptions, and bivalence. In that case, we will see that open futurists in fact needn’t deny bivalence (Russell), or, if they do, they will do so for perfectly general (Strawsonian) reasons – reasons for which we all must deny bivalence. Of course, the metaphysical objections to the open futurist’s model of the future will remain just as they were. However, the millenniaold “semantic” or “logical” objections to the doctrine would be answered.

25935.546829
Computer simulation of an epistemic landscape model, modified to include explicit representation of a centralised funding body, show the method of funding allocation has significant effects on communal tradeoff between exploration and exploitation, with consequences for the community’s ability to generate significant truths. The results show this effect is contextual, and depends on the size of the landscape being explored, with funding that includes explicit random allocation performing significantly better than peerreview on large landscapes. The paper proposes a way of incorporating external institutional factors in formal social epistemology, and offers a way of bringing such investigations to bear on current research policy questions.

26710.54689
In this paper I investigate whether certain substructural theories are able to dodge paradox while at the same time containing what might be viewed as a naive validity predicate. To this end I introduce the requirement of internalization, roughly, that an adequate theory of validity should prove that its own metarules are validitypreserving. The main point of the paper is that substructural theories fail this requirement in various ways.

33145.546913
11 August 1895 – 12 June 1980
Continuing with my Egon Pearson posts in honor of his birthday, I reblog a post by Aris Spanos: “Egon Pearson’s Neglected Contributions to Statistics“. Egon Pearson (11 August 1895 – 12 June 1980), is widely known today for his contribution in recasting of Fisher’s significance testing into the NeymanPearson (1933) theory of hypothesis testing. …

33384.546931
It’s been a long time since I’ve blogged about the Complex Adaptive System Composition and Design Environment or CASCADE project run by John Paschkewitz. For a reminder, read these:
• Complex adaptive system design (part 1), Azimuth, 2 October 2016. …

83631.546946
As Harvey Brown emphasizes in his book Physical Relativity, inertial motion in general relativity is best understood as a theorem, and not a postulate. Here I discuss the status of the “conservation condition”, which states that the energymomentum tensor associated with noninteracting matter is covariantly divergencefree, in connection with such theorems.

131041.546962
The spectrum argument purports to show that the betterthan relation is not transitive, and consequently that orthodox value theory is built on dubious foundations. The argument works by constructing a sequence of increasingly less painful but more drawnout experiences, such that each experience in the spectrum is worse than the previous one, yet the final experience is better than the experience with which the spectrum began. Hence the betterness relation admits cycles, threatening either transitivity or asymmetry of the relation. This paper examines recent attempts to block the spectrum argument, using the idea that it is a mistake to affirm that every experience in the spectrum is worse than its predecessor: an alternative hypothesis is that adjacent experiences may be incommensurable in value, or that due to vagueness in the underlying concepts, it is indeterminate which is better. While these attempts formally succeed as responses to the spectrum argument, they have additional, as yet unacknowledged costs that are significant. In order to effectively block the argument in its most typical form, in which the first element is radically inferior to the last, it is necessary to suppose that the incommensurability (or indeterminacy) is particularly acute: what might be called radical incommensurability (radical indeterminacy). We explain these costs, and draw some general lessons about the plausibility of the available options for those who wish to save orthodox axiology from the spectrum argument.

132369.54698
The need for expressing temporal constraints in conceptual models is wellknown, but it is unclear which representation is preferred and what would be easier to understand by modellers. We assessed five different modes of representing temporal constraints, being the formal semantics, Description logics notation, a codingstyle notation, temporal EER diagrams, and (pseudo)natural language sentences. The same information was presented to 15 participants in an experimental evaluation. Principally, it showed that 1) there was a clear preference for diagrams and natural language versus a dislike for other representations; 2) diagrams were preferred for simple constraints, but the natural language rendering was preferred for more complex temporal constraints; and 3) a multimodal modelling tool will be needed for the data analysis stage to be effective.

141269.546995
In this paper I discuss the delayed choice quantum eraser experiment by giving a straightforward account in standard quantum mechanics. At first glance, the experiment suggests that measurements on one part of an entangled photon pair (the idler) can be employed to control whether the measurement outcome of the other part of the photon pair (the signal) produces interference fringes at a screen after being sent through a double slit. Significantly, the choice whether there is interference or not can be made long after the signal photon encounters the screen. The results of the experiment have been alleged to invoke some sort of ‘backwards in time influences’. I argue that in the standard collapse interpretation the issue can be eliminated by taking into account the collapse of the overall entangled state due to the signal photon. Likewise, in the de BroglieBohm picture the particle’s trajectories can be given a welldefined description at any instant of time during the experiment. Thus, there is no need to resort to any kind of ‘backwards in time influence’. As a matter of fact, the delayed choice quantum eraser experiment turns out to resemble a Belltype measurement, and so there really is no mystery.

148655.547012
E.S. Pearson (11 Aug, 189512 June, 1980)
This is a belated birthday post for E.S. Pearson (11 August 189512 June, 1980). It’s basically a post from 2012 which concerns an issue of interpretation (longrun performance vs probativeness) that’s badly confused these days. …

192617.547028
There’s a new paper on the arXiv that claims to solve a hard problem:
• Norbert Blum, A solution of the P versus NP problem. Most papers that claim to solve hard math problems are wrong: that’s why these problems are considered hard. …

192620.547043
We owe to Frege in Begriffsschrift our modern practice of taking unrestricted quantification (in one sense) as basic. I mean, he taught us how to rephrase restricted quantifications by using unrestricted quantifiers plus connectives in the now familiar way, so that e.g. …

470442.54706
In this chapter, I will discuss what it takes for a dynamical collapse theory to provide a reasonable description of the actual world. I will start with discussions of what is required, in general, of the ontology of a physical theory, and then apply it to the quantum case. One issue of interest is whether a collapse theory can be a quantum state monist theory, adding nothing to the quantum state and changing only its dynamics. Although this was one of the motivations for advancing such theories, its viability has been questioned, and it has been argued that, in order to provide an account of the world, a collapse theory must supplement the quantum state with additional ontology, making such theories more like hiddenvariables theories than would first appear. I will make a case for quantum state monism as an adequate ontology, and, indeed, the only sensible ontology for collapse theories. This will involve taking dynamical variables to possess, not sharp values, as in classical physics, but distributions of values.

470467.547075
I discuss a gametheoretic model in which scientists compete to finish the intermediate stages of some research project. Banerjee et al. (2014) have previously shown that if the credit awarded for intermediate results is proportional to their difficulty, then the strategy profile in which scientists share each intermediate stage as soon as they complete it is a Nash equilibrium. I show that the equilibrium is both unique and strict. Thus rational creditmaximizing scientists have an incentive to share their intermediate results, as long as this is sufficiently rewarded.

520720.547092
Persistence judgments are ordinary judgments about whether an object survives a change, or perishes. For instance, if a house fire only superficially damages the kitchen, people judge that the house survived. But if the fire burnt the house to the ground instead, people judge that the house did not survive but was instead destroyed. We are interested in what drives these judgments, in part because objects are so central to our conception of the world, and our persistence judgments get to the very heart of the folk notion of an object.

524583.547106
In models for paraconsistent logics, the semantic values of sentences and their negations are less tightly connected than in classical logic. In “American Plan” logics for negation, truth and falsity are, to some degree, independent. The truth of ∼p is given by the falsity of p, and the falsity of ∼p is given by the truth of p. Since truth and falsity are only loosely connected, p and ∼p can both hold, or both fail to hold. In “Australian Plan” logics for negation, negation is treated rather like a modal operator, where the truth of ∼p in a situation amounts to p failing in certain other situations. Since those situations can be different from this one, p and ∼p might both hold here, or might both fail here.

603390.54712
Illustration by Slate
Last week a team of 72 scientists released the preprint of an article attempting to address one aspect of the reproducibility crisis, the crisis of conscience in which scientists are increasingly skeptical about the rigor of our current methods of conducting scientific research. …

643220.547136
Suppose that I am throwing a perfectly sharp dart uniformly randomly at a continuous target. The chance that I will hit the center is zero. What if I throw an infinite number of independent darts at the target? …

701076.547151
The claim of inflationary cosmology to explain certain observable facts, which the FriedmannRoberstonWalker models of ‘BigBang’ cosmology were forced to assume, has already been the subject of significant philosophical analysis. However, the principal empirical claim of inflationary cosmology, that it can predict the scaleinvariant power spectrum of density perturbations, as detected in measurements of the cosmic microwave background radiation, has hitherto been taken at face value by philosophers. The purpose of this paper is to expound the theory of density perturbations used by inflationary cosmology, to assess whether inflation really does predict a scaleinvariant spectrum, and to identify the assumptions necessary for such a derivation. The first section of the paper explains what a scaleinvariant powerspectrum is, and the requirements placed on a cosmological theory of such density perturbations. The second section explains and analyses the concept of the Hubble horizon, and its behaviour within an inflationary spacetime. The third section expounds the inflationary derivation of scaleinvariance, and scrutinises the assumptions within that derivation. The fourth section analyses the explanatory role of ‘horizoncrossing’ within the inflationary scenario.

701094.547165
In the context of superintelligent AI systems, the term “oracle” has two meanings. One refers to modular systems queried for domainspecific tasks. Another usage, referring to a class of systems which may be useful for addressing the value alignment and AI control problems, is a superintelligent AI system that only answers questions. The aim of this manuscript is to survey contemporary research problems related to oracles which align with longterm research goals of AI safety. We examine existing question answering systems and argue that their high degree of architectural heterogeneity makes them poor candidates for rigorous analysis as oracles. On the other hand, we identify computer algebra systems (CASs) as being primitive examples of domainspecific oracles for mathematics and argue that efforts to integrate computer algebra systems with theorem provers, systems which have largely been developed independent of one another, provide a concrete set of problems related to the notion of provable safety that has emerged in the AI safety community. We review approaches to interfacing CASs with theorem provers, describe welldefined architectural deficiencies that have been identified with CASs, and suggest possible lines of research and practical software projects for scientists interested in AI safety.

755305.54718
We give a precise semantics for a proposed revised version of the Knowledge Interchange Format. We show that quantification over relations is possible in a firstorder logic, but sequence variables take the language beyond firstorder.

834656.547195
We report on progress and an unsolved problem in our attempt to obtain a clear rationale for relevance logic via semantic decomposition trees. Suitable decomposition rules, constrained by a natural parity condition, generate a set of directly acceptable formulae that contains all axioms of the wellknown system R, is closed under substitution and conjunction, satisfies the lettersharing condition, but is not closed under detachment. To extend it, a natural recursion is built into the procedure for constructing decomposition trees. The resulting set of acceptable formulae has many attractive features, but it remains an open question whether it continues to satisfy the crucial lettersharing condition.

868036.547209
J. D. Hamkins and O, “The modal logic of settheoretic potentialism and the potentialist maximality principles.” (manuscript in preparation)
Citation arχiv
@ARTICLE{HamkinsLinnebo:Modallogicofsettheoreticpotentialism,
author = {Joel David Hamkins and {\O}ystein Linnebo},
title = {The modal logic of settheoretic potentialism and the potentialist maximality principles},
journal = {},
year = {},
volume = {},
number = {},
pages = {},
month = {},
note = {manuscript in preparation},
abstract = {},
keywords = {},
source = {},
eprint = {1708.01644},
archivePrefix = {arXiv},
primaryClass = {math.LO},
url = {http://jdh.hamkins.org/settheoreticpotentialism},
doi = {},
}
Abstract. …

874126.547223
The standard propositional account of necessary and sufficient conditions in many introductory logic textbooks is based on the material conditional. Some examples include (BarkerPlummer, Barwise, and Etchemendy 2011: 181182), (Churchill 1986: 391392), (Forbes 1994: 2025), (Gabbay 2002: 68), (Haight 1999: 187189), (Halverson 1984: 285 286), (Hardegree 2011: 129), (Layman 2002: 250251), (Leblanc and Wisdom 1976: 1618), (Salmon 1984: 4748), (P. Smith 2003: 132), (Suppes 1957: 810) and (Watson and Arp 2015: 149). In the appendix, pertinent excerpts from some of these resources are provided. In general, the typical exposition goes along the following lines (again, cf. the appendix): • “A is sufficient for B” is best rendered as “if A, then B”, or symbolically, (A ⊃ B). • “A is necessary for B” is best rendered as ”if not A, then not B”, or symbolically, (¬A ⊃ ¬B). This is equivalent to (B ⊃ A).

874221.547238
A central proposition of this book is that there are no universal rules for inductive inference. The chapters so far have sought to argue for this proposition and to illustrate it by showing how several popular accounts of inductive inference fail to provide universally applicable rules. Many in an influential segment of the philosophy of science community will judge these efforts to be mistaken and futile. In their view, the problem has been solved, finally and irrevocably.

901678.547252
We propose an investigation of the ways in which speakers’ subjective perspectives are likely to affect the meaning of gradable adjectives like tall or heavy. We present the results of a study showing that people tend to use themselves as a yardstick when ascribing these adjectives to human figures of variable measurements: subjects’ height and weight requirements for applying tall and heavy are found to be positively correlated with their personal measurements. We draw more general lessons regarding the definition of subjectivity and the ways in which a standard of comparison and a significant deviation of that standard are specified.

901726.547267
Recent ideas about epistemic modals and indicative conditionals in formal semantics have significant overlap with ideas in modal logic and dynamic epistemic logic. The purpose of this paper is to show how greater interaction between formal semantics and dynamic epistemic logic in this area can be of mutual benefit. In one direction, we show how concepts and tools from modal logic and dynamic epistemic logic can be used to give a simple, complete axiomatization of Yalcin’s [16] semantic consequence relation for a language with epistemic modals and indicative conditionals. In the other direction, the formal semantics for indicative conditionals due to Kolodny and MacFarlane [9] gives rise to a new dynamic operator that is very natural from the point of view of dynamic epistemic logic, allowing succinct expression of dependence (as in dependence logic) or supervenience statements. We prove decidability for the logic with epistemic modals and Kolodny and MacFarlane’s indicative conditional via a full and faithful computable translation from their logic to the modal logic K45.

993279.547281
In normative political theory, it is widely accepted that democratic decision making cannot be reduced to voting alone, but that it requires reasoned and wellinformed discussion by those involved in and/or subject to the decisions in question, under conditions of equality and respect. In short, democracy requires deliberation (e.g., Cohen 1989; Gutmann and Thompson 1996; Dryzek 2000; Fishkin 2009; Mansbridge et al. 2010). In formal political theory, by contrast, the study of democracy has focused less on deliberation, and more on the aggregation of individual preferences or opinions into collective decisions – social choices – typically through voting (e.g., Arrow 1951/1963; Riker 1982; AustenSmith and Banks 2000, 2005; Mueller 2003). While the literature on deliberation has an optimistic flavour, the literature on social choice is more mixed. It is centred around several paradoxes and impossibility results showing that collective decision making cannot generally satisfy certain plausible desiderata. Any democratic aggregation rule that we use in practice seems, at best, a compromise.

1047042.547295
In 1986 David Gauthier proposed an arbitration scheme for two player cardinal bargaining games based on interpersonal comparisons of players’ relative concessions. In Gauthier’s original arbitration scheme, players’ relative concessions are defined in terms of Raiffanormalized cardinal utility gains, and so it cannot be directly applied to ordinal bargaining problems. In this paper I propose a relative benefit equilibrating bargaining solution (RBEBS ) for two and nplayer ordinal and quasiconvex ordinal bargaining problems with finite sets of feasible basic agreements based on the measure of players’ ordinal relative individual advantage gains. I provide an axiomatic characterization of this bargaining solution and discuss the conceptual relationship between RBEBS and ordinal egalitarian bargaining solution (OEBS ) proposed by Conley and Wilkie (2012). I show the relationship between the measurement procedure for ordinal relative individual advantage gains and the measurement procedure for players’ ordinal relative concessions, and argue that the proposed arbitration scheme for ordinal games can be interpreted as an ordinal version of Gauthier’s arbitration scheme.

1104434.547309
Algebra is a branch of mathematics sibling to geometry, analysis
(calculus), number theory, combinatorics, etc. Although algebra has
its roots in numerical domains such as the reals and the complex
numbers, in its full generality it differs from its siblings in
serving no specific mathematical domain. Whereas geometry treats
spatial entities, analysis continuous variation, number theory integer
arithmetic, and combinatorics discrete structures, algebra is equally
applicable to all these and other mathematical domains. Elementary algebra, in use for centuries and taught in
secondary school, is the arithmetic of indefinite quantities or
variables \(x, y,\ldots\).