
729.916837
. What’s would I say is the most important takeaway from last week’s NISS “statistics debate” if you’re using (or contemplating using) Bayes factors (BFs)–of the sort Jim Berger recommends– as replacements for Pvalues? …

6292.916983
In this short survey article, I discuss Bell’s theorem and some strategies that attempt to avoid the conclusion of nonlocality. I focus on two that intersect with the philosophy of probability: (1) quantum probabilities and (2) superdeterminism. The issues they raised not only apply to a wide class of nogo theorems about quantum mechanics but are also of general philosophical interest.

6578.917022
The propensity nature of evolutionary fitness has long been appreciated and is nowadays amply discussed (Abrams, 2009, 2012; Ariew and Ernst, 2009; Ariew and Lewontin, 2004; Beatty and Finsen, 1989; Brandon, 1978; Drouet and Merlin, 2015; Mills and Beatty, 1979; Millstein, 2003, 2016; Pence and Ramsey, 2013; Sober, 1984, 2001, 2013, 2019; Walsh, 2010; Walsh, Ariew, Mahen, 2016; etc). The discussion has, however, on occasion followed long standing conflations in the philosophy of probability between propensities, probabilities, and frequencies. In this article, I apply a more recent conception of propensities in modelling practice (the ‘complex nexus of chance’, CNC) to some key issues, regarding whether and how fitness is explanatory, and how it ought to be represented mathematically. The ensuing complex nexus of fitness (CNF) emphasises the distinction between biological propensities and the probability distributions over offspring numbers that they give rise to; and how critical it is to distinguish the possession conditions of the underlying dispositional (physical and biological) properties from those of their probabilistic manifestations.

11930.917039
To a first approximation, epistemic utility theory is an application of standard decision theoretic tools to the study of epistemic rationality. The strategy consists in identifying a particular class of decision problems—epistemic decision problems—and using the recommendations that our decision theory makes for them in order to motivate principles of epistemic rationality. The resulting principles will of course be a function of, among other things, what we take epistemic decision problems to be and of what specific brand of decision theory we rely on.1 But regardless of the details, epistemic utility theory inherits from the decision theoretic framework a distinction between axiological notions—of epistemic value or epistemic utility—and deontological notions—like epistemic rationality or epistemic permissibility.

35262.917053
David Hilbert was promoting formalized mathematics, in which every real number with its infinite series of digits is a completed individual object. On the other side the Dutch mathematician, Luitzen Egbertus Jan Brouwer, was defending the view that each point on the line should be represented as a neverending process that develops in time, a view known as intuitionistic mathematics (Box 1).

180214.917076
Do scientific theories limit human knowledge? In other words, are there physical variables hidden by essence forever? We argue for negative answers and illustrate our point on chaotic classical dynamical systems. We emphasize parallels with quantum theory and conclude that the common real numbers are, de facto, the hidden variables of classical physics. Consequently, real numbers should not be considered as “physically real” and classical mechanics, like quantum physics, is indeterministic.

180291.917154
Propensities are presented as a generalization of classical determinism. They describe a physical reality intermediary between Laplacian determinism and pure randomness, such as in quantum mechanics. They are characterized by the fact that their values are determined by the collection of all actual properties. It is argued that they do not satisfy Kolmogorov axioms; other axioms are proposed.

192631.917195
We show that under plausible levels of background risk, no theory of choice under risk—such as expected utility theory, prospect theory, or rank dependent utility—can simultaneously satisfy the following three economic postulates: (i) Decision makers are riskaverse over small gambles, (ii) they respect stochastic dominance, and (iii) they account for background risk.

225051.917231
How should governments decide between alternative taxation schemes, environmental protection regulations, infrastructure plans, climate change policies, healthcare systems, and other policies? One kind of consideration that should bear on such decisions is their effects on people’s wellbeing. The most rigorous methodology for evaluating such effects is the “social welfare function” (SWF) approach originating in the work of Abram Bergson and Paul Samuelson and further developed by Kenneth Arrow, Amartya Sen, and other economists.

274513.917278
In this paper, I motivate the addition of an actuality operator to relevant logics. Straightforward ways of doing this are in tension with standard motivations for relevant logics, but I show how to add the operator in a way that permits one to maintain the intuitions behind relevant logics. I close by exploring some of the philosophical consequences of the addition.

283000.917306
One approach to knowledge, termed the relevant alternatives theory, stipulates that a belief amounts to knowledge if one can eliminate all relevant alternatives to the belief in the epistemic situation. This paper uses causal graphical models to formalize the relevant alternatives approach to knowledge. On this theory, an epistemic situation is encoded through the causal relationships between propositions, which determine which alternatives are relevant and irrelevant. This formalization entails that statistical evidence is not sufficient for knowledge, provides a simple way to incorporate epistemic contextualism, and can rule out many Gettier cases from knowledge. The interpretation in terms of causal models offers more precise predictions for the relevant alternatives theory, strengthening the case for it as a theory of knowledge.

304267.917321
I am indebted to my own teachers, Prof. Peter Koepke and Prof. Stefan Geschke, who taught me everything in these notes. Prof. Geschke’s scriptum for Einführung in die Logik und Modelltheorie (Bonn, Summer 2010) provided an invaluable basis for the compilation of these notes.

315514.917335
The vocabulary of human languages has been argued to support efficient communication by optimizing the tradeoff between complexity and informativeness (Kemp & Regier 2012). The argument has been based on crosslinguistic analyses of vocabulary in semantic domains of content words such as kinship, color, and number terms. The present work extends this analysis to a category of function words: indefinite pronouns (e.g. someone, anyone, noone, cf. Haspelmath 2001). We build on previous work to establish the meaning space and featural makeup for indefinite pronouns, and show that indefinite pronoun systems across languages optimize the complexity/informativeness tradeoff. This demonstrates that pressures for efficient communication shape both content and function word categories, thus tying in with the conclusions of recent work on quantifiers by SteinertThrelkeld (2019). Furthermore, we argue that the tradeoff may explain some of the universal properties of indefinite pronouns, thus reducing the explanatory load for linguistic theories.

324556.91735
. How did I respond to those 7 burning questions at last week’s (“PValue”) Statistics Debate? Here’s a fairly close transcript of my (a) general answer, and (b) final remark, for each question–without the inbetween responses to Jim and David. …

341356.917364
We define mereologically invariant composition as the relation between a whole object and its parts when the object retains the same parts during a time interval. We argue that mereologically invariant composition is identity between a whole and its parts taken collectively. Our reason is that parts and wholes are equivalent measurements of a portion of reality at diferent scales in the precise sense employed by measurement theory. The purpose of these scales is the numerical representation of primitive relations between quantities of being. To show this, we prove representation and uniqueness theorems for composition. Thus, mereologically invariant composition is transscalar identity.

454396.917378
The TL;DR summary of what follows is that we should quantify the conventionality of a regularity (DavidLewisstyle) as follows:
A regularity R in the behaviour of population P in a recurring situation S, is a convention of depth x, breadth y and degree z when there is a recurring situation T that refines S, and in each instance of T there is a subpopulation K of P, such that it’s true and common knowledge among K in that instance that:(A) BEHAVIOUR CONDITION: everyone in K conforms to R (B) EXPECTATION CONDITION: everyone in K expects everyone else in K to conform to R (C) SPECIAL PREFERENCE CONDITION: everyone in K prefers that they conform to R conditionally on everyone else in K conforming to R. where x (depth) is the fraction of Ssituations which are T, y (breadth) is the fraction of all Ps involved who are Ks in this instance, and z is the degree to which (AC) obtaining resembles a coordination equilibrium that solves a coordination problem among the Ks. …

544835.917392
Sharing, downloading, and reusing software is commonplace, some of which is carried out legally with open source software. When it is not legal, it is unclear how many infringements have taken place: does an infringement count for the artefact as a whole or for each source file of a computer program? To answer this question, it must first be established whether a computer program should be considered as an integral whole, a collection, or a mere set of distinct files, and why. We argue that a program is a functional whole, availing of, and combining, arguments from mereology, granularity, modularity, unity, and function to substantiate the claim. The argumentation and answer contributes to the ontology of software artefacts, may assist industry in litigation cases, and demonstrates that the notion of unifying relation is operationalisable.

544866.917406
Noether’s first theorem does not establish a oneway explanatory arrow from symmetries to conservation laws, but such an arrow is widely assumed in discussions of the theorem in the physics and philosophy literature. It is argued here that there are pragmatic reasons for privileging symmetries, even if they do not strictly justify explanatory priority. To this end, some practical factors are adduced as to why Noether’s direct theorem seems to be more wellknown and exploited than its converse, with special attention being given to the sometimes overlooked nature of Noether’s converse result and to its strengthened version due to Luis Martinez Alonso in 1979 and Peter Olver in 1986.

544869.91742
It is usual to identify initial conditions of classical dynamical systems with mathematical real numbers. However, almost all real numbers contain an infinite amount of information. I argue that a finite volume of space can’t contain more than a finite amount of information, hence that the mathematical real numbers are not physically relevant. Moreover, a better terminology for the socalled real numbers is “random numbers”, as their series of bits are truly random. I propose an alternative classical mechanics, which is empirically equivalent to classical mechanics, but uses only finiteinformation numbers. This alternative classical mechanics is nondeterministic, despite the use of deterministic equations, in a way similar to quantum theory. Interestingly, both alternative classical mechanics and quantum theories can be supplemented by additional variables in such a way that the supplemented theory is deterministic. Most physicists straightforwardly supplement classical theory with real numbers to which they attribute physical existence, while most physicists reject Bohmian mechanics as supplemented quantum theory, arguing that Bohmian positions have no physical reality.

626266.917434
Management and use of robots, and Artificial Moral Agents (AMAs) more broadly, may involve contexts where the machines are expected to make moral decisions. The design of an AMA is typically compartmentalised among AI researchers and engineers on the one hand and philosophers on the other. This has had the effect that of the current AMAs, either none or at most one specified normative ethical theory is incorporated as basis. This is problematic because it narrows down the AMA’s functional ability and versatility since it results in moral outcomes that only some people agree with, and possibly going counter to cultural norms, thereby undermining an AMA’s ability to be moral in a human sense. We aim to address this by taking a first step toward normed behaviour. We propose a threelayered model for general normative ethical theories, therewith enabling the representation of multiple normative theories, and users’ specific instances thereof.

626336.917448
Multiple ontology languages have been developed over the years, which brings afore two key components: how to select the appropriate language for the task at hand and language design itself. This engineering step entails examining the ontological ‘commitments’ embedded into the language, which, in turn, demands for an insight into what the effects of philosophical viewpoints may be on the design of a representation language. But what are the sort of commitments one should be able to choose from that have an underlying philosophical point of view, and which philosophical stances have a knockon effect on the specification or selection of an ontology language? In this paper, we provide a first step towards answering these questions. We identify and analyse ontological commitments embedded in logics, or that could be, and show that they have been taken in wellknown ontology languages. This contributes to reflecting on the language as enabler or inhibitor to formally characterising an ontology or an ontological investigation, as well as the design of new ontology languages following the proposed design process.

635954.917461
(1700 words; 8 minute read.) What rational polarization looks like. It’s September 21, 2020. Justice Ruth Bader Ginsburg has just died. Republicans are moving to fill her seat; Democrats are crying foul.Fox News publishes an oped by Ted Cruz arguing that the Senate has a duty to fill her seat before the election. …

681235.917476
What is possible, according to the empiricist conception, is what our evidence positively allows; and what is necessary is what it compels. These notions, along with logical possibility, are the only defensible notions of possibility and necessity. In so far as nomic and metaphysical possibility are defensible, they fall within empirical possibility. These empirical conceptions are incompatible with traditional possible world semantics. Empirically necessary propositions cannot be defined as those true in all possible worlds. There can be empirical possibilities without empirical necessities. The duality of possibility and necessity can be degenerate and can even be falsified.

688700.917509
Many philosophers think that common sense knowledge survives sophisticated philosophical proofs against it. It’s much more certain that things move that it is that the premises of Zeno’s counterarguments are true. What goes for Zeno’s arguments against motion arguably goes for philosophical arguments against causation, time, tables, human beings, knowledge, and more.

713147.917542
The title of the Bakerian 2001 lecture by David Sherrington, a renowned physicist, and the other author of the Sherrington–Kirkpatrick model, was “On magnets, microchips, memories and markets: the statistical physics of complex systems.” That the Sherrington– Kirkpatrick model of spin glasses (i.e. disordered magnets) should find applications in as distant fields as statistical physics, computer science, neural network theory, and financial markets is both outstanding and commonplace. Indeed, it serves as an epitome of the contemporary modeling practice, where the same function forms and equations, and mathematical and computational methods are being transferred and recycled across the disciplinary boundaries. While philosophers of science have only recently started to address the interdisciplinary dynamics of such a modeling practice, it is far from a new phenomenon. The transfer of theoretical and formal tools from one area of physics to another, and from physics to other disciplines such as economics and biology have marked many scientific breakthroughs in the 19^{th} and early 20^{th} century. More lately, engineering has had an increasing interdisciplinary influence on many fields, attested by, for example, the emergence of synthetic biology.

746890.917572
In a series of papers over the past twenty years, and in a new book, Igor Douven has argued that Bayesians are too quick to reject versions of inference to the best explanation that cannot be accommodated within their framework. In this paper, I survey Douven’s worries and attempt to answer them using a series of pragmatic and purely epistemic arguments that I take to show that Bayes’ Rule really is the only correct way to respond to your evidence.

843138.917589
We describe the translation invariant stationary states (TIS) of the onedimensional facilitated asymmetric exclusion process in continuous time, in which a particle at site i ∈ Z jumps to site i + 1 (respectively i − 1) with rate p (resp. 1 − p), provided that site i − 1 (resp. i + 1) is occupied and site i + 1 (resp. i − 1) is empty. All TIS states with density ρ ≤ 1/2 are supported on trapped configurations in which no two adjacent sites are occupied; we prove that if in this case the initial state is Bernoulli then the final state is independent of p. This independence also holds for the system on a finite ring. For ρ > 1/2 there is only one TIS. It is the infinite volume limit of the probability distribution that gives uniform weight to all configurations in which no two holes are adjacent, and is isomorphic to the Gibbs measure for hard core particles with nearest neighbor exclusion.

875632.917603
A recent claim by Meehan that quantum mechanics has a new “control problem” that puts limits on our ability to prepare quantum states and revises our understanding of the nocloning theorem is examined. We identify flaws in Meehan’s analysis and argue that such a problem does not exist.

875870.917617
Bayesian personalism models learning from experience as the updating of an agent’s credence function on the information the agent acquires. The standard updating rules are hamstrung for zero probability events. The maneuvers that have been proposed to handle this problem are examined and found wanting: they offer only temporary relief but no satisfying and stable long term resolution. They do suggest a strategy for avoiding the problem altogether, but the price to be paid is a very crabbed account of learning from experience. I outline what Bayesians would need to do in order to come to grips with the problem rather than seeking to avoid it. Furthermore, I emphasize that an adequate treatment of the issues must work not only for classical probability but also for quantum probability as well, the latter of which is rarely discussed in the philosophical literature in the same breath with the updating problem. Since it is not obvious how the maneuvers applied to updating classical probability can be made to work for updating quantum probability a rethinking of the problem may be required. At the same time I indicate that in some special cases quantum probability theory has a selfcontained solution to the problem of updating on zero probability events requiring no additional technical devices or rationality constraints.

920918.917631
In this paper, I use interventionist causal models to identify some novel Newcomb problems, and subsequently use these problems to refine existing interventionist treatments of causal decision theory. The new Newcomb problems that stir trouble for existing interventionist treatments involve socalled “exotic choice” — i.e., decisionmaking contexts where the agent has evidence about the outcome of her choice. I argue that when choice is exotic, the interventionist can adequately capture causaldecisiontheoretic reasoning by introducing a new interventionist approach to updating on exotic evidence. But I also argue that this new updating procedure is principled only if the interventionist trades in the typical interventionist conception of choice for an alternative Ramseyan conception. I end by arguing that the guide to exotic choice developed here may be useful in some everyday contexts, despite its name.