-
23966.53957
We develop a theory of policy advice that focuses on the relationship between the competence of the advisor (e.g., an expert bureaucracy) and the quality of advice that the leader may expect. We describe important tensions between these features present in a wide class of substantively important circumstances. These tensions point to the presence of a trade-off between receiving advice more often and receiving more informative advice. The optimal realization of this trade-off for the leader sometimes induces her to prefer advisors of limited competence – a preference that, we show, is robust under different informational assumptions. We consider how institutional tools available to leaders affect preferences for advisor competence and the quality of advice they may expect to receive in equilibrium.
-
25053.539643
There are two main strands of arguments regarding the value-free ideal (VFI): desirability and achievability (Reiss and Sprenger 2020). In this essay, I will argue for what I will call a compatibilist account of upholding the VFI focusing on its desirability even if the VFI is unachievable. First, I will explain what the VFI is. Second, I will show that striving to uphold the VFI (desirability) is compatible with the rejection of its achievability. Third, I will demonstrate that the main arguments against the VFI do not refute its desirability. Finally, I will provide arguments on why it is desirable to strive to uphold the VFI even if the VFI is unachievable and show what role it can play in scientific inquiry. There is no single definition of the VFI, yet the most common way to interpret it is that non-epistemic values ought not to influence scientific reasoning (Brown 2024, 2). Non-epistemic values are understood as certain ethical, social, cultural or political considerations. Therefore, it is the role of epistemic values, such as accuracy, consistency, empirical adequacy and simplicity, to be part of and to ensure proper scientific reasoning.
-
126651.539652
Statistics play an essential role in an extremely wide range of human reasoning. From theorizing in the physical and social sciences to determining evidential standards in legal contexts, statistical methods are ubiquitous, and thus various questions about their application inevitably arise. As tools for making inferences that go beyond a given set of data, they are inherently a means of reasoning ampliatively, and so it is unsurprising that philosophers interested in the notions of evidence and inductive inference have been concerned to utilize statistical frameworks to further our understanding of these topics. However, the field of statistics has long been the subject of heated philosophical controversy. Given that a central goal for philosophers of science is to help resolve problems about evidence and inference in scientific practice, it is important that they be involved in current debates in statistics and data science. The purpose of this topical collection is to promote such philosophical interaction. We present a cross-section of these subjects, written by scholars from a variety of fields in order to explore issues in philosophy of statistics from different perspectives.
-
126668.539659
Robert W. Batterman’s A Middle Way: A Non-Fundamental Approach to Many-Body Physics is an extraordinarily insightful book, far-reaching in its scope and significance, interdisciplinary in character due to connections made between physics, materials science and engineering, and biology, and groundbreaking in the sense that it reflects on important scientific domains that are mostly absent from current literature. The book presents a hydrodynamic methodology, which Batterman explains is pervasive in science, for studying many-body systems as diverse as gases, fluids, and composite materials like wood, steel, and bone. Following Batterman, I will call said methodology the middle-out strategy. Batterman’s main thesis is that the middle-out strategy is superior to alternatives, solves an important autonomy problem, and, consequently, implies that certain mesoscale structures (explained below) ought to be considered natural kinds. In what follows, I unpack and flesh out these claims, starting with a discussion of the levels of reality and its representation. Afterward, I briefly outline the contents of the book’s chapters and then identify issues that seem to me to merit further clarification.
-
140438.539673
I take a pragmatist perspective on quantum theory. This is not a view of the world described by quantum theory. In this view quantum theory itself does not describe the physical world (nor our observations, experiences or opinions of it). Instead, the theory offers reliable advice—on when to expect an event of one kind or another, and on how strongly to expect each possible outcome of that event. The event’s actual outcome is a perspectival fact—a fact relative to a physical context of assessment. Measurement outcomes and quantum states are both perspectival. By noticing that each must be relativized to an appropriate physical context one can resolve the measurement problem and the problem of nonlocal action. But if the outcome of a quantum measurement is not an absolute fact, then why should the statistics of such outcomes give us any objective reason to accept quantum theory? One can describe extensions of the scenario of Wigner’s friend in which a statement expressing the outcome of a quantum measurement would be true relative to one such context but not relative to another. However, physical conditions in our world prevent us from realizing such scenarios. Since the outcome of every actual quantum measurement is certified at what is essentially a single context of assessment, the outcome relative to that context is an objective fact in the only sense that matters for science. We should accept quantum theory because the statistics these outcomes display are just those it leads us to expect.
-
140456.53968
Extrapolating causal effects is becoming an increasingly important kind of inference in Evidence-Based Policy, development economics, and microeconometrics more generally. While several strategies have been proposed to aid with extrapolation, the existing methodological literature has left our understanding of what extrapolation consists of and what constitutes successful extrapolation underdeveloped. This paper addresses this lack in understanding by offering a novel account of successful extrapolation. Building on existing contributions pertaining to the challenges involved in extrapolation, this more nuanced and comprehensive account seeks to provide tools that facilitate the scrutiny of specific extrapolative inferences and general strategies for extrapolation. Offering such resources is important especially in view of the increasing amounts of real-world decision-making in policy, development, and beyond that involve extrapolation.
-
140517.539687
I dispute the conventional claim that the second law of thermodynamics is saved from a "Maxwell's Demon" by the entropy cost of information erasure, and show that instead it is measurement that incurs the entropy cost. Thus Brillouin, who identified measurement as savior of the second law, was essentially correct, and putative refutations of his view, such as Bennett's claim to measure without entropy cost, are seen to fail when the applicable physics is taken into account. I argue that the tradition of attributing the defeat of Maxwell's Demon to erasure rather than to measurement arose from unphysical classical idealizations that do not hold for real gas molecules, as well as a physically ungrounded recasting of physical thermodynamical processes into computational and information-theoretic conceptualizations. I argue that the fundamental principle that saves the second law is the quantum uncertainty principle applying to the need to localize physical states to precise values of observables in order to eQect the desired disequilibria aimed at violating the second law. I obtain the specific entropy cost for localizing a molecule in the Szilard engine, which coincides with the quantity attributed to Landauer's principle. I also note that an experiment characterized as upholding an entropy cost of erasure in a "quantum Maxwell's Demon" actually demonstrates an entropy cost of measurement.
-
371194.539693
We draw a distinction between the traditional reference class problem which describes an obstruction to estimating a single individual probability—which we re-term the individual reference class problem—and what we call the reference class problem at scale, which can result when using tools from statistics and machine learning to systematically make predictions about many individual probabilities simultaneously. We argue that scale actually helps to mitigate the reference class problem, and purely statistical tools can be used to efficiently minimize the reference class problem at scale, even though they cannot be used to solve the individual reference class problem.
-
371210.5397
Modal Empiricism in philosophy of science proposes to understand the possibility of modal knowledge from experience by replacing talk of possible worlds with talk of possible situations, which are coarse-grained, bounded and relative to background conditions. This allows for an induction towards objective necessity, assuming that actual situations are representative of possible ones. The main limitation of this epistemology is that it does not account for probabilistic knowledge. In this paper, we propose to extend Modal Empiricism to the probabilistic case, thus providing an inductivist epistemology for probabilistic knowledge. The key idea is that extreme probabilities, close to 1 and 0, serve as proxies for testing mild probabilities, using a principle of model combination.
-
371230.539706
In operational quantum mechanics two measurements are called operationally equivalent if they yield the same distribution of outcomes in every quantum state and hence are represented by the same operator. In this paper, I will show that the ontological models for quantum mechanics and, more generally, for any operational theory sensitively depend on which measurement we choose from the class of operationally equivalent measurements, or more precisely, which of the chosen measurements can be performed simultaneously. To this goal, I will take first three examples—a classical theory, the EPR-Bell scenario and the Popescu-Rochlich box; then realize each example by two operationally equivalent but different operational theories—one with a trivial and another with a non-trivial compatibility structure; and finally show that the ontological models for the different theories will be different with respect to their causal structure, contextuality, and fine-tuning.
-
486537.539712
QBism explicitly takes the subjective view: probabilities of events are defined solely by past experiences, i.e. the record of observations. As shown by the authors (Fuchs et al, 2013), this: “... removes the paradoxes, conundra, and pseudo-problems that have plagued quantum foundations for the past nine decades”. It is criticised for its lack of ontology and anthropocentric nature. However, if Everett's (1957) formulation is taken at face value, exactly the features of QBism are the result, and the ontology is inherent. The anthropocentric nature of the solution is simply an indication that the quantum state is relative, as is central to Everett. Problems of measurement and locality do not arise.
-
486555.539718
In Part 1 the properties of QBism are shown to be natural consequences of taking quantum mechanics at face value, as does Everett in his Relative State Formulation (1957). In Part 2 supporting evidence is presented. Parmenides' (Palmer, 2012) notion that the physical world is static and unchanging is vividly confirmed in the new physics. This means the time evolution of the physical world perceived by observers only occurs at the level of appearances as noted by Davies (2002). In order to generate this appearance of time evolution, a moving frame of reference is required: this is the only possible explanation of the enactment of the dynamics of physics in a static universe.
-
486571.539723
Despite the simplicity of Weyl's solution to the paradox of the passage of time in the static block universe, virtually no interest is shown in this approach although as shown in Part 2, the problem of the Now could be taken as evidence for his solution being correct. A moving frame of reference is required to explain the experience of the enactment of any of the dynamics of physics, and the experiencing consciousness supervenes on this phenomenon. Given the logic involved is straightforward, it seems that the reasons all this has been ignored may be less so. Here it is suggested, based on Davies' (2006) research, that this might well involve a horror of even the possibility of deity and mysticism being dignified by discussion, let alone endorsement. The objective here is to demonstrate that this approach does validate certain archetypal myths of the great spiritual traditions, but at the same time fully supports and reinforces the objective basis of the science of physics. The myths are exploded to reveal simply scientific principles, and a complete absence of gods or mystical phenomena, indeed such things are categorically ruled out. The scientific principles illustrated by the third logical type which have languished unexamined turn out to be powerful knowledge which serves only to reinforce and emphasise how deeply flawed were the key principles of the religious preoccupations which our culture had to relinquish in order to move forward.
-
486590.539729
The localization problem in relativistic quantum theory has persisted for more than seven decades, yet it is largely unknown and continues to perplex even those well-versed in the subject. At the heart of this problem lies a fundamental conflict between localizability and relativistic causality, which can also be construed as part of the broader dichotomy between measurement and unitary dynamics. This article provides a historical review of the localization problem in one-particle relativistic quantum mechanics, clarifying some persistent misconceptions in the literature, and underscoring the antinomy between causal dynamics and localized observables.
-
486608.539734
— While emergentism enjoys some good fortune in contemporary philosophy, attempts at elucidating the history of this view are rare. Among such attempts, by far the most influential certainly is McLaughlin’s landmark paper “The Rise and Fall of British Emergentism” (1992). While McLaughlin’s analysis of the recent history of emergentism is insightful and instructive in its own ways, in the present paper we offer reasons to be suspicious of some of its central claims. In particular, we advance evidence that rebuts McLaughlin’s contention that British Emergentism did not fall in the 1920–1930s because of philosophical criticism but rather because of an alleged empirical inconsistency with fledgling quantum mechanics.
-
526714.539741
High speed store required: 947 words. No of bits in a word: 64 Is the program overlaid? No No. of magnetic tapes required: None What other peripherals are used? Card Reader; Line Printer No. of cards in combined program and test deck: 112 Card punching code: EBCDIC Keywords: Atomic, Molecular, Nuclear, Rotation Matrix, Rotation Group, Representation, Euler Angle, Symmetry, Helicity, Correlation.
-
562371.539747
I gave a talk on March 8 at an AI, Systems, and Society Conference at the Emory Center for Ethics. The organizer, Alex Tolbert (who had been a student at Virginia Tech), suggested I speak about controversies in statistics, especially P-hacking in statistical significance testing. …
-
604862.539754
A neglected but challenging argument developed by Peter Geach, John Haldane, and Stephen Rothman purports to show that reproduction cannot be explained by natural selection and is irreducibly teleological. Meanwhile, the most plausible definitions of life include reproduction as a constitutive feature. The implication of combining these ideas is that life cannot be explained by natural selection and is irreducibly teleological. This does not entail that life cannot be explained in evolutionary terms of some kind, but it does lend support to the controversial view of Jerry Fodor and Thomas Nagel that evolutionists need to look beyond the constraints of Neo-Darwinism.
-
637069.539759
Where does the Born Rule come from? We ask: “What is the simplest extension of probability theory where the Born rule appears”? This is answered by introducing “superposition events” in addition to the usual discrete events. Two-dimensional matrices (e.g., incidence matrices and density matrices) are needed to mathematically represent the differences between the two types of events. Then it is shown that those incidence and density matrices for superposition events are the (outer) products of a vector and its transpose whose components foreshadow the “amplitudes” of quantum mechanics. The squares of the components of those “amplitude” vectors yield the probabilities of the outcomes. That is how probability amplitudes and the Born Rule arise in the minimal extension of probability theory to include superposition events. This naturally extends to the full Born Rule in the Hilbert spaces over the complex numbers of quantum mechanics. It would perhaps be satisfying if probability amplitudes and the Born Rule only arose as the result of deep results in quantum mechanics (e.g., Gleason’s Theorem). But both arise in a simple extension of probability theory to include “superposition events”–which should not be too surprising since superposition is the key non-classical concept in quantum mechanics.
-
998590.539765
One way of defining life is via a real definition, which gives the essence of life. Another approach is an operational definition, which shows how living things can be tested or measured in a way that is distinctive of the biological. Although I give a real definition elsewhere, in this paper I provide an operational definition, echoing Canguilhem’s dictum that life is what is capable of making mistakes. Biological mistakes are central to the behaviour of organisms, their parts and sub-systems, and the collections to which they belong. I provide an informal definition of a biological mistake. I contrast mistakes with mere failures and malfunctions. Although closely related phenomena, each is distinct. After giving some brief examples of mistake-making and how it can be tested, I reply to some objections to the very idea of a biological mistake.
-
1063105.53977
The theoretical developments that led to supersymmetry – first global and then local – over a period of about six years (1970/71-1976) emerged from a confluence of physical insights and mathematical methods drawn from diverse, and sometimes independent, research directions. Despite these varied origins, a common thread united them all: the pursuit of a unity in physics, grounded in the central role of symmetry, where “symmetry” is understood in terms of group theory, representation theory, algebra, and differential geometry.
-
1063123.539776
Scientific fields frequently need to exchange data to advance their own inquiries. Data unification is the process of stabilizing these forms of interfield data exchange. I present an account of the epistemic structure of data unification, drawing on case studies from model-based cognitive neuroscience (MBCN). MBCN is distinctive because it shows that modeling practices play an essential role in mediating these data exchanges. Models often serve as interfield evidential integrators, and models built for this purpose have their own representational and inferential functions. This form of data unification should be seen as autonomous from other forms, particularly explanatory unification.
-
1126038.539782
Scientists do not merely choose to accept fully formed theories, they also have to decide which models to work on before they are fully developed and tested. Since decisive empirical evidence in favour of a model will not yet have been gathered, other criteria must play determining roles. I examine the case of modern high-energy physics where the experimental context that once favoured the pursuit of beautiful, simple, and general theories now favours the pursuit of models that are ad hoc, narrow in scope, and complex; in short, ugly models. The lack of new discoveries since the Higgs boson, together with the unlikeliness of a new higher energy collider, has left searches for new physics conceptually and empirically wide open. Physicists must make use of the experiment at hand while also creatively exploring alternatives that have not yet been explored. This encourages the pursuit of models that have at least one of two key features: i) they take radically novel approaches, or ii) are easily testable. I present three models, neutralino dark matter, the relaxion, and repulsive gravity, and show that even if they do exhibit traditional epistemic virtues, they are nonetheless pursuitworthy. I argue that experimental context strongly determines pursuitworthiness and I lay out the conditions under which experiment encourages the pursuit of ugly models.
-
1236128.539788
Probabilities play an essential role in the prediction and explanation of events and thus feature prominently in well-confirmed scientific theories. However, such probabilities are frequently described as subjective, epistemic, or both. This prompts a well-known puzzle: how could scientific posits that predict and explain human-independent events essentially involve agents or knowers? I argue that the puzzle can be resolved by acknowledging that although such probabilities are non-fundamental, they may still be ontic and objective. To this end I describe dynamical mechanisms that are responsible for the convergence of probability distributions for chaotic systems, and apply an account of emergence developed elsewhere. I suggest that this analysis will generalise and claim that, consequently, a great many of the probabilities in science should be characterised in the same terms. Along the way I’ll defend a particular definition of chaos that suits the emergence analysis.
-
1236148.539793
Suppose we observe many emeralds which are all green. This observation usually provides good evidence that all emeralds are green. However, the emeralds we have observed are also all grue, which means that they are either green and already observed or blue and not yet observed. We usually do not think that our observation provides good evidence that all emeralds are grue. Why? I argue that if we are in the best case for inductive reasoning, we have reason to assign low probability to the hypothesis that all emeralds are grue before seeing any evidence. My argument appeals to random sampling and the observation-independence of green, understood as probabilistic independence of whether emeralds are green and when they are observed.
-
1236167.539799
While there has been much discussion of whether AI systems could function as moral agents or acquire sentience, there has been very little discussion of whether AI systems could have free will. I sketch a framework for thinking about this question, inspired by Daniel Dennett’s work. I argue that, to determine whether an AI system has free will, we should not look for some mysterious property, expect its underlying algorithms to be indeterministic, or ask whether the system is unpredictable. Rather, we should simply ask whether we have good explanatory reasons to view the system as an intentional agent, with the capacity for choice between alternative possibilities and control over the resulting actions. If the answer is “yes”, then the system counts as having free will in a pragmatic and diagnostically useful sense.
-
1236193.539804
trices. The main aim is to construct a system of Nmatrices by substituting standard sets by quasets. Since QST is a conservative extension of ZFA (the Zermelo-Fraenkel set theory with Atoms), it is possible to obtain generalized Nmatrices (Q-Nmatrices). Since the original formulation of QST is not completely adequate for the developments we advance here, some possible amendments to the theory are also considered. One of the most interesting traits of such an extension is the existence of complementary quasets which admit elements with undetermined membership. Such elements can be interpreted as quantum systems in superposed states. We also present a relationship of QST with the theory of Rough Sets RST, which grants the existence of models for QST formed by rough sets. Some consequences of the given formalism for the relation of logical consequence are also analysed.
-
1351478.53981
This paper concerns the question of which collections of general relativistic space-times are deterministic relative to which definitions. We begin by considering a series of three definitions of increasing strength due to Belot (1995). The strongest of these definitions is particularly interesting for spacetime theories because it involves an asymmetry condition called “rigidity” that has been studied previously in a different context (Geroch 1969; Halvorson and Manchak 2022; Dewar 2024). We go on to explore other (stronger) asymmetry conditions that give rise to other (stronger) forms of determinism. We introduce a number of definitions of this type and clarify the relationships between them and the three considered by Belot. We go on to show that there are collections of general relativistic spacetimes that satisfy much stronger forms of determinism than previously known. We also highlight a number of open questions.
-
1351498.539815
Determinism is the thesis that the past determines the future, but eorts to dene it precisely have exposed deep methodological disagreements. Standard possible-worlds formulations of determinism presuppose an "agreement" relation between worlds, but this relation can be understood in multiple ways none of which is particularly clear. We critically examine the proliferation of denitions of determinism in the recent literature, arguing that these denitions fail to deliver clear verdicts about actual scientic theories. We advocate a return to a formal approach, in the logical tradition of Carnap, that treats determinism as a property of scientic theories, rather than an elusive metaphysical doctrine. We highlight two key distinctions: (1) the dierence between qualitative and "full" determinism, as emphasized in recent discussions of physics and metaphysics, and (2) the distinction between weak and strong formal conditions on the uniqueness of world extensions. We argue that dening determinism in terms of metaphysical notions such as haecceities is unhelpful, whereas rigorous formal criteria such as Belot's D1 and D3 oer a tractable and scientically relevant account. By clarifying what it means for a theory to be deterministic, we set the stage for a fruitful interaction between physics and metaphysics.
-
1351519.53982
The idea that the universe is governed by laws of nature has precursors from ancient times, but the view that it is a or even the primary - or even the primary - aim of science to discover these laws only became established during the 16th and 17th century when it replaced the then prevalent Aristotelian conception of science. The most prominent promoters and developers of the new view were Galileo, Descartes, and Newton. Descartes, in Le Monde dreamed of an elegant mathematical theory that specified laws that describe the motions of matter and Newton in his Principia went a long way towards realizing this dream.