The modal properties of the principle of the causal closure of the physical have traditionally been said to prevent anything outside the physical world from affecting the physical universe and vice versa. This idea has been shown to be relative to the definition of the principle (Gamper 2017). A traditional definition prevents the one universe from affecting any other universe, but with a modified definition, e.g. (ibid.), the causal closure of the physical can be consistent with the possibility of one universe affecting the other universe. Gamper (2017) proved this modal property by implementing interfaces between universes. Interfaces are thus possible, but are they realistic? To answer this question, I propose a two-step process where the second step is scientific research. The first step, however, is to fill the gap between the principles or basic assumptions and science with a consistent theoretical framework that accommodates the modal properties of an ontology that matches the basic assumptions.
Robert Batterman and others have argued that certain idealizing explanations have an asymptotic form: they account for a state of affairs or behavior by showing that it emerges “in the limit”. Asymptotic idealizations are interesting in many ways, but is there anything special about them as idealizations? To understand their role in science, must we augment our philosophical theories of idealization? This paper uses simple examples of asymptotic idealization in population genetics to argue for an affirmative answer and proposes a general schema for asymptotic idealization, drawing on insights from Batterman’s treatment and from John Norton’s subsequent critique.
Oabout number. Without it, modern economic life would be impossible, science would never have developed, and the complex technology that surrounds us would not exist. Though the full range of human numerical abilities is vast, the positive integers are arguably foundational to the rest of numerical cognition, and they will be our focus here. Many theorists have noted that although animals can represent quantity in some respects, they are unable to represent precise integer values. There has been much speculation about why this is so, but a common answer is that it is because animals lack another characteristic feature of human minds—natural language. In this chapter, we examine the question of whether there is an essential connection between language and number, while looking more broadly at some of the potential innate precursors to the acquisition of the positive integers. A full treatment of this topic would require an extensive review of the empirical literature, something we do not have space for. Instead, we intend to concentrate on the theoretical question of how language may figure in an account of the ontogeny of the positive integers.
Game-theoretic approaches to social norms have flourished in the recent years, and on first inspection theorists seem to agree on the broad lines that such accounts should follow. By contrast, this paper aims to show that the main two interpretations of social norms are at odds on at least one aspect of social norms, and both fail to account for another aspect.
According to orthodox (Kolmogorovian) probability theory, conditional probabilities are by definition certain ratios of unconditional probabilities. As a result, orthodox conditional probabilities are regarded as undefined whenever their antecedents have zero unconditional probability. This has important ramifications for the notion of probabilistic independence.
According to Ian Hacking, some human kinds are subject to a peculiar type of classificatory instability: individuals change in reaction to being classified, which in turn leads to a revision of our understanding of the kind. Hacking’s claim that these ‘human interactive kinds’ cannot be natural kinds has been vehemently criticised on the grounds that similar patterns of instability occur in paradigmatic examples of natural kinds. I argue that the dialectic of the extant debate misses the core conceptual problem of human interactive kinds. The problem is not that these kinds are particularly unstable but ‘capricious’— their members behave in wayward, unexpected manners which defeats existing theoretical understanding. The reason for that, I argue, is that human interactive kinds are often ‘hybrid kinds’ consisting of a base kind and an associated status, which makes mechanisms that support patterns of change and stability systematically difficult to understand and predict.
The previous two chapters have sought to show that the probability calculus cannot serve as a universally applicable logic of inductive inference. We may well wonder whether there might be some other calculus of inductive inference that can be applied universally. It would, perhaps, arise through a weakening of the probability calculus. The principal source of difficulty addressed in those chapters was the additivity of the probability calculus. Such a weakening seems possible as far as additivity is concerned. Something like it is achieved with the Shafer- Dempster theory of belief functions. However there is a second, lingering problem. Bayesian analyses require prior probabilities. As we shall see below, these prior probabilities are never benign. They always make a difference to the final result.
Erich Lehmann 20 November 1917 – 12 September 2009
Erich Lehmann was born 100 years ago today! (20 November 1917 – 12 September 2009). Lehmann was Neyman’s first student at Berkeley (Ph.D 1942), and his framing of Neyman-Pearson (NP) methods has had an enormous influence on the way we typically view them. …
I argue that the function attributed to episodic memory by Mahr & Csibra (that is, grounding one’s claims to epistemic authority over past events) fails to support the essentially autonoetic character of such memories. I suggest, in contrast, that episodic event-memories are sometimes purely first-order, sometimes autonoetic, depending on relevance in the context.
This article develops an account of local epistemic practices on the basis of case studies from ethnobiology. I argue that current debates about objectivity often stand in the way of a more adequate understanding of local knowledge and ethno-biological practices in general. While local knowledge about the biological world often meets criteria for objectivity in philosophy of science, general debates about the objectivity of local knowledge can also obscure their unique epistemic features. In modification of Ian Hacking’s suggestion to discuss “ground level questions” instead of objectivity, I propose an account that focuses on both epistemic virtues and vices of local epistemic practices.
Rosencrantz and Guildenstern Are Dead, they are betting on coin throws. Rosencrantz has a standing bet on heads, and he keeps winning, pocketing coin after coin. We soon learn that this has been going on for some time, and that no fewer than 76 consecutive heads have been thrown, and counting — a situation which is making Guildenstern increasingly uneasy. The coins don’t appear to be double-headed or weighted or anything like that — just ordinary coins — leading Guildenstern to consider several unsettling explanations: that he is subconsciously willing the coins to land heads in order to cleanse himself of some repressed sin, that they are both trapped reliving the same moment in time over and over again, that the coins are being controlled by some menacing supernatural force. He then proposes a fourth hypothesis, which suggests a change of heart: that nothing surprising is happening at all and no special explanation is needed. He says, “… each individual coin spun individually is as likely to come down heads as tails and therefore should cause no surprise each individual time it does.” In the end 92 heads are thrown without a single tail, when the characters are interrupted.
The Problem of Old Evidence is a perennial issue for Bayesian confirmation theory. Garber (1983) famously argues that the problem can be solved by conditionalizing on the proposition that a hypothesis deductively implies the existence of the old evidence. In recent work, Hartmann and Fitelson (2015) and Sprenger (2015) aim for similar, but more general, solutions to the Problem of Old Evidence. These solutions are more general because they allow the explanatory relationship between a new hypothesis and old evidence to be inductive, rather than deductive. In this paper, I argue that these solutions are either unsound or under-motivated, depending on the case of inductive explanation that we have in mind. This lends support to the broader claim that Garber-style Bayesian confirmation cannot capture the sense in which new hypotheses that do not deductively imply old evidence nevertheless seem to be confirmed via old evidence.
Hawking’s area theorem is a fundamental result in black hole theory that is universally associated with the null energy condition. That this condition can be weakened is illustrated by the formulation of a strengthened version of the theorem based on an energy condition that allows for violations of the null energy condition. This result tightens the conventional wisdom that quantum field theoretic violations of the null energy condition account for why the conclusion of the area theorem can be bypassed in the semi-classical context. Shown here is that violations of the null energy condition, though necessary, are not sufficient to violate the conclusion of the area theorem. As an added benefit, the specific form of the energy condition used here suggests that the area non-decrease behavior described by the area theorem is a quasi-local effect that depends, in large measure, on the energetic character of the relevant fields in the vicinity of the event horizon.
This paper considers states on the Weyl algebra of the canonical commutation relations over the phase space R2n. We show that a state is regular iff its classical limit is a countably additive Borel probability measure on R2n. It follows that one can “reduce” the state space of the Weyl algebra by altering the collection of quantum mechanical observables so that all states are ones whose classical limit is physical.
One part of the problem of anomaly is this. If a well-established scientific theory seems to predict something contrary to what we observe, we tend to stick to the theory, with barely a change in credence, while being dubious of the auxiliary hypotheses. …
« The Karp-Lipton Advice Column
The destruction of graduate education in the United States »
Review of “Inadequate Equilibria,” by Eliezer Yudkowsky
Inadequate Equilibria: Where and How Civilizations Get Stuck is a little gem of a book: wise, funny, and best of all useful (and just made available for free on the web). …
« Superposition your mouse over these five exciting QC links! Review of “Inadequate Equilibria,” by Eliezer Yudkowsky »
The Karp-Lipton Advice Column
Today, Shtetl-Optimized is extremely lucky to have the special guest blogger poly: the ‘adviser’ in the computational complexity class P/poly (P with polynomial-sized advice string), defined by Richard Karp and Richard Lipton in 1982. …
Paul Humphreys, Emergence (OUP, 2016)Yesterday we saw, via an example from social psychology, that diachronic approaches to emergence can avoid some of the major problems of synchronic approaches. That motivating example is not wholly convincing as an example of transformational emergence. …
Network analysis needs tools to infer distributions over graphs of arbitrary size from a single graph. Assuming the distribution is generated by a continuous latent space model which obeys certain natural symmetry and smoothness properties, we establish three levels of consistency for non-parametric maximum likelihood inference as the number of nodes grows: (i) the estimated locations of all nodes converge in probability on their true locations; (ii) the distribution over locations in the latent space converges on the true distribution; and (iii) the distribution over graphs of arbitrary size converges.
Molecular biologists exploit information conveyed by mechanistic models for experimental purposes. In this contribution, I make sense of this aspect of biological practice by developing Keller’s idea of the distinction between ‘models of’ and ‘models for’. ‘Models of (phenomena)’ should be understood as models representing phenomena and they are valuable if they explain phenomena. ‘Models for (manipulating phenomena)’ suggest new types of material manipulations and they are important not because of their explanatory force, but because of the interventionist strategies they afford. This is a distinction between aspects of the same model; in molecular biology, models may be treated either as ‘models of’ or as ‘models for’. By analyzing the discovery and characterization of restriction-modification systems and their exploitation for DNA cloning and mapping, I identify the differences between treating a model as a ‘model of’ or as a ‘model for’. These lie in a cognitive disposition of the modeler towards the model. A modeler will look at a model as a ‘model of’ if he/she is interested in its explanatory force, or as a ‘model for’ if the interest is in the material manipulations it can possibly afford.
The problem of the direction of the electromagnetic arrow of time is perhaps the most perplexing of the major unsolved problems of contemporary physics, because the usual tools of theoretical physics cannot be used to investigate it. Even the clues provided by the CP violation of the K 2 meson, which have led to a profound insight into the dominance of matter over antimatter in the universe, have not shed any light on the problem of the origin of the electromagnetic arrow of time.
There are various equivalent formulations of the Church-Turing thesis. A common one is that every effective computation can be carried out by
a Turing machine. The Church-Turing thesis is often misunderstood,
particularly in recent writing in the philosophy of mind.
Anthropic reasoning based on the apparent fine-tuning of physical parameters – scientific theory’s possession of values in an apparently tiny range allowing life – has been reinvigorated with the realization that string theory, far from determining the parameter values at issue, has models of great diversity. From this fact, many are convinced that the fine-tuning evidence is best explained by the observed universe’s selection as life-permitting (from many real sub-universes most of which do not allow the creation of life). Others see the universe as purposeful and perhaps designed. However, all fine-tuning arguments presuppose a governing conception of laws of nature. This paper argues that a David Lewis-style best-system account of scientific law disarms the anthropic argument. Indeed, in light of the fact that even rejected scientific theories are in many cases fine-tuned, anthropic reasoning may point toward a deflationary metaphysics rather than the extravagant designer-or-multiverse alternative.
After a brief presentation of Feynman diagrams, we criticizise the idea that Feynman diagrams can be considered to be pictures or depictions of actual physical processes. We then show that the best interpretation of the role they play in quantum field theory and quantum electrodynamics is captured by Hughes' Denotation, Deduction and Interpretation theory of models (DDI), where “models” are to be interpreted as inferential, non-representational devices constructed in given social contexts by the community of physicists.
The project of naturalistic metaphysics appears straightforward. Start with one’s best scientific theories and infer one’s metaphysical commitments from what these theories say exist, the sort of ideological frameworks they employ. Yet, as many have noted, naturalism poses challenges for metaphysics as it is typically practiced. In particular, once scientific theories themselves offer verdicts about the sort of things that exist, the properties they have, and the spatiotemporal structures they occupy, what more is there for metaphysicians to contribute than simply repeating what is already known? Even if the work is straightforward, in becoming naturalistic, metaphysics seems to promote its own obsolescence. The goal of this paper is to evaluate one influential response to this concern, one that has been appealing to many contemporary metaphysicians who are naturalists. This is to argue that although it might appear that metaphysics and science are aimed at a common set of questions about the sorts of entities the world contains and what they are like, this appearance is misleading. Metaphysicians rather address a distinctive subject matter, a subject matter more fundamental than that of science.
[Editor's Note: The following new entry by David Vander Laan replaces the
on this topic by the previous authors.] In the philosophy of religion, creation is the action by
which God brings an object into existence, while conservation
is the action by which God maintains the existence of an object over
time. The major monotheisms unambiguously affirm that God both created
the world and conserves it. It is less clear, however, whether
creation and conservation are to be conceived as distinct kinds of
actions. The question has its roots in medieval and early modern
characterizations of divine action, and it has received renewed
attention in recent decades.
Joseph Halpern and Judea Pearl () draw upon structural equation models to develop an attractive analysis of ‘actual cause’. Their analysis is designed for the case of deterministic causation. I show that their account can be naturally extended to provide an elegant treatment of probabilistic causation.
As Feynman (1982) observed, “we always have had a great deal of difficulty in understanding the world view that quantum mechanics represents” (471). Among the perplexing aspects of quantum mechanics is its seeming, on a wide variety of presently live realist interpretations (including but not limited to the so-called ‘orthodox’ interpretation), to violate the classical supposition of ‘value definiteness’, according to which the properties—a.k.a. ‘observables’—of a given particle or system have precise values at all times. Indeed, value indefiniteness lies at the heart of what is supposed to be distinctive about quantum phenomena, as per the following classic cases:
In this paper we compare two different notions of ‘power’, both of which attempt to provide a realist understanding of quantum mechanics grounded on the potential mode of existence. For this propose we will begin by introducing two different notions of potentiality present already within Aristotelian metaphysics, namely, irrational potentiality and rational potentiality. After discussing the role played by potentiality within classical and quantum mechanics, we will address the notion of causal power which is directly related to irrational potentiality and has been adopted by many interpretations of QM. We will then present the notion of immanent power which relates to rational potentiality and argue that this new concept presents important advantages regarding the possibilities it provides for understanding in a novel manner the theory of quanta. We end our paper with a comparison between both notions of ‘power’, stressing some radical differences between them.
This paper introduces and examines the prospects of the recent research in a holographic relation between entanglement and spacetime pioneered by Mark van Raamsdonk and collaborators. Their thesis is that entanglement in a holographic quantum state is crucial for connectivity in its spacetime dual. Utilizing this relation, the paper develops a thought experiment that promises to probe the nature of spacetime by monitoring the behavior of a spacetime when all entanglement is removed between local degrees of freedom in its dual quantum state. The thought experiment suggests a picture of spacetime as consisting of robust nodes that are connected by non-robust bulk spacetime that is sensitive to changes in entanglement in the dual quantum state. However, rather than pursuing the thought experiment in further detail, the credibility of the relation between spacetime and entanglement in this zero entanglement limit is questioned. The energy of a quantum system generally increases when all entanglement is removed between subsystems, and so does the energy of its spacetime dual. If a system is subdivided into an infinite number of subsystems and all entanglement between them is removed, then the energy of the quantum system and the energy of its spacetime dual are at risk of diverging. While this is a prima facie worry for the thought experiment, it does not constitute a conclusive refutation.