-
5325.890572
The AdS/CFT correspondence posits a holographic equivalence between a gravitational theory in Anti-de Sitter (AdS) spacetime and a conformal field theory (CFT) on its boundary, linked by gauge-invariant quantities like field strengths Fµν and fluxes Φ. This paper examines that link, drawing on my prior analysis of the Aharonov-Bohm (AB) effect, where such quantities exhibit nonlocality, discontinuity, and incompleteness. I demonstrate that gauge potentials Aµ in the Lorenz gauge—not their invariant derivatives—mediate the AB effect’s local, continuous dynamics, a reality extending to gravitational fields gµν as substantival entities. In AdS/CFT, the CFT’s reduction of bulk Aµ and gµν to gauge-invariant imprints fails to reflect this ontology, a flaw so fundamental that it excludes exact gauge/gravity duality—neither standard mappings nor reformulations suffice. A new mathematical proof formalizes this: the bulk’s diffeomorphism freedom cannot correspond to the boundary’s gauge freedoms, Abelian or non-Abelian, under this reality. This critique spans the gauge/gravity paradigm broadly, from AdS/CFT to holographic QCD, where symmetry invisibility obscures bulk physics. While duality’s successes in black hole thermodynamics and strongly coupled systems highlight its utility, I suggest these reflect approximations within specific regimes, not a full equivalence. I propose a shift toward a framework prioritizing Aµ and gµν ’s roles, with gravitational AB effects in AdS as a testing ground. This work seeks to enrich holography’s dialogue, advancing a potential-centric view for quantum gravity.
-
5368.891242
The article sets out to clarify a number of confusions that exist in connection with the Born-Oppenheimer approximation (BOA). It is generally claimed that chemistry cannot be reduced to quantum mechanics because of the nature of this commonly used approximation in quantum chemistry, that is popularly believed to require a ‘clamping’ of the nuclei. It is also claimed that the notion of molecular structure, which is so central to chemistry, cannot be recovered from the quantum mechanical description of molecules and that it must be imposed by hand through the BOA. Such an alleged failure of reduction is then taken to open the door to concepts such as emergence and downward causation.
-
5454.891272
The quantum measurement problem is one of the most profound challenges in modern physics, questioning how and why the wavefunction collapses during measurement to produce a single observable outcome. In this paper, we propose a novel solution through a logical framework called Aethic reasoning, which reinterprets the ontology of time and information in quantum mechanics.
-
5545.891281
Inconsistencies! What do they mean? Can we support them? With this paper, we hope to contribute to the claim that we can tolerate inconsistencies in certain situations even without considering any logic that may enable us to do that, say some paraconsistent logic. We argue that in many cases where we apply reason, we work in domains where inconsistencies appear, and even so, we neither get them out (but ‘support’ them) nor modify the underlying logic (such as classical logic) to avoid logical troubles. To make things more precise, we distinguish between inconsistency, anomaly, and contradiction. Our thesis is that we can reason sensibly with classical logic even in the presence of inconsistencies once (as we explain) we either ‘do not go there’ or make things so that the inconsistent sentences cannot be joined to arrive at a contradiction. Some sample cases are given to motivate the discussion.
-
120643.891288
This paper explores the theme of human limitedness and the virtues in David McPherson’s The Virtues of Limits. I survey some of the main themes of his discussion— including kinds of human limits and the idea of “limiting-virtues”—and indicate salient themes in Buddhist and classical Chinese philosophical traditions. I then suggest that McPherson is too quick to dismiss forms of moral quietism and that his discussion of our limitedness rests on a latent pessimism worthy of further articulation.
-
120831.891295
Empiricists following Poincaré have argued that spacetime geometry can be freely chosen by convention, while adjusting unobservable structure so as to maintain empirical adequacy. In this article, I first strengthen a no-go result of Weatherall and Manchak against the conventionality of geometry, and then argue that any remaining conventionality arises from scientific incompleteness. To illustrate, I discuss a new kind of conventionality that is available in the presence of higher spatial dimensions, and illustrate how the incompleteness in such models can be resolved by introducing new physical theories like Kaluza-Klein theory. Conventional choices of this kind may provide a fruitful starting point in the search for new science, but if successful would eliminate the conventionalist alternatives.
-
379863.891302
Mexican existentialism grows out of the encounter, engagement, and
appropriation with French and German existentialist philosophies in
Mexico mid-way through the twentieth-century. Key players in this
tradition were José Gaos (1900–1969), Antonio Caso
(1883–1946), and, especially, el grupo Hiperión
(the Hyperion Group). Members of Hyperion, but particularly Emilio
Uranga (1921–1988), Leopoldo Zea (1912–2004), Jorge
Portilla (1918–1963), and Luis Villoro (1922–2014),
focused their efforts on existential reinterpretations of that
which is Mexican (“lo mexicano” or Mexicanness), a
focus that lends this tradition its historical and conceptual
uniqueness and importance.
-
524523.891308
A nested interferometer experiment by Danan et al (2013) is discussed and some ontological implications explored, primarily in the context of time-symmetric interpretations of quantum theory. It is pointed out that photons are supported by all components of their wavefunctions, not selectively truncated "first order" portions of them, and that figures representing both a gap in the photon's path and signals from the cut-off path are incorrect. It is also noted that the Transactional Formulation (traditionally known as the Transactional Interpretation) readily accounts for the observed phenomena.
-
524597.891314
I argue that we need to distinguish between three concepts of actual causation: total, path-changing, and contributing actual causation. I provide two lines of argument in support of this account. First, I address three thought experiments that have been troublesome for unified accounts of actual causation, and I show that my account provides a better explanation of corresponding causal intuitions. Second, I provide a functional argument: if we assume that a key purpose of causal concepts is to guide agency, we are better off making a distinction between three concepts of actual causation.
-
697612.891326
There are two main strands of arguments regarding the value-free ideal (VFI): desirability and achievability (Reiss and Sprenger 2020). In this essay, I will argue for what I will call a compatibilist account of upholding the VFI focusing on its desirability even if the VFI is unachievable. First, I will explain what the VFI is. Second, I will show that striving to uphold the VFI (desirability) is compatible with the rejection of its achievability. Third, I will demonstrate that the main arguments against the VFI do not refute its desirability. Finally, I will provide arguments on why it is desirable to strive to uphold the VFI even if the VFI is unachievable and show what role it can play in scientific inquiry. There is no single definition of the VFI, yet the most common way to interpret it is that non-epistemic values ought not to influence scientific reasoning (Brown 2024, 2). Non-epistemic values are understood as certain ethical, social, cultural or political considerations. Therefore, it is the role of epistemic values, such as accuracy, consistency, empirical adequacy and simplicity, to be part of and to ensure proper scientific reasoning.
-
697629.891333
There is an overwhelmingly abundance of works in AI Ethics. This growth is chaotic because of how sudden it is, its volume, and its multidisciplinary nature. This makes difficult to keep track of debates, and to systematically characterize goals, research questions, methods, and expertise required by AI ethicists. In this article, I show that the relation between ‘AI’ and ‘ethics’ can be characterized in at least three ways, which correspond to three well-represented kinds of AI ethics: ethics and AI; ethics in AI; ethics of AI. I elucidate the features of these three kinds of AI Ethics, characterize their research questions, and identify the kind of expertise that each kind needs. I also show how certain criticisms to AI ethics are misplaced, as being done from the point of view of one kind of AI ethics, to another kind with different goals. All in all, this work sheds light on the nature of AI ethics, and set the grounds for more informed discussions about scope, methods, and trainings of AI ethicists.
-
726022.891341
Critical theory arose as a response to perceived inadequacies in
Marxist theory, and perceived changes in modern capitalism. Critical
theorists emphasized the ability of capitalism to shape the thought
and experience of individuals: it distorts how modern society and its
products appear to us, and how we think about them. So, aesthetic
experience – like all other experience – is moulded to and
compromised by capitalism. For critical theory, if we seek to
understand aesthetics we need to acknowledge this distorting
effect. Critical theorists ask us to pay attention to how art, and aesthetic
experience, suffer under capitalism, and become part of the way in
which capitalism prevents the formation of a better life.
-
783708.891347
Dehumanization is widely thought to occur when someone is treated or
regarded as less than human. However, there is an ongoing debate about
how to develop this basic characterization. Proponents of the
harms-based approach focus on the idea that to dehumanize someone
is to treat them in a way that harms their humanity; whereas
proponents of the psychological approach focus on the idea
that to dehumanize someone is to think of them as less than
human. Other theorists adopt a pluralistic view that combines elements
of both approaches. In addition to explaining different views on what it means to
dehumanize someone, this article focuses on related issues, such as
how to resolve the so-called “paradox of dehumanization”;
the causes and consequences of dehumanization; the sorts of contexts
in which dehumanization typically occurs; and the relation between
dehumanization and objectification.
-
799227.891352
Robert W. Batterman’s A Middle Way: A Non-Fundamental Approach to Many-Body Physics is an extraordinarily insightful book, far-reaching in its scope and significance, interdisciplinary in character due to connections made between physics, materials science and engineering, and biology, and groundbreaking in the sense that it reflects on important scientific domains that are mostly absent from current literature. The book presents a hydrodynamic methodology, which Batterman explains is pervasive in science, for studying many-body systems as diverse as gases, fluids, and composite materials like wood, steel, and bone. Following Batterman, I will call said methodology the middle-out strategy. Batterman’s main thesis is that the middle-out strategy is superior to alternatives, solves an important autonomy problem, and, consequently, implies that certain mesoscale structures (explained below) ought to be considered natural kinds. In what follows, I unpack and flesh out these claims, starting with a discussion of the levels of reality and its representation. Afterward, I briefly outline the contents of the book’s chapters and then identify issues that seem to me to merit further clarification.
-
894011.891364
Political disagreement tends to display a “radical” nature that is partly related to the fact that political beliefs and judgments are generally firmly held. This makes people unlikely to revise and compromise on them. …
-
1159096.89137
QBism explicitly takes the subjective view: probabilities of events are defined solely by past experiences, i.e. the record of observations. As shown by the authors (Fuchs et al, 2013), this: “... removes the paradoxes, conundra, and pseudo-problems that have plagued quantum foundations for the past nine decades”. It is criticised for its lack of ontology and anthropocentric nature. However, if Everett's (1957) formulation is taken at face value, exactly the features of QBism are the result, and the ontology is inherent. The anthropocentric nature of the solution is simply an indication that the quantum state is relative, as is central to Everett. Problems of measurement and locality do not arise.
-
1159114.891376
In Part 1 the properties of QBism are shown to be natural consequences of taking quantum mechanics at face value, as does Everett in his Relative State Formulation (1957). In Part 2 supporting evidence is presented. Parmenides' (Palmer, 2012) notion that the physical world is static and unchanging is vividly confirmed in the new physics. This means the time evolution of the physical world perceived by observers only occurs at the level of appearances as noted by Davies (2002). In order to generate this appearance of time evolution, a moving frame of reference is required: this is the only possible explanation of the enactment of the dynamics of physics in a static universe.
-
1159130.891382
Despite the simplicity of Weyl's solution to the paradox of the passage of time in the static block universe, virtually no interest is shown in this approach although as shown in Part 2, the problem of the Now could be taken as evidence for his solution being correct. A moving frame of reference is required to explain the experience of the enactment of any of the dynamics of physics, and the experiencing consciousness supervenes on this phenomenon. Given the logic involved is straightforward, it seems that the reasons all this has been ignored may be less so. Here it is suggested, based on Davies' (2006) research, that this might well involve a horror of even the possibility of deity and mysticism being dignified by discussion, let alone endorsement. The objective here is to demonstrate that this approach does validate certain archetypal myths of the great spiritual traditions, but at the same time fully supports and reinforces the objective basis of the science of physics. The myths are exploded to reveal simply scientific principles, and a complete absence of gods or mystical phenomena, indeed such things are categorically ruled out. The scientific principles illustrated by the third logical type which have languished unexamined turn out to be powerful knowledge which serves only to reinforce and emphasise how deeply flawed were the key principles of the religious preoccupations which our culture had to relinquish in order to move forward.
-
1159149.891388
The localization problem in relativistic quantum theory has persisted for more than seven decades, yet it is largely unknown and continues to perplex even those well-versed in the subject. At the heart of this problem lies a fundamental conflict between localizability and relativistic causality, which can also be construed as part of the broader dichotomy between measurement and unitary dynamics. This article provides a historical review of the localization problem in one-particle relativistic quantum mechanics, clarifying some persistent misconceptions in the literature, and underscoring the antinomy between causal dynamics and localized observables.
-
1159167.891395
— While emergentism enjoys some good fortune in contemporary philosophy, attempts at elucidating the history of this view are rare. Among such attempts, by far the most influential certainly is McLaughlin’s landmark paper “The Rise and Fall of British Emergentism” (1992). While McLaughlin’s analysis of the recent history of emergentism is insightful and instructive in its own ways, in the present paper we offer reasons to be suspicious of some of its central claims. In particular, we advance evidence that rebuts McLaughlin’s contention that British Emergentism did not fall in the 1920–1930s because of philosophical criticism but rather because of an alleged empirical inconsistency with fledgling quantum mechanics.
-
1188128.891403
Comparative philosophy of religion is a subfield of both philosophy of
religion and comparative philosophy. Philosophy of religion engages
with philosophical questions related to religious belief and practice,
including questions concerning the concept of religion itself. Comparative philosophy compares concepts, theories, and arguments from
diverse philosophical traditions. The term “comparative
philosophy of religion” can refer to the comparative
philosophical study of different religions or of different
philosophies of religion. It can thus be either a first-order
philosophical discipline—investigating matters to do with
religion—or a second-order philosophical discipline,
investigating matters to do with philosophical inquiry into religion.
-
1199273.891409
High speed store required: 947 words. No of bits in a word: 64 Is the program overlaid? No No. of magnetic tapes required: None What other peripherals are used? Card Reader; Line Printer No. of cards in combined program and test deck: 112 Card punching code: EBCDIC Keywords: Atomic, Molecular, Nuclear, Rotation Matrix, Rotation Group, Representation, Euler Angle, Symmetry, Helicity, Correlation.
-
1277421.891415
A neglected but challenging argument developed by Peter Geach, John Haldane, and Stephen Rothman purports to show that reproduction cannot be explained by natural selection and is irreducibly teleological. Meanwhile, the most plausible definitions of life include reproduction as a constitutive feature. The implication of combining these ideas is that life cannot be explained by natural selection and is irreducibly teleological. This does not entail that life cannot be explained in evolutionary terms of some kind, but it does lend support to the controversial view of Jerry Fodor and Thomas Nagel that evolutionists need to look beyond the constraints of Neo-Darwinism.
-
1309628.891422
Where does the Born Rule come from? We ask: “What is the simplest extension of probability theory where the Born rule appears”? This is answered by introducing “superposition events” in addition to the usual discrete events. Two-dimensional matrices (e.g., incidence matrices and density matrices) are needed to mathematically represent the differences between the two types of events. Then it is shown that those incidence and density matrices for superposition events are the (outer) products of a vector and its transpose whose components foreshadow the “amplitudes” of quantum mechanics. The squares of the components of those “amplitude” vectors yield the probabilities of the outcomes. That is how probability amplitudes and the Born Rule arise in the minimal extension of probability theory to include superposition events. This naturally extends to the full Born Rule in the Hilbert spaces over the complex numbers of quantum mechanics. It would perhaps be satisfying if probability amplitudes and the Born Rule only arose as the result of deep results in quantum mechanics (e.g., Gleason’s Theorem). But both arise in a simple extension of probability theory to include “superposition events”–which should not be too surprising since superposition is the key non-classical concept in quantum mechanics.
-
1361184.891428
Edith Landmann-Kalischer (1877–1951) is the author of several
significant studies on topics in the philosophy of art, aesthetics,
value, mind, and knowledge in the first half of the twentieth century. Influenced by Franz Brentano, Georg Simmel, Carl Stumpf, and Stefan
George, her studies were initiated at a time when the academic, often
tendentious borders between psychology and philosophy, like those
between aesthetics and art history, were still being drawn. While
clearly also influenced by Edmund Husserl, she takes his phenomenology
to task for its idealism and, in her view, its unfounded isolation
from the sciences, especially psychology.
-
1434308.891434
I've been exploring in this newsletter recently how people's growing inability to understand and control the institutions that shape their lives affects their political views (see here or here for instance). …
-
1619342.89144
Brian Leftow’s 2022 book, Anselm’s Argument: Divine Necessity is an impressively thorough discussion of Anselmian modal metaphysics, centred around what he takes to be Anselm’s strongest “argument from perfection” (Leftow’s preferred term for an Ontological Argument). This is not the famous argument from Proslogion 2, nor even the modal argument that some have claimed to find in Proslogion 3, but rather, an argument from Anselm’s Reply to Gaunilo, expressed in the following quotation: “If … something than which no greater can be thought … existed, neither actually nor in the mind could it not exist. Otherwise it would not be something than which no greater can be thought. But whatever can be thought to exist and does not exist, if it existed, would be able actually or in the mind not to exist. For this reason, if it can be thought, it cannot not exist.” (p. 66) Before turning to this argument, Leftow offers an extended and closely-argued case for understanding Anselm’s modality in terms of absolute necessity and possibility, with a metaphysical foundation on powers as argued for at length (575 pages) in his 2012 book God and Necessity. After presenting this interpretation in Chapter 1, Leftow’s second chapter discusses various theological applications (such as the fixity of the past, God’s veracity, and immortality), addressing them in a way that both expounds and defends what he takes to be Anselm’s approach. Then in Chapter 3 Leftow addresses certain problems, for both his philosophical and interpretative claims, while Chapter 4 spells out the key Anselmian argument, together with Leftow’s suggested improvements. Chapter 5 explains how the argument depends on Brouwer’s system of modal logic, and defends this while also endorsing the more standard and comprehensive system S5.
-
1620510.891446
In a recent TLS, I wrote about the spoils of pessimism—whether we should be quietists, retreating from the world, or activists who fight for it—but my real subject was despair. I did not get to write about the best book on despair I’ve read: Christian Wiman’s prose-poetic Zero at the Bone. …
-
1731764.891452
Let us say that a being is omnisubjective if it has a perfect first-person grasp of all subjective states (including belief states). The question of whether God is omnisubjective raises a nest of thorny issues in the philosophy of language, philosophy of mind, and metaphysics, at least if there are irreducibly subjective states. There are notorious difficulties analyzing the core traditional divine attributes—omniscience, omnipotence, and omnibenevolence—but those difficulties are notorious partly because we seem to have a decent pre-theoretic grasp of what it means for something to be all knowing, powerful, and good, and so it is surprising, frustrating, and perplexing that it is so difficult to provide a satisfactory analysis of those notions.
-
1735664.891458
The theoretical developments that led to supersymmetry – first global and then local – over a period of about six years (1970/71-1976) emerged from a confluence of physical insights and mathematical methods drawn from diverse, and sometimes independent, research directions. Despite these varied origins, a common thread united them all: the pursuit of a unity in physics, grounded in the central role of symmetry, where “symmetry” is understood in terms of group theory, representation theory, algebra, and differential geometry.