-
3501037.720317
Chronogeometry is often conceived as a necessary condition for spatiotemporality, yet many theories of quantum gravity (QG) seem to challenge it. Applications of noncommutative geometry (NCG) to QG propose that spacetime exhibits noncommutative features at or beyond the Planck scale, thereby replacing relativistic symmetries with their deformations, known as quantum groups. This leads to an algebraic formulation of noncommutative structure that postulates a minimal length scale and deforms relativistic (commutative) physics, raising questions about whether noncommutative theories preserve spatiotemporal content, and specifically, chronogeometry. I argue that noncommutative approaches can satisfy an appropriate definition of chronogeometry, thus attaining physical significance within QG. In particular, I contend that noncommutativity is compatible with chronogeometricity, using κ-Minkowski spacetime as case study in NCG. In this algebraic setting, physical interpretation hinges on two crucial elements: a representation of the noncommutative algebra and a corresponding set of observers. I show how this framework enables the algebra to encode localisation procedures for events in noncommutative spacetime, relative to a noncommutative reference frame, with frame transformations governed by the quantum group structure. By enriching the theory with noncommutative reference frames, NCG can satisfy the necessary representational principles to support chronogeometric content.
-
3534956.720393
Many of the theories found in contemporary high-energy physics are
gauge theories. The theory of the electromagnetic force is a gauge
theory, as are the theories of the weak and strong nuclear forces. Philosophers disagree about which other theories are gauge theories,
but they generally agree that gauge theories present distinctive
puzzles concerning mathematical representation. Philosophical
discussion of gauge theories has focused on these puzzles alongside
the metaphysical and epistemological consequences of the fact that
gauge theories feature centrally in theories of the fundamental
physical forces.
-
3534974.720407
The concept of preference spans numerous research fields, resulting in
diverse perspectives on the topic. Preference logic specifically
focuses on reasoning about preferences when comparing objects,
situations, actions, and more, by examining their formal properties. This entry surveys major developments in preference logic to date. Section 2
provides a historical overview, beginning with foundational work by
Halldén and von Wright, who emphasized the syntactic aspects of
preference. In
Section 3,
early semantic contributions by Rescher and Van Dalen are introduced. The consideration of preference relations over possible worlds
naturally gives rise to modal preference logic where preference
lifting enables comparisons across sets of possible worlds.
-
3544516.720417
I’ve explained a cool way to treat bound states of the hydrogen atom as wavefunctions on a sphere in 4-dimensional space. But so far I’ve been neglecting the electron’s spin. Now let’s throw that in too! …
-
3553221.720427
It is uncontroversial that humanistic thought and scientific inquiry have been entangled throughout a very long arc of intellectual history. Beyond this, however, significant challenges await anyone hoping to understand let alone articulate the nature of these entanglements. Since ‘science’ and ‘humanism’ are labels that are commonly applied to traditions of theorizing and practice that predate the 18th and 19th century introduction and use of these terms in their modern senses, respectively, and since both of these traditions have evolved and speciated a great deal from antiquity to the present, any attempt to untangle the many complex relationships between them amounts to a formidable task.
-
3558680.720443
We ask how and why mathematical physics may be seen as a rigorous discipline. Starting with Newton but drawing on a philosophical tradition ranging from Aristotle to (late) Wittgenstein, we argue that, as in mathematics, rigour ultimately comes from rules. These include logical rules of inference as well as definitions that give a precise meaning to physical concepts such as space and time by providing rules governing their use in models of the theories in which they are defined. In particular, so-called implicit definitions characterize “indefinables” whose traditionally assumed familiarity through “intuition” or “acquaintance” from Aristotle down to Russell blasts any hope of both rigour and innovation. Given the basic physical concepts, one may subsequently define derived concepts (like black holes or determinism). Definitions are seen as a priori meaning-constitutive conventions that are neither necessary `a la Kant nor arbitrary `a la Carnap, as they originate in empirical science as well as in the autonomous development of mathematics and physics. As such definitions are best seen as hypothetical.
-
3558751.720453
According to the stochastic-quantum correspondence, a quantum system can be understood as a stochastic process unfolding in an old-fashioned configuration space based on ordinary notions of probability and ‘indivisible’ stochastic laws, which are a non-Markovian generalization of the laws that describe a textbook stochastic process. The Hilbert spaces of quantum theory and their ingredients, including wave functions, can then be relegated to secondary roles as convenient mathematical appurtenances. In addition to providing an arguably more transparent way to understand and modify quantum theory, this indivisible-stochastic formulation may lead to new possible applications of the theory. This paper initiates a deeper investigation into the conceptual foundations and structure of the stochastic-quantum correspondence, with a particular focus on novel forms of gauge invariance, dynamical symmetries, and Hilbert-space dilations.
-
3558775.720468
Critics of ambivalence see it as something of inherent disvalue: a sign of poorly functioning agency. Instead, this chapter challenges this assumption, outlining the potential benefits of ambivalence for well-functioning agency, using criteria of rationality, agential effectiveness, autonomy, and authenticity. Furthermore, by exploring the interplay between philosophical debates on ambivalence and psychological research on suicide, the chapter shows how insights from each field can inform the other. For example, it follows that fostering ambivalence, rather than eliminating it, can sometimes support more effective suicide interventions, while ambivalence alone should not be seen as a marker of deficient agency and thus as justification for paternalistic measures.
-
3571362.720483
On all-false open future (AFOF), future contingent claims are all false. The standard way to define “Will p” is to say that p is true in all possible futures. But defining a possible future is difficult. …
-
3616451.720495
The literature on values in science contains countless claims to the effect that a particular type of scientific choice is or is not value-laden. This chapter exposes an ambiguity in the notion of a value-laden choice. In the first half, I distinguish four ways a choice can be said to be value-laden. In the second half, I illustrate the usefulness of this taxonomy by assessing arguments about whether the value-ladenness of science is inevitable. I focus on the “randomizer reply,” which claims that, in principle, scientists could always avoid value-laden choices by flipping a coin.
-
3670562.720505
Accuracy plays an important role in the deployment of machine learning algorithms. But accuracy is not the only epistemic property that matters. For instance, it is well-known that algorithms may perform accurately during their training phase but experience a significant drop in performance when deployed in real-world conditions. To address this gap, people have turned to the concept of algorithmic robustness. Roughly, robustness refers to an algorithm’s ability to maintain its performance across a range of real-world and hypothetical conditions. In this paper, we develop a rigorous account of algorithmic robustness grounded in Robert Nozick’s counterfactual sensitivity and adherence conditions for knowledge. By bridging insights from epistemology and machine learning, we offer a novel conceptualization of robustness that captures key instances of algorithmic brittleness while advancing discussions on reliable AI deployment. We also show how a sensitivity-based account of robustness provides notable advantages over related approaches to algorithmic brittleness, including causal and safety-based ones.
-
3731722.720516
Why are quantum correlations so puzzling? A standard answer is that they seem to require either nonlocal influences or conspiratorial coincidences. This suggests that by embracing nonlocal influences we can avoid conspiratorial fine-tuning. But that’s not entirely true. Recent work, leveraging the framework of graphical causal models, shows that even with nonlocal influences, a kind of fine-tuning is needed to recover quantum correlations. This fine-tuning arises because the world has to be just so as to disable the use of nonlocal influences to signal, as required by the no-signaling theorem. This places an extra burden on theories that posit nonlocal influences, such as Bohmian mechanics, of explaining why such influences are inaccessible to causal control. I argue that Everettian Quantum Mechanics suffers no such burden. Not only does it not posit nonlocal influences, it operates outside the causal models framework that was presupposed in raising the fine-tuning worry. Specifically, it represents subsystems with density matrices instead of random variables. This allows it to sidestep all the results (including EPR and Bell) that put quantum correlations in tension with causal models. However, this doesn’t mean one must abandon causal reasoning altogether in a quantum world. After all, quantum systems can clearly stand in causal relations. When decoherence is rampant and there’s no controlled entanglement, Everettian Quantum Mechanics licenses our continued use of standard causal models. When controlled entanglement is present—such as in Bell-type experiments—we can employ recently proposed quantum causal models that are consistent with Everettian Quantum Mechanics. We never need invoke any kind of nonlocal influence or any kind of fine-tuning.
-
3731749.720526
Feynman diagrams are used to calculate scattering amplitudes in quantum field theory, where they simplify the derivation of individual terms in the corresponding perturbation series. Considered mathematical tools with an approximative character, the received view in the philosophy of physics denies that individual diagrams can represent physical processes. A different story, however, can be observed in physics practice. From education to high-profile research publications, Feynman diagrams are used in connection with particle phenomena without any reference to perturbative calculations. In the first part of the paper, I argue that this illuminates an additional use of Feynman diagrams that is not calculatory but representational. It is not a possible translation into mathematical terms that prompts this practice but rather the epistemic insights into the target phenomenon that the diagrams provide. Based on this practical use, I intend to push back against the received view. In the second part of the paper, I conceptualize the representative use of Feynman diagrams as models that provide modal understanding of their associated target phenomena. The set of Feynman diagrams corresponding to an interaction is taken as a possibility space whose dependency relations can be analysed, allowing an agent to grasp possible target behaviour, leading to understanding. In clearly separating the diagrams from perturbative calculations for their use as a model, the concerns that hinder a representative reading can be resolved.
-
3731775.720539
We take a fresh look at Daniel Dennett’s naturalist legacy in philosophy, focusing on his rethinking of philosophical methods. Critics sometimes mistake Dennett for promoting a crude naturalism or dismissing philosophical tools like first-person intuition. We present his approach as more methodologically radical, blending science and philosophy in a way that treats inquiry as an evolving process. Concepts and intuitions are tested and adjusted in light of empirical findings and broader epistemic aims. For Dennett, science isn’t a limitation on philosophy, but a tool that sharpens it, with empirical data helping to refine our understanding both of concepts and philosophical phenomena alike. By exploring Dennett’s methodological contributions, we underscore the ongoing importance of his naturalist perspective in today’s philosophical landscape.
-
3731844.720552
In this paper, we argue that a perceiver’s contributions to perception can substantially affect what objects are represented in perceptual experience. To capture the scalar nature of these perceiver-contingent contributions, we introduce three grades of subject-dependency in object perception. The first grade, “weak subject-dependency,” concerns attentional changes to perceptual content like, for instance, when a perceiver turns their head, plugs their ears, or primes their attention to a particular cue. The second grade, “moderate subject-dependency,” concerns changes in the contingent features of perceptual objects due to action-orientation, location, and agential interest. For instance, being to the right or left of an object will cause the object to have a corresponding locative feature, but that feature is non-essential to the object in question. Finally, the third grade, “strong subject-dependency,” concerns generating perceptual objects whose existence depends upon their perceivers’ sensory contributions to perception. For this final grade of subject-dependency the adaptive perceptual system shapes diverse representations of sensory information by contributing necessary features to perceptual objects. To exemplify this nonstandard form of object perception we offer evidence from the future-directed anticipation of perceptual experts, and from the feature binding of synesthetes. We conclude that strongly subject-dependent perceptual objects are more than mere material objects, but are rather a necessary combination of material objects with the contributions of a perceiving subject.
-
3783037.720564
Pedants complain that the word “literally” is more often misused than used correctly. “This post will literally blow your mind! Your brain will literally explode!” “Literally?” they exclaim. “Then I had better stop reading.”
But the pedants are not pedantic enough. …
-
3805763.720574
In Part 4 we saw that the classical Kepler problem—the problem of a single classical particle in an inverse square force—has symmetry under the group of rotations of 4-dimensional space Since the Lie algebra of this group is
we must have conserved quantities
and
corresponding to these two copies of The physical meaning of these quantities is a bit obscure until we form linear combinations
Then is the angular momentum of the particle, while is a subtler conserved quantity: it’s the eccentricity vector of the particle divided by where the energy is negative for bound states (that is, elliptical orbits)
The advantage of working with and is that these quantities have very nice Poisson brackets:
This says they generate two commuting symmetries. …
-
3839787.720584
1. Should the state get out of the marriage business? Would it be better, if “personal relationships are regulated, the vulnerable are protected, and justice is furthered, all without the state recognition of marriage or any similar alternative”? …
-
3847174.720594
We revisit Einstein’s 1927 thought experiment on electron diffraction, using a single-electron source and an opaque hemispheric detector array, now achievable with modern sensors (~0.1 ns). In this fully enclosed system, where no signals escape the hemisphere, we provide a direct empirical comparison of the Many-Worlds Interpretation (MWI) and the Branched Hilbert Subspace Interpretation (BHSI). Both maintain unitarity without invoking wavefunction collapse, as in the Copenhagen Interpretation (CI), but differ ontologically: MWI proposes irreversible global branching into parallel worlds, while BHSI describes local, potentially reversible branching into decohered subspaces. In this setup, all quantum events (branching, engagement, disengagement, and relocation) occur entirely within the local system, and the Born rule, naturally emerging through branch weights, can be observed in detector statistics. To explore branching dynamics more thoroughly, we suggest an enhanced dual-layer experimental setup with an inner transparent detector. Because the electron’s transit time between layers (~0.12 ns) is shorter than the average response times of the inner sensors (~1 ns), this allows a crucial test of measurement timing and potential anomalies (“delayed” or “uncommitted” choice?). Our analysis challenges the notion that unitarity necessitates parallel worlds, instead advocating for a simpler view: local, unitary branching without collapse or global splitting.
-
3847224.720605
This paper introduces "pseudo-consciousness" as a novel framework for understanding and classifying advanced artificial intelligence (AI) systems that exhibit sophisticated cognitive behaviors without possessing subjective awareness or sentience.
-
3847249.720615
Tatiana Ehrenfest-Afanassjewa was an important physicist, mathematician, and educator in 20th century Europe. While some of her work has recently undergone reevaluation, little has been said regarding her groundbreaking work on dimensional analysis. This, in part, reflects an unfortunate dismissal of her interventions in such foundational debates by her contemporaries. In spite of this, her work on the generalized theory of homogeneous equations provides a mathematically sound foundation for dimensional analysis and has found some appreciation and development. It remains to provide a historical account of Ehrenfest-Afanassjewa’s use of the theory of homogeneous functions to ground (and limit) dimensional analysis. We take as a central focus Ehrenfest-Afanassjewa’s contributions to a debate on the foundations of dimensional analysis started by physicist Richard Tolman in 1914. I go on to suggest an interpretation of the more thoroughgoing intervention Ehrenfest-Afanassjewa makes in 1926 based on this earlier context, especially her limited rehabilitation of a “theory of similitude” in contradistinction to dimensional analysis. It is shown that Ehrenfest-Afanassjewa has made foundational contributions to the mathematical foundations and methodology of dimensional analysis, our conception of the relation between constants and laws, and our understanding of the quantitative nature of physics, which remain of value.
-
3847281.720626
What is health? This book addresses this fundamental question by narrowing the focus to contemporary medicine, specifically Western biomedicine or mainstream medicine. This chapter and the next one introduce the strategy: to understand what health is, we need to analyze health concepts. The health concepts we will discuss and evaluate throughout the book are the statements found in regulatory documents of the medical and healthcare community, or the operational definitions found in research protocols and scientific articles. We will see throughout the book that each concept of health is a theoretical tool designed to serve specific goals.
-
3847304.720638
Perspectivist positions have been proposed in physics, notably in order to address the interpretive difficulties of quantum mechanics. Recently, some versions of perspectivism have also been proposed in general philosophy of science to account for the plurality of scientific practice. Both kinds of views share the rejection of what they metaphorically call the “view from nowhere”. However, beyond this superficial similarity, they are very different: while quantum perspectivism entertains a concrete notion of perspective associated with individual agents or systems or concrete contexts, perspectival realism adopts a more abstract notion associated with explanatory aims or conceptual schemes. The aim of this paper is to clarify what is at stake with perspectivism in general. The general notion of a perspective, as well as the various attitudes one can entertained towards them, are characterised using the concepts of harmless contradiction and crossperspectival accessibility. A taxonomy of positions ranging from absolutism to relativism is proposed on this basis. Then the framework is applied to quantum perspectivism and perspectival realism to show its fruitfulness. Finally, I argue that abstract versions of perspectivism are bound to be metaphysically weaker than concrete versions.
-
3847329.720649
Causal Set Theory (CST) is a promising approach to fundamental physics that seems to treat causation as a basic posit. But in exactly what sense is CST causal? We argue that if the growth dynamics is interpreted as a physical process, then CST employs relations of actual causation between causal set elements, whereby elements bring one another into existence. This is important, as it provides a better sense of how CST works, highlights important differences from general relativity— where relations between spacetime points are typically seen as cases of mere causal connectibility rather than actual causation of the relevant type—and points toward a specific understanding of the emergence of spacetime within CST.
-
3962643.720659
The photon is typically regarded as a unitary object that is both particle-discrete and wave- continuous. This is a paradoxical position and we live with it by making dualism a fundamental feature of radiation. It is argued here that the photon is not unitary; rather it has two identities, one supporting discrete behavior and the other supporting continuous (wave) behavior. There is photon kinetic energy that is always discrete/localized on arrival; it never splits (on half-silvered mirrors) or diffracts (in pinholes or slits). Then there is the photon s probability wavefront that is continuous and diffractable. Acknowledging that the photon has two identities explains the photon s dual nature. And wave-particle duality is central to quantum mechanics. Understanding it leads to new insights into the photon s constant velocity and its entanglement with another photon.
-
3962666.72067
The idea of using lattice methods to provide a mathematically well-defined formulation of realistic effective quantum field theories (QFTs) and clarify their physical content has gained traction in the last decades. In this paper, I argue that this strategy faces a two-sided obstacle: realistic lattice QFTs are (i) too different from their effective continuum counterparts even at low energies to serve as their foundational proxies and (ii) far from reproducing all of their empirical and explanatory successes to replace them altogether. I briefly conclude with some lessons for the foundations of QFT.
-
3962688.720682
Thought experiments (TEs) are indispensable conceptual tools in scientific research, particularly in the study of quantum gravity. Many scholars argue that the epistemic significance of TEs hinges on the proper and ineliminable use of imagination. However, there is disagreement regarding the specific nature of the imagination involved. A valuable perspective on this debate is provided by a TE proposed by Matvei Bronstein in 1936 to support a quantum theory of gravity. His contribution serves as a notable example of destructive TE, aiming to highlight the internal inconsistency within a unified theory of both quantum mechanics and general relativity. In this paper, I reconstruct Bronstein’s TE in the context of recent discussions on the relationship between TEs and imagination. I argue that this case study challenges existing epistemological frameworks for understanding TEs. I contend that Bronstein’s TE introduces a new form of imagination, termed operational imagination, as indispensable for reaching its intended conclusion. I conclude that operational imagination can be integrated into simulative model-based accounts of TEs.
-
3962710.720692
Process jargon is widespread in the physical sciences. Beginning with the work of Wesley Salmon, several accounts in philosophy of science have attempted to provide a definition of “process” compatible with scientists’ understanding of causation and explanation. The proposed characterisation links processes to the properties of the spacetime they inhabit as regards continuity and genuine causality. Recent developments in theories of quantum gravity challenge the validity of process ontologies at the fundamental scale. In particular, this paper examines how arguments based on minimal length in the literature question the traditional definition of process. Process realism does not favour the processualist against these arguments. I conclude that certain theories of quantum gravity prevent a processual representation of the intended phenomena at the fundamental scale because they predict a violation of either the spatiotemporal specification or the causality conditions. In the end, the processualist faces a dilemma: either weaken the accepted definition of process without falling into substance ontologies, or hope that problematic theories of quantum gravity will be disconfirmed.
-
3962758.720702
Machine learning is rapidly transforming how society and humans are quantified. Shared amongst some machine learning applications in the social and human sciences is the tendency to conflate concepts with their operationalization through particular tests or measurements. Existing scholarship reduces these equations of concept and operationalization to disciplinary naivety or negligence. This paper takes a close look at equations of concept and operationalization in machine learning predictions of poverty metrics. It develops two arguments. First, I demonstrate that conflations of concept and operationalization in machine learning poverty prediction cannot be reduced to naivety or negligence but can serve a strategic function. Second, I propose to understand this function in the context of philosophical and historical research on operationalism in the social sciences.
-
3962784.720713
The term ‘spontaneous’ appears in various contexts in modern physics, but it also has a long history in natural philosophy. Its Greek analogue to automaton is studied by Aristotle, and the Latin phrase sponte sua is used extensively by Lucretius. Peirce also introduces spontaneity in the context of his tychism. In this thesis we give a historical overview of these uses of spontaneity and compare them to spontaneity in thermodynamics and quantum mechanics. We examine the relation to quantum measurement. We argue that in the Copenhagen interpretation, no quantum event can be said to be truly spontaneous, but that true spontaneity does exist in spontaneous collapse theories. Finally we investigate the relation of spontaneity to randomness and indeterminism.