1. 5674.659385
    - Grant Sanderson, of 3blue1brown, has put up a phenomenal YouTube video explaining Grover’s algorithm, and dispelling the fundamental misconception about quantum computing, that QC works simply by “trying all the possibilities in parallel.” Let me not futz around: this video explains, in 36 minutes, what I’ve tried to explain over and over on this blog for 20 years … and it does it better. …
    Found 1 hour, 34 minutes ago on Scott Aaronson's blog
  2. 42850.659477
    Common ground is the information that the participants in a conversation treat as background information for the purposes of their interaction. We review two traditions of research on common ground: The formal tradition, consisting mainly of theoretical linguists and philosophers of language, has developed increasingly sophisticated formal models of common ground in order to generate predictions about an expanding range of empirical phenomena. Meanwhile, the psycholinguistic tradition has focused on a narrower range of phenomena while developing more realistic theories of the psychological mechanisms that allow us to select and represent common ground. After summarizing these two traditions, we consider several reasons why they should be re-integrated, and argue that the best way to bring them back together would be to adopt a cognitive-pluralist approach, whereby language users have access to a variety of mechanisms for managing background information, which are more or less available and efficient depending on the communicative situation and the kind of information mentally represented, as well as the cognitive demands of each mechanism.
    Found 11 hours, 54 minutes ago on Daniel W. Harris's site
  3. 62071.659502
    Hyperscanning has been increasingly used to quantify the quality of social relationships by tracking the neural correlates of interpersonal interactions.This paper critically examines the use of hyperscanning to track the neural correlates of psychotherapeutic change, e.g., the patient-therapist relationship. First, we motivate our project by diagnosing a lack of complex models at the mesoscale in this domain and, consequently, a polarization of the analysis at the micro and macroscales. Looking for the causes of this issue, we highlight the epistemic blindspots of current methodologies that prioritize neural synchrony as a marker of therapeutic success. Drawing on empirical studies and theoretical frameworks, we identify an asymmetry between the neural and behavioral conceptual toolkits, with the latter remaining underdeveloped. We argue that this imbalance stems from two key issues: the underdetermined qualitative interpretation of brain data and the neglect of strong reciprocity in neuroscientific second-person paradigms. In light of our critical analysis, we suggest that further research could address the complexity of reciprocal, dynamic interactions in therapeutic contexts. Specifically, drawing on enactivism, we highlight that the autonomy of interactions is one of the factors that undermines the synchrony paradigm. This approach emphasizes the co-construction of meaning and shared experiences through embodied, reciprocal interactions, offering a more integrative understanding of therapeutic change that moves beyond static neural measures to account for the emergent and dynamic nature of social cognition.
    Found 17 hours, 14 minutes ago on PhilSci Archive
  4. 62150.659518
    A number of authors (Morgan, 1999; Boumans, 2005; Morrison, 2009; Massimi and Bhimji, 2015; Parker, 2017) have argued that models can be quite literally thought of as measuring instruments. I here challenge this view by reconstructing three arguments from the literature and rebutting them. Further, I argue that models should be seen as cognitive rather than measuring instruments, and that the distinction is important for understanding scientific change: Both yield two distinct sources of insight that mutually depend on each other, and should not be equated. In particular, we may perform the exact same actions in the laboratory but conceive of them entirely differently by virtue of the models we endorse at different points in time.
    Found 17 hours, 15 minutes ago on PhilSci Archive
  5. 62170.65953
    The gravitational Aharonov-Bohm (AB) effect, where quantum particles acquire phase shifts in curvature-free regions due to a gauge-fixed metric perturbation hµν , highlights the intriguing gauge dependence of spacetime. This study explores whether Loop Quantum Gravity (LQG), which views spacetime as emerging from SU(2)- and diffeomorphism-invariant spin networks, can accommodate this effect. The AB effect suggests that LQG should incorporate gauge dependence at the quantum level, which appears challenging within its relational, gauge-invariant framework. Potential modifications to LQG, such as introducing gauge-fixing constraints or effective fields, may require assumptions aligned with substantivalism, potentially diverging from its emergent paradigm. These results invite a thoughtful reconsideration of spacetime’s ontological status, encouraging a dialogue between relational and substantivalist perspectives in quantum gravity.
    Found 17 hours, 16 minutes ago on PhilSci Archive
  6. 62188.659542
    In a recent reply to my criticisms (Found. Phys. 55:5, 2025), Carcassi, Oldofredi, and Aidala (COA) admitted that their no-go result for ψ-ontic models is based on the implicit assumption that all states are equally distinguishable, but insisted that this assumption is a part of the ψ-ontic models defined by Harrigan and Spekkens, thus maintaining their result’s validity. In this note, I refute their argument again, emphasizing that the ontological models framework (OMF) does not entail this assumption. I clarify the distinction between ontological distinctness and experimental distinguishability, showing that the latter depends on dynamics absent from OMF, and address COA’s broader claims about quantum statistical mechanics and Bohmian mechanics.
    Found 17 hours, 16 minutes ago on PhilSci Archive
  7. 62210.659554
    Measuring diversity in microbial ecology and microbiome studies is fraught with challenges, rendering the assessment of its ”real-world” value nearly impossible. The instability of taxonomic classification, difficulty in isolating individuals, and reliance on DNA-based methods and statistical tools all contribute to the complexity of measuring diversity reliably. This manuscript explores the underlying philosophical issues, relating them to the measurement problem in philosophy. I argue that traditional philosophical accounts of measurement, including representational, operationalist, and realist approaches, are insufficient to address these issues. Instead, I examine these challenges through the lens of a model-based perspective on measurement, which can remain agnostic about entities and property ontologies, clarify the role of assumptions in diversity measurement, and provide solutions for justifying measurement procedures. This work emphasizes the importance of calibration and clearly defining measurement purposes, providing avenues for scientists to improve their measurement procedures. Ultimately, I contribute to a deeper understanding of the challenges and opportunities in measuring microbial diversity by bridging the gap between philosophy and scientific practice.
    Found 17 hours, 16 minutes ago on PhilSci Archive
  8. 62235.659566
    There has been considerable discussion in the philosophical literature of the past decade or so of a view that has come to be known as “wave function realism,” which I will abbreviate as WFR. The basic claim of this view is that quantum theory gives us motivation to think that quantum wave functions should be thought of as fields on a space of very high (or perhaps infinite) dimension, and that this space is in some important sense more fundamental than familiar three-dimensional space or four-dimensional spacetime. Note that this is much stronger than the mere claim that quantum states represent something physically real, a claim that I myself have defended (Myrvold 2020a, 2020b).
    Found 17 hours, 17 minutes ago on PhilSci Archive
  9. 62257.659577
    Philosophers of physics, when engaged in matters they regard as fundamental, tend to focus their analyses on fictitious systems that are wholly isolated from their environments. When pressed, they retreat to the fiction of treating the universe as a whole as an object of scientific study. This is nothing at all like the way science is practiced. Even if a system can be insulated in such a way that its interactions with its surroundings are negligible, one only ever explicitly models a minute fraction of the degrees of freedom of the system, and, as the degrees of freedom explicitly considered interact with those that are not, those degrees of freedom that are treated in the model constitute what is, in effect, an open system.
    Found 17 hours, 17 minutes ago on PhilSci Archive
  10. 193181.65959
    This paper proposes a theory-neutral formal framework designed to accommodate data that implicates consciousness in anomalous observer-linked phenomena, including structured accounts sometimes interpreted as involving alleged non-human intelligence. Motivated by growing empirical reports in which observer phenomenology appears coupled to system behavior, the paper introduces an explanatory workspace that expands the standard quantum state space to include a phenomenal dimension.
    Found 2 days, 5 hours ago on PhilSci Archive
  11. 193237.659602
    I introduce Inevitable Actualization (IA), an ontological modality: if (1) the universe’s future time involves an unbounded sequence of causal trials (H) and (2) a state S has a non-zero physical probability Pn > in trial n such that the sum n=1 Pn diverges, then S is guaranteed to occur with probability one. IA is developed through a rigorous measure-theoretic foundation, probabilistic modeling with dependence (under standard mixing conditions) and absorbing-state exceptions, contrasting IA with classical modalities and modern multiverse theories. Positioned as a distinct third category alongside necessity and contingency, IA’s unique grounding rests on temporal structure and probability. I address objections (Boltzmann brains, the measure problem, and identity duplication) and illustrate IA’s implications for ethics, cosmology, and personal identity, acknowledging formal challenges.
    Found 2 days, 5 hours ago on PhilSci Archive
  12. 296539.659613
    A seminal controversy in statistical inference is whether error probabilities associated with an inference method are evidentially relevant once the data are in hand. Frequentist error statisticians say yes; Bayesians say no. …
    Found 3 days, 10 hours ago on D. G. Mayo's blog
  13. 366385.659627
    Novel tools have allowed researchers to intervene into circuits at the mesoscale. The results of these interventions are often explained by appeal to functions. How are functions ascribed to circuit parts experimentally? I identify two kinds of function ascription practices in circuit interventions. Analysis of these practices shows us that function ascriptions are challenging due to a lack of interventive control and insufficient constraints on the class of candidate functions to discriminate in practice. One kind of function ascription practice— subtractive analysis—fares better at addressing these challenges.
    Found 4 days, 5 hours ago on PhilSci Archive
  14. 481722.65964
    The interpretation of quantum measurements presents a fundamental challenge in quantum mechanics, with concepts such as the Copenhagen Interpretation (CI), Many-Worlds Interpretation (MWI), and Bohmian Mechanics (BM) offering distinct perspectives. We propose the Branched Hilbert Subspace Interpretation (BHSI), which describes measurement as branching the local Hilbert space of a system into parallel subspaces. We formalize the mathematical framework of BHSI using branching and the engaging and disengaging unitary operators to relationally and causally update the states of observers. Unlike the MWI, BHSI avoids the ontological proliferation of worlds and copies of observers, realizing the Born rule based on branch weights. Unlike the CI, BHSI retains the essential features of the MWI: unitary evolution and no wavefunction collapse. Unlike the BM, BHSI does not depend on a nonlocal structure, which may conflict with relativity. We apply BHSI to examples such as the double-slit experiment, the Bell test, Wigner and his friend, and the black hole information paradox. In addition, we explore whether recohering branches can be achieved in BHSI. Compared to the CI and MWI, BHSI provides a minimalist, unitarity-preserving, collapse-free, and probabilistically inherent alternative interpretation of quantum measurements.
    Found 5 days, 13 hours ago on PhilSci Archive
  15. 481743.659652
    Recent results have shown that singularities can be avoided from the general relativistic standpoint in Lorentzian-Euclidean black holes by means of the transition from a Lorentzian to an Euclidean region where time loses its physical meaning and becomes imaginary. This dynamical mechanism, dubbed “atemporality”, prevents the emergence of black hole singularities and the violation of conservation laws. In this paper, the notion of atemporality together with a detailed discussion of its implications is presented from a philosophical perspective. The main result consists in showing that atemporality is naturally related to conservation laws.
    Found 5 days, 13 hours ago on PhilSci Archive
  16. 481799.659666
    This chapter addresses the development of tests for consciousness (C-tests), defined as any protocol or methodology devised to detect specific properties that, if present, would justify higher credence in the belief that the system under test is phenomenally conscious. Though inherently defeasible, C-tests are vital for reducing epistemic uncertainty, balancing ethical and practical considerations regarding the attribution of consciousness to systems like patients with disorders of consciousness, non-human animals, and artificial systems. In this chapter, we first present a taxonomy of current available C-tests, describing how they rely on specific neural and/or psychological properties to reduce uncertainty about the presence of consciousness in various target systems. Second, we clarify the notion of phenomenal consciousness as the target of C-tests, delineating the limits of C- tests in being able to capture it. Third, we address the question of whether a well-established theory of consciousness and/or pre-theoretical intuitions are necessary for validation of C-tests. Fourth, we evaluate several inferential strategies to justify extrapolations of consciousness from consensus to non-consensus cases. Finally, we conclude by describing the iterative natural kind approach as a multidimensional method that integrates multiple tests with weighted evidence. This model would provide probabilistic assessments of consciousness across different populations, offering a more reliable framework for addressing non-consensus cases and providing a valuable aid for practical decision-making.
    Found 5 days, 13 hours ago on PhilSci Archive
  17. 481823.65968
    This paper aims to offer an alternative account for understanding scientific models based on metaphors. To accomplish this, we analyze Darwin’s use of metaphors, such as the notion of powerful Being and Struggle for Existence, in order to represent part of the process taking place in natural selection. The proposal emerges from two provocative issues. First, that the use of metaphors in philosophical and scientific literature is a form of approach that together with other “linguistic tropes in science dies hard” (Bailer-Jones 2002a; Keller 2002, p.117). Second, there are still unsolved problems in the literature of scientific models and debates using metaphors in science as the main epistemological approach.
    Found 5 days, 13 hours ago on PhilSci Archive
  18. 539506.659692
    Marletto and Vedral [Phys. Rev. Lett. 125, 040401 (2020)] propose that the Aharonov-Bohm (AB) phase is locally mediated by entanglement between a charged particle and the quantized electromagnetic field, asserting gauge independence for non-closed paths. Using quantum electrodynamics (QED), we critically analyze their model and demonstrate that the AB phase arises from the interaction with the vector potential A, not from entanglement, which is merely a byproduct of the QED framework. We show that their field-based energy formulation, intended to reflect local electromagnetic interactions, is mathematically flawed due to an incorrect prefactor and involves fields inside the solenoid, failing to support local mediation of the phase. Its equivalence to qv · A holds only in the Coulomb gauge, undermining their claim of a gauge-independent local mechanism. Furthermore, we confirm that the AB phase is gauge-dependent for non-closed paths, contradicting their assertion. Our analysis reaffirms the semi-classical interpretation, where the AB phase is driven by the vector potential A, with entanglement playing no causal role in its generation.
    Found 6 days, 5 hours ago on PhilSci Archive
  19. 539535.659704
    This paper reconsiders the metaphysical implication of Einstein algebras, prompted by the recent objections of Chen (2024) on Rosenstock et al. (2015)’s conclusion. Rosenstock et al.’s duality theorem of smooth manifolds and smooth algebras supports a conventional wisdom which states that the Einstein algebra formalism is not more “relationalist” than the standard manifold formalism. Nevertheless, as Chen points out, smooth algebras are different from the relevant algebraic structure of an Einstein algebra. It is therefore questionable if Rosenstock et al.’s duality theorem can support the conventional wisdom. After a re-visit of John Earman’s classic works on the program of Leibniz algebras, I formalize the program in category theory and propose a new formal criterion to determine whether an algebraic formalism is more “relationalist” than the standard manifold formalism or not. Based on the new formal criterion, I show that the conventional wisdom is still true, though supported by a new technical result. I also show that Rosenstock et al. (2015)’s insight can be re-casted as a corollary of the new result. Finally, I provide a justification of the new formal criterion with a discussion of Sikorski algebras and differential spaces. The paper therefore provides a new perspective for formally investigating the metaphysical implication of an algebraic formalism for the theory of space and time.
    Found 6 days, 5 hours ago on PhilSci Archive
  20. 539571.659716
    This paper critically examines Ian Hacking’s account of looping effects and human kinds, focusing on three related arguments defended by Hacking: (1) the looping effects of human science classifications render their objects of classification inherently unstable, (2) looping effects preclude the possibility of generating stable projectable inferences (i.e., reliable predictions) based on human kind terms, and (3) looping effects can demarcate human science classifications from natural science classifications. Contra-Hacking, I argue that: (1) some objects of human science classifications (viz., biological kinds) remain stable despite the feedback generated by their classifications, (2), human science classifications that individuate biological kinds yield stable projectable inferences, and (3) looping effects are a problematic criterion for distinguishing human science classifications from natural science classifications.
    Found 6 days, 5 hours ago on PhilSci Archive
  21. 539592.659728
    This paper aims to resolve the incompatibility between two extant gauge-invariant accounts of the Abelian Higgs mechanism: the first account uses global gauge symmetry breaking, and the second eliminates spontaneous symmetry breaking entirely. We resolve this incompatibility by using the constrained Hamiltonian formalism in symplectic geometry. First we argue that, unlike their local counterparts, global gauge symmetries are physical. The symmetries that are spontaneously broken by the Higgs mechanism are then the global ones. Second, we explain how the dressing field method singles out the Coulomb gauge as a preferred gauge for a gauge-invariant account of the Abelian Higgs mechanism. Based on the existence of this group of global gauge symmetries that are physical, we resolve the incompatibility between the two accounts by arguing that the correct way to carry out the second method is to eliminate only the redundant gauge symmetries, i.e. those local gauge symmetries which are not global. We extend our analysis to quantum field theory, where we show that the Abelian Higgs mechanism can be understood as spontaneous global U(1) symmetry breaking in the C -algebraic sense.
    Found 6 days, 5 hours ago on PhilSci Archive
  22. 591770.659743
    Some authors maintain that we can use causal Bayes nets to infer whether X → Y or X ← Y by consulting a probability distribution defined over some exogenous source of variation for X or Y . We raise a problem for this approach. Specifically, we point out that there are cases where an exogenous cause of X (Ex) has no probabilistic influence on Y no matter the direction of causation — namely, cases where Ex → X → Y and Ex → X ← Y are probabilistically indistinguishable. We then assess the philosophical significance of this problem and discuss some potential solutions.
    Found 6 days, 20 hours ago on Reuben Stern's site
  23. 635167.659761
    The paper argues for a non-disjunctivist account of reference in episodic memory. Our account provides a uniform theory of reference for episodic memories that root in veridical and non-veridical experiences. It is independent from the particular mechanisms that subserve the respective source experiences. We reject both relationalist and intentionalist analyses of memory and build our approach on Werning and Liefke’s theory of referential parasitism and Werning’s theory of trace minimalism. The motivation for our non-disjunctivist account is the assumption that perceptual and non-perceptual memories with an episodic character share a uniform underlying causal mechanism and thus make up one and the same natural kind.
    Found 1 week ago on Markus Werning's site
  24. 649324.659773
    Canonical is a solver for type inhabitation in dependent type theory, that is, the problem of producing a term of a given type. We present a Lean tactic which invokes Canonical to generate proof terms and synthesize programs. The tactic supports higher-order and dependently-typed goals, structural recursion over indexed inductive types, and definitional equality. Canonical finds proofs for 84% of Natural Number Game problems in 51 seconds total.
    Found 1 week ago on Jeremy Avigad's site
  25. 703539.659785
    Visual illusions provide a means of investigating the rules and principles through which approximate number representations are formed. Here, we investigated the developmental trajectory of an important numerical illusion – the connectedness illusion, wherein connecting pairs of items with thin lines reduces perceived number without altering continuous attributes of the collections. We found that children as young as 5 years of age showed susceptibility to the illusion and that the magnitude of the effect increased into adulthood. Moreover, individuals with greater numerical acuity exhibited stronger connectedness illusions after controlling for age. Overall, these results suggest the approximate number system expects to enumerate over bounded wholes and doing so is a signature of its optimal functioning.
    Found 1 week, 1 day ago on Sam Clarke's page
  26. 705664.659798
    Common sense tells us that biological systems are goal-directed, and yet the concept remains philosophically problematic. We propose a novel characterization of goal-directed activities as a basis for hypothesizing about and investigating explanatory mechanisms. We focus on survival goals such as providing adequate nutrition to body tissues, highlighting two key features—normativity and action. These are closely linked inasmuch as goal-directed actions must meet normative requirements such as that they occur when required and not at other times. We illustrate how goal-directed actions are initiated and terminated not by environmental features and goals themselves, but by markers for them. For example, timely blood clotting is the essential response to injury, but platelet activation, required for clotting, is initiated not by the injury itself but by a short sequence of amino acids (GPO) that provides a reliable marker for it. We then make the case that the operation of markers is a prerequisite for common biological phenomena such as mistake-proneness and mimicry. We go on to identify properties of markers in general, including those that are genetically determined and those that can be acquired through associative learning. Both provide the basis for matching actions to changing environments and hence adaptive goal-directedness. We describe how goal-directed activities such as bird nest construction and birdsong learning, completed in anticipation of actions in the environment, have to be evaluated and practiced against a standard of correctness. This characterization of goal-directedness is sufficiently detailed to provide a basis for the scientific study of mechanisms.
    Found 1 week, 1 day ago on David S. Oderberg's site
  27. 1000867.65981
    In this paper, we investigate the treatment of the direction of time in Bohmian mechanics. We show how Bohmian mechanics can account for the direction of time in different ways. In particular, we argue that Bohmian mechanics can be employed to accommodate reductionism, because there always is an asymmetry in the initial conditions when forward and backward evolutions of the configuration of matter are compared. It can also be employed to accommodate primitivism and relationalism due to the fact that Bohmian mechanics is a first order theory that recognizes only position as a primitive physical magnitude. We show how this fact can be employed to support a primitive direction of time by assuming Leibnizian relationalism, which reduces the direction of time to change in the configuration of matter with that change being directed as a primitive matter of fact.
    Found 1 week, 4 days ago on PhilSci Archive
  28. 1000894.659821
    The capacity for purposeful choice among genuine alternatives—commonly termed free will— presents a profound challenge to a scientific worldview often perceived as deterministic. Understanding how seemingly goal-directed actions, observed across the spectrum of life from bacteria navigating chemical gradients (chemotaxis) to humans deliberating complex decisions, can arise from underlying physical and chemical processes is a central question in both philosophy and science. This paper explores the possibility of naturalizing free will by conceptualizing it as emergent autonomy: a capacity rooted in the unique organization of life itself, an organization that unfolds dynamically in real, lived time (Mascolo & Kallio, 2019; Moore, 2023). Foundational work by thinkers like Kauffman & Clayton (2006) on emergence and organization provides crucial groundwork for such an approach.
    Found 1 week, 4 days ago on PhilSci Archive
  29. 1058619.659841
    The nineteenth-century distinction between the nomothetic and the idiographic approach to scientific inquiry can provide valuable insight into the epistemic challenges faced in contemporary earth modelling. However, as it stands, the nomothetic-idiographic dichotomy does not fully encompass the range of modelling commitments and trade-offs that geoscientists need to navigate in their practice. Adopting a historical epistemology perspective, I propose to further spell out this dichotomy as a set of modelling decisions concerning historicity, model complexity, scale, and closure. Then, I suggest that, to address the challenges posed by these decisions, a pluralist stance towards the cognitive aims of earth modelling should be endorsed, especially beyond predictive aims.
    Found 1 week, 5 days ago on PhilSci Archive
  30. 1058639.659855
    This is an introduction to a collection of articles on the conceptual history of epigenesis, from Aristotle to Harvey, Cavendish, Kant and Erasmus Darwin, moving into nineteenth-century biology with Wolff, Blumenbach and His, and onto the twentieth century and current issues, with Waddington and epigenetics. The purpose of the topical collection is to emphasize how epigenesis marks the point of intersection of a theory of biological development and a (philosophical) theory of active matter. We also wish to show that the concept of epigenesis existed prior to biological theorization and that it continues to permeate thinking about development in recent biological debates.
    Found 1 week, 5 days ago on PhilSci Archive