1. 34304.934615
    The paper proposes and studies new classical, type-free theories of truth and determinateness with unprecedented features. The theories are fully compositional, strongly classical (namely, their internal and external logics are both classical), and feature a defined determinateness predicate satisfying desirable and widely agreed principles. The theories capture a conception of truth and determinateness according to which the generalizing power associated with the classicality and full compositionality of truth is combined with the identification of a natural class of sentences – the determinate ones – for which clear-cut semantic rules are available. Our theories can also be seen as the classical closures of Kripke-Feferman truth: their ω-models, which we precisely pinned down, result from including in the extension of the truth predicate the sentences that are satisfied by a Kripkean closed-off fixed point model. The theories compare to recent theories proposed by Fujimoto and Halbach, featuring a primitive determinateness predicate. In the paper we show that our theories entail all principles of Fujimoto and Halbach’s theories, and are proof-theoretically equivalent to Fujimoto and Halbach’s CD . We also show establish some negative results on Fujimoto and Halbach’s theories: such results show that, unlike what happens in our theories, the primitive determinateness predicate prevents one from establishing clear and unrestricted semantic rules for the language with type-free truth.
    Found 9 hours, 31 minutes ago on Carlo Nicolai's site
  2. 36196.934971
    In contemporary philosophy of physics, there has recently been a renewed interest in the theory of geometric objects—a programme developed originally by geometers such as Schouten, Veblen, and others in the 1920s and 30s. However, as yet, there has been little-to-no systematic investigation into the history of the geometric object concept. I discuss the early development of the geometric object concept, and show that geometers working on the programme in the 1920s and early 1930s had a more expansive conception of geometric objects than that which is found in later presentations— which, unlike the modern conception of geometric objects, included embedded submanifolds such as points, curves, and hypersurfaces. I reconstruct and critically evaluate their arguments for this more expansive geometric object concept, and also locate and assess the transition to the more restrictive modern geometric object concept.
    Found 10 hours, 3 minutes ago on PhilSci Archive
  3. 36223.934997
    Chronogeometry is often conceived as a necessary condition for spatiotemporality, yet many theories of quantum gravity (QG) seem to challenge it. Applications of noncommutative geometry (NCG) to QG propose that spacetime exhibits noncommutative features at or beyond the Planck scale, thereby replacing relativistic symmetries with their deformations, known as quantum groups. This leads to an algebraic formulation of noncommutative structure that postulates a minimal length scale and deforms relativistic (commutative) physics, raising questions about whether noncommutative theories preserve spatiotemporal content, and specifically, chronogeometry. I argue that noncommutative approaches can satisfy an appropriate definition of chronogeometry, thus attaining physical significance within QG. In particular, I contend that noncommutativity is compatible with chronogeometricity, using κ-Minkowski spacetime as case study in NCG. In this algebraic setting, physical interpretation hinges on two crucial elements: a representation of the noncommutative algebra and a corresponding set of observers. I show how this framework enables the algebra to encode localisation procedures for events in noncommutative spacetime, relative to a noncommutative reference frame, with frame transformations governed by the quantum group structure. By enriching the theory with noncommutative reference frames, NCG can satisfy the necessary representational principles to support chronogeometric content.
    Found 10 hours, 3 minutes ago on PhilSci Archive
  4. 70142.935015
    Many of the theories found in contemporary high-energy physics are gauge theories. The theory of the electromagnetic force is a gauge theory, as are the theories of the weak and strong nuclear forces. Philosophers disagree about which other theories are gauge theories, but they generally agree that gauge theories present distinctive puzzles concerning mathematical representation. Philosophical discussion of gauge theories has focused on these puzzles alongside the metaphysical and epistemological consequences of the fact that gauge theories feature centrally in theories of the fundamental physical forces.
    Found 19 hours, 29 minutes ago on Stanford Encyclopedia of Philosophy
  5. 70160.935031
    The concept of preference spans numerous research fields, resulting in diverse perspectives on the topic. Preference logic specifically focuses on reasoning about preferences when comparing objects, situations, actions, and more, by examining their formal properties. This entry surveys major developments in preference logic to date. Section 2 provides a historical overview, beginning with foundational work by Halldén and von Wright, who emphasized the syntactic aspects of preference. In Section 3, early semantic contributions by Rescher and Van Dalen are introduced. The consideration of preference relations over possible worlds naturally gives rise to modal preference logic where preference lifting enables comparisons across sets of possible worlds.
    Found 19 hours, 29 minutes ago on Stanford Encyclopedia of Philosophy
  6. 79702.935046
    I’ve explained a cool way to treat bound states of the hydrogen atom as wavefunctions on a sphere in 4-dimensional space. But so far I’ve been neglecting the electron’s spin. Now let’s throw that in too! …
    Found 22 hours, 8 minutes ago on Azimuth
  7. 93866.935063
    We ask how and why mathematical physics may be seen as a rigorous discipline. Starting with Newton but drawing on a philosophical tradition ranging from Aristotle to (late) Wittgenstein, we argue that, as in mathematics, rigour ultimately comes from rules. These include logical rules of inference as well as definitions that give a precise meaning to physical concepts such as space and time by providing rules governing their use in models of the theories in which they are defined. In particular, so-called implicit definitions characterize “indefinables” whose traditionally assumed familiarity through “intuition” or “acquaintance” from Aristotle down to Russell blasts any hope of both rigour and innovation. Given the basic physical concepts, one may subsequently define derived concepts (like black holes or determinism). Definitions are seen as a priori meaning-constitutive conventions that are neither necessary `a la Kant nor arbitrary `a la Carnap, as they originate in empirical science as well as in the autonomous development of mathematics and physics. As such definitions are best seen as hypothetical.
    Found 1 day, 2 hours ago on PhilSci Archive
  8. 93937.935079
    According to the stochastic-quantum correspondence, a quantum system can be understood as a stochastic process unfolding in an old-fashioned configuration space based on ordinary notions of probability and ‘indivisible’ stochastic laws, which are a non-Markovian generalization of the laws that describe a textbook stochastic process. The Hilbert spaces of quantum theory and their ingredients, including wave functions, can then be relegated to secondary roles as convenient mathematical appurtenances. In addition to providing an arguably more transparent way to understand and modify quantum theory, this indivisible-stochastic formulation may lead to new possible applications of the theory. This paper initiates a deeper investigation into the conceptual foundations and structure of the stochastic-quantum correspondence, with a particular focus on novel forms of gauge invariance, dynamical symmetries, and Hilbert-space dilations.
    Found 1 day, 2 hours ago on PhilSci Archive
  9. 205748.93511
    Accuracy plays an important role in the deployment of machine learning algorithms. But accuracy is not the only epistemic property that matters. For instance, it is well-known that algorithms may perform accurately during their training phase but experience a significant drop in performance when deployed in real-world conditions. To address this gap, people have turned to the concept of algorithmic robustness. Roughly, robustness refers to an algorithm’s ability to maintain its performance across a range of real-world and hypothetical conditions. In this paper, we develop a rigorous account of algorithmic robustness grounded in Robert Nozick’s counterfactual sensitivity and adherence conditions for knowledge. By bridging insights from epistemology and machine learning, we offer a novel conceptualization of robustness that captures key instances of algorithmic brittleness while advancing discussions on reliable AI deployment. We also show how a sensitivity-based account of robustness provides notable advantages over related approaches to algorithmic brittleness, including causal and safety-based ones.
    Found 2 days, 9 hours ago on Jens Christian Bjerring's site
  10. 266908.935127
    Why are quantum correlations so puzzling? A standard answer is that they seem to require either nonlocal influences or conspiratorial coincidences. This suggests that by embracing nonlocal influences we can avoid conspiratorial fine-tuning. But that’s not entirely true. Recent work, leveraging the framework of graphical causal models, shows that even with nonlocal influences, a kind of fine-tuning is needed to recover quantum correlations. This fine-tuning arises because the world has to be just so as to disable the use of nonlocal influences to signal, as required by the no-signaling theorem. This places an extra burden on theories that posit nonlocal influences, such as Bohmian mechanics, of explaining why such influences are inaccessible to causal control. I argue that Everettian Quantum Mechanics suffers no such burden. Not only does it not posit nonlocal influences, it operates outside the causal models framework that was presupposed in raising the fine-tuning worry. Specifically, it represents subsystems with density matrices instead of random variables. This allows it to sidestep all the results (including EPR and Bell) that put quantum correlations in tension with causal models. However, this doesn’t mean one must abandon causal reasoning altogether in a quantum world. After all, quantum systems can clearly stand in causal relations. When decoherence is rampant and there’s no controlled entanglement, Everettian Quantum Mechanics licenses our continued use of standard causal models. When controlled entanglement is present—such as in Bell-type experiments—we can employ recently proposed quantum causal models that are consistent with Everettian Quantum Mechanics. We never need invoke any kind of nonlocal influence or any kind of fine-tuning.
    Found 3 days, 2 hours ago on PhilSci Archive
  11. 266935.935144
    Feynman diagrams are used to calculate scattering amplitudes in quantum field theory, where they simplify the derivation of individual terms in the corresponding perturbation series. Considered mathematical tools with an approximative character, the received view in the philosophy of physics denies that individual diagrams can represent physical processes. A different story, however, can be observed in physics practice. From education to high-profile research publications, Feynman diagrams are used in connection with particle phenomena without any reference to perturbative calculations. In the first part of the paper, I argue that this illuminates an additional use of Feynman diagrams that is not calculatory but representational. It is not a possible translation into mathematical terms that prompts this practice but rather the epistemic insights into the target phenomenon that the diagrams provide. Based on this practical use, I intend to push back against the received view. In the second part of the paper, I conceptualize the representative use of Feynman diagrams as models that provide modal understanding of their associated target phenomena. The set of Feynman diagrams corresponding to an interaction is taken as a possibility space whose dependency relations can be analysed, allowing an agent to grasp possible target behaviour, leading to understanding. In clearly separating the diagrams from perturbative calculations for their use as a model, the concerns that hinder a representative reading can be resolved.
    Found 3 days, 2 hours ago on PhilSci Archive
  12. 340949.935164
    In Part 4 we saw that the classical Kepler problem—the problem of a single classical particle in an inverse square force—has symmetry under the group of rotations of 4-dimensional space Since the Lie algebra of this group is we must have conserved quantities and corresponding to these two copies of The physical meaning of these quantities is a bit obscure until we form linear combinations Then is the angular momentum of the particle, while is a subtler conserved quantity: it’s the eccentricity vector of the particle divided by where the energy is negative for bound states (that is, elliptical orbits) The advantage of working with and is that these quantities have very nice Poisson brackets: This says they generate two commuting symmetries. …
    Found 3 days, 22 hours ago on Azimuth
  13. 382360.935182
    We revisit Einstein’s 1927 thought experiment on electron diffraction, using a single-electron source and an opaque hemispheric detector array, now achievable with modern sensors (~0.1 ns). In this fully enclosed system, where no signals escape the hemisphere, we provide a direct empirical comparison of the Many-Worlds Interpretation (MWI) and the Branched Hilbert Subspace Interpretation (BHSI). Both maintain unitarity without invoking wavefunction collapse, as in the Copenhagen Interpretation (CI), but differ ontologically: MWI proposes irreversible global branching into parallel worlds, while BHSI describes local, potentially reversible branching into decohered subspaces. In this setup, all quantum events (branching, engagement, disengagement, and relocation) occur entirely within the local system, and the Born rule, naturally emerging through branch weights, can be observed in detector statistics. To explore branching dynamics more thoroughly, we suggest an enhanced dual-layer experimental setup with an inner transparent detector. Because the electron’s transit time between layers (~0.12 ns) is shorter than the average response times of the inner sensors (~1 ns), this allows a crucial test of measurement timing and potential anomalies (“delayed” or “uncommitted” choice?). Our analysis challenges the notion that unitarity necessitates parallel worlds, instead advocating for a simpler view: local, unitary branching without collapse or global splitting.
    Found 4 days, 10 hours ago on PhilSci Archive
  14. 382435.935198
    Tatiana Ehrenfest-Afanassjewa was an important physicist, mathematician, and educator in 20th century Europe. While some of her work has recently undergone reevaluation, little has been said regarding her groundbreaking work on dimensional analysis. This, in part, reflects an unfortunate dismissal of her interventions in such foundational debates by her contemporaries. In spite of this, her work on the generalized theory of homogeneous equations provides a mathematically sound foundation for dimensional analysis and has found some appreciation and development. It remains to provide a historical account of Ehrenfest-Afanassjewa’s use of the theory of homogeneous functions to ground (and limit) dimensional analysis. We take as a central focus Ehrenfest-Afanassjewa’s contributions to a debate on the foundations of dimensional analysis started by physicist Richard Tolman in 1914. I go on to suggest an interpretation of the more thoroughgoing intervention Ehrenfest-Afanassjewa makes in 1926 based on this earlier context, especially her limited rehabilitation of a “theory of similitude” in contradistinction to dimensional analysis. It is shown that Ehrenfest-Afanassjewa has made foundational contributions to the mathematical foundations and methodology of dimensional analysis, our conception of the relation between constants and laws, and our understanding of the quantitative nature of physics, which remain of value.
    Found 4 days, 10 hours ago on PhilSci Archive
  15. 497829.935214
    The photon is typically regarded as a unitary object that is both particle-discrete and wave- continuous. This is a paradoxical position and we live with it by making dualism a fundamental feature of radiation. It is argued here that the photon is not unitary; rather it has two identities, one supporting discrete behavior and the other supporting continuous (wave) behavior. There is photon kinetic energy that is always discrete/localized on arrival; it never splits (on half-silvered mirrors) or diffracts (in pinholes or slits). Then there is the photon s probability wavefront that is continuous and diffractable. Acknowledging that the photon has two identities explains the photon s dual nature. And wave-particle duality is central to quantum mechanics. Understanding it leads to new insights into the photon s constant velocity and its entanglement with another photon.
    Found 5 days, 18 hours ago on PhilSci Archive
  16. 497852.935239
    The idea of using lattice methods to provide a mathematically well-defined formulation of realistic effective quantum field theories (QFTs) and clarify their physical content has gained traction in the last decades. In this paper, I argue that this strategy faces a two-sided obstacle: realistic lattice QFTs are (i) too different from their effective continuum counterparts even at low energies to serve as their foundational proxies and (ii) far from reproducing all of their empirical and explanatory successes to replace them altogether. I briefly conclude with some lessons for the foundations of QFT.
    Found 5 days, 18 hours ago on PhilSci Archive
  17. 497944.935265
    Machine learning is rapidly transforming how society and humans are quantified. Shared amongst some machine learning applications in the social and human sciences is the tendency to conflate concepts with their operationalization through particular tests or measurements. Existing scholarship reduces these equations of concept and operationalization to disciplinary naivety or negligence. This paper takes a close look at equations of concept and operationalization in machine learning predictions of poverty metrics. It develops two arguments. First, I demonstrate that conflations of concept and operationalization in machine learning poverty prediction cannot be reduced to naivety or negligence but can serve a strategic function. Second, I propose to understand this function in the context of philosophical and historical research on operationalism in the social sciences.
    Found 5 days, 18 hours ago on PhilSci Archive
  18. 598141.935281
    In Part 4 we saw how the classical Kepler problem is connected to a particle moving on the 3-sphere, and how this illuminates the secret symmetry of the Kepler problem. There are various ways to quantize the Kepler problem and obtain a description of the hydrogen atom’s bound states as wavefunctions on the 3-sphere. …
    Found 6 days, 22 hours ago on Azimuth
  19. 660608.935315
    Історія логіки – актуальний напрямок досліджень в царині сучасного логічного знання. Такі розвідки, по-перше, сприяють створенню загальної картини еволюції логіки, усвідомленню змін предмета, що їх вона зазнавала як наука і як навчальна дисципліна, а також змін у парадигмальних принципах її історичного розвитку, засадничих правилах побудови логічних теорій та інструментарієві останніх. По-друге, історикологічні дослідження надають можливість виявити те, як логічні концепції впливали на інші наукові дисципліни, передусім філософію та математику. По-третє, історико-логічний аналіз дозволяє розглянути логічну позицію певного автора в широкому історико-філософському контексті, показати, як філософські ідеї впливали на розвиток логічного знання. По-четверте, дослідження в царині історії логіки допомагають розглянути її в широкому історико-культурному контексті, з’ясувати взаємовплив різних логічних поглядів та певних культурних традицій і особливостей історичних епох.
    Found 1 week ago on Heinrich Wansing's site
  20. 660650.935332
    We present a logic which deals with connexive exclusion. Exclusion (also called “co-implication”) is considered to be a propositional connective dual to the connective of implication. Similarly to implication, exclusion turns out to be non-connexive in both classical and intuitionistic logics, in the sense that it does not satisfy certain principles that express such connexivity. We formulate these principles for connexive exclusion, which are in some sense dual to the well-known Aristotle’s and Boethius’ theses for connexive implication. A logical system in a language containing exclusion and negation can be called a logic of connexive exclusion if and only if it obeys these principles, and, in addition, the connective of exclusion in it is asymmetric, thus being different from a simple mutual incompatibility of propositions. We will develop a certain approach to such a logic of connexive exclusion based on a semantic justification of the connective in question. Our paradigm logic of connexive implication will be the connexive logic C, and exactly like this logic the logic of connexive exclusion turns out to be contradictory though not trivial.
    Found 1 week ago on Heinrich Wansing's site
  21. 813508.935349
    The Kepler problem is the study of a particle moving in an attractive inverse square force. In classical mechanics, this problem shows up when you study the motion of a planet around the Sun in the Solar System. …
    Found 1 week, 2 days ago on Azimuth
  22. 815055.935366
    I review the works of Gärdenfors (1990) and Scorzato (2013) and show that their combination provides an elegant solution of Goodman’s new riddle of induction. The solution is based on two main ideas: (1) clarifying what is expected from a solution: understanding that philosophy of science is a science itself, with the same limitations and strengths as other scientific disciplines; (2) understanding that the concept of complexity of a model’s assumptions and the concept of direct measurements must be characterized together. Although both measurements and complexity have been the subject of a vast literature, within the philosophy of science, essentially no other attempt has been made to combine them. The widespread expectation, among modern philosophers, that Goodman’s new riddle cannot be solved is clearly not defensible without serious exploration of such a natural approach. A clarification of this riddle has always been very important, but it has become even more crucial in the age of AI.
    Found 1 week, 2 days ago on PhilSci Archive
  23. 891391.935381
    Levy’s Upward Theorem says that the conditional expectation of an integrable random variable converges with probability one to its true value with increasing information. In this paper, we use methods from effective probability theory to characterise the probability one set along which convergence to the truth occurs, and the rate at which the convergence occurs. We work within the setting of computable probability measures defined on computable Polish spaces and introduce a new general theory of effective disintegrations. We use this machinery to prove our main results, which (1) identify the points along which certain classes of effective random variables converge to the truth in terms of certain classes of algorithmically random points, and which further (2) identify when computable rates of convergence exist. Our convergence results significantly generalize earlier results within a unifying novel abstract framework, and there are no precursors of our results on computable rates of convergence. Finally, we make a case for the importance of our work for the foundations of Bayesian probability theory.
    Found 1 week, 3 days ago on Simon M. Huttegger's site
  24. 987790.935396
    There are four well-known models of fundamental objective probabilistic reality: classical probability, comparative probability, non-Archimedean probability, and primitive conditional probability. I offer two desiderata for an account of fundamental objective probability, comprehensiveness and non-superfluity. It is plausible that classical probabilities lack comprehensiveness by not capturing some intuitively correct probability comparisons, such as that it is less likely that 0 = 1 than that a dart randomly thrown at a target will hit the exact center, even though both classically have probability zero. We thus want a comparison between probabilities with a higher resolution than we get from classical probabilities. Comparative and non-Archimedean probabilities have a hope of providing such a comparison, but for known reasons do not appear to satisfy our desiderata. The last approach to this problem is to employ primitive conditional probabilities, such as Popper functions, and then argue that P(0 = 1 | 0 = 1 or hit center) = < 1 = P (hit center | 0 = 1 or hit center). But now we have a technical question: How can we reconstruct a probability comparison, ideally satisfying the standard axioms of comparative probability, from a primitive conditional probability? I will prove that, given some plausible assumptions, it is impossible to perform this task: conditional probabilities just do not carry enough information to define a satisfactory comparative probability. The result is that of the models, no one satisfies our two desiderata. We end by briefly considering three paths forward.
    Found 1 week, 4 days ago on PhilSci Archive
  25. 987818.935412
    Weatherall and Manchak (2014) show that Reichenbachean universal effects, constrained to a rank-2 tensor field representation in the geodesic equation, always exist in non-relativistic gravity but not so for relativistic spacetimes. Thus general relativity is less susceptible to underdetermination than its Newtonian predecessor. Durr and Ben-Menahem (2022) argue these assumptions are exploitable as loopholes, effectively establishing a (rich) no-go theorem. I disambiguate between two targets of the proof, which have previously been conflated: the existence claim of at least one alternative geometry to a given one and Reichenbach’s (in)famous ”theorem theta”, which amounts to a universality claim that any geometry can function as an alternative to any other. I show there is no (rich) no-go theorem to save theorem theta. I illustrate this by explicitly breaking one of the assumptions and generalising the proof to torsionful spacetimes. Finally, I suggest a programmatic attitude: rather than undermining the proof one can use it to systematically and rigorously articulate stronger propositions to be proved, thereby systematically exploring the space of alternative spacetime theories.
    Found 1 week, 4 days ago on PhilSci Archive
  26. 1074202.935431
    Baroque questions of set-theoretic foundations are widely assumed to be irrelevant to physics. In this article, we challenge this assumption. We show that even such fundamental questions as whether a theory is deterministic — whether it fixes a unique future given the present — depend on set-theoretic axiom candidates over which there is philosophical disagreement.
    Found 1 week, 5 days ago on PhilSci Archive
  27. 1152828.935446
    This is probably an old thing that has been discussed to death, but I only now noticed it. Suppose an open future view on which future contingents cannot have truth value. What happens to entailments? …
    Found 1 week, 6 days ago on Alexander Pruss's Blog
  28. 1159446.935464
    While correct as far as it goes, this standard picture can encourage an overly sharp distinction between scientific activities and ethical deliberation. Far from entering only at the policy-making stage, ethical judgments often shape scientific research itself. This is most obvious in the choice of research questions. The choice of what to study ultimately affects what knowledge can be brought to bear in real-world decisions, including consequences for which (and whose) decisions can be made with the benefit of scientific insight.
    Found 1 week, 6 days ago on Wendy S. Parker's site
  29. 1246989.935481
    The Pusey-Barrett-Rudolph (PBR) theorem proves that the joint wave function ψ ⊗ψ2 of a composite quantum system is ψ-ontic, representing the system’s physical reality. We present a minimalist proof showing that this result, combined with the tensor product structure assigning ψ1 to subsystem 1 and ψ2 to subsystem 2, directly implies that ψ1 and ψ2 are ψ-ontic for their respective subsystems. This establishes ψ-ontology for single quantum systems without requiring preparation independence or other assumptions. Our proof challenges the widely held view that joint ψ-onticity permits subsystem ψ-epistemicity via correlations, providing a simpler, more direct understanding of the wave function’s ontological status in quantum mechanics.
    Found 2 weeks ago on PhilSci Archive
  30. 1316323.935496
    I’ve been working on a math project involving the periodic table of elements and the Kepler problem—that is, the problem of a particle moving in an inverse square force law. I started it in 2021, but I just finished. …
    Found 2 weeks, 1 day ago on Azimuth