1. 2956012.150154
    Today I want to make a little digression into the quaternions. We won’t need this for anything later—it’s just for fun. But it’s quite beautiful. We saw in Part 8 that if we take the spin of the electron into account, we can think of bound states of the hydrogen atom as spinor-valued functions on the 3-sphere. …
    Found 1 month ago on Azimuth
  2. 3011310.150236
    Learning not to be so embarrassed by my ignorance and failures. Reminder: everyone is welcome here, but paid subscriptions are what enable me to devote the necessary time to researching and writing this newsletter, including pieces like this one on Katie Johnson, the woman who alleged Trump sexually assaulted her at the age of thirteen at a party of Jeffrey Epstein’s. …
    Found 1 month ago on More to Hate
  3. 3136354.150246
    The famous Catholic pilgrimage site at Lourdes, France, until fairly recently displayed hundreds of discarded crutches as testament to miraculous cures. It has, though, never displayed a wooden leg. Hence the Wooden Leg Problem (WLP) for believers in miracles: if God can cure paralysis, why does He seem never to have given an amputee back their lost limb? The WLP is a severe challenge for believers in miracles and must be confronted head-on. Yet there does not appear to be any systematic analysis of the problem, at least as formulated here, in the literature on miracles or philosophy of religion generally. I discuss ten possible solutions to the WLP on behalf of the believer in miracles. Although some are stronger than others, all but the final one seem too weak to solve the problem. It is the final one – the ‘how do you know?’ solution – that I endorse and examine in some depth. This solution, I argue, shows that the WLP does not move the epistemological dial when it comes to belief or disbelief in miracles.
    Found 1 month ago on David S. Oderberg's site
  4. 3140999.15026
    Subsumption theodicies aim to subsume apparent cases of natural evil under the category of moral evil, claiming that apparently natural evils result from the actions or omissions of free creatures. Subsumption theodicies include Fall theodicies, according to which nature was corrupted by the sins of the first humans (Aquinas 1993, Dembski 2009), demonic-action theodicies, according to which apparently natural evils are caused by the actions of fallen angels (Lewis 1944, Plantinga 1974, Johnston 2023), and simulation theodicies, according to which our universe is a computer simulation, with its apparent natural evils caused by the free actions of simulators in the next universe up (Dainton 2020, Crummett 2021).
    Found 1 month ago on Brian Cutter's site
  5. 3142719.15027
    This paper introduces the physics and philosophy of strange metals, which are characterized by unusual electrical and thermal properties that deviate from conventional metallic behaviour. The anomalous strange-metal behaviour discussed here appears in the normal state of a copper-oxide high-temperature superconductor, and it cannot be described using standard condensed-matter physics. Currently, it can only be described through a holographic dual, viz. a four-dimensional black hole in anti-de Sitter spacetime. This paper first We dedicate this paper to the memory of Umut G¨ursoy, who tragically passed away on 24 April 2025, during the last stages of the paper’s completion.
    Found 1 month ago on PhilSci Archive
  6. 3142743.150278
    We employ a pragmatist model of inquiry to explain how measurement in physics can solve the problem of usefulness. In spite of the fact that a variety of resources, including theory, simulation, heuristics, rules of thumb, and practical considerations contribute to the context of a specific measurement inquiry, the measurement inquiry process partially decontextualizes its results, making them useful for other inquiries. This measurement inquiry process involves a process of transformation of data we call “entheorization,” which happens in conjunction with the evaluation of uncertainty of measurement results. These uncertainty estimates then serve to define the sensitivity of the result to the aims of subsequent inquiries. On this approach, the epistemology of measurement requires treating measurement procedure, uncertainty estimation, and sensitivity to targets of inquiry as equally fundamental to understanding how measurement yields knowledge. To help understand how the abstract elements of our epistemological model of experimental inquiries are applicable to concrete episodes of measurement, we use the example of the W -boson mass measurement at the Large Hadron Collider to illustrate our arguments.
    Found 1 month ago on PhilSci Archive
  7. 3142765.150284
    This article uses especially Sartre’s existential philosophy (also drawing from Scheler, Husserl, and Descartes) to investigate pathogenetic issues in psychopathology from a first-person perspective. Psychosis is a “total experience” that points to orientating changes in subjectivity, supported by evidence regarding self-disorders in the schizophrenia spectrum. This article proposes that schizophrenia is essentially characterized (and distinguished) by specific structural alterations of (inter)subjectivity around the relationship between self and Other, which all its seemingly disparate signs and symptoms eventually point to. Two reciprocal distortions are present in psychotic schizophrenia patients: (A) an encroaching and substantialized Other, and (B) a self transformed into being-for-the-Other. Under the altered conditions of (A & B), delusional mood is the presence but inaccessibility of the Other; a delusional perception is an eruption or surfacing of objectification of self by Other; a delusion is an experience of the Other, which fulfills certainty, incorrigibility, and potentially falsehood.
    Found 1 month ago on PhilSci Archive
  8. 3146492.150291
    [This continues an earlier essay, Book Review: Paradise Lost by John Milton.] 1. Adam asks Raphael, and you would too, admit it, about the sex lives of angels: Love not the heavenly spirits, and how their love Express they, by looks only, or do they mix Irradiance, virtual or immediate touch? …
    Found 1 month ago on Mostly Aesthetics
  9. 3146493.150297
    PEA Soup is pleased to announce the forthcoming discussion from Free & Equal, on Elise Sugarman’s “Supposed Corpses and Correspondence” with a précis from Gabriel Mendlow. The discussion will take place from August 6th to 8th. …
    Found 1 month ago on PEA Soup
  10. 3201990.150306
    Sorry for the long blog-hiatus! I was completely occupied for weeks, teaching an intensive course on theoretical computer science to 11-year-olds (! ), at a math camp in St. Louis that was also attended by my 8-year-old son. …
    Found 1 month ago on Scott Aaronson's blog
  11. 3211993.150312
    Now comes the really new stuff. I want to explain how the hydrogen atom is in a certain sense equivalent to a massless spin-½ particle in the ‘Einstein universe’. This is the universe Einstein believed in before Hubble said the universe was expanding! …
    Found 1 month ago on Azimuth
  12. 3231700.150318
    For predictive processing, perception is tied to the upshot of probabilistic inference, which makes perception internal, affording only indirect access to the world external to the perceiver. The metaphysical implications of predictive processing however remain unresolved, which is a significant gap given the major influence of this framework across philosophy and other fields of research. Here, I present what I believe is a consistent metaphysical package of commitments for predictive processing. My starting point is a suitable challenge to predictive processing presented by Tobias Schlicht, who argues that the framework is committed to Kantian transcendental idealism, and marshals several lines of argument that this commitment undermines predictive processing’s aspirations to completeness, realism, and naturalism. I first trace Hermann von Helmholtz’s nuanced reaction to Kant, which sets out the preconditions for perception in a manner prescient of the notion of self-evidencing central to contemporary predictive processing. This position enables a fundamental structural realism, rather than idealism, which blocks Schlicht’s line of argument, allowing plausible versions of completeness, realism and naturalism. Schlicht’s challenge is nevertheless valuable because addressing it, in the specific context of Helmholtz’s response to Kant, helps bring to light the compelling structural realism at the heart of self-evidencing.
    Found 1 month ago on Jakob Hohwy's site
  13. 3256725.150325
    philosophical logic may also interest themselves with the logical appendices, one of which presents modal logic as a subsystem of the logic of counterfactuals. Last but not least, the work also includes an afterword that is both a severe reprimand to the analytic community for a certain sloppiness and an exhortation to all colleagues to apply more rigor and patience in addressing metaphysical issues. People familiar with Williamson’s work will not be surprised by the careful and detailed (sometimes a bit technical) argumentation, which demands careful attention from the reader. As expected, this is a most relevant contribution to an increasingly popular topic by one of today’s leading analytic philosophers.
    Found 1 month ago on Clas Weber's site
  14. 3258117.150331
    Cognitive neuroscientists typically posit representations which relate to various aspects of the world, which philosophers call representational content. Anti-realists about representational content argue that contents play no role in neuroscientific explanations of cognitive capacities. In this paper, I defend realism against an anti-realist argument due to Frances Egan, who argues that for content to be explanatory it must be both essential and naturalistic. I introduce a case study from cognitive neuroscience in which content is both essential and naturalistic, meeting Egan’s challenge. I then spell out some general principles for identifying studies in which content plays an explanatory role.
    Found 1 month ago on PhilSci Archive
  15. 3258163.150338
    Representations appear to play a central role in cognitive science. Capacities such as face recognition are thought to be enabled by internal states or structures representing external items. However, despite the ubiquity of representational terminology in cognitive science, there is no explicit scientific theory outlining what makes an internal state a representation of an external item. Nonetheless, many philosophers hope to uncover an implicit theory in the scientific literature. This is the project of the current thesis. However, all such projects face an obstacle in the form of Frances Egan’s argument that content plays no role in scientific theorising. I respond that, in some limited regions of cognitive science, content is crucial for explanation. The unifying idea is that closer attention to the application of information theory in those regions of cognitive neuroscience enables us to uncover an implicit theory of content. I examine the conditions which must be met for the cognitive system to be modelled using information theory, presenting some constraints on how we apply the mathematical framework. For example, information theory requires identifying probability distributions over measurable outcomes, which leads us to focus specifically on neural representation. I then argue that functions are required to make tractable measures of information, since they serve to narrow the range of possible contents to those potentially explanatory of a cognitive capacity. However, unlike many other teleosemanticists, I argue that we need to use a non-etiological form of function. I consider whether non-etiological functions allow for misrepresentation, and conclude that they do. Finally, I introduce what I argue is the implicit theory of content in cognitive neuroscience: maxMI. The content of a representation is that item in the environment with which the representation shares maximal mutual information.
    Found 1 month ago on PhilSci Archive
  16. 3292098.150344
    The term algorithmic fairness is used to assess whether machine learning algorithms operate fairly. To get a sense of when algorithmic fairness is at issue, imagine a data scientist is provided with data about past instances of some phenomenon: successful employees, inmates who when released from prison go on to reoffend, loan recipients who repay their loans, people who click on an advertisement, etc. and is tasked with developing an algorithm that will predict other instances of these phenomena. While an algorithm can be successful or unsuccessful at its task to varying degrees, it is unclear what makes such an algorithm fair or unfair.
    Found 1 month, 1 week ago on Stanford Encyclopedia of Philosophy
  17. 3301963.15035
    would be like if the theory were true. This concerns (i) what possibilities it represents, (ii) the internal structure of those possibilities and their interrelations, and, to some extent, (iii) how those possibilities differ from what’s come before. By providing an interpretive foil that one can amplify or amend, it aspires to shape the research agenda in the foundations of general relativity for established philosophers of physics, graduate students searching for work in these topics, and other interested academics. This title is also available as Open Access on Cambridge Core.
    Found 1 month, 1 week ago on Samuel C. Fletcher's site
  18. 3315849.150359
    Hilbert-space techniques are widely used not only for quantum theory, but also for classical physics. Two important examples are the Koopman-von Neumann (KvN) formulation and the method of “classical” wave functions. As this paper explains, these two approaches are conceptually distinct. In particular, the method of classical wave functions was not due to Bernard Koopman and John von Neumann, but was developed independently by a number of later researchers, perhaps first by Mario Schönberg, with key contributions from Angelo Loinger, Giacomo Della Riccia, Norbert Wiener, and E. C. George Sudarshan. The primary goals of this paper are to explain these two approaches, describe the relevant history in detail, and give credit where credit is due.
    Found 1 month, 1 week ago on PhilSci Archive
  19. 3315871.150366
    Why does quantum theory need the complex numbers? With a view toward answering this question, this paper argues that the usual Hilbert-space formalism is a special case of the general method of Markovian embeddings. This paper then describes the ‘indivisible interpretation’ of quantum theory, according to which a quantum system can be regarded as an ‘indivisible’ stochastic process unfolding in an old-fashioned configuration space, with wave functions and other exotic Hilbert-space ingredients demoted from having an ontological status. The complex numbers end up being necessary to ensure that the Hilbert-space formalism is indeed a Markovian embedding.
    Found 1 month, 1 week ago on PhilSci Archive
  20. 3315893.150374
    Scientific realism is the philosophical stance that science tracks truth, in particular in its depiction of the world’s ontology. Ontologically, this involves a commitment to the existence of entities posited by our best scientific theories; metaontologically, it includes the claim that the theoretical framework itself is true. In this article, we examine wave function realism as a case study within this broader methodological debate. Wave function realism holds that the wave function, as described by quantum mechanics, corresponds to a real physical entity. We focus on a recent formulation of this view that commits to the ontology of the wave function while deliberately avoiding the metaontological question of the framework’s truth. Instead, the view is defended on pragmatic, non-truth-conductive grounds. This, we argue, raises tensions for the purported realism of wave function realism and its compatibility with scientific realism more broadly.
    Found 1 month, 1 week ago on PhilSci Archive
  21. 3315929.15038
    There is a tendency in the philosophy of science to present the scientist as a ghostly being that just has degrees of belief in various descriptive statements, which are adjusted according to some rules of rational thinking (e.g. Bayes’ theorem) . . . We need a more serious understanding of scientists as agents, not as passive receivers of information or algorithmic processors of propositions . . .
    Found 1 month, 1 week ago on PhilSci Archive
  22. 3330885.150387
    Did you know that Lawvere did classified work on arms control in the 1960s, back when he was writing his thesis? Did you know that the French government offered him a job in military intelligence? The following paper should be interesting to applied category theorists—for a couple of different reasons: • Bill Lawvere, The category of probabilistic mappings with applications to stochastic processes, statistics, and pattern recognition, Spring 1962, featuring Lawvere’s abstract and author commentary from 2020, reformatted for Lawvere Archive Posthumous Publications by Tobias Fritz, July 14, 2025. …
    Found 1 month, 1 week ago on Azimuth
  23. 3330886.150403
    As you may recall, Matthew Adelstein uses r-K selection theory to argue that the average bug’s life is not worth living. Quick version: Humans have a few offspring, who typically receive immense parental investment. …
    Found 1 month, 1 week ago on Bet On It
  24. 3351009.15041
    Picture a playground on a sunny day, bustling with excited children. One falls and scratches her knee. Cries of distress draw the concern of a new friend. A few breaths later, she’s back on her feet with a big grin, ready for the next adventure. …
    Found 1 month, 1 week ago on Good Thoughts
  25. 3368106.150416
    Social authorities claim that we are obliged to obey their commands and they also claim the right to enforce them should we refuse. Many liberals (amongst others) insist that these claims hold water only when those subject to such an authority have agreed to obey it. Thus, according to classical liberals, people are subject to the authority of the state only if they have (in some sense) consented to its rule. Grounds for scepticism about a consent-based theory of political authority are no less familiar. Though ‘consent’ can mean different things, it is often observed that there is no form of consent which could both (a) validate political authority and (b) plausibly be attributed to most of the population of either past or present states.
    Found 1 month, 1 week ago on David Owens's site
  26. 3370024.150422
    PEA Soup is pleased to introduce the July Ethics article discussion on “Gender, Gender Expression, and the Dilemma of the Body” by Katie Zhou (MIT). The précis is from Cressida Heyes (University of Alberta). …
    Found 1 month, 1 week ago on PEA Soup
  27. 3407867.15043
    Aristotle had a famous argument that time had no beginning or end. In the case of beginnings, this argument caused immense philosophical suffering in the middle ages, since combined with the idea that time requires change it implies that the universe was eternal, contrary to the Jewish, Muslim and Christian that God created the universe a finite amount of time ago. …
    Found 1 month, 1 week ago on Alexander Pruss's Blog
  28. 3423815.150437
    There are two parts of Aristotle’s theory that are hard to fit together. First, we have Aristotle’s view of future contingents, on which - It is neither true nor false that tomorrow there will be a sea battle but, of course: - It is true that tomorrow there will be a sea battle or no sea battle. …
    Found 1 month, 1 week ago on Alexander Pruss's Blog
  29. 3429346.150444
    The paper proposes and studies new classical, type-free theories of truth and determinateness with unprecedented features. The theories are fully compositional, strongly classical (namely, their internal and external logics are both classical), and feature a defined determinateness predicate satisfying desirable and widely agreed principles. The theories capture a conception of truth and determinateness according to which the generalizing power associated with the classicality and full compositionality of truth is combined with the identification of a natural class of sentences – the determinate ones – for which clear-cut semantic rules are available. Our theories can also be seen as the classical closures of Kripke-Feferman truth: their ω-models, which we precisely pinned down, result from including in the extension of the truth predicate the sentences that are satisfied by a Kripkean closed-off fixed point model. The theories compare to recent theories proposed by Fujimoto and Halbach, featuring a primitive determinateness predicate. In the paper we show that our theories entail all principles of Fujimoto and Halbach’s theories, and are proof-theoretically equivalent to Fujimoto and Halbach’s CD . We also show establish some negative results on Fujimoto and Halbach’s theories: such results show that, unlike what happens in our theories, the primitive determinateness predicate prevents one from establishing clear and unrestricted semantic rules for the language with type-free truth.
    Found 1 month, 1 week ago on Carlo Nicolai's site
  30. 3431238.15045
    In contemporary philosophy of physics, there has recently been a renewed interest in the theory of geometric objects—a programme developed originally by geometers such as Schouten, Veblen, and others in the 1920s and 30s. However, as yet, there has been little-to-no systematic investigation into the history of the geometric object concept. I discuss the early development of the geometric object concept, and show that geometers working on the programme in the 1920s and early 1930s had a more expansive conception of geometric objects than that which is found in later presentations— which, unlike the modern conception of geometric objects, included embedded submanifolds such as points, curves, and hypersurfaces. I reconstruct and critically evaluate their arguments for this more expansive geometric object concept, and also locate and assess the transition to the more restrictive modern geometric object concept.
    Found 1 month, 1 week ago on PhilSci Archive