-
32756.197777
We evaluate the roles general relativistic assumptions play in simulations used in recent observations of black holes including LIGO-Virgo and the Event Horizon Telescope. In both experiments simulations play an ampliative role, enabling the extraction of more information from the data than would be possible otherwise. This comes at a cost of theory-ladenness. We discuss the issue of inferential circularity, which arises in some applications; classify some of the epistemic strategies used to reduce the extent of theory-ladenness; and discuss ways in which these strategies are model independent.
-
32780.197847
The extraterrestrial hypothesis (ETH), the hypothesis that an extraterrestrial civilization (ETC) is active on Earth today, is taboo in academia, but the assumptions behind this taboo are faulty. Advances in biology have rendered the notion that complex life is rare in our Galaxy improbable. The objection that no ETC would come to Earth to hide from us does not consider all possible alien motives or means. For an advanced ETC, the convergent instrumental goals of all rational agents – self-preservation and the acquisition of resources – would support the objectives of removing existential threats and gathering strategic and non-strategic information.
-
63001.197855
Concept formation has recently become a widely discussed topic in philosophy under the headings of “conceptual engineering”, “conceptual ethics”, and “ameliorative analysis”. Much of this work has been inspired either by the method of explication or by ameliorative projects. In the former case, concept formation is usually seen as a tool of the sciences, of formal disciplines, and of philosophy. In the latter case, concept formation is seen as a tool in the service of social progress. While recent philosophical discussions on concept formation have addressed natural sciences such as physics as well as various life sciences, so far there is only little direct engagement with the social sciences. To address this shortcoming is important because many debates about socially relevant concepts such as power, gender, democracy, risk, justice, or rationality, may best be understood as engaging in conceptual engineering. This topical collection addresses the nature and structure of concept formation in the natural, the life, and the social sciences alike, both as a process taking place within science and as an activity that aims at a broader impact in society. This helps to understand how concept formation proceeds not only in the natural sciences but also in disciplines such as psychology, cognitive science, political science, sociology and economics.
-
63018.197861
Did you ever submit a grant proposal to a funding agency? Then, you have likely encountered the request to specify your research method. Anecdotal evidence suggests that philosophers often address this unpopular request by mentioning reflective equilibrium (RE), the method proposed by Goodman (1983 [1954]) and baptized by John Rawls in his “A Theory of Justice” (1971). Appeal to RE has indeed become a standard move in ethics (see, e.g., Daniels, 1996; Swanton, 1992; van der Burg & van Willigenburg, 1998; DePaul, 2011; Mikhail, 2011; Beauchamp & Childress, ). The method has also been referred to in many other branches of philosophy, e.g., in methodological discussions about logic (e.g., Goodman, 1983; Resnik, 1985, , 1997; Brun, 2014; Peregrin & Svoboda, 2017) and theories of rationality (e.g., Cohen, 1981; Stein, 1996). Some philosophers have gone as far as to argue that RE is unavoidable in ethics (Scanlon, 2003) or simply the philosophical method (Lewis, , p. x; Keefe, 2000, ch. 2). The popularity of RE indicates that its key idea resonates well with the inclinations of many philosophers: You start with your initial views or commitments on a theme and try to systematize them in terms of a theory or a few principles. Discrepancies between theory and commitments trigger a characteristic back and forth between the commitments and the theories, in which commitments and theories are adjusted to each other until an equilibrium state is reached.
-
119859.197867
Nietzsche’s first book was entitled The Birth of Tragedy out
of the Spirit of Music (1872), and one of his very last works was
called The Case of Wagner: A Musician’s Problem (1888). As this simple fact indicates, reflection on art (and especially, on
music and drama) is an abiding and central feature of
Nietzsche’s thought. Indeed, very nearly all of his works
address aesthetic questions at least in passing. Some of these
questions are familiar from the philosophical tradition: e.g., how
should we explain the effect tragedy has on us? What is the relation
of aesthetic value to other kinds of value?
-
126067.197872
Benacerraf famously argued that no set theoretic reduction can capture the natural numbers. While one might conclude from this that the natural numbers are some kind of sui generis entities, Benacerraf instead opts for a structuralist view on which different things can play the role of different numbers. …
-
136044.197878
Recently, Dardashti et al. (Stud Hist Philos Sci Part B Stud Hist Philos Mod Phys 67:1–11, 2019) proposed a Bayesian model for establishing Hawking radiation by analogical inference. In this paper we investigate whether their model would work as a general model for analogical inference. We study how it performs when varying the believed degree of similarity between the source and the target system. We show that there are circumstances in which the degree of confirmation for the hypothesis about the target system obtained by collecting evidence from the source system goes down when increasing the believed degree of similarity between the two systems. We then develop an alternative model in which the direction of the variation of the degree of confirmation always coincides with the direction of the believed degree of similarity. Finally, we argue that the two models capture different types of analogical inference.
-
137050.197884
Journal of the American Philosophical Association () – © The Author(s), . Published by Cambridge University Press on behalf of the American Philosophical Association. This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/ licenses/by/.), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
-
139429.197894
1. Strong and weak notions of erasure are distinguished according to whether the single erasure procedure does or does not leave the environment in the same state independently of the pre-erasure state. 2. Purely thermodynamic considerations show that strong erasure cannot be dissipationless. 3. The main source of entropy creation in erasure processes at molecular scales is the entropy that must be created to suppress thermal fluctuations (“noise”). 4. A phase space analysis recovers no minimum entropy cost for weak erasure and a positive minimum entropy cost for strong erasure. 5. An information entropy term has been attributed mistakenly to pre-erasure states in the Gibbs formalism through the neglect of an additive constant in the “–k sum p log p” Gibbs entropy formula.
-
148048.1979
Hadfield-Menell et al. (2017) propose the Off-Switch Game, a model of Human-AI cooperation in which AI agents always defer to humans because they are uncertain about our preferences. I explain two reasons why AI agents might not defer. First, AI agents might not value learning. Second, even if AI agents value learning, they might not be certain to learn our actual preferences.
-
148067.197906
The changes that quantum states undergo during measurement are both probabilistic and nonlo-cal. These two characteristics complement one another to insure compatibility with relativity and maintain conservation laws. Nonlocal entanglement relations provide a means to enforce conservation laws in a probabilistic theory, while the probabilistic nature of nonlocal effects prevents the superluminal transmission of information. In order to explain these measurement-induced changes in terms of fundamental physical processes it is necessary to take these two key characteristics into account. One way to do this is to modify the Schrodinger equation by adding stochastic, nonlinear terms. A number of such proposals have been made over the past few decades. A recently proposed equation based on the assumption that wave function collapse is induced by a sequence of correlating interactions of the kind that constitute measurements has been shown to maintain strict adherence to conservation laws in individual instances, and has also eliminated the need to introduce any new, ad hoc physical constants. In this work it is shown that the stochastic modification to the Schrodinger equation is Lorentz invariant. It is further argued that the additional spacetime structure that it requires provides a way to implement the assumption that spacelike-separated operators (and measurements) commute, and that this assumption of local commutativity should be regarded as a third postulate of relativity.
-
148086.197912
This essay is a two-step reflection on the question ‘Which events (can be said to) occur in quantum phenomena?’ The first step regiments the ontological category of statistical phenomena and studies the adequacy of probabilistic event models as descriptions thereof. Guided by the conviction that quantum phenomena are to be circumscribed within this same ontological category, the second step highlights the peculiarities of probabilistic event models of some non-relativistic quantum phenomena, and thereby of what appear to be some plausible answers to our initial question. The reflection ends in an aporetic state, as it is by now usual in encounters between ontology and the quantum.
-
189115.197919
The inference pattern known as disjunctive syllogism (DS) appears as a derived rule in Gentzen’s natural deduction calculi NI and NK. This is a paradoxical feature of Gentzen’s calculi in so far as DS is sometimes thought of as appearing intuitively more elementary than the rules ∨E, ¬E, and EFQ that figure in its derivation. For this reason, many contemporary presentations of natural deduction depart from Gentzen and include DS as a primitive rule. However, such departures violate the spirit of natural deduction, according to which primitive rules are meant to relationally define logical connectives via universal properties (§2). This situation raises the question: Can disjunction be relationally defined with DS instead of with Gentzen’s ∨I and ∨E rules? We answer this question in the affirmative and explore the duality between Gentzen’s definition and our own (§3). We argue further that the two universal characterizations, rather than provide competing relational definitions of a single disjunction operator, disambiguate natural language’s “or” (§4). Finally, this disambiguation is shown to correspond exactly with the additive and multiplicative disjunctions of linear logic (§5). The hope is that this analysis sheds new light on the latter connective, so often deemed mysterious in writing about linear logic.
-
203796.197924
Neil Mehta has written a fantastic book. A Pluralist Theory of Perception develops a novel theory of perception that illuminates the metaphysical structure, epistemic significance, and semantic role of perceptual consciousness. By and large, I found the core tenets of Mehta’s theory to be highly plausible and successfully defended. I could quibble with some parts (e.g., his claim that our conscious awareness of sensory qualities is non-representational). But I suspect our disagreements are largely verbal, and where they are non-verbal, they are minor. Instead of focusing on disagreements, in this commentary I wish to explore the metaphysical ramifications of Mehta’s theory with respect to the mind-body problem. Mehta has a great deal to say about the metaphysics of perception. Much of it seems to me to be in tension with physicalism. But throughout the book he remains officially neutral on the truth of physicalism, “in reflection of [his] genuine uncertainty” (ibid: 100). I will try to show that Mehta’s commitments lead almost inexorably to dualism (or, at least, away from physicalism) by giving three arguments against physicalism that centrally rely on premises to which Mehta is committed.
-
205747.197933
If the philosophy of mathematics wants to be rigorous, the concept of infinity must stop being equivocal (both potential and actual) as it currently is. The conception of infinity as actual is responsible for all the paradoxes that compromise the very foundation of mathematics and is also the basis on which Cantor's argument is based on the non-countability of R, and the existence of infinite cardinals of different magnitude. Here we present proof that all infinite sets (in a potential sense) are countable and that there are no infinite cardinals.
-
263416.19794
The philosophical literature on mathematical structuralism and its history has focused on the emergence of structuralism in the 19th century. Yet modern abstractionist accounts cannot provide an historical account for the abstraction process. This paper will examine the role of relations in the history of mathematics, focusing on three main epochs where relational abstraction is most prominent: ancient Greek, 17th and 19th centuries, to provide a philosophical account for the abstraction of structures. Though these structures emerged in the 19th century with definitional axioms, the need for such axioms in the abstraction process comes about, as this paper will show, after a series of relational abstractions without a suitable basis.
-
316992.197946
This paper argues for a unified account of semantic and pragmatic infelicity. It is argued that an utterance is infelicitous when it communicates an inconsistent set of propositions, given the context. In cases of semantic infelicity the relevant utterance expresses a set of inconsistent propositions, whereas pragmatic infelicity is a matter of the utterance conflicting with contextual expectations or assumptions. We spell out this view within the standard framework according to which a central aim of communication is to update a body of information shared among the participants. We show that this account explains different kinds of infelicity for both declarative and non-declarative utterances. Further, the account is seen to make correct predictions for a range of cases involving irony, joking, and related non-assertoric utterances.
-
317609.197952
My guess is that most of you have never read Friedrich Nietzsche’s Thus Spoke Zarathustra. While it offers very few actual arguments, it’s some of my all-time favorite poetry. To sell you, here is perhaps my favorite chapter, “The Preachers of Death.”
By the way, I know scholars disfavor this translation. …
-
321100.197957
In theory, replication experiments purport to independently validate claims from previous research or provide some diagnostic evidence about their truth value. In practice, this value of replication experiments is often taken for granted. Our research shows that in replication experiments, practice often does not live up to theory. Most replication experiments involve confounding factors and their results are not uniquely determined by the treatment of interest, hence are uninterpretable. These results can be driven by the true data generating mechanism, limitations of the original experimental design, discrepancies between the original and the replication experiment, distinct limitations of the replication experiment, or combinations of any of these factors. Here we introduce the notion of minimum viable experiment to replicate which defines experimental conditions that always yield interpretable replication results and is replication-ready. We believe that most reported experiments are not replication-ready and before striving to replicate a given result, we need theoretical precision in or systematic exploration of the experimental space to discover empirical regularities.
-
321118.197963
The paper examines critically some recently published views by Ramsey on the contrast between ab initio and parametrized theories. I argue that, all things being equal, ab initio calculations are indeed regarded more highly in the physics and chemistry communities. A case study on density functional approaches in theoretical chemistry is presented in order to re-examine the question of ab initio and parametrized approaches in a contemporary context.
-
354442.197971
The optimism bias is a cognitive bias where individuals overestimate the likelihood of good outcomes and underestimate the likelihood of bad outcomes. Associated with improved quality of life, optimism bias is considered to be adaptive and is a promising avenue of research for mental health interventions in conditions where individuals lack optimism such as major depressive disorder. Here we lay the groundwork for future research on optimism as an intervention by introducing a domain general formal model of optimism bias, which can be applied in different task settings. Employing the active inference framework, we propose a model of the optimism bias as high precision likelihood biased towards positive outcomes. First, we simulate how optimism may be lost during development by exposure to negative events. We then ground our model in the empirical literature by showing how the developmentally acquired differences in optimism are expressed in a belief updating task typically used to assess optimism bias. Finally, we show how optimism affects action in a modified two-armed bandit task. Our model and the simulations it affords provide a computational basis for understanding how optimism bias may emerge, how it may be expressed in standard tasks used to assess optimism, and how it affects agents’ decision-making and actions; in combination, this provides a basis for future research on optimism as a mental health intervention.
-
378823.197977
In the foundations of quantum mechanics (QM), one important distinction is that drawn by Harrigan and Spekkens (2010), between ‘ψ-ontic’ and ‘ψ-epistemic’ approaches. Here, recall, is how they put the distinction: We call a hidden variable model ψ-ontic if every complete physical state or ontic state in the theory is consistent with only one pure quantum state; we call it ψ-epistemic if there exist on-tic states that are consistent with more than one pure quantum state. (Harrigan and Spekkens , p. 126) Famously, ψ-epistemic approaches are at risk of falling prey to the no-go theorem of Pusey et al. (2012) (the ‘PBR theorem’). That being said, there are other approaches to QM which might be described (if only loosely) as ‘epistemic’, which (at least prima facie) reject the ontological models framework in which the PBR theorem is situated, and (prima facie, ipso facto) manage to evade it. These approaches include many of the ‘epistemic-pragmatist’ approaches which are the subject of the article under review here.
-
378869.197982
The Aharonov-Bohm (AB) effect highlights the fundamental role of electromagnetic potentials in quantum mechanics. While extensively studied in the static case, the impact of a time-varying magnetic flux on the electron’s phase shift remains an open and debated question. In this paper, we derive the AB phase shift for a time-dependent magnetic vector potential and show that it is proportional to the time average of enclosed magnetic flux. Our analysis reveals that the AB phase is continuously accumulated as the electron traverses its path, challenging the conventional view that it emerges instantaneously at the point of interference. This generalized AB effect may provide deeper insight into the role of gauge-dependent potentials in quantum mechanics and also suggest novel experimental tests using alternating or pulsed magnetic flux.
-
378904.197987
This article concerns various foundational aspects of the periodic system of the elements. These issues include the dual nature of the concept of an “element” to include element as a “basic substance” and as a “simple substance.” We will discuss the question of whether there is an optimal form of the periodic table, including whether the left-step table fulfils this role. We will also discuss the derivation or explanation of the [n ⫹ ᐉ , n] or Madelung rule for electron-shell filling and whether indeed it is important to attempt to derive this rule from first principles. In particular, we examine the views of two chemists, Henry Bent and Eugen Schwarz, who have independently addressed many of these issues.
-
380944.197993
Here’s an option that is underexplored: theistic Humeanism. There are two paths to it. The path from orthodoxy: Start with a standard theistic concurrentism: whenever we have a creaturely cause C with effect E, E only eventuates because God concurs, i.e., God cooperates with the creaturely causal relation. …
-
380945.197998
It now seems the switch of Cancel Culture has only two settings:
- everything is cancellable—including giving intellectual arguments against specific DEI policies, or teaching students about a Chinese filler word (“ne-ge”) that sounds a little like the N-word, or else
- nothing is cancellable—not even tweeting “normalize Indian hate” and “I was racist before it was cool,” shortly before getting empowered to remake the US federal government. …
-
401146.198004
This post is free for all, so feel free to share it widely if you feel so inclined. And please ‘like’ it via the heart below and restack it on notes if you get something out of it. It’s the best way to help others find my work. …
-
422917.198009
Picking up where I left off in a 2023 post, I will (finally!) return to Gardiner and Zaharos’s discussion of sensitivity in epistemology and its connection to my notion of severity. But before turning to Parts II (and III), I’d better reblog Part I. …
-
426829.198016
In an ingenious and provocative paper, “Individualism, Type Specimens, and the Scrutability of Species Membership”, Alex Levine argues that “species membership, by which I mean the relation that connects a given organism, o, with the species S of which it is part, is a fundamentally contingent matter” (2001, 333). He finds this contingency in conflict with the role of “type specimens” in biology. He points out that “naming a species requires collecting and preserving one, or at most a very few specimens of the species in question” (327). David Hull has the following view of this practice: The sole function of the type specimen is to be the name bearer for its species. No matter in which species the type specimen is placed, its name goes with it. (Hull 1982, 484) Levine takes Hull’s view, together with the “rigid designation” theory of reference, to entail that any organism selected as the type specimen for a species is necessarily a member of that species. This generates the conflict that Levine sums up neatly as follows: “qua organism, the type specimen belongs to its respective species contingently, while qua type specimen, it belongs necessarily”; he finds this “paradoxical” (Levine 2001, 334).
-
427144.198022
Until the ‘70s, the received view in the theory of reference was that the referent of a term was identified by certain descriptions that competent speakers associated with the term; for example, the referent of the proper name ‘Aristotle’ was determined by its association with a description like ‘the pupil of Plato and teacher of Alexander the Great’; the reference of the natural kind term ‘tiger’, by a description like ‘large feline with yellow and black stripes and a white belly’. But then came the revolution in the theory of reference, stemming particularly from the works of Kripke (1980) and Putnam (1975). It was argued that this “Description Theory” was fundamentally wrong for many terms, including ‘Aristotle’ and ‘tiger’. “Ignorance and error” arguments were particularly influential. People are often too ignorant to supply descriptions that would uniquely identify the referents of their terms. Most of us refer successfully with ‘elm’, but could not come close to describing those trees well enough to distinguish them from other trees like beeches. Speakers can also associate erroneous descriptions with a term; some who use ‘Einstein’ to refer successfully to the famous physicist wrongly think he invented the atomic bomb.