-
34447.027303
Empiricists following Poincaré have argued that spacetime geometry can be freely chosen by convention, while adjusting unobservable structure so as to maintain empirical adequacy. In this article, I first strengthen a no-go result of Weatherall and Manchak against the conventionality of geometry, and then argue that any remaining conventionality arises from scientific incompleteness. To illustrate, I discuss a new kind of conventionality that is available in the presence of higher spatial dimensions, and illustrate how the incompleteness in such models can be resolved by introducing new physical theories like Kaluza-Klein theory. Conventional choices of this kind may provide a fruitful starting point in the search for new science, but if successful would eliminate the conventionalist alternatives.
-
89278.027368
Free choice sequences play a key role in the Brouwerian continuum. Using recent modal analysis of potential infinity, we can make sense of free choice sequences as potentially infinite sequences of natural numbers without adopting Brouwer’s distinctive idealistic metaphysics. This provides classicists with a means to make sense of intuitionistic ideas from their own classical perspective. I develop a modal-potentialist theory of real numbers that suffices to capture the most distinctive features of intuitionistic analysis, such as Brouwer’s continuity theorem, the existence of a sequence that is monotone, bounded, and non-convergent, and the inability to decompose the continuum non-trivially.
-
119551.027392
We draw on value theory in social psychology to conceptualize the range of motives that can influence researchers’ attitudes, decisions, and actions. To conceptualize academic research values, we integrate theoretical insights from the literature on personal, work, and scientific work values, as well as the outcome of interviews and a survey among 255 participants about values relating to academic research. Finally, we propose a total of 246 academic research value items spread over 11 dimensions and 34 sub-themes. We relate our conceptualization and proposed items to existing work and provide recommendations for future scale development. Gaining a better understanding of researchers’ different values can improve careers in science, attract a more diverse range of people to enter science, and elucidate some of the mechanisms that lead to both exemplary and questionable scientific practices.
-
339645.027402
Sunwin chính chủ sở hữu bộ core game cùng hệ thống chăm sóc khách hàng vô địch. Sunwin hiện nay giả mạo rất nhiều anh em chú ý check kĩ uy tín đường link để đảm bảo an toàn và trải nghiệm game đỉnh cao duy nhất. …
-
339645.02741
Sunwin chính chủ sở hữu bộ core game cùng hệ thống chăm sóc khách hàng vô địch. Sunwin hiện nay giả mạo rất nhiều anh em chú ý check kĩ uy tín đường link để đảm bảo an toàn và trải nghiệm game đỉnh cao duy nhất. …
-
380437.027432
Hannah Rubin, Mike D. Schneider, Remco Heesen, Alejandro Bortolus, Emelda E. Chukwu, Chad L. Hewitt, Ricardo Kaufer, Veli Mitova, Anne Schwenkenbecher, Evangelina Schwindt, Temitope O. Sogbanmu, Helena Slanickova, Katie Woolaston
Knowledge brokers, usually conceptualized as passive intermediaries between scientists and policymakers in evidence-based policymaking, are understudied in philosophy of science. Here, we challenge that usual conceptualization. As agents in their own right, knowledge brokers have their own goals and incentives, which complicate the effects of their presence at the science-policy interface. We illustrate this in an agent-based model and suggest several avenues for further exploration of the role of knowledge brokers in evidence-based policy.
-
438122.027441
One Approach to the Necessary Conditions of Free Will Logical Paradox and the Essential Unpredictability of Physical Agents Even today, there is no precise definition of free will – only mere hypotheses and intuitions. This is why this paper will approach the question of free will from a negative perspective, depicting a scenario in which free will seemingly exists. Subsequently, it will attempt to refute this scenario (as a necessary condition for free will). The absence of free will might seem absolute if scientific determinism holds true. Therefore, the goal of the study is to present a logical argument (paradox) that demonstrates the impossibility of an omniscient (P) predictor (scientific determinism), highlighting its inherent self-contradiction. This paradox reveals that the prediction (P = C) by a (P) physical agent of itself is objectively impossible. In other words, even a fully deterministic agent in a deterministic universe cannot predict its own future state, not even in a Platonic sense.
-
438139.02745
A nested interferometer experiment by Danan et al (2013) is discussed and some ontological implications explored, primarily in the context of time-symmetric interpretations of quantum theory. It is pointed out that photons are supported by all components of their wavefunctions, not selectively truncated "first order" portions of them, and that figures representing both a gap in the photon's path and signals from the cut-off path are incorrect. It is also noted that the Transactional Formulation (traditionally known as the Transactional Interpretation) readily accounts for the observed phenomena.
-
438231.027458
Quantum mechanics with a fundamental density matrix has been proposed and discussed recently. Moreover, it has been conjectured that the universe is not in a pure state but in a mixed state in this theory. In this paper, I argue that this mixed state conjecture has two main problems: the redundancy problem and the underdetermination problem, which are lacking in quantum mechanics with a definite initial wave function of the universe.
-
505890.027465
This is a bit of a shaggy dog story, but I think it’s fun, and there’s a moral about the nature of mathematical research. Act 1
Once I was interested in the McGee graph, nicely animated here by Mamouka Jibladze:
This is the unique (3,7)-cage, meaning a graph such that each vertex has 3 neighbors and the shortest cycle has length 7. …
-
534013.027472
These days, any quantum computing post I write ought to begin with the disclaimer that the armies of Sauron are triumphing around the globe, this is the darkest time for humanity most of us have ever known, and nothing else matters by comparison. …
-
610141.027479
We develop a theory of policy advice that focuses on the relationship between the competence of the advisor (e.g., an expert bureaucracy) and the quality of advice that the leader may expect. We describe important tensions between these features present in a wide class of substantively important circumstances. These tensions point to the presence of a trade-off between receiving advice more often and receiving more informative advice. The optimal realization of this trade-off for the leader sometimes induces her to prefer advisors of limited competence – a preference that, we show, is robust under different informational assumptions. We consider how institutional tools available to leaders affect preferences for advisor competence and the quality of advice they may expect to receive in equilibrium.
-
611228.027488
There are two main strands of arguments regarding the value-free ideal (VFI): desirability and achievability (Reiss and Sprenger 2020). In this essay, I will argue for what I will call a compatibilist account of upholding the VFI focusing on its desirability even if the VFI is unachievable. First, I will explain what the VFI is. Second, I will show that striving to uphold the VFI (desirability) is compatible with the rejection of its achievability. Third, I will demonstrate that the main arguments against the VFI do not refute its desirability. Finally, I will provide arguments on why it is desirable to strive to uphold the VFI even if the VFI is unachievable and show what role it can play in scientific inquiry. There is no single definition of the VFI, yet the most common way to interpret it is that non-epistemic values ought not to influence scientific reasoning (Brown 2024, 2). Non-epistemic values are understood as certain ethical, social, cultural or political considerations. Therefore, it is the role of epistemic values, such as accuracy, consistency, empirical adequacy and simplicity, to be part of and to ensure proper scientific reasoning.
-
611245.027495
There is an overwhelmingly abundance of works in AI Ethics. This growth is chaotic because of how sudden it is, its volume, and its multidisciplinary nature. This makes difficult to keep track of debates, and to systematically characterize goals, research questions, methods, and expertise required by AI ethicists. In this article, I show that the relation between ‘AI’ and ‘ethics’ can be characterized in at least three ways, which correspond to three well-represented kinds of AI ethics: ethics and AI; ethics in AI; ethics of AI. I elucidate the features of these three kinds of AI Ethics, characterize their research questions, and identify the kind of expertise that each kind needs. I also show how certain criticisms to AI ethics are misplaced, as being done from the point of view of one kind of AI ethics, to another kind with different goals. All in all, this work sheds light on the nature of AI ethics, and set the grounds for more informed discussions about scope, methods, and trainings of AI ethicists.
-
697310.027504
Prioritarianism is generally understood as a kind of moral
axiology. An axiology provides an account of what makes items, in
this case outcomes, good or bad, better or worse. A moral
axiology focuses on moral value: on what makes outcomes
morally good or bad, morally better or worse. Prioritarianism, specifically, posits that the moral-betterness
ranking of outcomes gives extra weight (“priority”) to
well-being gains and losses affecting those at lower levels of
well-being. It differs from utilitarianism, which is indifferent to
the well-being levels of those affected by gains and
losses.[ 1 ]
Although it is possible to construe prioritarianism as a
non-axiological moral view, this entry follows the prevailing approach
and trains its attention on axiological prioritarianism.
-
712826.027511
Statistics play an essential role in an extremely wide range of human reasoning. From theorizing in the physical and social sciences to determining evidential standards in legal contexts, statistical methods are ubiquitous, and thus various questions about their application inevitably arise. As tools for making inferences that go beyond a given set of data, they are inherently a means of reasoning ampliatively, and so it is unsurprising that philosophers interested in the notions of evidence and inductive inference have been concerned to utilize statistical frameworks to further our understanding of these topics. However, the field of statistics has long been the subject of heated philosophical controversy. Given that a central goal for philosophers of science is to help resolve problems about evidence and inference in scientific practice, it is important that they be involved in current debates in statistics and data science. The purpose of this topical collection is to promote such philosophical interaction. We present a cross-section of these subjects, written by scholars from a variety of fields in order to explore issues in philosophy of statistics from different perspectives.
-
726613.027519
I take a pragmatist perspective on quantum theory. This is not a view of the world described by quantum theory. In this view quantum theory itself does not describe the physical world (nor our observations, experiences or opinions of it). Instead, the theory offers reliable advice—on when to expect an event of one kind or another, and on how strongly to expect each possible outcome of that event. The event’s actual outcome is a perspectival fact—a fact relative to a physical context of assessment. Measurement outcomes and quantum states are both perspectival. By noticing that each must be relativized to an appropriate physical context one can resolve the measurement problem and the problem of nonlocal action. But if the outcome of a quantum measurement is not an absolute fact, then why should the statistics of such outcomes give us any objective reason to accept quantum theory? One can describe extensions of the scenario of Wigner’s friend in which a statement expressing the outcome of a quantum measurement would be true relative to one such context but not relative to another. However, physical conditions in our world prevent us from realizing such scenarios. Since the outcome of every actual quantum measurement is certified at what is essentially a single context of assessment, the outcome relative to that context is an objective fact in the only sense that matters for science. We should accept quantum theory because the statistics these outcomes display are just those it leads us to expect.
-
726631.027527
Extrapolating causal effects is becoming an increasingly important kind of inference in Evidence-Based Policy, development economics, and microeconometrics more generally. While several strategies have been proposed to aid with extrapolation, the existing methodological literature has left our understanding of what extrapolation consists of and what constitutes successful extrapolation underdeveloped. This paper addresses this lack in understanding by offering a novel account of successful extrapolation. Building on existing contributions pertaining to the challenges involved in extrapolation, this more nuanced and comprehensive account seeks to provide tools that facilitate the scrutiny of specific extrapolative inferences and general strategies for extrapolation. Offering such resources is important especially in view of the increasing amounts of real-world decision-making in policy, development, and beyond that involve extrapolation.
-
726692.027536
I dispute the conventional claim that the second law of thermodynamics is saved from a "Maxwell's Demon" by the entropy cost of information erasure, and show that instead it is measurement that incurs the entropy cost. Thus Brillouin, who identified measurement as savior of the second law, was essentially correct, and putative refutations of his view, such as Bennett's claim to measure without entropy cost, are seen to fail when the applicable physics is taken into account. I argue that the tradition of attributing the defeat of Maxwell's Demon to erasure rather than to measurement arose from unphysical classical idealizations that do not hold for real gas molecules, as well as a physically ungrounded recasting of physical thermodynamical processes into computational and information-theoretic conceptualizations. I argue that the fundamental principle that saves the second law is the quantum uncertainty principle applying to the need to localize physical states to precise values of observables in order to eQect the desired disequilibria aimed at violating the second law. I obtain the specific entropy cost for localizing a molecule in the Szilard engine, which coincides with the quantity attributed to Landauer's principle. I also note that an experiment characterized as upholding an entropy cost of erasure in a "quantum Maxwell's Demon" actually demonstrates an entropy cost of measurement.
-
957369.027543
We draw a distinction between the traditional reference class problem which describes an obstruction to estimating a single individual probability—which we re-term the individual reference class problem—and what we call the reference class problem at scale, which can result when using tools from statistics and machine learning to systematically make predictions about many individual probabilities simultaneously. We argue that scale actually helps to mitigate the reference class problem, and purely statistical tools can be used to efficiently minimize the reference class problem at scale, even though they cannot be used to solve the individual reference class problem.
-
957385.027551
Modal Empiricism in philosophy of science proposes to understand the possibility of modal knowledge from experience by replacing talk of possible worlds with talk of possible situations, which are coarse-grained, bounded and relative to background conditions. This allows for an induction towards objective necessity, assuming that actual situations are representative of possible ones. The main limitation of this epistemology is that it does not account for probabilistic knowledge. In this paper, we propose to extend Modal Empiricism to the probabilistic case, thus providing an inductivist epistemology for probabilistic knowledge. The key idea is that extreme probabilities, close to 1 and 0, serve as proxies for testing mild probabilities, using a principle of model combination.
-
957405.027559
In operational quantum mechanics two measurements are called operationally equivalent if they yield the same distribution of outcomes in every quantum state and hence are represented by the same operator. In this paper, I will show that the ontological models for quantum mechanics and, more generally, for any operational theory sensitively depend on which measurement we choose from the class of operationally equivalent measurements, or more precisely, which of the chosen measurements can be performed simultaneously. To this goal, I will take first three examples—a classical theory, the EPR-Bell scenario and the Popescu-Rochlich box; then realize each example by two operationally equivalent but different operational theories—one with a trivial and another with a non-trivial compatibility structure; and finally show that the ontological models for the different theories will be different with respect to their causal structure, contextuality, and fine-tuning.
-
1072712.027567
QBism explicitly takes the subjective view: probabilities of events are defined solely by past experiences, i.e. the record of observations. As shown by the authors (Fuchs et al, 2013), this: “... removes the paradoxes, conundra, and pseudo-problems that have plagued quantum foundations for the past nine decades”. It is criticised for its lack of ontology and anthropocentric nature. However, if Everett's (1957) formulation is taken at face value, exactly the features of QBism are the result, and the ontology is inherent. The anthropocentric nature of the solution is simply an indication that the quantum state is relative, as is central to Everett. Problems of measurement and locality do not arise.
-
1072765.027574
The localization problem in relativistic quantum theory has persisted for more than seven decades, yet it is largely unknown and continues to perplex even those well-versed in the subject. At the heart of this problem lies a fundamental conflict between localizability and relativistic causality, which can also be construed as part of the broader dichotomy between measurement and unitary dynamics. This article provides a historical review of the localization problem in one-particle relativistic quantum mechanics, clarifying some persistent misconceptions in the literature, and underscoring the antinomy between causal dynamics and localized observables.
-
1112889.027581
High speed store required: 947 words. No of bits in a word: 64 Is the program overlaid? No No. of magnetic tapes required: None What other peripherals are used? Card Reader; Line Printer No. of cards in combined program and test deck: 112 Card punching code: EBCDIC Keywords: Atomic, Molecular, Nuclear, Rotation Matrix, Rotation Group, Representation, Euler Angle, Symmetry, Helicity, Correlation.
-
1148546.027588
I gave a talk on March 8 at an AI, Systems, and Society Conference at the Emory Center for Ethics. The organizer, Alex Tolbert (who had been a student at Virginia Tech), suggested I speak about controversies in statistics, especially P-hacking in statistical significance testing. …
-
1223244.027597
Where does the Born Rule come from? We ask: “What is the simplest extension of probability theory where the Born rule appears”? This is answered by introducing “superposition events” in addition to the usual discrete events. Two-dimensional matrices (e.g., incidence matrices and density matrices) are needed to mathematically represent the differences between the two types of events. Then it is shown that those incidence and density matrices for superposition events are the (outer) products of a vector and its transpose whose components foreshadow the “amplitudes” of quantum mechanics. The squares of the components of those “amplitude” vectors yield the probabilities of the outcomes. That is how probability amplitudes and the Born Rule arise in the minimal extension of probability theory to include superposition events. This naturally extends to the full Born Rule in the Hilbert spaces over the complex numbers of quantum mechanics. It would perhaps be satisfying if probability amplitudes and the Born Rule only arose as the result of deep results in quantum mechanics (e.g., Gleason’s Theorem). But both arise in a simple extension of probability theory to include “superposition events”–which should not be too surprising since superposition is the key non-classical concept in quantum mechanics.
-
1532958.027604
Brian Leftow’s 2022 book, Anselm’s Argument: Divine Necessity is an impressively thorough discussion of Anselmian modal metaphysics, centred around what he takes to be Anselm’s strongest “argument from perfection” (Leftow’s preferred term for an Ontological Argument). This is not the famous argument from Proslogion 2, nor even the modal argument that some have claimed to find in Proslogion 3, but rather, an argument from Anselm’s Reply to Gaunilo, expressed in the following quotation: “If … something than which no greater can be thought … existed, neither actually nor in the mind could it not exist. Otherwise it would not be something than which no greater can be thought. But whatever can be thought to exist and does not exist, if it existed, would be able actually or in the mind not to exist. For this reason, if it can be thought, it cannot not exist.” (p. 66) Before turning to this argument, Leftow offers an extended and closely-argued case for understanding Anselm’s modality in terms of absolute necessity and possibility, with a metaphysical foundation on powers as argued for at length (575 pages) in his 2012 book God and Necessity. After presenting this interpretation in Chapter 1, Leftow’s second chapter discusses various theological applications (such as the fixity of the past, God’s veracity, and immortality), addressing them in a way that both expounds and defends what he takes to be Anselm’s approach. Then in Chapter 3 Leftow addresses certain problems, for both his philosophical and interpretative claims, while Chapter 4 spells out the key Anselmian argument, together with Leftow’s suggested improvements. Chapter 5 explains how the argument depends on Brouwer’s system of modal logic, and defends this while also endorsing the more standard and comprehensive system S5.
-
1649280.027611
The theoretical developments that led to supersymmetry – first global and then local – over a period of about six years (1970/71-1976) emerged from a confluence of physical insights and mathematical methods drawn from diverse, and sometimes independent, research directions. Despite these varied origins, a common thread united them all: the pursuit of a unity in physics, grounded in the central role of symmetry, where “symmetry” is understood in terms of group theory, representation theory, algebra, and differential geometry.
-
1701977.027618
According to classical utilitarianism, well-being consists in pleasure or happiness, the good consists in the sum of well-being, and moral rightness consists in maximizing the good. Leibniz was perhaps the first to formulate this doctrine. Bentham made it widely known. For a long time, however, the second, summing part lacked any clear foundation. John Stuart Mill, Henry Sidgwick, and Richard Hare all gave arguments for utilitarianism, but they took this summing part for granted. It was John Harsanyi who finally presented compelling arguments for this controversial part of the utilitarian doctrine.