-
22781.167324
Jamin Asay has recently argued that deflationists about the concept of truth cannot satisfactorily account for our alethic desires, i.e. those of our desires that pertain to the truth of our beliefs. In this brief reply, I show how deflationists can draw on well-established psychological findings on framing effects to explain how the concept of truth behaves within the scope of our alethic desires.
-
58434.167543
Generally speaking, when we estimate a fixed population parameter based on the observation of a sample (i.e., not the whole population), we know that different samples would have generated different estimates. The magnitude of this difference is called the sampling error. When an estimator is consistent and unbiased, the sampling error can be reduce to an arbitrarily small difference, by increasing the sample size of a study. This is a desirable property since it implies that the estimates over all possible samples will be closer to the truth and less variable. Consequently, when a test statistic is based on a consistent and unbiased estimator, it is always desirable (although not always possible) to increase the sample size and therefore the power of the test in order to reduce the sampling error.
-
58469.167636
A familiar fact about theories in mathematical physics is that the laws of theory— in the form of …eld equations/equations of motion— exhibit symmetries that are not exhibited by typical solutions. The proponents of the symmetry-to-reality-inference suffer from a peculiar psychological malady; namely, they are morally offended by this fact. Rather than getting therapy for this malady they want to promote it to a philosophical doctrine that governs how we are to understand the way theories in mathematical physics represent the world. The physical processes described by these theories do, they claim, share the symmetries of the laws of the theory; the contrary appearance, they claim, is due to the fact that these theories employ variables that give redundant descriptions of the same underlying physical state. One would think that the proponents of this view would be eager to show how the false appearance given by the theory can corrected by reformulating the theory in an empirically equivalent form using variables that remove the redundancy and present solutions that all share the symmetries of the laws. Rarely, if ever, is this attempted, and the proponents of the StRI content themselves with special pleadings— indistinguishability arguments, anti-haecceitism considerations, ‡ying the ‡ag of spacetime relationism, etc. No amount of special pleading can cover up the fact that the commitment to the StRI does not do justice to the ways theories in mathematical physics represent the world, as illustrated here in some detail for the case of source-free Maxwell theory. I propose con…ning the application of StRI to gauge variables, as distinguished by the fact that taking different values of these variables to imply a difference in the physical state would entail that the …eld equations/equations of motion do not have a good Cauchy problem. This means of identifying gauge freedom is not without its problems. But discussing these problems is more productive than trying to treat the StRI malady.
-
63140.167742
Conceptual limitations restrict our epistemic options. One cannot believe, disbelieve, or doubt what one cannot grasp. I show how such restrictions could lead to epistemic dilemmas: situations in which each of one’s options violates some epistemic requirement. To help the dilemmist distinguish their view of these cases from alternative non-dilemmic analyses, I propose to treat puzzlement as a kind of epistemic residue, appropriate only when one has violated an epistemic requirement. As moral dilemmists have appealed to unavoidable guilt as a sign of a moral dilemma, so too can the epistemic dilemmist appeal to unavoidable puzzlement as a sign of an epistemic dilemma. I conclude by considering why, on the dilemmist’s view, it sometimes makes sense for inquirers to seek out puzzlement.
-
113791.167785
In my dissertation, I defended a causal power account of modality on
which something is possible just in case either it’s actual or something
can bring about a causal chain leading to its being actual. …
-
120926.167815
This article presents a theodicy based on a revision of the popular concept of God’s benevolence. If we follow the Protestant tradition by assuming that God is the exclusive source of virtue, the benevolence of God has to be radically different from the benevolence of a human being. A benevolent and almighty God who wishes to reward virtue and punish evil would design the world order similar to that in the allegory of the long spoons. Divine punishment is unforgiving, merciless, individually non-retributive, holistically retributive, and quantitatively unpredictable. All sufferings are divine punishment. Several popular arguments from evil, including animal suffering, victims of evil deeds, natural disasters, and children’s diseases, can be resolved within this framework.
-
120955.167841
In the concluding section of the Book One of the Treatise, Hume confronts radical scepticism about the standards of correct reasoning. According to the naturalistic interpretations, Hume resolves this scepticism by appealing to some psychological facts. A common criticism of this interpretation is that the alleged naturalistic epistemic norm seems to be merely Hume’s report of his psychology, and it remains unclear why this seemingly mere psychological description can provide a principled reason to overcome his scepticism. In this paper, I will argue that Hume’s discussions of the ”indirect passions” and social identity provide a constitutivist ground for the naturalistic epistemic standards in the “Conclusion”: being the object of the indirect passions constitutes what kind of person one is, and being the kind of person (philosopher in Hume’s case) gives non-optional reason to pursue certain kinds of reasoning.
-
174085.167868
Most existing proposals to explain the temporal asymmetries we see around us are sited within an approach to physics based on time evolution, and thus they typically put the asymmetry in at the beginning of time in the form of a special initial state. But there may be other possibilities for explaining temporal asymmetries if we don’t presuppose the time evolution paradigm. In this article, we explore one such possibility, based on Kent’s ‘final-measurement’ interpretation of quantum mechanics. We argue that this approach potentially has the resources to explain the electromagnetic asymmetry, the thermodynamic asymmetry, the coarse-graining asymmetry, the fork asymmetry, the record asymmetry, and the cosmological asymmetry, and that the explanations it offers may potentially be better than explanations appealing to a special initial state. Our hope is that this example will encourage further exploration of novel approaches to temporal asymmetry outside of the time evolution paradigm.
-
196303.167895
In his classic monograph, Social Choice and Individual Values, Arrow introduced the notion of a decisive coalition of voters as part of his mathematical framework for social choice theory. The subsequent literature on Arrow’s Impossibility Theorem has shown the importance for social choice theory of reasoning about coalitions of voters with different grades of decisiveness. The goal of this paper is a fine-grained analysis of reasoning about decisive coalitions, formalizing how the concept of a decisive coalition gives rise to a social choice theoretic language and logic all of its own. We show that given Arrow’s axioms of the Independence of Irrelevant Alternatives and Universal Domain, rationality postulates for social preference correspond to strong axioms about decisive coalitions. We demonstrate this correspondence with results of a kind familiar in economics—representation theorems—as well as results of a kind coming from mathematical logic—completeness theorems. We present a complete logic for reasoning about decisive coalitions, along with formal proofs of Arrow’s and Wilson’s theorems. In addition, we prove the correctness of an algorithm for calculating, given any social rationality postulate of a certain form in the language of binary preference, the corresponding axiom in the language of decisive coalitions. These results suggest for social choice theory new perspectives and tools from logic.
-
196384.167927
We describe the planning problem within the framework of dynamic epistemic logic (DEL), considering the tree of sequences of events as the underlying structure. In general, the DEL planning problem is computationally difficult to solve. On the other hand, a great deal of fruitful technical advances have led to deep insights into the way DEL works, and these can be exploited in special cases. We present a few properties that will lead to considerable simplifications of the DEL planning problem and apply them in a toy example.
-
196563.168141
We propose six axioms concerning when one candidate should defeat another in a democratic election involving two or more candidates. Five of the axioms are widely satisfied by known voting procedures. The sixth axiom is a weakening of Kenneth Arrow’s famous condition of the Independence of Irrelevant Alternatives (IIA). We call this weakening Coherent IIA. We prove that the five axiom splusCoherentI IA single out a method of determining defeats studied in our recent work: Split Cycle. In particular, Split Cycle provides the most resolute definition of defeat among any satisfying the six axioms for democratic defeat. In addition, we analyze how Split Cycle escapes Arrow’s impossibility theorem and related impossibility results.
-
196659.168198
Much of the theoretical work on strategic voting makes strong assumptions about what voters know about the voting situation. A strategizing voter is typically assumed to know how other voters will vote and to know the rules of the voting method. A growing body of literature explores strategic voting when there is uncertainty about how others will vote. In this paper, we study strategic voting when there is uncertainty about the voting method. We introduce three notions of manipulability for a set of voting methods: sure, safe, and expected manipulability. With the help of a computer program, we identify voting scenarios in which uncertainty about the voting method may reduce or even eliminate a voter’s incentive to misrepresent her preferences. Thus, it may be in the interest of an election designer who wishes to reduce strategic voting to leave voters uncertain about which of several reasonable voting methods will be used to determine the winners of an election.
-
196960.168233
A number of rules for resolving majority cycles in elections have been proposed in the literature. Recently, Holliday and Pacuit (Journal of Theoretical Politics 33 (2021) 475-524) axiomatically characterized one such cycle-resolving rule, dubbed Split Cycle: in each majority cycle, discard the majority preferences with the smallest majority margin. They showed that any rule satisfying five standard axioms, plus a weakening of Arrow’s Independence of Irrelevant Alternatives (IIA) called Coherent IIA, is refined by Split Cycle. In this paper, we go further and show that Split Cycle is the only rule satisfying the axioms of Holliday and Pacuit together with two additional axioms: Coherent Defeat and Positive Involvement in Defeat. Coherent Defeat states that any majority preference not occurring in a cycle is retained, while Positive Involvement in Defeat is closely related to the well-known axiom of Positive Involvement (as in J. P´erez, Social Choice and Welfare 18 (2001) 601-616). We characterize Split Cycle not only as a collective choice rule but also as a social choice correspondence, over both profiles of linear ballots and profiles of ballots allowing ties.
-
197000.168262
In this paper, we introduce a doxastic logic with expressions that are intended to represent definite descriptions for propositions. Using these definite descriptions, we can formalize sentences such as: • Ann believes that the strangest proposition that Bob believes is that neutrinos travel at twice the speed of light. • Ann believes that the strangest proposition that Bob believes is false. The first sentence is represented as Ba(γ is ϕ), where γ stands for “the strangest proposition that Bob believes” and ϕ stands for “that neutrinos travel at twice the speed of light”. The second sentence has both de re and de dicto readings, which are distinguished in our logic. We motivate our logical system with a novel analysis of the Brandenburger-Keisler paradox. Our analysis of this paradox uncovers an interesting connection between it and the Kaplan-Montague Knower paradox.
-
197246.16829
In 1976, Robert Aumann proved a fascinating result [2]. Suppose that two agents have the same prior probability and update their probability of an event E with private information by conditioning. Aumann showed that if the posterior probabilities of E are common knowledge, then the agents must assign the same posterior to E. This is true even if the agents receive different information. In other words, if agents have the same prior probability and update by conditioning, then the agents cannot “agree to disagree” about their posterior probabilities. This seminal result has been generalized in many ways [5, 9, 36, 24, 33, 29, 3, 13] and is still the subject of much discussion in Economics [16, 28, 25, 7, 35], Logic [11, 12, 17] and Philosophy [20, 24, 8, 10].
-
197396.168321
We propose a Condorcet consistent voting method that we call Split Cycle. Split Cycle belongs to the small family of known voting methods satisfying the anti-vote-splitting criterion of independence of clones. In this family, only Split Cycle satisfies a new criterion we call immunity to spoilers, which concerns adding candidates to elections, as well as the known criteria of positive involvement and negative involvement, which concern adding voters to elections. Thus, in contrast to other clone-independent methods, Split Cycle mitigates both “spoiler effects” and “strong no show paradoxes.”
-
197447.168349
There is a long tradition of fruitful interaction between logic and social choice theory. In recent years, much of this interaction has focused on computer-aided methods such as SAT solving and interactive theorem proving. In this paper, we report on the development of a framework for formalizing voting theory in the Lean theorem prover, which we have applied to verify properties of a recently studied voting method. While previous applications of interactive theorem proving to social choice (using Isabelle/HOL and Mizar) have focused on the verification of impossibility theorems, we aim to cover a variety of results ranging from impossibility theorems to the verification of properties of specific voting methods (e.g., Condorcet consistency, independence of clones, etc.). In order to formalize voting theoretic axioms concerning adding or removing candidates and voters, we work in a variable-election setting whose formalization makes use of dependent types in Lean.
-
226367.168376
Experiments have led some philosophers to conclude that the reference determination of natural kind terms is neither simply descriptive nor simply causal-historical. Various theories have been aired to account for this, including ambiguity, hybrid, and different-idiolects theories. Devitt and Porter (2021) hypothesized that some terms are covered by one theory, some another, with a place for all the proposed theories. The present paper tests hypotheses that the term ‘Rio de Janeiro Myrtle’ is simply causal-historical but the term ‘rice’ is hybrid. For, whereas the former term is of scientific but little practical interest, the latter is not: rice is a significant part of the human diet. So, we predicted there would be two factors to the reference determination of ‘rice’: a superficial-descriptive one and a deep-causal one. Our experiments confirmed these hypotheses using the methods of elicited production and truth value judgments. We take our results to support the hybrid Theory of ‘rice’ rather than the ambiguity or different-idiolects theory. We were not testing ‘myrtle’ but, surprisingly, our results implied that ‘myrtle’ was partly descriptive and so like ‘rice’ but not ‘Rio de Janeiro Myrtle’. A follow-up experiment confirmed these puzzling results. More investigation is needed.
-
226376.168428
Expressions of thought by members of a linguistic community are typically, to some extent, governed by the rules and principles (briefly, rules) of a shared language. Georges Rey and John Collins (“RC”), in “Laws and Luck in Language: Problems with Devitt’s Conventional, Common-Sense Linguistics” (Chapter 5, this volume), helpfully describe “a matter of ‘luck’” as something “accidental relative to a certain principled system” (p. xx). So, relative to the system of rules that constitute a language, an expression of thought that is not governed by the rules is a matter of linguistic “luck”: the linguistic rules are luck-reducing mechanisms. I am mainly concerned with the extent of that luck: To what extent are the semantic and syntactic properties of the linguistic tokens in expressions of thought, in utterances, not governed by the rules of the language and hence “lucky”? I am also concerned with the source of those rules. Are they innate or learned? Insofar as they are learned, they are accidental relative to innate human nature; they are a matter of luck. And insofar as the rules are thus lucky, so too are the expressions governed by them. So the concern is with two distinct sorts of linguistic luck: linguistic tokens that are accidental relative to linguistic rules are “r-lucky”; and linguistic rules, hence expressions governed by them that are accidental relative to human nature are “i-lucky.”
-
226380.168466
I never seem to tire of this action-theoretic case. You need to send
a nerve signal to your arm muscles because there is a machine that
detects these signals and dispenses food, and you’re hungry. So you
raise your arm. …
-
226413.168495
Significant work in the philosophy of history has focused on the writing of historiographical narratives, isolated from the rest of what historians do. Taking my cue from the philosophy of science in practice, I suggest that understanding historical narratives as embedded within historical practice more generally is fruitful. I illustrate this by bringing a particular instance of historical practice, Natalie Lawrence’s explanation of the sad fate of Winston the Platypus, into dialogue with some of Louis Mink’s arguments in favour of anti-realism about historical events. Attending to how historians seek out and utilize archival resources puts serious pressure on these arguments, motivates realist positions, and re-focuses the philosophy of history towards making sense of historiography as a part of the diversity of historians’ interests.
-
226423.168523
In this paper I argue that the extent to which a human trait is genetically caused can causally depend upon whether the trait is categorized within human genetics as genetically caused. This makes the kind genetically caused trait an interactive kind. I demonstrate that this thesis is both conceptually coherent and empirically plausible. I outline the core rationale of this thesis and demonstrate its conceptual coherence by drawing upon Waters’ (2007) analysis of genetic causation. I add empirical plausibility to the thesis by describing a hypothetical but empirically plausible mechanism by which the fact that obesity is categorized as genetically caused within human genetics increases the extent to which obesity is in fact genetically caused.
-
231786.16858
We examine evaluations of the contributions of Matrix Mechanics and Max Born to the formulation of quantum mechanics from Heisenberg's Helgoland paper of 1925 to Born's Nobel Prize of 1954. We point out that the process of evaluation is continuing in the light of recent interpretations of the theory that deemphasize the importance of the wave function.
-
231824.168597
This paper argues against a particular version of the inference from the success of a scientific theory to the claim that the theory must be approximately true to some extent. The kind of success at issue is comparative, where one theory is more empirically successful than its rival if that theory predicts phenomena that are inexplicable or anomalous according to its rival. A theory that exhibits this kind of comparative success can be seen as thereby achieving empirical progress over its rival. David Harker has developed a form of selective scientific realism based on the idea that this kind of success is evidence for the approximate truth of the parts of theories responsible for such success. Counterexamples to Harker’s position are cases in which a theory is more successful than its rival in virtue of containing parts that are not even approximately true. In order to identify some counterexamples to Harker’s position, this paper considers four historical cases that Greg Frost-Arnold has recently used to motivate a novel historical challenge to realism called the Problem of Misleading Evidence. This paper argues that these four cases are counterexamples to Harker’s position, and that they provide a strong reason to doubt his position and the kind of success-to-truth inference that he defends.
-
231841.168613
I critically examine the assumption that the theoretical structure that varies under theoretical symmetries is redundant and should be eliminated from a metaphysical picture of the universe, following a “symmetry to reality” inference. I do so by analysing the status of coordinate change symmetries taking a pragmatic approach. I argue that coordinate systems function as indexical devices, and play an important pragmatic role for representing concrete physical systems. I examine the implications of considering this pragmatic role seriously, taking what I call a perspectivist stance. My conclusion is that under a perspectivist stance, all symmetries (including local gauge symmetries) potentially have a direct empirical status: they point to dynamical aspects that are invariant under changes of operationalisation, and they constitute a guide not to reality, but to nomology and kinship.
-
231853.168628
The representational theory of measurement provides a collection of results that specify the conditions under which an attribute admits of numerical representation. The original architects of the theory interpreted the formalism operationally and explicitly acknowledged that some aspects of their representations are conventional. There have been a number of recent efforts to reinterpret the formalism to arrive at a more metaphysically robust account of physical quantities. In this paper we argue that the conventional elements of the representations afforded by the representational theory of measurement require careful scrutiny as one moves toward such an interpretation. To illustrate why, we show that there is a sense in which the very number system in which one represents a physical quantity such as mass or length is conventional. We argue that this result does not undermine the project of reinterpreting the representational theory of measurement for metaphysical purposes in general, but it does undermine a certain class of inferences about the nature of physical quantities that some have been tempted to draw.
-
231886.168645
This paper describes an alternative to currently dominant philosophical approaches to the metaphysics of causation. It is motivated by the gap that currently exists between metaphysical accounts and recent epistemological research on causal reasoning and methods for discovering causal relationships. Our approach aims at characterizing structural features of the actual world that support, and are exploited by, successful strategies for causal reasoning and discovery. We call these features the “worldly infrastructure” of causation. We identify several elements of this worldly infrastructure, sketch an account of their physical bases, and explain how they contribute to the possibility of successful causal reasoning.
-
231919.168661
There has been a great buzz surrounding Daniel Jafferis et al.’s latest Nature paper, ”Traversable wormhole dynamics on a quantum processor”. The Nature paper discusses an experiment in which Google’s Sycamore quantum processor is used to simulate a sparse N = 7 SYK model with 5 terms (a learned Hamiltonian). The Nature paper shows that the learned Hamiltonian preserves the key gravitational characteristics of an N = 10 SYK model with 210 terms and is sufficient to produce a traversable wormhole behavior. I will examine the experiment and discuss some philosophical challenges concerning the experiment in memory of Ian Hacking. Recently, Norman Yao and two graduate students discovered multiple flaws in Jafferis et al.’s learned Hamiltonian and uploaded a comment on the Nature paper. As expected, Jafferis and his team found a simple way to clarify the misunderstanding. They found a physical justification that allowed them to avoid the problem. In this paper, I elucidate the main arguments Yao and his students raised and the way Jafferis et al. found to save their learned Hamiltonian. I will end this paper with a philosophical comment on this recent development in the context of the learned Hamiltonian.
-
231975.168676
We argue that the temporal asymmetry of influence is not merely the result of thermodynamics: it is a consequence of the fact that modal structure of the universe must admit only processes which cannot give rise to contradictions. We appeal to the process matrix formalism developed in the field of quantum foundations to characterise processes which are compatible with local free will whilst ruling out contradictions, and argue that this gives rise to ‘consistent chaining’ requirements that explain the temporal asymmetry of influence. We compare this view to the perspectival account of causation advocated by Price and Ramsey.
-
232007.168712
Enlightenment science and medicine achieved different levels of accuracy and precision 1 as both sought rationality, empiricism, and the spread of human happiness. Animal magnetism was a unique medical phenomenon of late Enlightenment Paris that established trust with the public and had a significant cultural impact. Yet this method was not reproducible, involved limited rule-following, and lacked a coherent theoretical framework. Lessons from this phenomenon have included the first use of the placebo and the first design and implementation of careful, controlled experimentation in the medical context. Another lesson can be the power of demonstration and spectacle in communicating medical innovations. In fact, the underpinnings of science – the reproducibility of a certain scientific experiment and the rules, know-hows, and theories it assures – can be so conveyed to the public.