
15583.121566
A number of arguments purport to show that quantum field theory cannot be given an interpretation in terms of localizable particles. We show, in light of such arguments, that the classical ~ → 0 limit can aid our understanding of the particle content of quantum field theories. In particular, we demonstrate that for the massive KleinGordon field, the classical limits of number operators can be understood to encode local information about particles in the corresponding classical field theory.

15666.121795
To make sense of large data sets, we often look for patterns in how data points are “shaped” in the space of possible measurement outcomes. The emerging field of topological data analysis (TDA) offers a toolkit for formalizing the process of identifying such shapes. This paper aims to discover why and how the resulting analysis should be understood as reflecting significant features of the systems that generated the data. I argue that a particular feature of TDA—its functoriality— is what enables TDA to translate visual intuitions about structure in data into precise, computationally tractable descriptions of realworld systems.

71013.121835
This paper generalises Enelow (J Polit 43(4):1062–1089, 1981) and Lehtinen’s (Theory Decis 63(1):1–40, 2007b) model of strategic voting under amendment agendas by allowing any number of alternatives and any voting order. The generalisation enables studying utilitarian efficiencies in an incomplete information model with a large number of alternatives. Furthermore, it allows for studying how strategic voting affects pathdependence. Strategic voting increases utilitarian efficiency also when there are more than three alternatives. The existence of a Condorcet winner does not guarantee pathindependence if the voters engage in strategic voting under incomplete information. A criterion for evaluating pathdependence, the degree of pathdependence, is proposed, and the generalised model is used to study how strategic voting affects it. When there is a Condorcet winner, strategic voting inevitably increases the degree of pathdependence, but when there is no Condorcet winner, strategic voting decreases pathdependence. Computer simulations show, however, that on average it increases the degree of pathdependence.

71185.121864
The most common argument against the use of rational choice models outside economics is that they make unrealistic assumptions about individual behavior. We argue that whether the falsity of assumptions matters in a given model depends on which factors are explanatorily relevant. Since the explanatory factors may vary from application to application, effective criticism of economic model building should be based on modelspecific arguments showing how the result really depends on the false assumptions. However, some modeling results in imperialistic applications are relatively robust with respect to unrealistic assumptions.

71278.121891
Political science and economic science . . . make use of the same language, the same mode of abstraction, the same instruments of thought and the same method of reasoning. (Black 1998, 354) Proponents as well as opponents of economics imperialism agree that imperialism is a matter of unification; providing a unified framework for social scientific analysis. Uskali Mäki distinguishes between derivational and ontological unification and argues that the latter should serve as a constraint for the former. We explore whether, in the case of rationalchoice political science, selfinterested behavior can be seen as a common causal element and solution concepts as the common derivational element, and whether the former constraints the use of the latter. We find that this is not the case. Instead, what is common to economics and rationalchoice political science is a set of research heuristics and a focus on institutions with similar structures and forms of organization.

71321.121918
This paper examines the welfare consequences of strategic voting under the Borda rule in a comparison of utilitarian efficiencies in simulated voting games under two behavioural assumptions: expected utilitymaximising behaviour and sincere behaviour. Utilitarian efficiency is higher in the former than in the latter. Strategic voting increases utilitarian efficiency particularly if the distribution of preference intensities correlates with voter types. The Borda rule is shown to have two advantages: strategic voting is beneficial even if some but not all voter types engage in strategic behaviour, and even if the voters’ information is based on unreliable signals.

71506.121944
This paper reconsiders the discussion on ordinal utilities versus preference intensities in voting theory. It is shown by way of an example that arguments concerning observability and riskattitudes that have been presented in favour of Arrow’s Independence of Irrelevant Alternatives (IIA), and against utilitarian evaluation, fail due to strategic voting. The failure of these two arguments is then used to justify utilitarian evaluation of outcomes in voting. Given a utilitarian viewpoint, it is then argued that strategyproofness is not normatively acceptable. Social choice theory is criticised not just by showing that some of its most important conditions are not normatively acceptable, but also by showing that the very idea of imposing condition on social choice function under the assumption of sincere behaviour does not make much sense because satisfying a condition does not quarantee that a voting rule actually has the properties that the condition confers to it under sincere behaviour. IIA, the binary intensity IIA, and monotonicity are used as illustrations of this phenomenon.

73367.121971
The distinguishability between pairs of quantum states, as measured by quantum fidelity, is formulated on phase space. The fidelity is physically interpreted as the probability that the pair are mistaken for each other upon an measurement. The mathematical representation is based on the concept of symplectic capacity in symplectic topology. The fidelity is the absolute square of the complexvalued overlap between the symplectic capacities of the pair of states. The symplectic capacity for a given state, onto any conjugate plane of degrees of freedom, is postulated to be bounded from below by the Gromov width h/2. This generalize the GibbsLiouville theorem in classical mechanics, which state that the volume of a region of phase space is invariant under the Hamiltonian flow of the system, by constraining the shape of the flow. It is shown that for closed Hamiltonian systems, the Schrodinger equation is the mathematical representation for the conservation of fidelity.

73429.121997
The measurement problem is addressed from the viewpoint that it is the distinguishability between the state preparation and its quantum ensemble, i.e. the set of states with which it has a nonzero overlap, that is at the heart of the difference between classical and quantum measurements. The measure for the degree of distinguishability between pairs of quantum states, i.e. the quantum fidelity, is for this purpose generalized, by the application of the superposition principle, to the setting where there exists an arbitrarydimensional quantum ensemble.

74555.122023
For a PDF version of this post, see here.Many years ago, I was climbing Sgùrr na Banachdich with my friend Alex. It's a mountain in the Black Cuillin, a horseshoe of summits that surround Loch Coruisk at the southern end of the Isle of Skye. …

133091.122051
Models of decisionmaking under uncertainty gain much of their power from the specification of states so as to resolve all uncertainty. However, this specification can undermine the presumed observability of preferences on which axiomatic theories of decisionmaking are based. We introduce the notion of a contingency. Contingencies need not resolve all uncertainty, but preferences over functions from contingencies to outcomes are (at least in principle) observable. In sufficiently simple situations, states and contingencies coincide. In more challenging situations, the analyst must choose between sacrificing observability in order to harness the power of states that resolve all uncertainty, or preserving observability by working with contingencies.

133134.122101
In Part I of this paper, we identified and compared various schemes for trivalent truth conditions for indicative conditionals, most notably the proposals by de Finetti (1936) and Reichenbach (1935, 1944) on the one hand, and by Cooper (Inquiry, 11, 295–320, 1968) and Cantwell (Notre Dame Journal of Formal Logic, 49, 245– 260, ) on the other. Here we provide the proof theory for the resulting logics DF/TT and CC/TT, using tableau calculi and sequent calculi, and proving soundness and completeness results. Then we turn to the algebraic semantics, where both logics have substantive limitations: DF/TT allows for algebraic completeness, but not for the construction of a canonical model, while CC/TT fails the construction of a LindenbaumTarski algebra. With these results in mind, we draw up the balance and sketch future research projects.

133158.122133
The notion of grounding is usually conceived as an objective and explanatory relation. It connects two relata if one—the ground—determines or explains the other—the consequence. In the contemporary literature on grounding, much effort has been devoted to logically characterize the formal aspects of grounding, but a major hard problem remains: defining suitable grounding principles for universal and existential formulae. Indeed, several grounding principles for quantified formulae have been proposed, but all of them are exposed to paradoxes in some very natural contexts of application. We introduce in this paper a firstorder formal system that captures the notion of grounding and avoids the paradoxes in a novel and nontrivial way. The system we present formally develops Bolzano’s ideas on grounding by employing Hilbert’s εterms and an adapted version of Fine’s theory of arbitrary objects.

133267.122163
Cheap talk has often been thought incapable of supporting the emergence of cooperation because costless signals, easily faked, are unlikely to be reliable (Zahavi and Zahavi, 1997). I show how, in a social network model of cheap talk with reinforcement learning, cheap talk does enable the emergence of cooperation, provided that individuals also temporally discount the past. This establishes one mechanism that suffices for moving a population of initially uncooperative individuals to a state of mutually beneficial cooperation even in the absence of formal institutions.

133271.122191
This paper examines two questions about scientists’ search for knowledge. First, which search strategies generate discoveries effectively? Second, is it advantageous to diversify search strategies? We argue pace Weisberg and Muldoon (2009) that, on the first question, a search strategy that deliberately seeks novel research approaches need not be optimal. On the second question, we argue they have not shown epistemic reasons exist for the division of cognitive labor, identifying the errors that led to their conclusions. Furthermore, we generalize the epistemic landscape model, showing that one should be skeptical about the benefits of social learning in epistemically complex environments.

133310.122218
As many Western countries emerged from initial periods of lockdown in spring 2020, they had brought COVID19 infection rates down significantly. This was followed, however, with more drastic second and third waves of viral spread, which many of these same countries are struggling to bring under control, even with the implementation of further periods of lockdown. Could this have been prevented by policymakers? We revisit two strategies that were focus of much discussion during the early stages of the pandemic, and which were implemented in several Western countries, albeit in a weakened form. These strategies both proceed by targeting certain segments of the population, while allowing others to go about their lives unhindered. The first suggests selectively isolating those that would most likely suffer severe adverse effects if infected – in particular the elderly. The second involves identifying and quarantining those who are likely to be infected through a contact tracing app that would centrally store users’ information. We suggest that both strategies showed promise in preventing the need for further lockdowns, albeit in a significantly more stringent form than anything that was implemented in Western countries. We then proceed to an ethical evaluation of these more stringent policies. We contend that selective isolation strategies face severe ethical problems due to its discriminatory nature, while the ethical issues with a more aggressive contact tracing regime can be mitigated. This analysis has implications for how to respond effectively and ethically to future pandemics, and perhaps contains lessons on how to successfully emerge from our current predicament.

133319.122249
In the 1920s, Ackermann and von Neumann, in pursuit of Hilbert’s Programme, were working on consistency proofs for arithmetical systems. One proposed method of giving such proofs is Hilbert’s epsilonsubstitution method. There was, however, a second approach which was not reflected in the publications of the Hilbert school in the 1920s, and which is a direct precursor of Hilbert’s first epsilon theorem and a certain ‘general consistency result’ due to Bernays. An analysis of the form of this socalled ‘failed proof’ sheds further light on an interpretation of Hilbert’s Programme as an instrumentalist enterprise with the aim of showing that whenever a ‘real’ proposition can be proved by ‘ideal’ means, it can also be proved by ‘real’, finitary means.

188797.122275
On the basis of a coherently applied physicalist ontology, I will argue that there is nothing conceptual in logic and mathematics. What we usually call “mathematical concepts”—from the most exotic ones to the most “evident” ones—are just names tagged to various elements of mathematical formalism. In fact they have nothing to do with concepts, as they have nothing to do with the actual things; they can be completely ignored by both philosophy and physics.

188936.122301
In this note we provide a concise report on the complexity of the causal ordering problem, originally introduced by Simon to reason about causal dependencies implicit in systems of mathematical equations. We show that Simon’s classical algorithm to infer causal ordering is NPHard—an intractability previously guessed but never proven. We present then a detailed account based on Nayak’s suggested algorithmic solution (the best available), which is dominated by computing transitive closure—bounded in time by O(V·S), where S(E, V) is the input system structure composed of a set E of equations over a set V of variables with number of variable appearances (density) S. We also comment on the potential of causal ordering for emerging applications in largescale hypothesis management and analytics. Keywords: Causal ordering, Causal reasoning, Structural equations, Hypothesis management.

195451.122327
Economic policy evaluations require social welfare functions for variablesize populations. Two important candidates are criticallevel generalized utilitarianism (CLGU) and rankdiscounted criticallevel generalized utilitarianism, which was recently characterized by Asheim and Zuber (2014) (AZ). AZ introduce a novel axiom, existence of egalitarian equivalence (EEE). First, we show that, under some uncontroversial criteria for a plausible social welfare relation, EEE suffices to rule out the Repugnant Conclusion of population ethics (without AZ’s other novel axioms). Second, we provide a new characterization of CLGU: AZ’s set of axioms is equivalent to CLGU when EEE is replaced by the axiom samenumber independence.

371621.122352
This paper presents challenge cases for prominent pragmatic responses to the proviso problem. The proviso problem (Geurts 1996, 1999) is the problem for many theories of presupposition of explaining why sentences predicted to semantically presuppose ψ ⊃ P seem in certain uses to commit the speaker to an unconditional presupposition P — for instance, why a use of (1) would typically commit the speaker not merely to (1a), but to the logically stronger (1b).

381475.122378
Every beginning real analysis student learns the classic HeineBorel theorem, that the interval [0, 1] is compact. The standard proof involves techniques such as constructing a sequence and appealing to the completeness of the reals (which some may find unsatisfying). In this article, we present a different perspective by showing how the Heine Borel theorem can be derived from a few fundamental results in mathematical logic. In particular, we put an ultrametric on the space of infinite binary sequences. Compactness of this space can be established from Brouwer’s fan theorem. This result can be derived from either Konig’s infinity lemma or from Godel’s compactness theorem in model theory. The HeineBorel theorem is an immediate corollary. This illustrates an interesting connection between the fundamental yet different notions of compactness in analysis and compactness in logic.

381514.122404
For the uninitiated, the dense nature of mathematical language can act as an obscuring force. With this essay we aim to bring two classical results of discrete mathematics into the light. To this end we analyze winning strategies in a certain class of solitaire games. The gains are nonstandard proofs of the results of K˝onig [3] and Vizing [7]. For the standard treatment of these results, see [6]. (For a dense and obscure version of the nonstandard proofs presented here, see [4].) First, let’s introduce the games.

381553.12243
In Rabern and Rabern (2008) we presented a two question solution to ‘the hardest logic puzzle ever’ (as presented in Boolos (1996)), which relied on selfreferential questions. In this note we respond to several worries related to this solution. We clarify our claim that some yesno questions cannot be answered by the gods and thus that asking such questions of the gods will result in head explosion. We argue that the inclusion of exploding head possibilities is neither cheating nor ad hoc but is instead forced upon us by principles related to Tarski’s theorem. We also respond to concerns that have been raised about our use of selfreferential questions in support of the two question solution. In particular, we address the worry that there is a revenge problem lurking, which is analogous to revenge problems that arise for purported solutions to the liar paradox. And we make some further observations about the relationship between selfreferential questions, truthtelling gods and the semantic paradoxes. In the appendix we give a two question solution to the modified puzzle (where Random randomly answers ‘ja’ or ‘da’).

381601.122458
The semantic paradoxes are often associated with selfreference or referential circularity. However, Yablo has shown in [2] that there are infinitary versions of the paradoxes that do not involve this form of circularity. It remains an open question what relations of reference between collections of sentences afford the structure necessary for paradoxicality. In [1] we laid the groundwork for a general investigation into the nature of reference structures that support the semantic paradoxes. The remaining task is to classify the socalled dangerous directed graphs. In appendix A of [1], we sketched a reformulation of the problem in terms of fixed points of certain functions. Here we expand on this reformulation, removing all syntactic considerations to get a purely mathematical problem. It is definitely possible that the problem’s solution depends on the axioms of set theory we choose—this would be an interesting outcome.

386170.122485
On an influential line of thinking tracing back to Ramsey, conditionals are closely linked to the attitude of supposition. When applied to counterfactuals, this view suggests a subjunctive version of the socalled Ramsey test: the probability of a counterfactual If A, would B ought to be equivalent to the probability of B, under the subjunctive supposition that A. I present a collapse result for any view that endorses the subjunctive version of the Ramsey test. Starting from plausible assumptions, the result shows that one’s rational credence in a wouldcounterfactual and in the corresponding mightcounterfactual have to be identical.

444297.122511
Should a scientist rely on methodological triangulation? Heesen et al. (Synthese 196(8):3067–3081, 2019) recently provided a convincing affirmative answer. However, their approach requires belief gambles if the evidence is discordant. We instead propose epistemically modest triangulation (EMT), according to which one should withhold judgement in such cases. We show that for a scientist in a methodologically diffident situation the expected utility of EMT is greater than that of Heesen et al.’s (2019) triangulation or that of using a single method. We also show that EMT is more appropriate for increasing epistemic trust in science. In short: triangulate, but do not gamble with evidence.

557946.122537
It is intuitive to say that persons have infinite value, and recently Rasmussen and Bailey have given some cool arguments for this thesis. But what does it mean to say that humans have infinite value? …

560366.122564
I investigate the extent to which perspectival realism (PR) agrees with frequentist statistical methodology and philosophy, with an emphasis on J. Neyman’s views. Based on the example of the stopping rule problem I argue that PR can naturally be associated with frequentist statistics. Then I analyze Neyman’s conception of statistical inference to conclude that PR and Neyman’s conception are incongruent. Additionally, I show that Neyman’s philosophy is internally inconsistent. I conclude that Neyman’s frequentism weakens the philosophical validity and universality of PR as analyzed from the point of view of statistical methodology.

560431.122595
Penelope Maddy’s Second Philosophy is one of the most wellknown approaches in recent philosophy of mathematics. She applies her secondphilosophical method to analyze mathematical methodology by reconstructing historical cases in a setting of meansends relations. However, outside of Maddy’s own work, this kind of methodological analysis has not yet been extensively used and analyzed. In the present work, we will make a first step in this direction. We develop a general framework that allows us to clarify the procedure and aims of the Second Philosopher’s investigation into settheoretic methodology; provides a platform to analyze the Second Philosopher’s methods themselves; and can be applied to further questions in the philosophy of set theory.