
116819.498468
We discuss economic environments in which individual choice sets are fixed and the level of a certain parameter that systematically biases the preferences of all agents is determined endogenously to achieve equilibrium. Our equilibrium concept, Biased Preferences Equilibrium, is reminiscent of competitive equilibrium: agents’ choice sets and their preferences are independent of the behavior of other agents, the combined choices have to satisfy overall feasibility constraints and the endogenous adjustment of the equilibrating preference parameter is analogous to the equilibrating price adjustment. The concept is applied to a number of economic examples.

121645.498562
Within ordinary —unitary— quantum mechanics there exist global protocols that allow to verify that no definite event —an outcome to which a probability can be associated— occurs. Instead, states that start in a coherent superposition over possible outcomes always remain as a superposition. We show that, when taking into account fundamental errors in measuring length and time intervals, that have been put forward as a consequence of a conjunction of quantum mechanical and general relativity arguments, there are instances in which such global protocols no longer allow to distinguish whether the state is in a superposition or not. All predictions become identical as if one of the outcomes occurs, with probability determined by the state. We use this as a criteria to define events, as put forward in the Montevideo Interpretation of Quantum Mechanics. We analyze in detail the occurrence of events in the paradigmatic case of a particle in a superposition of two different locations. We argue that our approach provides a consistent (C) singleworld (S) picture of the universe, thus allowing an economical way out of the limitations imposed by a recent theorem by Frauchiger and Renner showing that having a selfconsistent singleworld description of the universe is incompatible with quantum theory. In fact, the main observation of this paper may be stated as follows: If quantum mechanics is extended to include gravitational effects to a QG theory, then QG, S, and C are satisfied.

121670.498585
The CPT theorem states that any causal, Lorentzinvariant, thermodynamically wellbehaved quantum field theory must also be invariant under a reflection symmetry that reverses the direction of time (T), flips spatial parity (P), and conjugates charge (C). Although its physical basis remains obscure, CPT symmetry appears to be necessary in order to unify quantum mechanics with relativity. This paper attempts to decipher the physical reasoning behind proofs of the CPT theorem in algebraic quantum field theory. Ultimately, CPT symmetry is linked to a systematic reversal of the C algebraic Lie product that encodes the generating relationship between observables and symmetries. In any physically reasonable relativistic quantum field theory it is always possible to systematically reverse this generating relationship while preserving the dynamics, spectra, and localization properties of physical systems. Rather than the product of three separate reflections, CPT symmetry is revealed to be a single global reflection of the theory’s state space.

121720.4986
In this paper, I give a counterexample to a claim made in Norton (2008) that empirically equivalent theories can often be regarded as theoretically equivalent by treating one as having surplus structure, thereby overcoming the problem of underdetermination of theory choice. The case I present is that of Lorentz's ether theory and Einstein's theory of special relativity. I argue that Norton's suggestion that surplus structure is present in Lorentz's theory in the form of the ether state of rest is based on a misunderstanding of the role that the ether plays in Lorentz's theory, and that in general, consideration of the conceptual framework in which a theory is embedded is vital to understanding the relationship between different theories.

257567.498624
I review the philosophical literature on the question of when two physical theories are equivalent. This includes a discussion of empirical equivalence, which is often taken to be necessary, and sometimes taken to be sufficient, for theoretical equivalence; and “interpretational” equivalence, which is the idea that two theories are equivalent just in case they have the same interpretation. It also includes a discussion of several formal notions of equivalence that have been considered in the recent philosophical literature, including (generalized) definitional equivalence and categorical equivalence. The article concludes with a brief discussion of the relationship between equivalence and duality.

314538.498641
Recent philosophical work has praised the reward structure of science, while recent empirical work has shown that many scientific results may not be reproducible. I argue that the reward structure of science incentivizes scientists to focus on speed and impact at the expense of the reproducibility of their work, thus contributing to the socalled reproducibility crisis. I use a rational choice model to identify a set of sufficient conditions for this problem to arise, and I argue that these conditions plausibly apply to a wide range of research situations. Currently proposed solutions will not fully address this problem. Philosophical commentators should temper their optimism about the reward structure of science.

316086.498657
Scientific models need to be investigated if they are to provide valuable information about the systems they represent. Surprisingly, the epistemological question of what enables this investigation has hardly been investigated. Even authors who consider the inferential role of models as central, like Hughes (1997) or Bueno and Colyvan (2011), content themselves with claiming that models contain mathematical resources that provide inferential power. We claim that these notions require further analysis and argue that mathematical formalisms contribute to this inferential role. We characterize formalisms, illustrate how they extend our mathematical resources, and highlight how distinct formalisms offer various inferential affordances.

392569.498672
Summary This survey article discusses two basic issues that semantic theories of questions face. The first is how to conceptualise and formally represent the semantic content of questions. This issue arises in particular because the standard truthconditional notion of meaning, which has been fruitful in the analysis of declarative statements, is not applicable to questions. The second issue is how questions, when embedded in a declarative statement (e.g., in Bill wonders who called ) contribute to the truthconditional content of that statement. Several ways in which these issues have been addressed in the literature are discussed and compared.

432028.498687
We analyze the flow into inflation for generic “singleclock” systems, by combining an effective field theory approach with a dynamicalsystems analysis. In this approach, we construct an expansion for the potentiallike term in the effective action as a function of time, rather than specifying a particular functional dependence on a scalar field. We may then identify fixed points in the effective phase space for such systems, orderbyorder, as various constraints are placed on the M th time derivative of the potentiallike function. For relatively simple systems, we find significant probability for the background spacetime to flow into an inflationary state, and for inflation to persist for at least 60 efolds. Moreover, for systems that are compatible with singlescalarfield realizations, we find a single, universal functional form for the effective potential, V (φ), which is similar to the wellstudied potential for powerlaw inflation. We discuss the compatibility of such dynamical systems with observational constraints.

464497.498709
. Tour guides in your travels jot down Mementos and Keepsakes from each Tour[i]. Their scribblings, which may at times include details, at other times just a word or two, may be modified through the Tour, and in response to questions from travelers (so please check back). …

492259.498738
We investigate how epistemic injustice can manifest itself in mathematical practices. We do this as both a social epistemological and virtuetheoretic investigation of mathematical practices. We delineate the concept both positively – we show that a certain type of folk theorem can be a source of epistemic injustice in mathematics – and negatively by exploring cases where the obstacles to participation in a mathematical practice do not amount to epistemic injustice. Having explored what epistemic injustice in mathematics can amount to, we use the concept to highlight a potential danger of intellectual enculturation.

613856.498765
Ontological pluralism is the view that there are different ways to exist. It is a position with deep roots in the history of philosophy. For example, Aristotle seemed to endorse it when he said that ‘there are many senses in which a thing may be said to ‘be’’. Although the view fell out of favour, there has recently been a resurgence of interest, sparked by defences from Kris McDaniel [2009, 2010a, 2010b] and Jason Turner [2010, 2012]. Indeed, while the position may still have relatively few adherents in quite these terms, the influential Fregean approach to higherorder quantification—according to which this is over ‘concepts’ rather than objects— would seem to be an instance of it. In contemporary presentations, the view is stated in terms of fundamental languages. That is, languages whose expressions ‘carve nature at the joints’, or whose meanings are natural in the sense of Lewis [1983, 1986]. Thus stated, it is the claim that such languages have more than one type of quantification, ranging over different domains. For example, ∃a ranging over abstract objects, and ∃c ranging over concrete ones.

613910.498834
Kripke [1975] gives a formal theory of truth based on Kleene’s strong evaluation scheme. It is probably the most important and influential that has yet been given—at least since Tarski. However, it has been argued that this theory has a problem with generalized quantifiers such as All(φ, ψ), i.e. all φs are ψ, or Most(φ, ψ). Specifically, it has been argued that such quantifiers preclude the existence of just the sort of language that Kripke aims to deliver, that is, one that contains its own truth predicate. In this paper I solve the problem by showing how Kleene’s strong scheme, and Kripke’s theory that is based on it, can in a natural way be extended to accommodate the full range of generalized quantifiers.

792270.498867
Computational complexity theory, or in other words, the theory of tractability and intractability, is defined in terms of limit behavior. A typical question of computational complexity theory is of the form: As the size of the input increases, how do the running time and memory requirements of the algorithm change? Therefore, computational complexity theory, among other things, investigates the scalability of computational problems and algorithms, i.e. it measures the rate of increase in computational resources required as a problem grows (see, e.g., Arora and Barak, 2009). The implicit assumption here is that the size of the problem is unbounded. For example, models can be of any finite size, formulas can contain any number of distinct variables, and so on.

822820.498896
We observe that verbs like wonder do not just imply that their subject does not know the answer to the embedded question, but a stronger form of ignorance, which we call distributive ignorance. This is not predicted by existing work on the semantics of wonder, and we argue that it cannot be straightforwardly derived as a pragmatic inference either. We consider two possible semantic accounts, and conclude in favor of one on which the lexical semantics of wonder involves exhaustification w.r.t. structural alternatives as well as subdomain alternatives of its complement.

822862.498913
Despite the intuitive conflict between deterministic laws of nature and objective chances, philosophers have attempted to develop accounts which allow for the compatibility of determinism and chance. I offer an explicit argument for why this compatibility is not possible and also criticize the various notions of deterministic chance supplied by the compatibilists. Many of them are strongly motivated by the existence of objective probabilities in scientific theories with deterministic laws, the most salient of which is classical statistical mechanics. I show that there is no interpretational difficulty here: statistical mechanics is either an indeterministic theory or else its probabilities are not chances—just as incompatibilism demands.

895314.498935
A plausible constraint on normative reasons to act is that it must make sense to use them as premises in deliberation. I argue that a central sort of deliberation—what Bratman calls partial planning—is questiondirected: it is over, and aims to resolve, deliberative questions. Whether it makes sense to use some consideration as a premise in deliberation in a case of partial planning can vary with the deliberative question at issue. I argue that the best explanation for this is that reasons are contrastive, or relativized to deliberative questions.

900880.498954
KK is the thesis that if you can know p, you can know that you can know p. Though it’s unpopular, a flurry of considerations have recently emerged in its favor. Here we add fuel to the fire: standard resources allow us to show that any failure of KK will lead to the knowability and assertability of abominable indicative conditionals of the form, ‘If I don’t know it, p.’ Such conditionals are manifestly not assertable—a fact that KK defenders can easily explain. I survey a variety of KKdenying responses and find them wanting. Those who object to the knowability of such conditionals must either (i) deny the possibility of harmony between knowledge and belief, or (ii) deny wellsupported connections between conditional and unconditional attitudes. Meanwhile, those who grant knowability owe us an explanation of such conditionals’ unassertability —yet no successful explanations are on offer. Upshot: we have new evidence for KK.

908467.498969
After a brief introduction to issues that plague the realization of a theory of quantum gravity, I suggest that the main one concerns a quantization of the principle of relative simultaneity. This leads me to a distinction between time and space, to a further degree than that present in the canonical approach to general relativity. With this distinction, superpositions are only meaningful as interference between alternative paths in the relational configuration space of the entire Universe. But the full use of relationalism brings us to a timeless picture of Nature, as it does in the canonical approach (which culminates in the WheelerDeWitt equation). After a discussion of Parmenides and the Eleatics’ rejection of time, I show that there is middle ground between their view of absolute timelessness and a view of physics taking place in timeless configuration space. In this middle ground, even though change does not fundamentally exist, the illusion of change can be recovered in a way not permitted by Parmenides. It is recovered through a particular density distribution over configuration space which gives rise to ‘records’. Incidentally, this distribution seems to have the potential to dissolve further aspects of the measurement problem that can still be argued to haunt the application of decoherence to Many Worlds. I end with a discussion indicating that the conflict between the conclusions of this paper and our view of the continuity of the self may still intuitively bother us. Nonetheless, those conclusions should be no more challenging to our intuition than Derek Parfit’s thought experiments on the same subject.

908501.498999
Dawid, DeGroot and Mortera showed, a quarter century ago, that any agent who regards a fellow agent as a peer–in particular, defers to the fellow agent’s prior credences in the same way that she defers to her own– and updates by splitthedifference is prone (on pain of triviality) to diachronic incoherence. On the other hand one may show that there are special scenarios in which Bayesian updating approximates difference splitting, so it remains an important question whether it remains a viable (approximate) response to “generic” peer update. We look at arguments by two teams of philosophers (Fitelson & Jehle and NissanRozen & Spectre) against difference splitting.

908551.499015
Putnam (1963) construed the aim of Carnap’s program of inductive logic as the specification of a “universal learning machine,” and presented a diagonal proof against the very possibility of such a thing. Yet the ideas of Solomonoff (1964) and Levin (1970) lead to a mathematical foundation of precisely those aspects of Carnap’s program that Putnam took issue with, and in particular, resurrect the notion of a universal mechanical rule for induction. In this paper, I take up the question whether the SolomonoffLevin proposal is successful in this respect. I expose the general strategy to evade Putnam’s argument, leading to a broader discussion of the outer limits of mechanized induction. I argue that this strategy ultimately still succumbs to diagonalization, reinforcing Putnam’s impossibility claim.

971893.499042
This paper provides and motivates a unified compositional semantics for indicative and counterfactual conditionals that have in their consequents probability operators: probable, likely, more likely than not, 50% chance and so on. The theory combines a Kratzerian syntax for conditionals with semantics based on causal Bayes nets, and develops an algorithm for interpreting probabilistic indicatives and counterfactuals on which the key difference between them involves the choice between two operations on causal models—conditioning and intervention. The interaction between probability operators and the two kinds of conditionals sheds new light on the vexed question of what realworld information is retained in counterfactual scenarios and what is discarded, and helps to explain why indicatives are not sensitive to details of causal structure while counterfactuals are. The second part of the paper develops a new empirical challenge for the revision procedure that emerges due to the interaction of counterfactuals and probability operators. The straightforward approach to revision developed in the first part is shown to be inadequate for certain scenarios in which realworld information that is discarded in the counterfactual scenario nevertheless informs the values of other variables. I describe two extensions of the semantics in the first part that resolve this puzzle—one based on a more complicated definition of counterfactual probabilities, and one that relies on the related formalism of structural causal models.

986360.499066
StatSci/PhilSci Museum
Where you are in the Journey* We’ll move from the philosophical ground ﬂoor to connecting themes from other levels, from Popperian falsiﬁcation to signiﬁcance tests, and from Popper’s demarcation to currentday problems of pseudoscience and irreplication. …

1022796.499085
Peer review is one of the linchpins of the social organization of science. Whether as a grant proposal, manuscript, or conference abstract, just about every piece of scientific work passes through peer review, often multiple times. Yet philosophers of science have paid surprisingly little attention to peer review (exceptions include Zollman 2009, Avin forthcoming).

1045144.49911
In contemporary accounts, counterfactual modality and “backwards” inference are thought to be incompatible. In a counterfactual A > B, the inference from A (and supporting facts) to B cannot be one that runs against the tide of causal dependency, from effect to cause. While backwards conditionals with contrarytofact antecedents are acknowledged to exist, they are considered nonstandard, or else thought to constitute a distinct, epistemic reading of the conditional. However, there is evidence from English and other languages that the restriction against backwards inference is not connected to counterfactuality. The same restriction applies when the indicative auxilliary will is used to talk about the actual present situation, as when we say ‘Alma will be home right now’. The sentence is infelicitous if inferred from the evidence of an effect, such as Alma’s car being parked in the driveway. We conclude that counterfactuality and antiabductivity are distinct and dissociable features of modal statements.

1106069.499149
I recently developed a novel paradox involving a variety of representational states and activities, and I am wondering if readers might have any thoughts about my ideas here. To illustrate the paradox, I first prove that there are certain contingently true propositions that no one can occurrently believe. …

1418330.499175
. My new book, Statistical Inference as Severe Testing: How to Get Beyond the Statistics Wars,” you might have discovered, includes Souvenirs throughout (AZ). But there are some highlights within sections that might be missed in the excerpts I’m posting. …

1440084.499195
Deflationists argue that ‘true’ is merely a logicolinguistic device for expressing blind ascriptions and infinite generalisations. For this reason, some authors have argued that deflationary truth must be conservative, i.e. that a deflationary theory of truth for a theory S (that interprets a sufficient amount of mathematics, or syntax) must not entail sentences in S’s language that are not already entailed by S. However, it has been forcefully argued that any adequate theory of truth for S must be nonconservative and that, for this reason, truth cannot be deflationary (Shapiro, 1998; Ketland, 1999).

1509664.499223
Assume that it is your evidence that determines what opinions you should have. I argue that since you should take peer disagreement seriously, evidence must have two features. (1) It must sometimes warrant being modest : uncertain what your evidence warrants, and (thus) uncertain whether you’re rational. (2) But it must always warrant being guided : disposed to treat your evidence as a guide. It is surprisingly difficult to vindicate these dual constraints. But diagnosing why this is so leads to a proposal—Trust—that is weak enough to allow modesty but strong enough to yield many guiding features. In fact, I argue that Trust is the Goldilocks principle—for it is necessary and sufficient to vindicate the claim that you should always prefer to use free evidence. Upshot: Trust lays the foundations for a theory of disagreement and, more generally, an epistemology that permits selfdoubt—a modest epistemology.

1571161.499239
String theory has not even come close to a complete formulation after half a century of intense research. On the other hand, a number of features of the theory suggest that the theory, once completed, may be a final theory. It is argued in this chapter that those two conspicuous characteristics of string physics are related to each other. What links them together is the fact that string theory has no dimensionless free parameters at a fundamental level. The paper analyses possible implications of this situation for the long term propsects of theory building in fundamental physics.