
24029.752884
All of us engage in and make use of valid reasoning, but the reasoning
we actually perform differs in various ways from the inferences
studied by most (formal) logicians. Reasoning as performed by human
beings typically involves information obtained through more than one
medium. Formal logic, by contrast, has thus far been primarily
concerned with valid reasoning which is based on information in one
form only, i.e., in the form of sentences. Recently, many
philosophers, psychologists, logicians, mathematicians, and computer
scientists have become increasingly aware of the importance of
multimodal reasoning and, moreover, much research has been undertaken
in the area of nonsymbolic, especially diagrammatic, representation
systems.^{[ 1 ]}
This entry outlines the overall directions of this new research area
and focuses on the logical status of diagrams in proofs, their
representational function and adequacy, different kinds of
diagrammatic systems, and the role of diagrams in

48354.752948
Decisions are typically about outcomes that happen later in time. As such they demand comparisons of the value of outcomes now versus outcomes later. Should I buy a new car or save for retirement? Have the last piece of cake tonight or tomorrow? Lower carbon emissions now or suffer greater loss later? Intertemporal decisions have triggered hundreds of studies across many fields. Popular subjects include personal finances, addiction, nutrition, health, marketing, and environmental conservation. In many of these decisions we tend to exhibit what is called a positive time preference; that is, all else being equal, we prefer positive goods, experiences, and states of affairs to be delivered sooner rather than later. Sweets delivered to me tomorrow aren’t as valuable to me as sweets I can eat today. Descriptive and normative inquiries tackle how we make intertemporal comparisons of utility in such cases and how we should. The present paper is about the second issue, the normative question that asks how we ought to translate future utility into present utility. My focus is restricted to individuals and not societies. I want to challenge the conventional wisdom dominating the social sciences and philosophy regarding temporal discounting, the practice of discounting the value of future utility.

48378.752966
This is a discussion of Delia Fara’s theory of vagueness, and of its solution to the sorites paradox, criticizing some of the details of the account, but agreeing that its central insight will be a part of any solution to the problem. I also consider a wider range of philosophical puzzles that involve arguments that are structurally similar to the argument of the sorites paradox, and argue that the main ideas of her account of vagueness helps to respond to some of those puzzles.

48508.752981
This paper is a tribute to Delia Graff Fara. It extends her work on failures of metarules (conditional proof, RAA, contraposition, disjunction elimination) for validity as truthpreservation under a supervaluationist identification of truth with supertruth. She showed that such failures occur even in languages without special vaguenessrelated operators, for standards of deductive reasoning as materially rather than purely logically good, depending on a contextdependent background. This paper extends her argument to: quantifier metarules like existential elimination; ambiguity; deliberately vague standard mathematical notation. Supervaluationist attempts to qualify the metarules impose unreasonable cognitive demands on reasoning and underestimate her challenge.

48549.752995
A number of philosophers have attempted to solve the problem of nullprobability possible events in Bayesian epistemology by proposing that there are infinitesimal probabilities. Hajek (2003) (more tentatively) and Easwaran (2014) (more definitively) have argued that because there is no way to specify a particular hyperreal extension of the real numbers, solutions to the regularity problem involving infinitesimals, or at least hyperreal infinitesimals, involve an unsatisfactory ineffability or arbitrariness. The arguments depend on the alleged impossibility of picking out a particular hyperreal extension of the real numbers and/or of a particular value within such an extension due to the use of the Axiom of Choice. However, it is false that the Axiom of Choice precludes a specification of a hyperreal extension—such an extension can indeed be specified. Moreover, for all we know, it is possible to explicitly specify particular infinitesimals within such an extension. Nonetheless, I prove that because any regular probability measure that has infinitesimal values can be replaced by one that has all the same intuitive features but other infinitesimal values, the heart of the arbitrariness objection remains.

48566.753008
The clock hypothesis is taken to be an assumption independent of special relativity necessary to describe accelerated clocks. This enables to equate the time read off by a clock to the proper time. Here, it is considered a physical system—the light clock—proposed by Marzke and Wheeler. Recently, Fletcher proved a theorem that shows that a sufficiently small light clock has a time reading that approximates to an arbitrary degree the proper time. The clock hypothesis is not necessary to arrive at this result. Here, one explores the consequences of this regarding the status of the clock hypothesis. It is argued in this work that there is no need for the clock hypothesis in the special theory of relativity.

48603.753021
The use of unrealistic assumptions in Economics is usually defended not only for pragmatic reasons, but also because of the intrinsic difficulties in determining the degree of realism of assumptions. Additionally, the criterion used for evaluating economic models is associated with their ability to provide accurate predictions.

57827.753034
According to Kratzer’s influential account (1981; 1991; 2012), epistemic must and might involve quantification over domains of possibilities determined by a modal base and an ordering source. Recently, this account has been challenged by invoking contexts of ‘epistemic tension’: i.e., cases in which an assertion that must φ is conjoined with the possibility that ¬φ, and cases in which speakers try to downplay a previous assertion that must φ, after finding out that ¬φ. Epistemic tensions have been invoked from two directions. von Fintel and Gillies (2010) propose a return to a simpler modal logicinspired account: must and might still involve universal and existential quantification, but the domains of possibilities are determined solely by realistic modal bases. In contrast, Lassiter (2016), following Swanson (2006, 2011), proposes a more revisionary account which treats must and might as probabilistic operators. In this paper, we present a series of experiments to obtain reliable data on the degree of acceptability of different contexts of epistemic tensions. Our experiments include novel variations that, we argue, are required to make progress in this debate. We show that restricted quantificational accounts a la Kratzer (e.g., Kratzer, 1981, 2012; Roberts, 2015; Giannakidou and Mari, 2016) fit the overall pattern of results better than either of their recent competitors. In addition, our results help us identify which components of restricted quantificational accounts are crucial for their success, and on that basis propose some general constraints that should be satisfied by all candidate accounts of the modal auxiliaries.

224120.753055
The problem of the reduction of chemistry to physics has been traditionally addressed in terms of classical structural chemistry and standard quantum mechanics. In this work, we will study the problem from the perspective of the Quantum Theory of Atoms in Molecules (QTAIM), proposed by Richard Bader in the nineties. The purpose of this article is to unveil the role of QTAIM in the intertheoretical relations between chemistry and physics. We argue that, although the QTAIM solves two relevant obstacles to reduction by providing a rigorous definition of chemical bond and of atoms in a molecule, it appeals to concepts that are unacceptable in the quantummechanical context. Therefore, the QTAIM fails to provide the desired reduction. On the other hand, we will show that the QTAIM is more similar to Bohmian mechanics and that the basic elements of both theories are closely related.

224137.753069
This paper introduces some basic ideas and formalism of physics in noncommutative geometry. It is a draft (written back in 2011) of a chapter of Out of Nowhere, a book on quantum gravity that I am coauthoring with Christian Wuthrich. Although it has long been suggested that quantizing gravity – imposing canonical commutations in some way – will lead to the coordinate commutation relations of noncommutative geometry, there is no known formal requirement that this be so. Nevertheless, such relations do show up in theories of quantum gravity, for instance as the result of a possible Planck scale nonlocality in the interactions of the Dbranes of string theory.

224223.753082
The celu of the philosophical literature on the hole argument is the 1987 paper by Earman & Norton [“What Price Spacetime Substantivalism? The Hole Story” Br. J. Phil. Sci ]. This paper has a wellknown backstory, concerning work by Stachel and Norton on Einstein’s thinking in the years 191315. Less wellknown is a connection between the hole argument and Earman’s work on Leibniz in the 1970s and 1980s, which in turn can be traced to an argument first presented in 1975 by Howard Stein. Remarkably, this thread originates with a misattribution: the argument Earman attributes to Stein, which ultimately morphs into the hole argument, was not the argument Stein gave. The present paper explores this episode and presents some reflections on how it bears on the subsequent literature.

224247.753095
In their article on singularities and black holes in the Stanford Encyclopedia of Philosophy, Peter Bokulich and Erik Curiel raise a series of important philosophical questions regarding black holes, including the following: “Black holes appear to be crucial for our understanding of the relationship between matter and spacetime. . . . when matter forms a black hole, it is transformed into a purely gravitational entity. When a black hole evaporates, spacetime curvature is transformed into ordinary matter. Thus black holes offer an important arena for investigating the ontology of spacetime and ordinary objects.” [1] This paper aims to address these issues in the context of string theoretic models of black holes, with the aim of illuminating the ontological unification of gravity and matter, and the interpretation of cosmological models, within string theory. §1 will describe the central concepts of the theory: the fungibility of matter and geometry, and the reduction of gravity and supergravity. The ‘standard’ interpretation presented draws on that implicit in the thinking of many (but not all) string theorists, though made more explicit and systematic than usual. §2 will explain how to construct a stringy black hole, and some of its features, including evaporation. §3 will critically examine the assumptions behind such modeling, and their bearing on Curiel and Bokulich’s ontological questions.

225692.753108
« Airport idiocy
The NP genie
Hi from the Q2B conference! Every nerd has surely considered the scenario where an allknowing genie—or an enlightened guru, or a superintelligent AI, or God—appears and offers to answer any question of your choice. …

279522.753121
Just as a theory of representation is deficient if it can’t explain how misrepresentation is possible, a theory of computation is deficient if it can’t explain how miscomputation is possible. You might expect, then, that philosophers of computation have wellworkedout theories of miscomputation. But you’d be wrong. They have generally ignored miscomputation. My primary goal in this paper is to clarify both what miscomputation is and what needs to be accomplished in order to adequately explain it. Miscomputation is a special kind of malfunction. If the battery breaks, a system may fail to compute what it is supposed to compute. But it’s not miscomputing, because it’s not computing at all. Just as something doesn’t misrepresent unless it represents, something doesn’t miscompute unless it computes. To miscompute is to compute in a way that violates a computational norm. Consequently, an adequate account of miscomputation requires an account of what the system is computing when the system is violating the relevant computational norms.

279539.753138
John Stuart Mill’s A System of Logic, Ratiocinative and Inductive, being a connected view of the principles of evidence, and the methods of scientific investigation was the most popular and influential treatment of scientific method throughout the second half of the 19th century. As is wellknown, there was a radical change in the view of probability endorsed between the first and second editions. There are three different conceptions of probability interacting throughout the history of probability: (1) Chance, or Propensity — for example, the bias of a biased coin. (2) Judgmental Degree of Belief — for example, the degree of belief one should have that the bias is between .6 and .7 after 100 trials that produce 81 heads. (3) LongRun Relative Frequency — for example, proportion of heads in a very large, or even infinite, number of flips of a given coin.

291521.753154
Quantum set theory (QST) and topos quantum theory (TQT) are two long running projects in the mathematical foundations of quantum mechanics that share a great deal of conceptual and technical affinity. Most pertinently, both approaches attempt to resolve some of the conceptual difficulties surrounding quantum mechanics by reformulating parts of the theory inside of nonclassical mathematical universes, albeit with very different internal logics. We call such mathematical universes, together with those mathematical and logical structures within them that are pertinent to the physical interpretation, ‘Qworlds’. Here, we provide a unifying framework that allows us to (i) better understand the relationship between different Qworlds, and (ii) define a general method for transferring concepts and results between TQT and QST, thereby significantly increasing the expressive power of both approaches. Along the way, we develop a novel connection to paraconsistent logic and introduce a new class of structures that have significant implications for recent work on paraconsistent set theory.

297642.753168
Firstorder model theory, also known as classical model theory, is a
branch of mathematics that deals with the relationships between
descriptions in firstorder languages and the structures that satisfy
these descriptions. From one point of view, this is a vibrant area of
mathematical research that brings logical methods (in particular the
theory of definition) to bear on deep problems of classical
mathematics. From another point of view, firstorder model theory is
the paradigm for the rest of
model theory;
it is the area in which many of the broader ideas of model
theory were first worked out.

315070.753181
The surprisingly high reliability of Wikipedia has often been seen as a beneficial effect of the aggregation of diverse contributors, or as an instance of the wisdom of crowds phenomenon; additional factors such as elite contributors, Wikipedia’s policy or its administration have also been mentioned. We adjudicate between such explanations by modelling and simulating the evolution of a Wikipedia entry. The main threat to Wikipedia’s reliability, namely the presence of epistemically disruptive agents such as disinformers and trolls, turns out to be offset only by a combination of factors: Wikipedia’s administration and the possibility to instantly revert entries, both of which are insufficient when considered in isolation. Our results suggest that the reliability of Wikipedia should receive a pluralist explanation, involving factors of different kinds.

325669.753199
This paper presents a simple but, by my lights, effective argument for a subclassical account of logic – an account according to which logical consequence is (properly) weaker than the standard, socalled classical account. Alas, the vast bulk of the paper is setup. Because of the many conflicting uses of ‘logic’ the paper begins, following a disclaimer on logic and inference, by fixing the sense of ‘logic’ in question, and then proceeds to rehearse both the target subclassical account of logic and its wellknown relative (viz., classical logic). With background in place the simple argument – which takes up less than five pages – is advanced. My hope is that the minimal presentation will help to get ‘the simple argument’ plainly on the table, and that subsequent debate can move us forward.

382094.753212
Those who endorse the harmbased account of the wrongness of discrimination hold that ‘an instance of discrimination is wrong, when it is, because it makes people worse off’. In a coauthored piece with Adam Slavny, I pressed two objections against this view. First, the harmbased account implausibly fails to recognize that harmless discrimination can be wrong. Second, the harmbased account fails to identify all of the wrongmaking properties of discriminatory acts. In the light of these failings, we concluded that a more promising account of the wrongness of discrimination must ‘focus not only on the harmful outcomes of discriminatory acts but also on the deliberation of the discriminator and in particular on the reasons that motivate or fail to motivate her action’. In this brief paper, I defend these conclusions against an objection that has recently been pressed against our view by Richard Arneson. This task is important not only because Arneson’s objection is an intriguing one, but also  and more importantly  because my response sheds further light on the content and structure of an attractive theory of wrongful discrimination, as well as on more fundamental ideas in moral philosophy.

457637.753236
At its strongest, Hume’s problem of induction denies the existence of any well justified assumptionless inductive inference rule. At the weakest, it challenges our ability to articulate and apply good inductive inference rules. This paper examines an analysis that is closer to the latter camp. It reviews one answer to this problem drawn from the VC theorem in statistical learning theory and argues for its inadequacy. In particular, I show that it cannot be computed, in general, whether we are in a situation where the VapnikChervonenkis (VC) theorem can be applied for the purpose we want it to. Hume’s problem of induction can be analyzed in a number of different ways. At the strongest, it denies the existence of any well justified assumptionless inductive inference rule. At the weakest, it challenges our ability to articulate and apply good inductive inference rules. This paper examines an analysis that is closer to the latter camp. It reviews one answer to this problem drawing from a theorem in statistical learning theory and argues for its inadequacy.

695195.753252
Most versions of classical physics imply that if the 4volume of the entire spacetime is infinite or at least extremely large, then random fluctuations in the matter will by coincidence create copies of us in remote places, so called “Boltzmann brains.” That is a problem because it leads to the wrong prediction that we should be Boltzmann brains. The question arises, how can any theory avoid making this wrong prediction? In quantum physics, it turns out that the discussion requires a formulation of quantum theory that is more precise than the orthodox interpretation. Using Bohmian mechanics for this purpose, we point out a possible solution to the problem based on the phenomenon of “freezing” of configurations. Key words: quantum fluctuation; BunchDavies vacuum; late universe; scalar fields in cosmology; de Sitter spacetime.

719043.753265
In this paper I investigate the meaning of one word of English, unless, and its theoretical implications. In numerous textbooks and grammars we can find the traditional view that unless is equivalent to if…not.

804803.753279
In recent years philosophers of science have explored categorical equivalence as a promising criterion for when two (physical) theories are equivalent. On the one hand, philosophers have presented several examples of theories whose relationships seem to be clarified using these categorical methods. On the other hand, philosophers and logicians have studied the relationships, particularly in the first order case, between categorical equivalence and other notions of equivalence of theories, including definitional equivalence and generalized definitional (aka Morita) equivalence. In this article, I will express some skepticism about this approach, both on technical grounds and conceptual ones. I will argue that “category structure” (alone) likely does not capture the structure of a theory, and discuss some recent work in light of this claim.

806488.753292
Excursion 3 Exhibit (i)
Exhibit (i) NP Methods as Severe Tests: First Look (Water Plant Accident)
There’s been an accident at a water plant where our ship is docked, and the cooling system had to be repaired. …

1067794.753313
Neyman & Pearson
3.2 NP Tests: An Episode in AngloPolish Collaboration*
We proceed by setting up a specific hypothesis to test, H0 in Neyman’s and my terminology, the null hypothesis in R. A. Fisher’s . …

1085998.753328
I can’t help thinking about geometric quantization. I feel it holds some lessons about the relation between classical and quantum mechanics that we haven’t fully absorbed yet. I want to play my cards fairly close to my chest, because there are some interesting ideas I haven’t fully explored yet… but still, there are also plenty of ‘wellknown’ clues that I can afford to explain. …

1133917.753342
Herbert Simon introduced the term ‘bounded rationality’
(Simon 1957: 198) as a shorthand for his brief against neoclassical
economics and his call to replace the perfect rationality assumptions
of homo economicus with a conception of rationality tailored
to cognitively limited agents. Broadly stated, the task is to replace the global rationality of
economic man with the kind of rational behavior that is compatible
with the access to information and the computational capacities that
are actually possessed by organisms, including man, in the kinds of
environments in which such organisms exist.

1564987.753356
In this paper we want to discuss the changing role of mathematics in science, as a way to discuss some methodological trends at work in big data science. More specifically, we will show how the role of mathematics has dramatically changed from its more classical approach. Classically, any application of mathematical techniques requires a previous understanding of the phenomena, and of the mutual relations among the relevant data; modern data analysis appeals, instead, to mathematics in order to identify possible invariants uniquely attached to the specific questions we may ask about the phenomena of interest. In other terms, the new paradigm for the application of mathematics does not require any understanding of the phenomenon, but rather relies on mathematics to organize data in such a way as to reveal possible invariants that may or may not provide further understanding of the phenomenon per se, but that nevertheless provide an answer to the relevant question. However, postponing or giving up altogether the understanding of phenomena and making it dependent on the application of mathematics calls for a different kind of understanding, namely the understanding of the reasons that make the mathematical methods and tools apt to answer a specific question.

1795775.753369
At present, quantum theory leaves unsettled which quantities ontologically, physically exist in a quantum system. Do observables such as energy and position have meaningful values only at the precise moment of measurement, as in the Copenhagen interpretation? Or is position always definite and guided by the wave function, as in de BroglieBohm pilot wave theory? In the language of Bell, what are the “beables” of quantum theory and what values may they take in space and time? This is the quantum reality problem. A definitive answer requires not just describing which physical quantities exist in a quantum system, but describing what configurations of those quantities in space and time are allowed, and with what probability those configurations occur. Adrian Kent sets out a new vision of quantum theory along these lines. His interpretation supplements quantum theory to infer the value of physical quantities in spacetime from the asymptotic latetime behavior of the quantum system. In doing so, a Lorentzcovariant and singleworld solution to the quantum reality problem is achieved. In this paper, the framework of Kent’s interpretation is presented from the ground up. After a broad overview, a derivation of the generalized AharonovBergmann Lebowitz (ABL) rule is provided before applying Kent’s interpretation to toy model systems, in both relativistic and nonrelativistic settings. By adding figures and discussion, a broad introduction is provided to Kent’s proposed interpretation of quantum theory.