The primary objective of this paper is to introduce a new epistemic paradox that puts pressure on the claim that justification is closed under multi premise deduction. The first part of the paper will consider two well-known paradoxes—the lottery and the preface paradox—and outline two popular strategies for solving the paradoxes without denying closure. The second part will introduce a new, structurally related, paradox that is immune to these closure-preserving solutions. I will call this paradox, The Paradox of the Pill. Seeing that the prominent closure-preserving solutions do not apply to the new paradox, I will argue that it presents a much stronger case against the claim that justification is closed under deduction than its two predecessors. Besides presenting a more robust counterexample to closure, the new paradox also reveals that the strategies that were previously thought to get closure out of trouble are not sufficiently general to achieve this task as they fail to apply to similar closure-threatening paradoxes in the same vicinity.
According to orthodox (Kolmogorovian) probability theory, conditional probabilities are by definition certain ratios of unconditional probabilities. As a result, orthodox conditional probabilities are regarded as undefined whenever their antecedents have zero unconditional probability. This has important ramifications for the notion of probabilistic independence.
The previous two chapters have sought to show that the probability calculus cannot serve as a universally applicable logic of inductive inference. We may well wonder whether there might be some other calculus of inductive inference that can be applied universally. It would, perhaps, arise through a weakening of the probability calculus. The principal source of difficulty addressed in those chapters was the additivity of the probability calculus. Such a weakening seems possible as far as additivity is concerned. Something like it is achieved with the Shafer- Dempster theory of belief functions. However there is a second, lingering problem. Bayesian analyses require prior probabilities. As we shall see below, these prior probabilities are never benign. They always make a difference to the final result.
The many chapters of this book all aim to sustain a single conclusion. Inductive inferences are not warranted by formal schemas or rules. They are warranted by background facts. Over the last few years, I have had the opportunity of presenting this thesis and arguments for it in various philosophical forums. The reactions to it have been varied. Some find the idea illuminating and even obvious, once it is made explicit. They are supportive and I am grateful for it. Others are more neutral, reacting with various forms of indifference or incomprehension. Some set aside the question of whether they are or are not convinced by the main claim; or whether there is some way that they could help the speaker advance the project. Rather they hold to the lamentable idea that, no matter what, the job of an audience in a philosophy talk is to try to trip up the speaker with some artful sophistry. Still others are, perhaps, not quite sure of precisely what I am proposing and arguing. But they are nonetheless sure that it is a Very Bad Thing that must be opposed and stopped.
Erich Lehmann 20 November 1917 – 12 September 2009
Erich Lehmann was born 100 years ago today! (20 November 1917 – 12 September 2009). Lehmann was Neyman’s first student at Berkeley (Ph.D 1942), and his framing of Neyman-Pearson (NP) methods has had an enormous influence on the way we typically view them. …
I argue that the function attributed to episodic memory by Mahr & Csibra (that is, grounding one’s claims to epistemic authority over past events) fails to support the essentially autonoetic character of such memories. I suggest, in contrast, that episodic event-memories are sometimes purely first-order, sometimes autonoetic, depending on relevance in the context.
This article develops an account of local epistemic practices on the basis of case studies from ethnobiology. I argue that current debates about objectivity often stand in the way of a more adequate understanding of local knowledge and ethno-biological practices in general. While local knowledge about the biological world often meets criteria for objectivity in philosophy of science, general debates about the objectivity of local knowledge can also obscure their unique epistemic features. In modification of Ian Hacking’s suggestion to discuss “ground level questions” instead of objectivity, I propose an account that focuses on both epistemic virtues and vices of local epistemic practices.
We investigate the conflict between the ex ante and ex post criteria of social welfare in a new framework of individual and social decisions, which distinguishes between two sources of uncertainty, here interpreted as an objective and a subjective source respectively. This framework makes it possible to endow the individuals and society not only with ex ante and ex post preferences, as is usually done, but also with interim preferences of two kinds, and correspondingly, to introduce interim forms of the Pareto principle. After characterizing the ex ante and ex post criteria, we present a first solution to their conflict that extends the former as much possible in the direction of the latter. Then, we present a second solution, which goes in the opposite direction, and is also maximally assertive. Both solutions translate the assumed Pareto conditions into weighted additive utility representations, and both attribute to the individuals common probability values on the objective source of uncertainty, and different probability values on the subjective source. We discuss these solutions in terms of two conceptual arguments, i.e., the by now classic spurious unanimity argument and a novel informational argument labelled complementary ignorance.
Rosencrantz and Guildenstern Are Dead, they are betting on coin throws. Rosencrantz has a standing bet on heads, and he keeps winning, pocketing coin after coin. We soon learn that this has been going on for some time, and that no fewer than 76 consecutive heads have been thrown, and counting — a situation which is making Guildenstern increasingly uneasy. The coins don’t appear to be double-headed or weighted or anything like that — just ordinary coins — leading Guildenstern to consider several unsettling explanations: that he is subconsciously willing the coins to land heads in order to cleanse himself of some repressed sin, that they are both trapped reliving the same moment in time over and over again, that the coins are being controlled by some menacing supernatural force. He then proposes a fourth hypothesis, which suggests a change of heart: that nothing surprising is happening at all and no special explanation is needed. He says, “… each individual coin spun individually is as likely to come down heads as tails and therefore should cause no surprise each individual time it does.” In the end 92 heads are thrown without a single tail, when the characters are interrupted.
The notion of preference has a central role in many disciplines,
including moral philosophy and decision theory. Preferences and their
logical properties also have a central role in rational choice theory,
a subject that in its turn permeates modern economics, as well as
other branches of formalized social science. The notion of preference
and the way it is analysed vary between these disciplines. A treatment
is still lacking that takes into account the needs of all usages and
tries to combine them in a unified approach. This entry surveys the
most important philosophical uses of the preference concept and
investigates their compatibilities and conflicts.
The Problem of Old Evidence is a perennial issue for Bayesian confirmation theory. Garber (1983) famously argues that the problem can be solved by conditionalizing on the proposition that a hypothesis deductively implies the existence of the old evidence. In recent work, Hartmann and Fitelson (2015) and Sprenger (2015) aim for similar, but more general, solutions to the Problem of Old Evidence. These solutions are more general because they allow the explanatory relationship between a new hypothesis and old evidence to be inductive, rather than deductive. In this paper, I argue that these solutions are either unsound or under-motivated, depending on the case of inductive explanation that we have in mind. This lends support to the broader claim that Garber-style Bayesian confirmation cannot capture the sense in which new hypotheses that do not deductively imply old evidence nevertheless seem to be confirmed via old evidence.
One part of the problem of anomaly is this. If a well-established scientific theory seems to predict something contrary to what we observe, we tend to stick to the theory, with barely a change in credence, while being dubious of the auxiliary hypotheses. …
« The Karp-Lipton Advice Column
The destruction of graduate education in the United States »
Review of “Inadequate Equilibria,” by Eliezer Yudkowsky
Inadequate Equilibria: Where and How Civilizations Get Stuck is a little gem of a book: wise, funny, and best of all useful (and just made available for free on the web). …
In a famous series of experiments Justin Kruger and David Dunning found that people who scored in the lowest quartile of skill in grammar, logic, and (yes, they tried to measure this) humor tended to substantially overestimate their abilities, rating themselves as a bit above average in these skills. …
Network analysis needs tools to infer distributions over graphs of arbitrary size from a single graph. Assuming the distribution is generated by a continuous latent space model which obeys certain natural symmetry and smoothness properties, we establish three levels of consistency for non-parametric maximum likelihood inference as the number of nodes grows: (i) the estimated locations of all nodes converge in probability on their true locations; (ii) the distribution over locations in the latent space converges on the true distribution; and (iii) the distribution over graphs of arbitrary size converges.
One response to the problem of logical omniscience in standard possible worlds models of belief is to extend the space of worlds so as to include impossible worlds. It is natural to think that essentially the same strategy can be applied to probabilistic models of partial belief, for which parallel problems also arise. In this paper, I note a difficulty with the inclusion of impossible worlds into probabilistic models. Under weak assumptions about the space of worlds, most of the propositions which can be constructed from possible and impossible worlds are in an important sense inexpressible; leaving the probabilistic model committed to saying that agents in general have at least as many attitudes towards inexpressible propositions as they do towards expressible propositions. If it is reasonable to think that our attitudes are generally expressible, then a model with such commitments looks problematic.
September’s general elections have brought Germany its own Brexit/Trump moment. For the first time since 1945 a far right nationalist party is part of the German national parliament. The Alternative for Germany, AfD, gained 12,6 % of German votes. …
The previous chapter examined the inductive logic applicable to an infinite lottery machine. Such a machine generates a countably infinite set of outcomes, that is, there are as many outcomes as natural numbers, 1, 2, 3, … We found there that, if the lottery machine is to operate without favoring any particular outcome, the inductive logic native to the system is not probabilistic. A countably infinite set is the smallest in the hierarchy of infinities. The next routinely considered is a continuum-sized set, such as given by the set of all real numbers or even just by the set of all real numbers in some interval, from, say, 0 to 1.
Population ethics is the study of the unique ethical issues that arise when one’s actions can change who will come into existence: actions that lead to additional people being born, fewer people being born, or different people being born. The most obvious cases are those of an individual deciding whether to have a child, or of society setting the social policies surrounding procreation. However, issues of population ethics come up much more widely than this. How bad is it if climate change reduces the planet’s “carrying capacity”? How important is it to lower the risks of human extinction? How important is it, if at all, that humanity eventually seeks a future beyond Earth, allowing a much greater population?
The kind of skepticism that interests me in this book is not the skepticism that asks whether or not I know that this is my hand, or that you are not a zombie. Instead, it is part of an approach to epistemology that thinks of questions about knowledge, belief, and truth as being immediately tied to normative and evaluative questions. Much of the inspiration for this kind of skepticism derives from Socrates, or rather, the Socrates of Plato’s dialogues. In a famous line of the Apology, Socrates says that the unexamined life is not worth living for a human being (38a5-6). Ancient skepticism inherits this spirit. It is centrally about stepping back from belief-formation and counteracting one’s tendencies to be quick to judge. Closely related, it is concerned with the ways in which one can fail to understand one’s own thoughts, and fail to examine thoughts because one likes or dislikes them, or because one prefers to hold a view as opposed to holding no view. These psychological phenomena are taken to differ importantly from processes of rationally guided belief-formation, where a cognizer is inclined to accept a thought after careful consideration of whether it is true.
The liar paradox is widely conceived as a problem for logic and semantics. On the basis of empirical studies presented here, we suggest that there is an underappreciated psychological dimension to the liar paradox and related problems, conceived as a problem for human thinkers. Specific findings suggest that how one interprets the liar sentence and similar paradoxes can vary in relation to one’s capacity for logical and reflective thought, acceptance of certain logical principles, and degree of philosophical training, but also as a function of factors such as religious belief, gender, and whether the problem is treated as theoretical or practical. Though preliminary, these findings suggest that one reason the liar paradox resists a final resolution is that it engages both aspects described by so-called dual process accounts of human cognition.
In some sense, it is clear that the numbers count. That is, it is clear that the number of thinkers on a given side of a disputed issue is typically relevant to the degree of support their opinions provide. It is natural to think that numbers cannot be all that matter, though, for the extent to which the opinions are independent also seems to have substantial epistemic import. It is difficult, however, to capture explicitly the type of dependence and independence that can play this epistemic role. This paper investigates the issue, putting forward an expectational account of belief dependence and independence – one that can be applied whether we think in terms of credences or in terms of all-or-nothing beliefs.
Belief attribution, both in philosophy and in ordinary language, normally serves two different types of role. One role is predicting, tracking, or reporting what a person would verbally endorse. When we attribute belief to someone we are doing something like indirect quotation, speaking for them, expressing what we think they would say. …
The principles of Conditional Excluded Middle (CEM) and Simplification of Disjunctive Antecedents (SDA) have received substantial attention in isolation. Both principles are plausible generalizations about natural language conditionals. There is however little or no discussion of their interaction. This paper aims to remedy this gap and explore the significance of having both principles constrain the logic of the conditional. Our negative finding is that, together with elementary logical assumptions, CEM and SDA yield a variety of implausible consequences. Despite these incompatibility results, we open up a narrow space to satisfy both. We show that, by simultaneously appealing to the alternative-introducing analysis of disjunction and to the theory of homogeneity presuppositions, we can satisfy both. Furthermore, the theory that validates both principles resembles a recent semantics that is defended by Santorio on independent grounds. The cost of this approach is that it must give up the transitivity of entailment: we suggest that this is a feature, not a bug, and connect it with with recent developments of intransitive notions of entailment.
The epistemology of modality has focused almost exclusively on knowledge of metaphysical modality. However, other kinds of so-called objective modality (in the sense of Williamson 2016b) such as nomic, practical, and ‘easy’ possibility can also appear epistemologically puzzling, and they are important topics in their own right. Thus the neglect of the epistemology of other objective modalities may look strange and even parochial. At worst, it may look similar to an approach to the epistemology of mathematics that only deals with knowledge of the theorems of some weak mathematical theory, such as Robinson arithmetic.
The no-miracles argument and the pessimistic induction are currently regarded as the strongest arguments for and against scientific realism, respectively, in philosophy of science. In this paper, I construct a new argument for scientific realism that I call the anti-induction for scientific realism. It holds that since past theories were unwarranted, present theories are warranted. I provide an example from the history of science to show that anti-inductions sometimes work in science. The anti-induction for scientific realism has several advantages over the no-miracles argument as a positive argument for scientific realism.
A controversial principle in Catholic moral theology is the principle of “counseling the lesser evil”, sometimes confusingly (or confusedly) presented as the “principle of the lesser evil”. The principle is one that the Church has not pronounced on. …
Gettier’s (1963) two examples are now widely believed to be counterexamples to this JTB account of knowledge: JTB: A thinker knows that p iff p is true, the thinker believes that p is true, and the thinker is justified in her belief. Gettier claimed that his cases work equally well against Chisholm’s (1957) ETB account of knowledge: ETB: A thinker knows that p iff p is true, the thinker believes that p is true, and the thinker has adequate evidence for p. It’s probably fair to say that most epistemologists agree with Gettier on these points. There seems to be a general consensus that Gettier’s protagonists lack knowledge and that they meet the conditions that these theories propose would be sufficient for knowledge.
This paper discusses how to update one’s credences based on evidence that has initial probability 0. I advance a diachronic norm, Kolmogorov Conditionalization, that governs credal reallocation in many such learning scenarios. The norm is based upon Kolmogorov’s theory of conditional probability. I prove a Dutch book theorem and converse Dutch book theorem for Kolmogorov Conditionalization. The two theorems establish Kolmogorov Conditionalization as the unique credal reallocation rule that avoids a sure loss in the relevant learning scenarios.
In this article, I argue that it makes a moral difference whether an individual is worse off than she could have been. Here I part company with consequentialists such as Parfit and side with contractualists such as Scanlon. But, unlike some contractualists, I reject the view that all that matters is whether a principle can be justified to each particular individual, where such a justification is attentive to her interests, complaints, and other claims. The anonymous goodness of a distribution also matters. My attempt to reconcile contractualist and consequentialist approaches proceeds via a serious of reflections on cases.