Johannes Kepler (1571–1630) is one of the most significant
representatives of the so-called Scientific Revolution of the
16th and 17th centuries. Although he received
only the basic training of a “magister” and was
professionally oriented towards theology at the beginning of his
career, he rapidly became known for his mathematical skills and
theoretical creativity. As a convinced Copernican, Kepler was able to
defend the new system on different fronts: against the old astronomers
who still sustained the system of Ptolemy, against the Aristotelian
natural philosophers, against the followers of the new “mixed
system” of Tycho Brahe—whom Kepler succeeded as Imperial
Mathematician in Prague—and even against the standard Copernican
position according to which the new system was to be considered merely
as a computational device and not necessarily a physical reality.
Traditionally, logic has been the dominant formal method within philosophy. Are logical methods still dominant today, or have the types of formal methods used in philosophy changed in recent times? To address this question, we coded a sample of philosophy papers from the late 2000s and from the late 2010s for the formal methods they used. The results indicate that (a) the proportion of papers using logical methods remained more or less constant over that time period but (b) the proportion of papers using probabilistic methods was approximately three times higher in the late 2010s than it was in the late 2000s. Further analyses explored this change by looking more closely at specific methods, specific levels of technical engagement, and specific subdisciplines within philosophy. These analyses indicate that the increasing proportion of papers using probabilistic methods was pervasive, not confined to particular probabilistic methods, levels of sophistication, or subdisciplines.
We present nine questions related to the concept of negation and, in passing, we refer to connections with the essays in this special issue. The questions were submitted to one of the most eminent logicians who contributed to the theory of negation, Prof. (Jon) Michael Dunn, but, unfortunately, Prof. Dunn was no longer able to answer them. Michael Dunn passed away on 5 April 2021, and the present special issue of Logical Investigations is dedicated to his memory. The questions concern (i) negation-related topics that have particularly interested Michael Dunn or to which he has made important contributions, (ii) some controversial aspects of the logical analysis of the concept of negation, or (iii) simply properties of negation in which we are especially interested. Though sadly and regrettably unanswered by the distinguished scholar who intended to reply, the questions remain and might stimulate answers by other logicians and further research.
As with most topics in philosophy, there is no consensus about what experimental philosophy is. Most broadly, experimental philosophy involves using scientific methods to collect empirical data for the purpose of casting light on philosophical issues. Such a definition threatens to be too broad, however: Taking the nature of matter to be a philosophical issue, research at the Large Hadron Collider would count as experimental philosophy. Others have suggested more narrow definitions, characterizing experimental philosophy in terms of the use of scientific methods to investigate intuitions. This threatens to be too narrow, however, excluding such work as Eric Schwitzgebel’s comparison of the rates of theft of ethics books to similar volumes from other areas of philosophy for the purpose of finding out whether philosophical training in ethics promotes moral behavior. While restricting experimental philosophy to the study of intuitions is too narrow, this nonetheless covers most of the research in this area. Focusing on this research, we begin by discussing some of the methods that have been used by experimental philosophers. We then distinguish between three types of goals that have guided experimental philosophers, illustrating these goals with some examples.
Eugen Fischer and colleagues expand on a body of empirical work offering a debunking explanation of a key assumption involved in the argument from illusion. Following Snowden (1992), we can distinguish between the base case and the spreading step in the argument. Fischer et al. target the base case. In the most prominent current versions of the argument, the key move in the base case involves the phenomenal principle (Robinson, 1994, 32): “If there sensibly appears to a subject to be something which possesses a particular sensible quality then there is something of which the subject is aware which does possess that sensible quality.” In brief, Fischer et al. contend that the move here from a seemingly uncontroversial claim such as “the coin appears elliptical to me” to there being something of which the subject is aware that is elliptical requires that the initial claim be given a “literal interpretation” such that something elliptical has appeared to the subject. But they contend that under such an interpretation the claim should no longer be taken to be uncontroversial, assuming too much of what the argument needs to establish. And they argue that much of the intuitive appeal of this move can be explained in terms of accepting the claim based on the dominant usage of appearance verbs (e.g., I think the coin is elliptical), then shifting to the less salient phenomenal usage required for the conclusion. Fischer et al. then present the results of a series of nifty new studies in cross-cultural psycholinguistics to support the conclusion that people make stereotypical inferences warranted by the dominant sense of appearance verbs, even in contexts where this dominant sense is inappropriate.
Eugen Fischer and John Collins have brought together an impressive, and important, series of essays concerning the methodological debates between rationalists and naturalists, and how these debates have been impacted by work in experimental philosophy. The work at issue concerns the evidential value of intuitions, and as such is only a small part of the experimental philosophy corpus as I understand it. In fact, Fischer and Collins define experimental philosophy in this narrow sense in their introduction. On their view, experimental philosophy ‘‘builds on the assumption that, for better or worse, intuitions are crucially involved in philosophical work’’ (3). The parenthetical serves to emphasize that such work could either be pursued from a positive perspective aiming to vindicate the use of intuitions in philosophy or from a negative perspective aiming to undermine that use. Noting these two perspectives, it might then seem that experimental philosophy is neutral with regard to methodological debate: ‘‘experimental philosophy is not a party to the dispute between methodological rationalism and naturalism, but offers a new framework for settling it’’ (23).
Starting with the slogan that understanding is a ‘knowledge of causes,’ Stephen Grimm and John Greco have argued that understanding comes from a knowledge of dependence relations. Grounding is the trendiest dependence relation on the market, and if Grimm and Greco are correct, then instances of grounding should also give rise to understanding. In this paper, I will show that this prediction is correct – grounding does indeed generate understanding in just the way that Grimm and Greco anticipate. However, grounding examples of understanding also show that Grimm and Greco are not telling the full story when it comes to understanding. Understanding can only be generated by a particular subset of dependence relations — those dependence relations that are also explanatory. Grimm and Greco should thus appeal to a privileged class of dependence relations, relations like grounding that can give rise to explanation as well.
In the long run, the development of artificial intelligence (AI) is likely to be one of the biggest technological revolutions in human history. Even in the short run it will present tremendous challenges as well as tremendous opportunities. The more we do now to think through these complex challenges and opportunities, the better the prospects for the kind of outcomes we all hope for, for ourselves, our children, and our planet.
In this paper, we use the case of the COVID-19 pandemic in Europe to address the question of what kind of knowledge we should incorporate into public health policy. We show that policy-making in Europe during the COVID-19 pandemic has been biomedicine-centric in that its evidential basis marginalised input from non-biomedical disciplines. We then argue that in particular the social sciences could contribute essential expertise and evidence to public health policy in times of biomedical emergencies and that we should thus strive for a tighter integration of the social sciences in future evidence-based policy-making. This demand faces challenges on different levels, which we identify and discuss as potential inhibitors for a more pluralistic evidential basis.
Necessarily, if it is necessary that there is no God, then modal reality is bad. (Making the existence of God impossible is terrible!) Necessarily, if something is bad, it is possible for it not to be bad. …
A binary predicate R is standardly called symmetric if for every x and y, the statement R(x, y) is logically equivalent to R( y, x). Examples for symmetric predicates in English include relational adjectives, nouns and verbs, as in the following equivalent sentences.
In 'The Means and the Good' (Analysis, forthcoming) Matthew Oliver argues that pluralist consequentialists can accommodate intuitions against using others as a means, on the model of how they can accommodate intuitions about desert:Just as it is bad for Emily to benefit from a stolen manuscript, it is bad for anyone to benefit from the use of another’s body or resources as a means. …
Writing comments on a post about adversarial collaboration feels like a place where I should be adversarial (if in a collaborative spirit). But I agree with basically everything Eric says here. Frankly, this is all spot on. You probably don’t want to read 500 words from me just saying “yep, this” and agreeing with his excellent, sensible advice, though. So, let me attempt to be provocative: Eric doesn’t go far enough! (Not that he was trying to, of course.) All philosophers should be asking themselves what empirical evidence would actually test their views. Collaboration should be the rule, not the exception. And we should expect collaborations to have an adversarial element, treating this as a feature, not a bug.
Conceptual engineering involves revising our concepts. It can be pursued as a specific philosophical methodology, but is also common in ordinary, non-philosophical, contexts. How does our capacity for conceptual engineering fit into human cognitive life more broadly? I hold that conceptual engineering is best understood alongside practices of conceptual exploration, examples of which include conceptual supposition (i.e., suppositional reasoning about alternative concepts), and conceptual comparison (i.e., comparisons between possible concept choices). Whereas in conceptual engineering we aim to change the concepts we use, in conceptual exploration, we reason about conceptual possibilities. I approach conceptual exploration via the linguistic tools we use to communicate about concepts, using metalinguistic negotiation, convention-shifting conditionals, and metalinguistic comparatives as my key examples. I present a linguistic framework incorporating conventions that can account for this communication in a unified way. Furthermore, I argue that conceptual exploration helps undermine skepticism about conceptual engineering itself.
Sometimes, learning about the origins of a belief can make it irrational to continue to hold that belief—a phenomenon we call ‘genealogical defeat’. According to explanationist accounts, genealogical defeat occurs when one learns that there is no appropriate explanatory connection between one’s belief and the truth. Flatfooted versions of explanationism have been widely and rightly rejected on the grounds that they would disallow beliefs about the future and other inductively-formed beliefs. After motivating the need for some explanationist account, we raise some problems for recent versions of explanationism. Learning from their failures, we then produce and defend a more resilient explanationism.
This paper defends the view, put roughly, that to think that p is to guess that p is the answer to the question at hand, and that to think that p rationally is for one’s guess to that question to be in a certain sense non-arbitrary. Some theses that will be argued for along the way include: that thinking is question-sensitive and, correspondingly, that ‘thinks’ is context-sensitive; that it can be rational to think that p while having arbitrarily low credence that p; that, nonetheless, rational thinking is closed under entailment; that thinking does not supervene on credence; and that in many cases what one thinks on certain matters is, in a very literal sense, a choice. Finally, since there are strong reasons to believe that thinking just is believing, there are strong reasons to think that all this goes for belief as well.
« Open Problems Related to Quantum Query Complexity
My ACM TechTalk on quantum supremadvantage
This Erev Yom Kippur, I wish to repent for not putting enough quantum computing content on this blog. Of course, repentance is meaningless unless accompanied by genuine reform. …
Betting on collapse (EDC, ch.6)
Posted on Wednesday, 15 Sep 2021. Topic: decision theory
Chapter 6 of Evidence, Decision and Causality presents another alleged counterexample to CDT, involving a bet on the measurement of entangled particles. …
The field that has come to be known as the Critical Philosophy of
Race is an amalgamation of philosophical work on race that largely
emerged in the late 20th century, though it draws from earlier work. It
departs from previous approaches to the question of race that dominated
the modern period up until the era of civil rights. Rather than
focusing on the legitimacy of the concept of race as a way to
characterize human differences, Critical Philosophy of Race approaches
the concept with a historical consciousness about its function in
legitimating domination and colonialism, engendering a critical
approach to race and hence the name of the sub-field.
Here are two very common intuitions in the philosophy of mind:
Our experiences of the same things are approximately qualitatively the same: your perceptual experiences of white, or squareness, or the beat of a drum are approximately like mine. …
Buras and Cantrell have given a very clever ontological argument for the existence of God based on a desire for happiness. Here is a variant of their argument based on justice. Ought implies (metaphysical) possibility. …
One body of research in experimental philosophy indicates that non-philosophers by and large do not employ the concept of phenomenal consciousness. Another body of research, however, suggests that people treat phenomenal consciousness as essential for having free will. In this chapter, we explore the tension between these findings. We suggest that the dominant, ordinary usages of ‘consciousness’ concern notions of being awake, aware, and exercising control, all of which bear a clear connection to free will. Based on this, we argue that findings purporting to show that people take the capacity for phenomenal consciousness to be necessary for free will are better interpreted in terms of a non-phenomenal understanding of consciousness. We explore this suggestion by calling on extant work on the dimensions of mind perception, and we expand on it, presenting the results of a new study employing a global sample.
This paper is about two requirements on wish reports whose interaction motivates a novel semantics for these ascriptions. The first requirement concerns the ambiguities that arise when determiner phrases, e.g. definite descriptions, interact with ‘wish’. More specifically, several theorists have recently argued that attitude ascriptions featuring counterfactual attitude verbs license interpretations on which the determiner phrase is interpreted relative to the subject’s beliefs. The second requirement involves the fact that desire reports in general require decision-theoretic notions for their analysis. The current study is motivated by the fact that no existing account captures both of these aspects of wishing. I develop a semantics for wish reports that makes available belief-relative readings but also allows decision-theoretic notions to play a role in shaping the truth conditions of these ascriptions. The general idea is that we can analyze wishing in terms of a two-dimensional notion of expected utility.
We implement a recent characterization of metaphysical indeterminacy in the context of orthodox quantum theory, developing the syntax and semantics of two propositional logics equipped with determinacy and indeterminacy operators. These logics, which extend a novel semantics for standard quantum logic that accounts for Hilbert spaces with superselection sectors, preserve different desirable features of quantum logic and logics of indeterminacy. In addition to comparing the relative advantages of the two, we also explain how each logic answers Williamson’s challenge to any substantive account of (in)determinacy: For any proposition p, what could the difference between “p” and “it’s determinate that p” ever amount to?
I offer a case that quantum query complexity still has loads of enticing and fundamental open problems—from relativized QMA versus QCMA and BQP versus IP, to time/space tradeoffs for collision and element distinctness, to polynomial degree versus quantum query complexity for partial functions, to the Unitary Synthesis Problem and more.
Most authors who discuss willpower assume that everyone knows what it is, but our assumptions differ to such an extent that we talk past each other. We agree that willpower is the psychological function that resists temptations – variously known as impulses, addictions, or bad habits; that it operates simultaneously with temptations, without prior commitment; and that ’s skill at exec-use of it is limited by its cost, commonly called effort, as well as by the person utive functioning. However, accounts are usually not clear about how motivation functions during the application of willpower, or how motivation is related to effort. Some accounts depict willpower as the perceiving or formation of motivational contingencies that outweigh the temptation, and some depict it as a continuous use of mechanisms that interfere with reweighing the temptation. Some others now suggest that impulse control can bypass motivation altogether, although they refer to this route as habit rather than willpower.
I argue against the claim that the fundamental form of trust is a 2-place relation of A trusting B and in favour of the fundamental form being a 4- place relation of A, by ψ-ing, trusting B to φ. I characterize trusting behaviour as behaviour that knowingly makes one reliant on someone doing what they are supposed to do in the collaborative enterprise that the trusting behaviour belongs to. I explain how trust is involved in the following collaborative enterprises: knowledge transfer – i.e. telling someone something; maintaining a relationship; and passing responsibility for an action on to someone else. And I finish by showing how our talk of trust in non-collaborative contexts – e.g. trusting a branch to support one’s weight – may be explained by reference to the central sort of collaborative trust. Key words: collaboration; reliance; communication; Faulkner; Simpson; Jones.
In his superb book, The Metaphysics of Representation, Williams sketches biconditional reductive definitions of representational states in nonrepresentational terms (xvii). The central idea is an extremely innovative variety of interpretationism about belief and desire. Williams is inspired by David Lewis but departs significantly from him. I am sympathetic to interpretationism for some basic beliefs and desires. However, I will raise three worries for Williams’s version (§2–4). Then, I will suggest a modified version (§5). I will conclude with a general question (§6).
The pattern of implicatures of modified numeral ‘more than n’ depends on the roundness of n. Cummins, Sauerland, and Solt (2012) present experimental evidence for the relation between roundness and implicature patterns, and propose a pragmatic account of the phenomenon. More recently, Hesse and Benz (2020) present more extensive evidence showing that implicatures also depend on the magnitude of n and propose a novel explanation based on the Approximate Number System (Dehaene, 1999). Despite the wealth of experimental data, no formal account has yet been proposed to characterize the full posterior distribution over numbers of a listener after hearing ‘more than n’. We develop one such account within the Rational Speech Act framework, quantitatively reconstructing the pragmatic reasoning of a rational listener. We show that our pragmatic account correctly predicts various features of the experimental data.
Scholars, journalists, and activists working on climate change often distinguish between “individual” and “structural” approaches to decarbonization. The former concern choices individuals can make to reduce their “personal carbon footprint” (e.g., eating less meat). The latter concern changes to institutions, laws, and other social structures. These two approaches are often framed as oppositional, representing a mutually exclusive forced choice between alternative routes to decarbonization. After presenting representative samples of this oppositional framing of individual and structural approaches in environmental communication, we identify four problems with oppositional thinking and propose five ways to conceive of individual and structural reform as symbiotic and interdependent.