What I call the active mind approach revolves around the claim that what is “on” a person’s mind is in an important sense brought on and held on to through the agent’s self-conscious rational activity. In the first part, I state the gist of this perspective in a deliberately strong way in order to create a touchstone for critical discussion. In the second part, I engage with two categories of our mental lives that seem to speak against construing the mind as active. First, I discuss affectivity, in particular emotion, and show that emotional episodes are active engagements. Second, I discuss habitual action, and in particular those manifestations of habit which are initially opaque to the agent. In my responses to both objections, the notion of a practical self-understanding will play a central role. The result will be a qualified defence and expansion of the active mind position.
A number of arguments purport to show that quantum field theory cannot be given an interpretation in terms of localizable particles. We show, in light of such arguments, that the classical ~ → 0 limit can aid our understanding of the particle content of quantum field theories. In particular, we demonstrate that for the massive Klein-Gordon field, the classical limits of number operators can be understood to encode local information about particles in the corresponding classical field theory.
To make sense of large data sets, we often look for patterns in how data points are “shaped” in the space of possible measurement outcomes. The emerging field of topological data analysis (TDA) offers a toolkit for formalizing the process of identifying such shapes. This paper aims to discover why and how the resulting analysis should be understood as reflecting significant features of the systems that generated the data. I argue that a particular feature of TDA—its functoriality— is what enables TDA to translate visual intuitions about structure in data into precise, computationally tractable descriptions of real-world systems.
The distinguishability between pairs of quantum states, as measured by quantum fidelity, is formulated on phase space. The fidelity is physically interpreted as the probability that the pair are mistaken for each other upon an measurement. The mathematical representation is based on the concept of symplectic capacity in symplectic topology. The fidelity is the absolute square of the complex-valued overlap between the symplectic capacities of the pair of states. The symplec-tic capacity for a given state, onto any conjugate plane of degrees of freedom, is postulated to be bounded from below by the Gromov width h/2. This generalize the Gibbs-Liouville theorem in classical mechanics, which state that the volume of a region of phase space is invariant under the Hamiltonian flow of the system, by constraining the shape of the flow. It is shown that for closed Hamiltonian systems, the Schrodinger equation is the mathematical representation for the conservation of fidelity.
This paper reports the first empirical investigation of the hypothesis that epistemic appraisals form part of the structure of concepts. To date, studies of concepts have focused on the way concepts encode properties of objects, and the way those features are used in categorisation and in other cognitive tasks. Philosophical considerations show the importance of also considering how a thinker assesses the epistemic value of beliefs and other cognitive resources, and in particular, concepts.
According to one narrative about the history of the concept of emergence in metaphysics and philosophy of science, when British emergentists initially appealed to emergence in the early twentieth century, they aimed to lay the groundwork for a philosophy of nature that was supposed to constitute a middle course between two antagonistic worldviews: reductive physicalism and non-physicalist dualism. While reductive physicalism aims to establish that all concrete goings-on, ranging from social phenomena to biological and chemical processes, are reducible to fundamental physical states and processes explicated by, and invoked in, an ideal physics, non-physicalist dualism holds that some phenomena resist any kind of physical reducibility, and are radically autonomous vis-à-vis physical goings-on. The emergentist idea is that a more plausible way of making sense of the natural world is through accepting that some phenomena resist physical reduction, but that is not to say that such phenomena “float free” of the physical. Such phenomena are taken to be “emergent”, suggesting that there is an emergence relation between the emergent entities and their so-called physical “emergence bases”.
In a recent paper, Justin D’Ambrosio (2020) has offered an empirical argument in support of a negative solution to the puzzle of Macbeth’s dagger—namely, the question of whether, in the famous scene from Shakespeare’s play, Macbeth sees a dagger in front of him. D’Ambrosio’s strategy consists in showing that “seeing” is not an existence-neutral verb; that is, that the way it is used in ordinary language is not neutral with respect to whether its complement exists. In this paper, we offer an empirical argument in favor of an existence-neutral reading of “seeing”. In particular, we argue that existence-neutral readings are readily available to language users. We thus call into question D’Ambrosio’s argument for the claim that Macbeth does not see a dagger. According to our positive solution, Macbeth sees a dagger, even though there is not a dagger in front of him.
Effective altruism is based on a very simple idea: we should do the most good we can. Obeying the usual rules about not stealing, cheating, hurting, and killing is not enough, or at least not enough for those of us who have the good fortune to live in material comfort, who can feed, house, and clothe ourselves and our families and still have money or time to spare. …
In the 1920s, Ackermann and von Neumann, in pursuit of Hilbert’s Programme, were working on consistency proofs for arithmetical systems. One proposed method of giving such proofs is Hilbert’s epsilon-substitution method. There was, however, a second approach which was not reflected in the publications of the Hilbert school in the 1920s, and which is a direct precursor of Hilbert’s first epsilon theorem and a certain ‘general consistency result’ due to Bernays. An analysis of the form of this so-called ‘failed proof’ sheds further light on an interpretation of Hilbert’s Programme as an instrumentalist enterprise with the aim of showing that whenever a ‘real’ proposition can be proved by ‘ideal’ means, it can also be proved by ‘real’, finitary means.
On the basis of a coherently applied physicalist ontology, I will argue that there is nothing conceptual in logic and mathematics. What we usually call “mathematical concepts”—from the most exotic ones to the most “evident” ones—are just names tagged to various elements of mathematical formalism. In fact they have nothing to do with concepts, as they have nothing to do with the actual things; they can be completely ignored by both philosophy and physics.
ABSTRACT: Many artists, art critics, and poets suggest that an aesthetic appreciation of artworks may modify our perception of the world, including quotidian things and scenes. I call this Art-to-World, AtW. Focusing on visual artworks, in this paper I articulate an empirically-informed account of AtW that is based on content-related views of aesthetic experience, and on Goodman’s and Elgin’s concept of exemplification. An aesthetic encounter with artworks demands paying attention to its aesthetic, expressive, or design properties that realize its purpose. Attention to these properties make percipients better able to spot them in other entities and scenes as well. The upshot is that an aesthetic commerce with artworks enlarges the scope of what we are able to see and has therefore momentous epistemic consequences.
Leo Strauss was a twentieth-century German Jewish émigré
to the United States whose intellectual corpus spans ancient, medieval
and modern political philosophy and includes, among others, studies of
Plato, Maimonides, Machiavelli, Hobbes, Spinoza, and Nietzsche. Strauss wrote mainly as a historian of philosophy and most of his
writings take the form of commentaries on important thinkers and their
writings. Yet as he put it: “There is no inquiry into the
history of philosophy that is not at the same time a
philosophical inquiry” (PL, p. 41). While much of his
philosophical project involved an attempt to rethink pre-modern
philosophy, the impetus for this reconsideration and the philosophical
problems that vexed Strauss most were decidedly modern.
Early modern philosophy in Europe and Great Britain is awash with
discussions of the emotions: they figure not only in philosophical
psychology and related fields, but also in theories of epistemic
method, metaphysics, ethics, political theory and practical reasoning
in general. Moreover, interest in the emotions links philosophy with
work in other, sometimes unexpected areas, such as medicine, art,
literature, and practical guides on everything from child-rearing to
the treatment of subordinates. Because of the breadth of the topic,
this article can offer only an overview, but perhaps it will be enough
to give some idea how philosophically rich and challenging the
conception of the emotions was in this period.
Penelope Maddy’s Second Philosophy is one of the most well-known approaches in recent philosophy of mathematics. She applies her second-philosophical method to analyze mathematical methodology by reconstructing historical cases in a setting of means-ends relations. However, outside of Maddy’s own work, this kind of methodological analysis has not yet been extensively used and analyzed. In the present work, we will make a first step in this direction. We develop a general framework that allows us to clarify the procedure and aims of the Second Philosopher’s investigation into set-theoretic methodology; provides a platform to analyze the Second Philosopher’s methods themselves; and can be applied to further questions in the philosophy of set theory.
Joseph Henrich's ambitious tome, The WEIRDest People in the World, is driving me nuts. It's good enough and interesting enough that I want to read it. Henrich's general idea is that people in Western, Educated, Industrial, Rich, Democratic (WEIRD) societies differ psychologically from people in more traditionally structured societies, and that the family policies of the Catholic Church in medieval Europe lie at the historical root of this difference. …
People care very much about being listened to. In everyday talk, we make moral-sounding judgements of people as listeners: praising a doctor who listens well even if she does not have a ready solution, or blaming a boss who does not listen even if the employee manages to get her situation addressed. In this sense, listening is a normative behaviour: that is, we ought to be good listeners. Whilst several disciplines have addressed the normative importance of interpersonal listening—particularly in sociology, psychology, media and culture studies— analytic philosophy does not have a framework for dealing with listening as a normative interpersonal behaviour. Listening usually gets reduced mere speech-parsing (in philosophy of language), or into a matter of belief and trust in the testimony of credible knowers (in social epistemology). My preliminary task is to analyse why this reductive view is taken for granted in the discipline; to diagnose the problem behind the reduction and propose a more useful alternative approach.
The article at hand presents the results of a literature review on the ethical issues related to scientific authorship. These issues are understood as questions and/or concerns about obligations, values or virtues in relation to reporting, authorship and publication of research results. For this purpose, the Web of Science core collection was searched for English resources published between 1945 and 2018, and a total of 324 items were analyzed. Based on the review of the documents, ten ethical themes have been identified, some of which entail several ethical issues. Ranked on the basis of their frequency of occurrence these themes are: 1) attribution, 2) violations of the norms of authorship, 3) bias, 4) responsibility and accountability, 5) authorship order, 6) citations and referencing, 7) definition of authorship, 8) publication strategy, 9) originality, and 10) sanctions. In mapping these themes, the current article explores major ethical issue and provides a critical discussion about the application of codes of conduct, various understandings of culture, and contributing factors to unethical behavior.
This article is concerned with one of the notable but forgotten research strands that developed out of French nineteenth-century positivism, a strand that turned attention to the study of scientific discovery and was actively pursued by French epistemologists around the turn of the nineteenth century. I first sketch the context in which this research program emerged. I show that the program was a natural offshoot of French neopositivism; the latter was a current of twentieth-century thought that, even if implicitly, challenged the positivism of first-generation positivists such as Comte. I then survey what French epistemologists—including Ernest Naville, Élie Rabier, Pierre Duhem, Édouard Le Roy, Abel Rey, André Lalande, Théodule-Armand Ribot, Edmond Goblot, and Jacques Picard, among others—had to say about the logic, psychology, and sociology of discovery. My story demonstrates the inaccuracy of existing historical accounts of the philosophical study of scientific discovery.
It has been claimed that a unique feature of human culture is that it accumulates beneficial modifications over time. On the basis of a couple of methodological considerations, we here argue that, perhaps surprisingly, there is insufficient evidence for a proper test of this claim. And we indicate what further research would be needed to firmly establish the cumulativeness of human culture.
Tommaso Campanella (Stilo, 1568–Paris, 1639) was one of the
most important philosophers of the late Renaissance. Although his
best-known work today is the utopian text La città del
Sole (The City of the Sun), his thought was extremely
complex and engaged with all fields of learning. The fundamental core
of his thinking, which will be examined in this article, was concerned
with the philosophy of nature (what would nowadays be called science),
magic, political theory and natural religion.
It seems plausible that visual experiences of darkness have perceptual, phenomenal content which clearly differentiates them from absences of visual experiences. I argue, relying on psychological results concerning auditory attention, that the analogous claim is true about auditory experiences of silence. More specifically, I propose that experiences of silence present empty spatial directions like ‘right’ or ‘left’, and so have egocentric spatial content. Furthermore, I claim that such content is genuinely auditory and phenomenal in the sense that one can, in principle, recognize that she is experiencing silence. This position is far from obvious as the majority of theories concerning silence perception do not ascribe perceptual, phenomenal content to experiences of silence.
Over the past three years, I have returned to one question over and over again: how does technology reshape our moral beliefs and practices? In his classic study of medieval technology, Lynn White Jr argues that simple technological changes can have a profound effect on social moral systems. …
The most striking feature of Autrecourt’s academic career is his
condemnation in 1347. In almost every history of medieval philosophy,
his censure is presented as one of the most important events in
fourteenth-century Paris. In the older literature, Autrecourt’s
views have become linked to allegedly skeptical tendencies in
scholastic thought, and have been unduly shadowed by assumptions about
their relation to the views of William of Ockham. Over the last two
decades, however, it has become apparent that the study of
Autrecourt’s thought has been wrongly placed in the larger
context of the battle against Ockhamism at the University of Paris in
the years 1339–1347.
It is one of the great good fortunes of my life that I was able to count Dick as a friend for almost 40 years. I first met him shortly after I arrived at the University in 1975 as a new assistant professor in the Philosophy Department. I moved to California in 1999, but the friendship continued at a distance after that.
Amodal completion is the representation of those parts of the perceived object that we get no sensory stimulation from. While amodal completion is rife and plays an essential role in all sense modalities, philosophical discussions of this phenomenon have almost entirely been limited to vision. The aim of this paper is to examine in what sense we can talk about amodal completion in olfaction. We distinguish three different senses of amodal completion – spatial, temporal and feature-based completion – and argue that all three are present and play a significant role in olfaction.
This paper explores the feasibility of offering a restorative justice (RJ) approach in cases of domestic violence (DV). I argue that widely used RJ processes—such as ‘conferencing’ —are unlikely to be sufficiently safe or effective in cases of DV, at least as these processes are standardly designed and practiced (Sections 1-6). I then support the view that if RJ is to be used in cases of DV, then new specialist processes will need to be co-designed with key stakeholders to ensure they embody not only RJ principles, but also feminist theory and the concept of transformative justice (Section 7).
Formal criteria of theoretical equivalence are mathematical mappings between specific sorts of mathematical objects, notably including those objects used in mathematical physics. Proponents of formal criteria claim that results involving these criteria have implications that extend beyond pure mathematics. For instance, they claim that formal criteria bear on the project of using our best mathematical physics as a guide to what the world is like, and also have deflationary implications for various debates in the metaphysics of physics. In this paper, I investigate whether there is a defensible view according to which formal criteria have significant non-mathematical implications, of these sorts or any other, reaching a chiefly negative verdict. Along the way, I discuss various foundational issues concerning how we use mathematical objects to describe the world when doing physics, and how this practice should inform metaphysics. I diagnose the prominence of formal criteria as stemming from contentious views on these foundational issues, and endeavor to motivate some alternative views in their stead. Formal criteria of theoretical equivalence are mathematical mappings between specific sorts of mathematical objects, such as sets of sentences (understood as syntactic strings), or sets of mathematical models, or categories of mathematical models (in the sense of category theory). Philosophers of science working on such criteria first associate different physical theories with some such mathematical objects. They then use theorems about which of these mathematical objects stand in one of these mathematical mappings to each other in order to draw conclusions about which physical theories are (or fail to be) “theoretically equivalent”.
In this paper, I argue that the philosophy of science has not paid enough attention to the future of science. Even though the philosophy of science has deepened our understanding of science, explicit conceptual tools to understand the estimating of possible futures of science are missing from its repertoire. I argue that the philosophy of science can achieve two main objectives of the futures research: enhancing understanding and challenging conventional thinking. While there are legitimate concerns about the epistemic and ethical impossibility of predicting scientific innovations and discoveries, it is nevertheless possible to investigate a wide range of questions concerning the future of science. I sketch structural taxonomies as a tool for the estimating of possible futures of science. A structural taxonomy is a map of scenarios that are possible according to some philosophical theory of science. I show how the merits of such taxonomies can be assessed and how the assessment sheds new light on the existing philosophy of science. I conclude by noting that future-oriented thinking is highly valuable for our current understanding of science.
The value-ladenness of computer algorithms is typically framed around issues of epistemic risk. In this paper, I examine a deeper sense of value-ladenness: algorithmic methods are not only themselves value-laden, but also introduce value into how we reason about their domain of application. I call this domain distortion. In particular, using insights from jurisprudence, I show that the use of recidivism risk assessment algorithms (1) presupposes legal formalism and (2) blurs the distinction between liability assessment and sentencing, which distorts how the domain of criminal punishment is conceived and provides a distinctive avenue for values to enter the legal process.
This article surveys the debate focused on manipulation arguments against the compatibility of moral responsibility in the basic desert sense and the naturalistic determination of an action by factors beyond the agent’s control. Manipulation arguments draw an analogy between such causal determination and intentional deterministic manipulation by other agents, claiming that because the intentional determination precludes responsibility, so does causal determination. The dialectical structure of these arguments is analyzed, and the main sorts of objections are discussed. Compatibilist responses to manipulation arguments can be divided into two categories. Soft-line replies do not resist the intuition that the manipulated agent is not morally responsible, and instead aim to show that an agent who is merely naturalistically determined differs from the manipulated agent in a way relevant to moral responsibility. Hard-line replies, by contrast, resist the intuition, essential to the incompatibilist’s case, that the intentionally and deterministically manipulated agent is not morally responsible. Versions of each type of reply are critically assessed.