The concept of wild food does not play a significant role in contemporary nutritional science and it is seldom regarded as a salient feature within standard dietary guidelines. The knowledge systems of wild edible taxa are indeed at risk of disappearing. However, recent scholarship in ethnobotany, field biology, and philosophy demonstrated the crucial role of wild foods for food biodiversity and food security. The knowledge of how to use and consume wild foods is not only a means to deliver high-end culinary offerings, but also a way to foster alternative models of consumption. Our aim in this paper is to provide a conceptual framework for wild foods, which can account for diversified wild food ontologies. In the first section of the paper, we survey the main conception of wild foods provided in the literature, what we call the Nature View. We argue that this view falls short of capturing characteristics that are core to a sound account of wilderness in a culinary sense. In the second part of the paper, we provide the foundation for an improved model of wild food, which can countenance multiple dimensions and degrees characterizing wilderness in the culinary world. In the third part of the paper we argue that thanks to a more nuanced ontological analysis, the gradient framework can serve ethnobiologists, philosophers, scientists, and policymakers to represent and negotiate theoretical conflicts on the nature of wild food.
Quantum Field Theory (QFT) is the mathematical and conceptual
framework for contemporary elementary particle physics. It is also a
framework used in other areas of theoretical physics, such as
condensed matter physics and statistical mechanics. In a rather
informal sense QFT is the extension of quantum mechanics (QM), dealing
with particles, over to fields, i.e. systems with an infinite number
of degrees of freedom. (See the entry on
quantum mechanics.) In the last decade QFT has become a more widely discussed
topic in philosophy of science, with questions ranging from
methodology and semantics to ontology.
My aims in this essay are two. First (§§1-4), I want to get clear on the very idea of a theory of the history of philosophy, the idea of an overarching account of the evolution of philosophical reflection since the inception of written philosophy. And secondly (§§5-8), I want to actually sketch such a global theory of the history of philosophy, which I call the two-streams theory.
Inspired by work of Stefano Zambelli on these topics, this paper the complex nature of the relation between technology and computability. This involves reconsidering the role of computational complexity in economics and then applying this to a particular formulation of the nature of technology as conceived within the Sraffian framework. A crucial element of this is to expand the concept of technique clusters. This allows for understanding that the set of possible techniques is of a higher cardinality of infinity than that of the points on a wage-profit frontier. This is associated with potentially deep discontinuities in production functions and a higher form of uncertainty involved in technological change and growth.
We present an objection to Beall and Henderson’s recent paper defending a solution to the fundamental problem of conciliar Christology using qua or secundum clauses. We argue that certain claims the acceptance/rejection of which distinguish the Conciliar Christian from others fail to so distinguish on Beall and Henderson’s 0- Qua view. This is because on their 0-Qua account, these claims are either acceptable both to Conciliar Christians as well as those who are not Conciliar Christians or because they are acceptable to neither.
Art works are artefacts and, like all artefacts, are the product of agency. How important is that for our engagement with them? For many artefacts, agency hardly matters. The paperclips on my desk perform their function without me having to think of them as the outputs of agency, though I might on occasion admire their design. But for those artefacts we categorise as works of art, the connection is important: if I treat something as art I need to see how it manifests the choices, preferences, actions and sensibilities of the maker. I am not asked to see it simply as a record of those things. The work is not valuable merely as a conduit to the qualities of the maker; it has final value and not merely instrumental value. Its value depends on its relation to the maker; in Korsgaard’s terms it is value that is final and extrinsic.
Relying on some auxiliary assumptions, usually considered mild, Bell’s theorem proves that no local theory can reproduce all the predictions of quantum mechanics. In this work, we introduce a fully local, superdeterministic model that, by explicitly violating settings independence—one of these auxiliary assumptions, requiring statistical independence between measurement settings and systems to be measured—is able to reproduce all the predictions of quantum mechanics. Moreover, we show that, contrary to widespread expectations, our model can break settings independence without an initial state that is too complex to handle, without visibly losing all explanatory power and without outright nullifying all of experimental science. Still, we argue that our model is unnecessarily complicated and does not offer true advantages over its non-local competitors. We conclude that, while our model does not appear to be a viable contender to their non-local counterparts, it provides the ideal framework to advance the debate over violations of statistical independence via the superdeterministic route.
Fragmentalism was originally introduced as a new A-theory of time. It was further refined and discussed, and different developments of the original insight have been proposed. In a celebrated paper, Jonathan Simon contends that fragmentalism delivers a new realist account of the quantum state—which he calls conservative realism—according to which: (i) the quantum state is a complete description of a physical system; (ii) the quantum (superposition) state is grounded in its terms, and (iii) the superposition terms are themselves grounded in local goings-on about the system in question. We will argue that fragmentalism, at least along the lines proposed by Simon, does not offer a new, satisfactory realistic account of the quantum state. This raises the question about whether there are some other viable forms of quantum fragmentalism.
This paper develops Richard Wollheim’s claim that the proper appreciation of a picture involves not only enjoying a seeing-in experience but also abiding by a standard of correctness. While scholars have so far focused on what fixes the standard, thereby discussing the alternative between intentions and causal mechanisms, the paper focuses on what the standard does, that is, establishing which kinds, individuals, features and standpoints are relevant to the understanding of pictures. It is argued that, while standards concerning kinds, individuals and features can be relevant also to ordinary perception, standards concerning standpoints are specific to pictorial experience. Drawing on all this, the paper proposes an ontology of depiction according to which a picture is constituted by both its visual appearance and its standard of correctness.
Proclus of Athens (*412–485 C.E.) was the most authoritative
philosopher of late antiquity and played a crucial role in the
transmission of Platonic philosophy from antiquity to the Middle Ages. For almost fifty years, he was head or ‘successor’
(diadochos, sc. of Plato) of the Platonic
‘Academy’ in Athens. Being an exceptionally productive
writer, he composed commentaries on Aristotle, Euclid and Plato,
systematic treatises in all disciplines of philosophy as it was at
that time (metaphysics and theology, physics, astronomy, mathematics,
ethics) and exegetical works on traditions of religious wisdom
(Orphism and Chaldaean Oracles).
I argue that in addressing worries about the validity and reliability of implicit measures of social cognition, theorists should draw on research concerning “entitativity perception.” In brief, an aggregate of people is perceived as highly “entitative” when its members exhibit a certain sort of unity. For example, think of the difference between the aggregate of people waiting in line at a bank versus a tight-knit group of friends: the latter seems more “groupy” than the former. I start by arguing that entitativity perception modulates the activation of implicit biases and stereotypes. I then argue that recognizing this modulatory role will help researchers to address concerns surrounding the validity and reliability of implicit measures.
6. We desire love as a function of the relational nature of our being. Ontologically, we are not complete or sufficient unto ourselves. We do not and cannot provide the 'space' (both physical and emotional) we must occupy in order to be what and as we are.
In the year 2000, in a paper titled Quantum Theory Needs No ‘Interpretation’, Chris Fuchs and Asher Peres presented a series of instrumentalist arguments against the role played by ‘interpretations’ in QM. Since then —quite regardless of the publication of this paper— the number of interpretations has experienced a continuous growth constituting what Adán Cabello has characterized as a “map of madness”. In this work, we discuss the reasons behind this dangerous fragmentation in understanding and provide new arguments against the need of interpretations in QM which —opposite to those of Fuchs and Peres— are derived from a representational realist understanding of theories —grounded in the writings of Einstein, Heisenberg and Pauli. Furthermore, we will argue that there are reasons to believe that the creation of ‘interpretations’ for the theory of quanta has functioned as a trap designed by anti-realists in order to imprison realists in a labyrinth with no exit. Taking as a standpoint the critical analysis by David Deutsch to the anti-realist understanding of physics, we attempt to address the references and roles played by ‘theory’ and ‘observation’. In this respect, we will argue that the key to escape the anti-realist trap of interpretation is to recognize that —as Einstein told Heisenberg almost one century ago— it is only the theory which can tell you what can be observed. Finally, we will conclude that what QM needs is not a new interpretation but instead, a theoretical (formal-conceptual) consistent, coherent and unified scheme which allows us to understand what the theory is really talking about. Key-words: Interpretation, explanation, representation, quantum theory.
This article sheds light on a response to experimental philosophy that has not yet received enough attention: the reflection defense. According to proponents of this defense, judgments about philosophical cases are relevant only when they are the product of careful, nuanced, and conceptually rigorous reflection. We argue that the reflection defense is misguided: We present five studies (N>1800) showing that people make the same judgments when they are primed to engage in careful reflection as they do in the conditions standardly used by experimental philosophers.
This paper argues that while the classical, essentialist conception of identity is appealing due to its simplicity, it does not adequately capture the complexity of professional or individual identity. The appeal to essentialism in librarianship contributes to some serious problems for the profession, such as exclusion and homogeneity in the workplace, high attrition rates of minority librarians, exploitation and alienation of an underrepresented workforce, as well as stereotyping. This paper examines the theoretical landscape with regard to the identity question and proposes a more fitting alternative to essentialism, namely the relational conception of identity, and engages in a philosophical argument for the adoption of the relational account as a theoretical grounding for an understanding of the complex, fluid, and emergent nature of librarian identity within our dynamic profession.
It has been argued in various places that measurement-induced collapses in Orthodox Quantum Mechanics yields a genuine structural (or intrinsic) quantum arrow of time. In this paper, I will critically assess this proposal. I begin by distinguishing between a structural and a non-structural arrow of time. After presenting the proposal of a collapse-based arrow of time in some detail and discussing some criticisms it has faced, I argue, first, that any quantum arrow of time in Orthodox Quantum Mechanics cannot be defined for the entire universe and, second, that it requires non-dynamical information to be established. Consequently, I deliver that any quantum arrow of time in Orthodox Quantum Mechanics is, at best, local and nonstructural, deflating the original proposal.
The notion of time reversal has caused some recent controversy in philosophy of physics. In this paper, I claim that the notion is more complex than usually thought. In particular, I contend that any account of time reversal presupposes, explicitly or implicitly, an answer to the following questions: (a) What is time-reversal symmetry predicated of? (b) What sorts of transformations should time reversal perform, and upon what? (c) What role does time-reversal symmetry play in physical theories? Each dimension, I argue, not only admits divergent answers, but also opens a dimension of analysis that feeds the complexity of time reversal: modal, metaphysical, and heuristic, respectively. The comprehension of this multi-dimensionality, I conclude, shows how philosophically rich the notion of time reversal is in philosophy of physics
A widespread view in physics holds that the implementation of time reversal in standard quantum mechanics must be given by an anti-unitary operator. In foundations and philosophy of physics, however, there has been some discussion about the conceptual grounds of this orthodoxy, largely relying on either its obviousness or its mathematical-physical virtues. My aim in this paper is to substantively change the traditional structure of the debate by highlighting the philosophical commitments underlying the orthodoxy. I argue the persuasive force of the orthodoxy greatly depends on a relationalist metaphysics of time and a by-stipulation view of time-reversal invariance. Only with such philosophical background can the orthodoxy of time reversal in standard quantum mechanics succeed and be properly justified.
This article addresses three questions concerning Kant's views on non-rational animals: do they intuit spatio-temporal particulars, do they perceive objects, and do they have intentional states? My aim is to explore the relationship between these questions and to clarify certain pervasive ambiguities in how they have been understood. I first disambiguate various nonequivalent notions of objecthood and intentionality: I then look closely at several models of objectivity present in Kant's work, and at recent discussions of representational and relational theories of intentionality. I argue ultimately that, given the relevant disambiguations, the answers to all three questions will likely be positive. These results both support what has become known as the nonconceptualist reading of Kant, and make clearer the price the conceptualist must pay to sustain his or her position.
I argue that the science of the soul only covers sublunary living things. Aristotle cannot properly ascribe ψυχή to unmoved movers since they do not have any capacities that are distinct from their activities or any matter to be structured. Heavenly bodies do not have souls in the way that mortal living things do, because their matter is not subject to alteration or generation. These beings do not fit into the hierarchy of soul powers that Aristotle relies on to provide unity to ψυχή. Their living consists in their activities, not in having a capacity for activity.
Have you ever disagreed with your government’s stance about some significant social, political, economic, or even philosophical issue? For example: Healthcare policy? Response to a pandemic? Gender inequality? Structural racism? Drilling in the Arctic? Fracking? Approving or vetoing a military intervention in a foreign country? Transgender rights? Exiting some multi-national political alliance (for instance, the European Union)? The building of a 20 billion dollar wall? We’re guessing the answer is most likely ’yes’.
Albert and Callender have challenged the received view that theories like classical electrodynamics and non-relativistic quantum mechanics are time-reversal invariant. According to their view of time-reversal invariance, these theories are not time-reversal invariant. If so, then the important metaphysical implication is that space-time must have a temporal orientation. There is a large debate on what is the best way of viewing time-reversal invariance, with many philosophers defending the standard notion contra Albert and Callender. In this paper, we will not be concerned so much with that aspect of the debate, but rather focus our attention on an aspect of the Albert and Callender view that has received little attention, namely the role of ontology. In the type of theories that are considered the ontology is actually underdetermined. We will argue that with a suitable choice of ontology, these theories are in fact time-reversal invariant according their view.
The movement toward scientific literacy aims to cultivate a public able to make informed decisions about science in their own lives (e.g., personal health, sustainable practices, &c.) and their support of social policies for themselves, rather than passively accepting information they are given. Many people continue learning about science — its discoveries, nature, ramifications on society, and so on — through generalist media sources such as newspapers. What are they apt to learn from such sources? This paper examines the ways in which print journalism (sampled from three prominent newspapers) in the 2010s presents science — investigating, in particular, to what extent these sources attend to the methodology or the social–institutional processes by which particular results come about. We make a case for the significance of this question in connection with the public’s understanding and trust of science.
Both scientists and philosophers of science have recently emphasized the importance of promoting transparency in science. For scientists, transparency is a way to promote reproducibility, progress, and trust in research. For philosophers of science, transparency can help address the value-ladenness of scientific research in a responsible way. Nevertheless, the concept of transparency is a complex one. Scientists can be transparent about many different things, for many different reasons, on behalf of many different stakeholders. This paper proposes a taxonomy that clarifies the major dimensions along which approaches to transparency can vary. By doing so, it provides several insights that philosophers and other science-studies scholars can pursue. In particular, it helps address common objections to pursuing transparency in science, it clarifies major forms of transparency, and it suggests avenues for further research on this topic.
Philosophers of science are increasingly interested in engaging with scientific communities, policymakers, and members of the public; however, the nature of this engagement has not been systematically examined. Instead of delineating a specific kind of engaged philosophy of science, as previous accounts have done, this paper draws on literature from outside the discipline to develop a framework for analyzing different forms of broadly engaged philosophy of science according to two key dimensions: social interaction and epistemic integration. Clarifying the many forms of engagement available to philosophers of science can advance future scholarship on engagement and promote more strategic engagement efforts.
Data scientists take large quantities of noisy measurements and transform them into tractable, qualitative descriptions of the phenomena being measured. While this frequently involves statistical methods, the burgeoning field of data science distinguishes itself from statistics by branching out to a wider range of methods from mathematics and computer science. One such distinctly non-statistical method of growing popularity is topological data analysis (TDA). Topology is the study of the properties of shapes that are invariant under continuous deformations, such as stretching, twisting, bending, or re-scaling, but not tearing or gluing. TDA aims to identify the essential “structure” of a data set as it “appears” in an abstract space of measurement outcomes. This paper is an attempt to reconstruct the reasoning given by data scientists as to why and how the resulting analysis should be understood as reflecting significant features of the systems that generated the data.
The concept of indistinguishable particles in quantum theory is fundamental to questions of ontology. All ordinary matter is made of electrons, protons, neutrons, and photons and they are all indistinguishable particles. Yet the concept itself has proved elusive, in part because of the interpretational difficulties that afflict quantum theory quite generally, and in part because the concept was so central to the discovery of the quantum itself, by Planck in 1900; it came encumbered with revolution.
This post about epistemic in justice and implicit bias by Susanna Siegel is the third post of this week’s series on An Introduction to Implicit Bias: Knowledge, Justice, and the Social Mind (Routledge, 2020). …
This paper is a clarification and development of my interpretation of Sartre’s theory of bad faith in response to Ronald Santoni’s sophisticated critique, published in the same issue. Santoni rightly points out that the central claim of my interpretation is that bad faith is a fundamental project manifested in all our other projects. This paper therefore begins with a clarification of Sartre’s conception of a project, followed by an explanation of his claim that one project is fundamental, grounding an elucidation of the idea that bad faith is a fundamental project. The paper then uses this to address the central themes of Santoni’s critique of my interpretation. I argue that Sartre does not consider us to be ontologically and congenitally disposed to bad faith. The prevalence of bad faith is explained, on my reading of Sartre, by the social pressure to conform to it, which is inherent in the project itself. Santoni is right that this cannot really explain the prevalence of bad faith, but this is a problem with Sartre’s theory, not a problem for my interpretation of it. I then defend my claim that Sartre’s notion of seriousness is merely a strategy of bad faith by outlining an alternative strategy that Sartre does not consider. Finally, I argue that Sartre is right to deny that bad faith is an inherently cynical project, even though it is manipulative and self-serving, and even though it can be cynically motivated.
This post about embodied cognition and implicit bias by Céline Leboeuf is the second post of this week’s series on An Introduction to Implicit Bias: Knowledge, Justice, and the Social Mind (Routledge, 2020). …