Augustine is commonly interpreted as endorsing an extramission theory of perception in De quantitate animae. A close examination of the text shows, instead, that he is committed to its rejection. I end with some remarks about what it takes for an account of perception to be an extramission theory and with a review of the strength of evidence for attributing the extramission theory to Augustine on the basis of his other works.
This paper concerns an investigation of the conceptual spaces account of graded membership in the case of gradable adjectives. Douven and collaborators have shown that the degree of membership of an item intermediate between two color categories (green vs. blue) or two shape categories (vase vs. bowl) can be derived from the categories’ typical instances. An issue left open is whether the conceptual spaces approach can account for graded membership in more abstract categories. In this paper we consider dimensional adjectives such as tall and expensive, for which the notion of prototypicality is more problematic. We present the results of an empirical study showing that the account can be extended successfully to that class, taking advantage of systematic relations of antonymy in those adjectives. The approach’s assumption that typical instances of a category are equally typical and its ability to account for inter-individual differences in degree membership are discussed.
This book looks interesting:
• David S. Wilson and Alan Kirman, editors, Complexity and Evolution: Toward a New Synthesis for Economics, MIT Press, Cambridge Mass., 2016. You can get some chapters for free here. …
Head of Competence Center for Methodology and Statistics (CCMS)
Luxembourg Institute of Health
Automatic for the people? Not quite
What caught my eye was the estimable (in its non-statistical meaning) Richard Lehman tweeting about the equally estimable John Ioannidis. …
Common-sense and traditional metaphysics alike accord shadows a secondary status in the order of things, relegating them from the first rank of genuine substances. Recall, for example, Shirley’s famous lyric: “The Glories of our blood and state // Are shadows, not substantial things”. Or how in Shakespeare’s play, Marcus Andronicus, bemoans of his brother, Titus, that “grief has so wrought on him, He takes false shadows for true substances” (III.ii.79-80).
Finlay (2006) and Schroeder (2007) have developed two similar probabilistic accounts of promotion. According to their views, promoting a desire is increasing its probability of being realized (relative to some baseline). Behrends and DiPaolo (2011) have formulated an argument against understanding promotion in purely probabilistic terms. The same argument was later taken up (and further elaborated) by Coates (2014) and Sharadin (2015), who both develop their own understandings of promotion based on the criticism the argument delivers. Recently, in more exchange about the issue, interesting alternative accounts of promotion have been proposed and more problems regarding a purely probabilistic understanding have been brought to the fore. However, here, I would like to turn back the clock a little and call attention to a problem I see with the original argument against probabilistic accounts. More precisely, I am going to argue that the criticism as presented in Behrends and DiPaolo (2011) and in Sharadin (2015) fails. My argument is based on what I am going to call the Humean Core Idea.
Analogical reasoning addresses the question how evidence from various phenomena can be amalgamated and made relevant for theory development and prediction. In the first part of my contribution, I review some influential accounts of analogical reasoning, both historical and contemporary, focusing in particular on Keynes, Carnap, Hesse, and more recently Bartha. In the second part, I sketch a general framework. To this purpose, a distinction between a predictive and a conceptual type of analogical reasoning is introduced. I then take up a common intuition according to which (predictive) analogical inferences hold if the differences between source and target concern only irrelevant circumstances. I attempt to make this idea more precise by addressing possible objections and in particular by specifying a notion of causal irrelevance based on difference making in homogeneous contexts.
It is widely believed that the semantic contents of some linguistic and mental representations are determined by factors independent of a person’s bodily makeup. Arguments derived from Hilary Putnam’s seminal Twin Earth thought experiment have been especially influential in establishing that belief. I claim that there is a neglected version of the mind-body relation which undermines those arguments and also excludes the possibility of zombies. It has been neglected because it is counterintuitive but I show that it can nonetheless be intelligibly worked out in detail and all obvious objections met. This suggests that we may be faced with a choice between embracing a counterintuitive interpretation of the mind-body relation or accepting that a currently very promising theory in cognitive science, Prediction Error Minimization, faces a fundamental problem. Furthermore, blocking that threat entails that any physicalist/materialst theory of mind is freed from the spectre of zombie worlds. The proposal also makes the ideas of personal teleportation of mind uploading more plausible.
This paper provides a critical guide to the literature concerning the answer to the question: when does a quantum experiment have an result? This question was posed and answered by Rovelli (Rovelli ) and his proposal was critiqued by Oppenheim, Reznick and Unruh (Oppenheim et al. ), who also suggest another approach that (as they point out) leads to the quantum Zeno effect. What these two approaches have in common is the idea that a question about the time at which an event occurs can be answered through the instantaneous measurement of a projector (in Rovelli’s case, a single measurement; in that of Oppenheim et al. , a repeated measurement). However, the interpretation of a projection as an instantaneous operation that can be performed on a system at a time of the experimenter’s choosing is problematic, particularly when it is the time of the outcome of the experiment that is at issue.
A striking characteristic of the highly successful techniques in molecular biology is that they are derived from natural systems. RNA interference (RNAi), for example, utilises a mechanism that evolved in eukaryotes to destroy foreign nucleic acid. Other examples include restriction enzymes, the polymerase chain reaction, green fluorescent protein and CRISPR-Cas. I propose that biologists exploit natural molecular mechanisms for their effectors’ (protein or nucleic acid) activity and biological specificity (protein or nucleic acid can cause precise reactions). I also show that the developmental trajectory of novel techniques in molecular biology, such as RNAi, is four characteristic phases. The first phase is discovery of a biological phenomenon, typically as curiosity driven research. The second is identification of the mechanism’s trigger(s), the effector and biological specificity. The third is the application of the technique. The final phase is the maturation and refinement of the molecular biology technique. The development of new molecular biology techniques from nature is crucial for biological research. These techniques transform scientific knowledge and generate new knowledge.
The extended mind thesis claims that at least some cognitive processes extend beyond the organism’s brain in that they are constituted by the organism’s actions on its surrounding environment. A more radical move would be to claim that social actions performed by the organism could at least constitute some of its mental processes. This can be called the socially extended mind thesis. Based on the notion of affordance as developed in the ecological psychology tradition, I defend the position that perception extends to the environment. Then I will expand the notion of affordance to encompass social affordances. Thus, perception can in some situations also be socially extended.
The popular impression of Bohmian mechanics is that it is standard quantum mechanics with the addition of some extra gadgets— exact particle positions and a guiding equation for particle trajectories— the advantages being that the gadgets pave the way for a resolution of the measurement problem that eschews state vector reduction while restoring the determinism lost in standard quantum mechanics. In fact, the Bohmian mechanics departs in signi…cant ways from standard quantum mechanics. By itself this is not a basis for criticism; indeed, it makes Bohmian mechanics all the more interesting. But Bohmian mechanics is not, as the popular impression would have it, empirically equivalent to standard quantum mechanics in terms of probabilistic predictions for the outcomes of measurements of quantum observables. Indeed, in physically important applications to systems for which standard quantum mechanics delivers empirically well-con…rmed probabilistic predictions, the sophisticated form of Bohmian mechanics designed to prove the global existence of Bohmian particle trajectories fails to deliver unequivocal predictions— of even a probabilistic variety— for the future behavior of said systems. Possible responses to this lacuna are discussed.
The idea that the quantum probabilities are best construed as the personal/subjective degrees of belief of Bayesian agents is an old one. In recent years the idea has been vigorously pursued by a group of physicists who ‡y the banner of quantum Bayesianism (QBism). The present paper aims to identify the prospects and problems of implementing QBism, and it critically assesses the claim that QBism provides a resolution (or dissolution) of some of the long standing foundations issues in quantum mechanics, including the measurement problem and puzzles of non-locality.
The biological sciences have always proven a fertile ground for philosophical analysis, one from which has grown a rich tradition stemming from Aristotle and flowering with Darwin. And although contemporary philosophy is increasingly becoming conceptually entwined with the study of the empirical sciences with the data of the latter now being regularly utilised in the establishment and defence of the frameworks of the former, a practice especially prominent in the philosophy of physics, the development of that tradition hasn‟t received the wider attention it so thoroughly deserves. This review will briefly introduce some recent significant topics of debate within the philosophy of biology, focusing on those whose metaphysical themes (in everything from composition to causation) are likely to be of wide-reaching, cross-disciplinary interest.
I argue that there is an important similarity between causation and grounding. In particular I argue that, just as there is a type of scientific explanation that appeals to causal mechanisms—causal-mechanical explanation—there is a type of metaphysical explanation that appeals to grounding mechanisms—grounding-mechanical explanation. The upshot is that the role that grounding mechanisms play in certain metaphysical explanations mirrors the role that causal mechanisms play in certain scientific explanations. In this light, it becomes clear that grounding-mechanical explanations make crucial contributions to the evaluation of a variety of important philosophical theses, including priority monism and physicalism.
Neyman April 16, 1894 – August 5, 1981
For my final Jerzy Neyman item, here’s the post I wrote for his birthday last year:
A local acting group is putting on a short theater production based on a screenplay I wrote: “Les Miserables Citations” (“Those Miserable Quotes”) . …
Here are two technical problems with consciousness causes collapse (ccc) interpretations of quantum mechanics. In both, suppose a quantum experiment with two possible outcomes, A and B, of equal probability 1/2. …
Polynesian sailors developed elaborate techniques for long-distance sea travel long before their European counterparts. They mapped out the elevation of the stars; they followed the paths of migrating birds; they observed sea swells and tidal patterns. …
Much recent work in modal epistemology assumes a kind of modal realism according to which reality includes basic modal elements—basic capacities, essences, counterfactuals, etc., which are simply out there, waiting to be discovered. Alternative views of modality put modal epistemology in a very different light. On the reductionist Humean view championed by Lewis (e.g. [Lewis 1986b], [Lewis 1986a], [Lewis 1994]), modal statements express ultimately non-modal propositions concerning the spatiotemporal distribution of categorical properties, and thus modal knowledge is not knowledge of special modal facts. On conventionalist accounts (like [Ayer 1936] or [Sidelle 1989]), modal knowledge is presumably knowledge of linguistic conventions. On projectivist accounts (like [Skyrms 1980] or [Blackburn 1986]), modal knowledge is not knowledge of genuine objective facts at all.
April 16, 1894 – August 5, 1981
I’ll continue to post Neyman-related items this week in honor of his birthday. This isn’t the only paper in which Neyman makes it clear he denies a distinction between a test of statistical hypotheses and significance tests. …
Today is Jerzy Neyman’s birthday. I’ll post various Neyman items this week in honor of it, starting with a guest post by Aris Spanos. Happy Birthday Neyman! A. Spanos
A Statistical Model as a Chance Mechanism
Jerzy Neyman (April 16, 1894 – August 5, 1981), was a Polish/American statistician[i] who spent most of his professional career at the University of California, Berkeley. …
In our ASSC20 symposium, “Does unconscious perception really exist?”, the four of us asked some difficult questions about the purported phenomenon of unconscious perception, disagreeing on a number of points. This disagreement reflected the objective of the symposium: not only to come together to discuss a single topic of keen interest to the ASSC community, but to do so in a way that would fairly and comprehensively represent the heterogeneity of ideas, opinions, and evidence that exists concerning this contentious topic. The crux of this controversy rests in no small part on disagreement about what is meant by the terms of the debate and how to determine empirically whether a state is unconscious or not.
Similar to other complex behaviors, language is dynamic, social, multimodal, patterned, and purposive, its purpose being to promote desirable actions or thoughts in others and self (Edelman, 2017b). An analysis of the functional characteristics shared by complex sequential behaviors suggests that they all present a common overarching computational problem: dynamically controlled constrained navigation in concrete or abstract situation spaces. With this conceptual framework in mind, I compare and contrast computational models of language and evaluate their potential for explaining linguistic behavior and for elucidating the brain mechanisms that support it.
The current issue of Philosophy Now has a little article on Carnap by one Alistair MacFarlane, a Scottish electrical engineer who has held a number of academic administrative posts. To judge by a few of the details he relates about Carnap’s life, he seems to have known or met Carnap personally, though he also commits a surprising number of factual errors. …
In the recent second edition of William Seager’s book Theories of Consciousness: An Introduction and Assessment he addresses some of my work on the higher-order theory. I haven’t yet read the entire book but he seems generally very skeptical of higher-order theories, which is fine. …
The theory of evolution by natural selection is, perhaps, the crowning
intellectual achievement of the biological sciences. There is,
however, considerable debate about which entity or entities are
selected and what it is that fits them for that role. This article
aims to clarify what is at issue in these debates by identifying four
distinct, though often confused, concerns and then identifying how the
debates on what constitute the units of selection depend to a
significant degree on which of these four questions a thinker regards
In philosophy of statistics, Deborah Mayo and Aris Spanos have championed the following epistemic principle, which applies to frequentist tests: Severity Principle (full). Data x (produced by process G) provides good evidence for hypothesis H (just) to the extent that test T severely passes H with x . (Mayo and Spanos 2011, p.162). They have also devised a severity score that is meant to measure the strength of the evidence by quantifying the degree of severity with which H passes the test T (Mayo and Spanos 2006, 2011; Spanos 2013). That score is a real number defined on the interval [0,1]. That score is particularly high for hypotheses that are substantially different from the null-hypothesis when a significant result is obtained by using an under-powered test. This means that such hypotheses are very well supported by the evidence according to that measure. However, it is now well documented that significant tests with low power display inflated effect sizes. They systematically show departures from the null hypothesis H0 that are much greater than they really are:”theoretical considerations prove that when true discovery is claimed based on crossing a threshold of statistical significance and the discovery study is underpowered, the observed effects are expected to be inflated”(Ioannidis 2008, p.640) This is problematic in research contexts where the differences between H0 and H1 is particularly small and where the sample size is also small. See(Button et al. 2013; Ioannidis 2008; Gelman and Carlin 2014) for examples).
Traditionally, theories of mindreading have focused on the representation of beliefs and desires. However, decades of social psychology and social neuroscience have shown that, in addition to reasoning about beliefs and desires, human beings also use representations of character traits to predict and interpret behavior. While a few recent accounts have attempted to accommodate these findings, they have not succeeded in explaining the relation between trait attribution and belief-desire reasoning. On the account I propose, character-trait attribution is part of a hierarchical system for action prediction, and serves to inform hypotheses about agents' beliefs and desires, which are in turn used to predict and interpret behavior.
It is a standard understanding that we live in time. In fact, the whole physical world as described in sciences is based on the idea of objective (not absolute) time. For centuries we have defined time ever so minutely, basing them on finer and finer event measurements (uncoiling springs to atomic clocks) that we do not even notice that we have made an inductive leap when it comes to time - we can measure time, so we experience time. In the current work I wish to critique this inductive leap and examine what it means to experience time. We are embodied and embedded cognitive agents, constrained by our body as well as in continuous interaction with our environment. So another way to ask the question of temporal experience would be - how embodied is time? I posit that experience of time spoken of in general literature is a linguistic construct, in that, the idea of experience of time overshadows the actual phenomenal contents of time perception. Moreover, time perception itself comes from a post-facto judgment of events. It has also been observed that the order of events in time can be altered to create an illusion of violation of causality itself. This points to the possibility that events are arranged in a temporal map that can be read off by higher cognitive substrates. In the current work we go on to explore the nature of such a map as it emerges from an embodied mind.
The demarcation between science and pseudoscience is part of the
larger task of determining which beliefs are epistemically warranted. This entry clarifies the specific nature of pseudoscience in relation
to other categories of non-scientific doctrines and practices,
including science denial(ism) and resistance to the facts. The major
proposed demarcation criteria for pseudo-science are discussed and
some of their weaknesses are pointed out. In conclusion, it is
emphasized that there is much more agreement on particular cases of
demarcation than on the general criteria that such judgments should be