Augustine is commonly interpreted as endorsing an extramission theory of perception in De quantitate animae. A close examination of the text shows, instead, that he is committed to its rejection. I end with some remarks about what it takes for an account of perception to be an extramission theory and with a review of the strength of evidence for attributing the extramission theory to Augustine on the basis of his other works.
Higher-order evidence is, roughly, evidence of evidence. The idea is that evidence comes in levels. At the lowest level is evidence of the familiar type —evidence concerning some proposition that is not itself about evidence. At a higher level the evidence concerns some proposition about evidence at a lower level. Only in recent years has this less familiar type been the subject of epistemological focus, and the work on it remains relegated to a small circle of authors—far disproportionate to the attention it deserves. It deserves to occupy center stage for several reasons. First, higher-order evidence frequently arises in a diverse range of contexts, including testimony, disagreement, empirical observation, introspection, and memory, among others. Second, such evidence often plays a crucial epistemic role in these contexts. Third, the role it plays is complex, yields interesting epistemological puzzles, and therefore remains controversial and not yet fully understood. Although the ultimate goal of an investigation into higher-order evidence is to produce an account of its epistemic significance, my present concern is more fundamental. I have two primary sets of goals here. The first is expositional: to serve as an introduction for readers new to the topic. The second is argumentative: to establish that the existing characterizations of the concept of higher-order evidence and various related concepts are in dire need of refinement, to demonstrate that this lack of refinement is the source of major errors in the literature, and to provide the needed refinement to set the stage for further progress.
This paper concerns an investigation of the conceptual spaces account of graded membership in the case of gradable adjectives. Douven and collaborators have shown that the degree of membership of an item intermediate between two color categories (green vs. blue) or two shape categories (vase vs. bowl) can be derived from the categories’ typical instances. An issue left open is whether the conceptual spaces approach can account for graded membership in more abstract categories. In this paper we consider dimensional adjectives such as tall and expensive, for which the notion of prototypicality is more problematic. We present the results of an empirical study showing that the account can be extended successfully to that class, taking advantage of systematic relations of antonymy in those adjectives. The approach’s assumption that typical instances of a category are equally typical and its ability to account for inter-individual differences in degree membership are discussed.
John Locke (b. 1632, d. 1704) was a British philosopher, Oxford
academic and medical researcher. Locke’s monumental An Essay
Concerning Human Understanding (1689) is one of the first great
defenses of modern empiricism and concerns itself with determining the
limits of human understanding in respect to a wide spectrum of topics. It thus tells us in some detail what one can legitimately claim to
know and what one cannot. Locke’s association with Anthony Ashley
Cooper (later the First Earl of Shaftesbury) led him to become
successively a government official charged with collecting information
about trade and colonies, economic writer, opposition political
activist, and finally a revolutionary whose cause ultimately triumphed
in the Glorious Revolution of 1688.
Two-dimensional (2D) semantics is a formal framework that is used to
characterize the meaning of certain linguistic expressions and the
entailment relations among sentences containing them. Two-dimensional
semantics has also been applied to thought contents. In contrast with
standard possible worlds semantics, 2D semantics assigns extensions
and truth-values to expressions relative to two possible world
parameters, rather than just one. So a 2D semantic framework provides
finer-grained semantic values than those available within standard
possible world semantics, while using the same basic model-theoretic
Antoine Arnauld (1612–1694) was a powerful figure in the
intellectual life of seventeenth-century Europe. He had a long and
highly controversial career as a theologian, and was an able and
influential philosopher. His writings were published and widely read
over a period of more than fifty years and were assembled in
1775–1782 in forty-two large folio volumes. Evaluations of Arnauld’s work as a theologian vary. Ian Hacking,
for example, says that Arnauld was “perhaps the most brilliant
theologian of his time” (Hacking 1975a, 25). Ronald Knox, on the
other hand, says, “It was the fashion among the Jansenists to
represent Antoine Arnauld as a great theologian; he should be
remembered, rather as a great controversialist… A theologian by
trade, Arnauld was a barrister by instinct” (Knox 1950, 196).
Consider this Thomistic-style doctrine:
God’s believing that a contingent entity x exists is the cause of x’s existing. Let B be God’s believing that I exist. Then, either
B exists in all possible worlds
B exists in all and only the worlds where I exist. …
Imogen Dickie, Fixing Reference (Oxford, 2016)
So, why is the proposal of Fixing Reference not guilty of circularity or equivocation or both? Here are two observations to prepare the way for the answer to this question that I want to propose. …
Platonists hold that properties exist independently of their instances. Heavy-weight Platonists add the further thesis that the characterization of objects is grounded in or explained by the instantiation of a property, at least in fundamental cases. …
This book looks interesting:
• David S. Wilson and Alan Kirman, editors, Complexity and Evolution: Toward a New Synthesis for Economics, MIT Press, Cambridge Mass., 2016. You can get some chapters for free here. …
Head of Competence Center for Methodology and Statistics (CCMS)
Luxembourg Institute of Health
Automatic for the people? Not quite
What caught my eye was the estimable (in its non-statistical meaning) Richard Lehman tweeting about the equally estimable John Ioannidis. …
A ‘perceptual demonstrative’ thought is a thought of the kind standardly made available by a perceptual link with an ordinary thing, and standardly expressed using ‘this’ or ‘that’ – for example, when you look at a grapefruit on the table in front of you and think the thought you would express by saying ‘That is orange’, you are having a thought of this kind. …
Common-sense and traditional metaphysics alike accord shadows a secondary status in the order of things, relegating them from the first rank of genuine substances. Recall, for example, Shirley’s famous lyric: “The Glories of our blood and state // Are shadows, not substantial things”. Or how in Shakespeare’s play, Marcus Andronicus, bemoans of his brother, Titus, that “grief has so wrought on him, He takes false shadows for true substances” (III.ii.79-80).
The 11th and final chapter of Idealism and Christian Theology is “Idealistic Ethics and Berkeley’s Good God” by Timo Airaksinen. This is a rich, complex, and careful treatment of Berkeley’s ethical thought. …
Philosophers have three broad methods for settling disputes: appeal to "common sense" or culturally common presuppositions, appeal to scientific evidence, and appeal to theoretical virtues like simplicity, coherence, fruitfulness, and pragmatic value. …
Finlay (2006) and Schroeder (2007) have developed two similar probabilistic accounts of promotion. According to their views, promoting a desire is increasing its probability of being realized (relative to some baseline). Behrends and DiPaolo (2011) have formulated an argument against understanding promotion in purely probabilistic terms. The same argument was later taken up (and further elaborated) by Coates (2014) and Sharadin (2015), who both develop their own understandings of promotion based on the criticism the argument delivers. Recently, in more exchange about the issue, interesting alternative accounts of promotion have been proposed and more problems regarding a purely probabilistic understanding have been brought to the fore. However, here, I would like to turn back the clock a little and call attention to a problem I see with the original argument against probabilistic accounts. More precisely, I am going to argue that the criticism as presented in Behrends and DiPaolo (2011) and in Sharadin (2015) fails. My argument is based on what I am going to call the Humean Core Idea.
This post is about justification: the justification of perceptual demonstrative beliefs by uptake from perception, and of many of the beliefs we express using proper names by uptake from testimony. What I’m about to suggest is perhaps a little surprising. …
I regularly come across two objections to tactical voting, i.e. voting for Lesser Evil rather than Good in hopes of defeating the Greater Evil candidate. One objection is just the standard worry that individual votes lack instrumental value, debunked here. …
It wasn't just the positivists who thought there was a tight connection between meaning and truth in the case of a priori propositions:
However, it seems to me that nevertheless one ingredient of this wrong theory of mathematical truth [i.e. …
As I go through chapters of my Introduction to Formal Logic, heavily rewriting them for the second edition (only scattered paragraphs are surviving unaltered from the first edition, and I’m adding wholy new chapters too), I’m occasionally pulling other introductory books from my shelves, to check their coverage, to see how they handle particular points and whether they have nice expository ideas I could steal gratefully emulate. …
Johann Christoph Friedrich Schiller (1759–1805) is best known
for his immense influence on German literature. In his relatively
short life, he authored an extraordinary series of dramas, including
The Robbers, Maria Stuart, and the trilogy
Wallenstein. He was also a prodigious poet, composing perhaps
most famously the “Ode to Joy” featured in the culmination
of Beethoven’s Ninth Symphony and enshrined, some two centuries
later, in the European
Hymn.[ 1 ]
In part through his celebrated friendship with Goethe, he edited
epoch-defining literary journals and exerted lasting influence on
German stage production.
Analogical reasoning addresses the question how evidence from various phenomena can be amalgamated and made relevant for theory development and prediction. In the first part of my contribution, I review some influential accounts of analogical reasoning, both historical and contemporary, focusing in particular on Keynes, Carnap, Hesse, and more recently Bartha. In the second part, I sketch a general framework. To this purpose, a distinction between a predictive and a conceptual type of analogical reasoning is introduced. I then take up a common intuition according to which (predictive) analogical inferences hold if the differences between source and target concern only irrelevant circumstances. I attempt to make this idea more precise by addressing possible objections and in particular by specifying a notion of causal irrelevance based on difference making in homogeneous contexts.
It is widely believed that the semantic contents of some linguistic and mental representations are determined by factors independent of a person’s bodily makeup. Arguments derived from Hilary Putnam’s seminal Twin Earth thought experiment have been especially influential in establishing that belief. I claim that there is a neglected version of the mind-body relation which undermines those arguments and also excludes the possibility of zombies. It has been neglected because it is counterintuitive but I show that it can nonetheless be intelligibly worked out in detail and all obvious objections met. This suggests that we may be faced with a choice between embracing a counterintuitive interpretation of the mind-body relation or accepting that a currently very promising theory in cognitive science, Prediction Error Minimization, faces a fundamental problem. Furthermore, blocking that threat entails that any physicalist/materialst theory of mind is freed from the spectre of zombie worlds. The proposal also makes the ideas of personal teleportation of mind uploading more plausible.
This paper provides a critical guide to the literature concerning the answer to the question: when does a quantum experiment have an result? This question was posed and answered by Rovelli (Rovelli ) and his proposal was critiqued by Oppenheim, Reznick and Unruh (Oppenheim et al. ), who also suggest another approach that (as they point out) leads to the quantum Zeno effect. What these two approaches have in common is the idea that a question about the time at which an event occurs can be answered through the instantaneous measurement of a projector (in Rovelli’s case, a single measurement; in that of Oppenheim et al. , a repeated measurement). However, the interpretation of a projection as an instantaneous operation that can be performed on a system at a time of the experimenter’s choosing is problematic, particularly when it is the time of the outcome of the experiment that is at issue.
A striking characteristic of the highly successful techniques in molecular biology is that they are derived from natural systems. RNA interference (RNAi), for example, utilises a mechanism that evolved in eukaryotes to destroy foreign nucleic acid. Other examples include restriction enzymes, the polymerase chain reaction, green fluorescent protein and CRISPR-Cas. I propose that biologists exploit natural molecular mechanisms for their effectors’ (protein or nucleic acid) activity and biological specificity (protein or nucleic acid can cause precise reactions). I also show that the developmental trajectory of novel techniques in molecular biology, such as RNAi, is four characteristic phases. The first phase is discovery of a biological phenomenon, typically as curiosity driven research. The second is identification of the mechanism’s trigger(s), the effector and biological specificity. The third is the application of the technique. The final phase is the maturation and refinement of the molecular biology technique. The development of new molecular biology techniques from nature is crucial for biological research. These techniques transform scientific knowledge and generate new knowledge.
It seems to be a received view about the relationship of traditional Aristotelian logic to modern quantificational logic that the inferences codified in the old-fashioned syllogisms - All men are mortal, Socrates is a man, etc. …
Today’s Virtual Colloquium is “God’s Standing to Forgive” by Brandon Warmke. Dr. Warmke received his PhD in philosophy from the University of Arizona in 2014 and is currently Assistant Professor of Philosophy at Bowling Green State University in Ohio. …
The extended mind thesis claims that at least some cognitive processes extend beyond the organism’s brain in that they are constituted by the organism’s actions on its surrounding environment. A more radical move would be to claim that social actions performed by the organism could at least constitute some of its mental processes. This can be called the socially extended mind thesis. Based on the notion of affordance as developed in the ecological psychology tradition, I defend the position that perception extends to the environment. Then I will expand the notion of affordance to encompass social affordances. Thus, perception can in some situations also be socially extended.
The popular impression of Bohmian mechanics is that it is standard quantum mechanics with the addition of some extra gadgets— exact particle positions and a guiding equation for particle trajectories— the advantages being that the gadgets pave the way for a resolution of the measurement problem that eschews state vector reduction while restoring the determinism lost in standard quantum mechanics. In fact, the Bohmian mechanics departs in signi…cant ways from standard quantum mechanics. By itself this is not a basis for criticism; indeed, it makes Bohmian mechanics all the more interesting. But Bohmian mechanics is not, as the popular impression would have it, empirically equivalent to standard quantum mechanics in terms of probabilistic predictions for the outcomes of measurements of quantum observables. Indeed, in physically important applications to systems for which standard quantum mechanics delivers empirically well-con…rmed probabilistic predictions, the sophisticated form of Bohmian mechanics designed to prove the global existence of Bohmian particle trajectories fails to deliver unequivocal predictions— of even a probabilistic variety— for the future behavior of said systems. Possible responses to this lacuna are discussed.
The idea that the quantum probabilities are best construed as the personal/subjective degrees of belief of Bayesian agents is an old one. In recent years the idea has been vigorously pursued by a group of physicists who ‡y the banner of quantum Bayesianism (QBism). The present paper aims to identify the prospects and problems of implementing QBism, and it critically assesses the claim that QBism provides a resolution (or dissolution) of some of the long standing foundations issues in quantum mechanics, including the measurement problem and puzzles of non-locality.