Yesterday Ryan Mandelbaum, at Gizmodo, posted a decidedly tongue-in-cheek piece about whether or not the universe is a computer simulation. (The piece was filed under the category “LOL.”)
The immediate impetus for Mandelbaum’s piece was an blog post by Sabine Hossenfelder, a physicist who will likely be familiar to regulars here in the nerdosphere. …
Consider the crucible of character theodicy, that we are permitted by God to meet with great evils in order to form a character with virtues like courage and sacrificial love whose significant exercise requires significant evils. …
Kant famously claims that being is “obviously not a real predicate” (KrV, A 598/B 626) , i.e. a determination or a property of a thing. As Frege similarly states that existence is not a first-level predicate of objects but a second-level predicate of concepts, it is not surprising that the two philosophers have been compared on this point. Indeed, Jonathan Bennett speaks of the “Kant-Frege view”, according to which Frege first gave solid logical foundations for Kant’s claim (Bennett 1974, 62–5, 231). To my mind, although there is some truth to the Kant-Frege view, there is a fundamental disparity between Kant’s and Frege’s conceptions of existence that far outweighs their similarities.
This is a phenomenological description of what is happening when we experience the death of another that interprets surviving or living on after such death by employing the term event. This term of art from phenomenology and hermeneutics is used to describe a disruptive and transformative experience of singularity. I maintain that the death of the other is an experience of an event because such death is unpredictable or without a horizon of expectation, excessive or without any principle of sufficient reason, and transformative or a death of the world itself.
I am teaching Introduction to Neuroscience this spring semester and am using An Introduction to Brain and Behavior 5th edition by Kolb et al as the textbook (this is the book the biology program decided to adopt). …
Does time pass? A-theorists say it does; B theorists disagree. However both sides of the debate generally agree that it at least appears to us as though time passes, with B theorists standardly taking the passage of time to be some kind of cognitive illusion. This paper rejects the idea that temporal passage forms part of our conscious representation of the world. I consider a range of explanatory strategies for the aspects of our temporal experience generally taken to be passage-like—which I term ‘temporal qualia’—, and defend a reductionist account, according to which our temporal qualia are nothing more than our generally veridical experience of change, motion, succession, and other such features of the world well studied by empirical psychology. As such, I argue that our experience of time is neither illusory nor corresponds to temporal passage, and show that reductionism about temporal qualia is both continuous with and well supported by empirical work on time perception.
Non-relativistic quantum mechanics is grounded on ‘classical’ (Newtonian) space and time (NST). The mathematical description of these concepts entails that any two spatially separated objects are necessarily different, which implies that they are discernible (in classical logic, identity is defined by means of indiscernibility) — we say that the space is T2, or "Hausdorff". But quantum systems, in the most interesting cases, sometimes need to be taken as indiscernible, so that there is no way to tell which system is which, and this holds even in the case of fermions. But in the NST setting, it seems that we can always give an identity to them, which seems to be contra the physical situation. In this paper we discuss this topic for a case study (that of two potentially infinite wells) and conclude that, taking into account the quantum case, that is, when physics enter the discussion, even NST cannot be used to say that the systems do have identity. Keywords: identity of quantum particles, spatial identity, space and time in quantum mechanics.
In 1919, Lukács posed the question, “What is orthodox Marxism?” Even for Lukács, there was an undertone of irony: if by orthodoxy we mean devoutness, then “the most appropriate answer [is] a pitying smile.” But Lukács also points out that the question can be understood and asked in such a way that it invites or even requires a different kind of answer. If we understand it as a question about quintessence, Lukács’ answer is as follows: The quintessence of Marxism does not reside in the results of Marx’s research or a “‘belief’ in one or another proposition,” nor in the “exegesis of a ‘holy book.’” Rather, “orthodoxy in matters of Marxism refers exclusively to method.” In this essay I want to reapply Lukács’ question to Critical Theory: What is orthodox Critical Theory? And I’d like to advocate an approach that could be called orthodox in three respects.
Friedrich Nietzsche (1844–1900) was a German philosopher and
cultural critic who published intensively in the 1870s and 1880s. He
is famous for uncompromising criticisms of traditional European
morality and religion, as well as of conventional philosophical ideas
and social and political pieties associated with modernity. Many of
these criticisms rely on psychological diagnoses that expose false
consciousness infecting people’s received ideas; for that
reason, he is often associated with a group of late modern thinkers
(including Marx and Freud) who advanced a “hermeneutics of
suspicion” against traditional values (see Foucault  1990,
Ricoeur  1970, Leiter 2004).
Friedrich Nietzsche (1844–1900) was a German philosopher of the late
19th century who challenged the foundations of Christianity and
traditional morality. He was interested in the enhancement of
individual and cultural health, and believed in life, creativity,
power, and down-to-earth realities, rather than those
situated in a world beyond. Central to his philosophy is the idea of
“life-affirmation,” which involves an honest questioning of
all doctrines that drain life’s expansive energies, however socially
prevalent those views might be. Often referred to as one of the first
existentialist philosophers along with Søren Kierkegaard
(1813–1855), Nietzsche’s revitalizing philosophy has inspired leading
figures in all walks of cultural life, including dancers, poets,
novelists, painters, psychologists, philosophers, sociologists and
Today’s Virtual Colloquium is “Global and Local Atheisms” by Jeanine Diller. Dr. Diller received her PhD from the University of Michigan and is currently an assistant professor in the Department of Philosophy and Program on Religious Studies of the University of Toledo in Ohio. …
Neither Karl Popper, nor Frank Knight, nor Max Weber are cited or mentioned in Friedman’s famous 1953 essay “On the methodology of positive economics” (F53). However, they play a crucial role in F53. Making their contribution explicit suggests that F53 has been seriously misread in the past. I will first show that there are several irritating statements in F53 that are, taken together, not compatible with any of the usual readings of F53. Second, I show that an alternative reading of F53 can be achieved if one takes seriously Friedman’s reference to ideal types; “ideal type” is a technical term introduced by Max Weber. Friedman was familiar with Max Weber’s work through Frank Knight, who was his teacher in Chicago. Given that in F53’s view ideal types are fundamental building blocks of economic theory, it becomes clear why both instrumentalist and realist readings of F53 are inadequate. Third, the reading of F53 in terms of ideal types gives the role of elements from Popper’s falsificationist methodology in F53 a somewhat different twist. Finally, I show that the irritating passages of F53 make good sense under the new reading, including the infamous “the more significant the theory, the more unrealistic the assumptions”.
In the two months since I last blogged, the US has continued its descent into madness. Yet even while so many certainties have proven ephemeral as the morning dew—our US’s autonomy from Russia, the sanity of our nuclear chain of command, the outcome of our Civil War, the constraints on rulers that supposedly set us apart from the world’s dictator-run hellholes—I’ve learned that certain facts of life remain constant. …
Can we maintain that purple seems composed of red and blue without giving up the impenetrability of the red and blue parts that compose it? Brentano thinks we can. Purple, according to him, is a chessboard of red and blue tiles which, although individually too small to be perceived, are together indistinctly perceived within the purple. After a presentation of Brentano’s solution, we raise two objections to it. First, Brentano’s solution commits him to unperceivable intentional objects (the chessboard’s tiles). Second, his chessboard account fails in the end to explain the phenomenal spatial continuity of compound colours. We then sketch an alternative account, which, while holding fast to the phenomenal compoundedness of the purple and to the impenetrability of component colours, avoids introducing inaccessible intentional objects and compromising on the continuity of the purple. According to our proposal, instead of being indistinctly perceived spatial parts of the purple, red and blue are distinctly perceived non-spatial parts of it.
Moses ben Maimon [known to English speaking audiences as Maimonides
and Hebrew speaking as Rambam] (1138–1204) is the greatest Jewish
philosopher of the medieval period and is still widely read today. The
Mishneh Torah, his 14-volume compendium of Jewish law,
established him as the leading rabbinic authority of his time and quite
possibly of all time. His philosophic masterpiece, the Guide of the
Perplexed, is a sustained treatment of Jewish thought and practice
that seeks to resolve the conflict between religious knowledge and
secular. Although heavily influenced by the Neo-Platonized
Aristotelianism that had taken root in Islamic circles, it departs from
prevailing modes of Aristotelian thought by emphasizing the limits of
human knowledge and the questionable foundations of significant parts
of astronomy and metaphysics.
In this work we present a dynamical approach to quantum logics. By changing the standard formalism of quantum mechanics to allow non-Hermitian operators as generators of time evolution, we address the question of how can logics evolve in time. In this way, we describe formally how a non-Boolean algebra may become a Boolean one under certain conditions. We present some simple models which illustrate this transition and develop a new quantum logical formalism based in complex spectral resolutions, a notion that we introduce in order to cope with the temporal aspect of the logical structure of quantum theory.
Although during the last decades the philosophy of chemistry has greatly extended its thematic scope, the problem of the relationship between chemistry and physics still attracts a great interest in the area. In particular, the main difficulties appear in the attempt to link the chemical description of atoms and molecules and the description supplied by quantum mechanics.
Juan Luis Vives (1493–1540) was a Spanish humanist and
educational theorist who strongly opposed scholasticism and made his
mark as one of the most influential advocates of humanistic learning
in the early sixteenth century. His works are not limited to education
but deal with a wide range of subjects including philosophy,
psychology, politics, social reform and religion. Vives was not a
systematic writer, which makes it difficult to classify him as a
philosopher. His thought is eclectic and pragmatic, as well as
historical, in its orientation. He took what he considered most valid
from a variety of sources and combined these elements into a
The paper discusses major implications of high energy physics for the scientific realism debate. The first part analyses the ways in which aspects of the empirically well-confirmed standard model of particle physics are relevant for a reassessment of entity realism, ontological realism and structural realism. The second part looks at the implications of more far-reaching concepts like string theory. While those theories have not found empirical confirmation, if they turned out viable, their implications for the realism debate would be more substantial than those of the standard model.
Killer robots. You have probably heard about them. You may also have heard that there is a campaign to stop them. One of the main arguments that proponents of the campaign make is that they will create responsibility gaps in military operations. …
Omniscience is the property of having complete or maximal knowledge. Along with omnipotence and perfect goodness, it is usually taken to be
one of the central divine attributes. Once source of the attribution
of omniscience to God derives from the numerous biblical passages that
ascribe vast knowledge to him. St. Thomas Aquinas (Summa
Theologiae I, q. 14), in his discussion of the knowledge of God,
cites such texts as Job 12:13: “With God are wisdom and
strength; he has counsel and understanding” and Rom. 11:13:
“O the depths of the riches and wisdom and knowledge of
God!” Another source is provided by the requirements of
formulating one or another theological doctrine.
There was a period in the 1970’s when the admissions data for the UC–Berkeley graduate school (hereafter, BGS) exhibited some (prima facie) peculiar statistical correlations. Specifically, a strong negative correlation was observed between being female and being accepted into BGS. This negative correlation (in the overall population of BGS applicants) was (initially) a cause for some concern regarding the possibility of gender bias in the admissions process at BGS. However, closer scrutiny of the BGS admissions data from this period revealed that no individual department’s admissions data exhibited a negative correlation between being female and being admitted. In fact, every department reported a positive correlation between being female and being accepted. In other words, a correlation that appears at the level of the general population of BGS applicants is reversed in every single department of BGS. This sort of correlation reversal is known as Simpson’s Paradox. Because admissions decisions at BGS are made (autonomously) by each individual department, the lack of departmental correlations seems to rule-out the gender bias hypothesis as the best (causal) explanation of the observed correlations in the data. As it happens, there was a strong positive correlation between being female and applying to a department with a (relatively) high rejection rate.
In this article, I argue that what is commonly lamented as the decline of qualitative research might be because of our own inability to reveal something true about being-in-the-world. Four problems with qualitative work are identified: making what is obvious inescapable, confusion around what constitutes qualitative research and phenomenology, uniformed and disrespectful mixing of methods, and devolution into “little t” truth. I finish by calling for bold, evocative interpretation, and posing the question: What is the nature of the revolution that hermeneutics can foment?
According to the antirealist argument known as the pessimistic induction, the history of science is a graveyard of dead scientific theories and abandoned theoretical posits. Support for this pessimistic picture of the history of science usually comes from a few case histories, such as the demise of the phlogiston theory and the abandonment of caloric as the substance of heat. In this paper, I wish to take a new approach to examining the “history of science as a graveyard of theories” picture. Using JSTOR Data for Research and Springer Exemplar, I present new lines of evidence that are at odds with this pessimistic picture of the history of science. When rigorously tested against the historical record of science, I submit, the pessimistic picture of the history of science as a graveyard of dead theories and abandoned posits may turn out to be no more than a philosophers’ myth.
I argue that our judgements regarding the locally causal models which are compatible with a given quantum no-go theorem implicitly depend, in part, on the context of inquiry. It follows from this that certain no-go theorems, which are particularly striking in the traditional foundational context, have no force when the context switches to a discussion of the physical systems we are capable of building with the aim of classically reproducing quantum statistics. I close with a general discussion of the possible implications of this for our understanding of the limits of classical description, and for our understanding of the fundamental aim of physical investigation.
Naturalistic philosophers rely on literature search and review in a number of ways and for different purposes. Yet this article shows how processes of literature search and review are likely to be affected by widespread and systematic biases. A solution to this problem is offered here. Whilst the tradition of systematic reviews of literature from scientific disciplines has been neglected in philosophy, systematic reviews are important tools that minimize bias in literature search and review and allow for greater reproducibility and transparency. If naturalistic philosophers wish to reduce bias in their research, they should then supplement their traditional tools for literature search and review by including systematic methodologies.
Alcmaeon of Croton was an early Greek medical writer and
philosopher-scientist. His exact date, his relationship to other early
Greek philosopher-scientists, and whether he was primarily a medical
writer/physician or a typical Presocratic cosmologist, are all matters
of controversy. He is likely to have written his book sometime between
500 and 450 BCE. The surviving fragments and testimonia focus
primarily on issues of physiology, psychology, and epistemology and
reveal Alcmaeon to be a thinker of considerable originality. He was
the first to identify the brain as the seat of understanding and to
distinguish understanding from perception.
The first paper in Idealism and Christian Theology is James Spiegel’s “The Theological Orthodoxy of Berkeley’s Immaterialism.” This piece was originally published in Faith and Philosophy in 1996, though I must confess that I had not read it before today. …
Hume's account of human thought and cognition is central to his philosophical project. But despite this, and the general acknowledgement of Hume's importance as a philosopher, his account of cognition has often been viewed as limited and simplistic. While the worries about this account are
Traditional monotheism has long faced logical puzzles (omniscience, omnipotence, and more) [10, 11, 13, 14]. We present a simple but plausible ‘gappy’ framework for addressing these puzzles. By way of illustration we focus on God’s alleged stone problem. What we say about the stone problem generalizes to other familiar ‘paradoxes of omni- properties’, though we leave the generalization implicit. We assume familiarity with the proposed (subclassical) logic but an appendix is offered as a brief review.