F. A. Hayek and the Epistemology of Politics is primarily intended as a contribution to the philosophy and methodology of the Austrian School of economics (pp. 1-2). However, as the symposium participants are all quick to note, several of the book’s central arguments, especially those advanced in the first chapter, are of potential significance far beyond Austrian economics. The arguments of the first chapter present an important methodological challenge to multiple fields of political inquiry, to traditional political philosophy and theory, and to modern political science, as well as a significant practical problem for anyone concerned with the effectiveness of political action. Professional political thinkers and laypersons alike conceive the basic political problem to concern the motivations, reasons, incentives, etc., of policymakers. On this way of thinking, the fundamental problem to be solved, analytically, by the disciplines of political inquiry, and, practically, in political life, is how to ensure that policymakers are adequately motivated to pursue policy goals either that are in constituents’ interests or that constituents’ want pursued. I do not deny the significance of this problem or the value of the proposed solutions, whether analytical or practical-constitutional, that have been offered in the long course of the history of politics and political thought. The book does not suggest that we should scrap thousands of years of political inquiry and start all over again.
Scott Scheall has done an admirable job of making the occasionally dry and complicated issues of Hayekian political theory readable and even amusing. And he shows that he is an attentive student of Friedrich Hayek, particularly in the emphasis he places on epistemic humility which is certainly Hayek’s own principal teaching. But the result of Scheall’s skillful presentation is to lay bare just how flimsy that teaching really is as a guide to political wisdom, shorn of a normative framework.
This paper examines Heidegger’s position on a foundational distinction for Kantian and post- Kantian philosophy: that between acting ‘in the light of’ a norm and acting ‘merely in accordance with it’. In section 1, I introduce the distinction and highlight several relevant similarities between Kant and Heidegger on ontology and the first-person perspective. In section 2, I press the Kantian position further, focusing on the role of inferential commitments in perception: this provides a foil against which Heidegger’s account can be In section 3, I contrast this Kantian approach with Crowell’s highly sophisticated reading of Heidegger on care: I argue that, subject to certain conditions on how we view explanation, the two approaches are compatible and indeed mutually supporting. I close in section 4 by addressing an importantly distinct dimension of normativity, that marked by critique, broadly construed. I argue that we ultimately need to locate Heidegger in a context that runs from Kant’s ‘What is Enlightenment’ through Nietzsche’s Genealogy.
In this paper, I motivate the addition of an actuality operator to relevant logics. Straightforward ways of doing this are in tension with standard motivations for relevant logics, but I show how to add the operator in a way that permits one to maintain the intuitions behind relevant logics. I close by exploring some of the philosophical consequences of the addition.
Consider these two very plausible theses:
Whether I feel pain at t does not depend on any future facts. There is a length of time δt such that you cannot feel a pain lasting no more than δt, but you can feel a pain lasting 4δt. …
One approach to knowledge, termed the relevant alternatives theory, stipulates that a belief amounts to knowledge if one can eliminate all relevant alternatives to the belief in the epistemic situation. This paper uses causal graphical models to formalize the relevant alternatives approach to knowledge. On this theory, an epistemic situation is encoded through the causal relationships between propositions, which determine which alternatives are relevant and irrelevant. This formalization entails that statistical evidence is not sufficient for knowledge, provides a simple way to incorporate epistemic contextualism, and can rule out many Gettier cases from knowledge. The interpretation in terms of causal models offers more precise predictions for the relevant alternatives theory, strengthening the case for it as a theory of knowledge.
This paper uses the example of the Covid-19 pandemic to analyse the danger associated with insufficient pluralism in evidence-based public health policy. Drawing on certain elements in Paul Feyerabend’s political philosophy of science, it discusses reasons for implementing more pluralism as well as challenges to be tackled on the way forward.
I am indebted to my own teachers, Prof. Peter Koepke and Prof. Stefan Geschke, who taught me everything in these notes. Prof. Geschke’s scriptum for Einführung in die Logik und Modelltheorie (Bonn, Summer 2010) provided an invaluable basis for the compilation of these notes.
The vocabulary of human languages has been argued to support efficient communication by optimizing the trade-off between complexity and informativeness (Kemp & Regier 2012). The argument has been based on cross-linguistic analyses of vocabulary in semantic domains of content words such as kinship, color, and number terms. The present work extends this analysis to a category of function words: indefinite pronouns (e.g. someone, anyone, no-one, cf. Haspelmath 2001). We build on previous work to establish the meaning space and featural make-up for indefinite pronouns, and show that indefinite pronoun systems across languages optimize the complexity/informativeness trade-off. This demonstrates that pressures for efficient communication shape both content and function word categories, thus tying in with the conclusions of recent work on quantifiers by Steinert-Threlkeld (2019). Furthermore, we argue that the trade-off may explain some of the universal properties of indefinite pronouns, thus reducing the explanatory load for linguistic theories.
According to the Psychological-Continuity Account of What Matters, you are justified in having special concern for the well-being of a person at a future time if and only if that person will be psychologically continuous with you as you are now. On some versions of the account, the psychological continuity is required to be temporally ordered, whereas, on other versions, it is allowed to be temporally unordered. In this paper, I argue that the account is implausible if the psychological continuity is allowed to be temporally unordered. I also argue that, if the psychological continuity is required to be temporally ordered, it cannot plausibly be purely psychological (in the sense that the psychological continuity is not required to be caused through spatio-temporal continuity of a brain). The upshot is that no plausible version of the Psychological- Continuity Account of What Matters is purely psychological. So, psychological continuity is not what matters in survival.
. How did I respond to those 7 burning questions at last week’s (“P-Value”) Statistics Debate? Here’s a fairly close transcript of my (a) general answer, and (b) final remark, for each question–without the in-between responses to Jim and David. …
We define mereologically invariant composition as the relation between a whole object and its parts when the object retains the same parts during a time interval. We argue that mereologically invariant composition is identity between a whole and its parts taken collectively. Our reason is that parts and wholes are equivalent measurements of a portion of reality at diferent scales in the precise sense employed by measurement theory. The purpose of these scales is the numerical representation of primitive relations between quantities of being. To show this, we prove representation and uniqueness theorems for composition. Thus, mereologically invariant composition is trans-scalar identity.
The philosophical impact of early German romanticism in general and
Georg Philipp Friedrich von Hardenberg (Novalis) in particular has
typically been traced back to a series of fragments and reflections on
poetry, art, and beauty. Moreover, his name has been associated with
an aestheticization of philosophy, an illegitimate valorizing of the
medieval, and a politically reactionary program. This view of von
Hardenberg, however, is to a large extent rooted in the image created
posthumously by his increasingly conservative friends within the
romantic circle. Furthermore, von Hardenberg’s philosophical
reputation has been shaped by his critics, the most prominent of whom
was Georg Wilhelm Friedrich Hegel.
Two of the most systematic and well-developed theories of social norms analyse such norms in terms of patterns of individual attitudes. On Bicchieri’s view (2006, 2017), social norms most centrally involve a pattern of preferences among the members of a relevant population, conditional on their normative and empirical expectations of other members. According to Brennan et al. (2013; hereafter I will refer to this as the ‘BEGS account’), social norms most centrally involve patterns of normative attitudes among the members of a given group, grounded in a social practice of that group. This paper argues that the existence of attitudinal social norms speaks in favour of Bicchieri’s preference-based view, and against the BEGS account’s normative attitude-based view. I will first present some reasons to think that there are attitudinal social norms – social norms that demand not just behaviour, but also a variety of attitudes. I will then argue that, with a very minor modification, Bicchieri’s account can properly capture such attitudinal social norms and that the BEGS account cannot.
According to Aristotle, the medical art aims at health, which is a virtue of the body, and does so in an unlimited way. Consequently, medicine does not determine the extent to which health should be pursued, and “mental health” falls under medicine only via pros hen predication. Because medicine is inherently oriented to its end, it produces health in accordance with its nature and disease contrary to its nature—even when disease is good for the patient. Aristotle’s politician understands that this inherent orientation can be systematically distorted, and so would see the need for something like the Hippocratic Oath.
Utterances of simple sentences containing taste predicates (e.g. delicious, fun, frightening) typically imply that the speaker has had a particular sort of firsthand experience with the object of predication. For example, an utterance of The carrot cake is delicious would typically imply that the speaker had actually tasted the cake in question, and is not, for example, merely basing her judgment on the testimony of others. According to one approach, this acquaintance inference is essentially an implicature, one generated by the Maxim of Quality together with a certain principle concerning the epistemology of taste (Ninan 2014). We first discuss some problems for this approach, problems that arise in connection with disjunction and generalized quantifiers. Then, after stating a conjecture concerning which operators ‘obviate’ the acquaintance inference and which do not, we build on Anand & Korotkova 2018 and Willer & Kennedy Forthcoming by developing a theory that treats the acquaintance requirement as a presupposition, albeit one that can be obviated by certain operators.
The TL;DR summary of what follows is that we should quantify the conventionality of a regularity (David-Lewis-style) as follows:
A regularity R in the behaviour of population P in a recurring situation S, is a convention of depth x, breadth y and degree z when there is a recurring situation T that refines S, and in each instance of T there is a subpopulation K of P, such that it’s true and common knowledge among K in that instance that:(A) BEHAVIOUR CONDITION: everyone in K conforms to R (B) EXPECTATION CONDITION: everyone in K expects everyone else in K to conform to R (C) SPECIAL PREFERENCE CONDITION: everyone in K prefers that they conform to R conditionally on everyone else in K conforming to R. where x (depth) is the fraction of S-situations which are T, y (breadth) is the fraction of all Ps involved who are Ks in this instance, and z is the degree to which (A-C) obtaining resembles a coordination equilibrium that solves a coordination problem among the Ks. …
Imagine you are an untenured Professor and the only woman and person of color amongst the faculty in a Philosophy department. You are frequently approached by students, typically women or members of other underrepresented groups, looking for mentorship and emotional support as they navigate their academic experience. While you believe this service work is valuable with a view to increasing the representation of minorities in philosophy, it is also emotionally draining and takes significant time away from your own research. You feel trapped. If you do this sort of mentorship work, you help diversify the field in a way that will be better for you and other members of underrepresented groups. Moreover, if you refuse to do this work, you indirectly help to maintain a status quo in which women and people of color like yourself remain dramatically underrepresented and under-served. But, by doing this service work, you compromise your own research, and reinforce a system where disproportionate burdens are placed on women and people of color, making them less likely to succeed in the profession.
Theories are indispensable to organize immunological data into coherent, explanatory, and predictive frameworks. Here we propose to combine different models to develop a unifying theory of immunity, which situates immunology in the wider context of physiology. We believe that the immune system will be increasingly understood as a central component of a network of partner physiological systems that connect to maintain homeostasis.
The main objective of immunology is to establish why and when an immune response occurs, that is, to determine a criterion of immunogenicity. According to the consensus view, the proper criterion of immunogenicity lies in the discrimination between self and nonself. Here we challenge this consensus by suggesting a simpler and more comprehensive criterion, the criterion of continuity. Moreover, we show that this criterion may be considered as an interpretation of the immune ‘‘self.’’ We conclude that immunologists can continue to speak of the self, provided that they admit that the self兾nonself discrimination is not an adequate criterion of immunogenicity.
A wide variety of normative ethical views have been developed as the field of ethics has progressed. Proponents of each view disagree with one another about the deontic status of acts and about the exact right-making features of acts. Utilitarians, for instance, believe that an action is right because it maximizes utility, while Kantians believe an action is right because it accords with the categorical imperative. Virtue ethics is often understood as a set of normative ethical views that are purported rivals to versions of consequentialism, deontology, contractualism, and other normative ethical views. To be sure, not all accounts of virtue ethics are developed to fit this role. Some accounts of virtue ethics don’t consider the virtues to be directly tied to right action, but nevertheless assign the virtues a non-trivial role with respect to what makes an action right. Nevertheless, numerous contemporary accounts of virtue ethics are developed to rival existing substantive normative ethical views, irrespective of whether such accounts should be categorized in this manner. We will henceforth collectively refer to virtue ethics positions that purport to rival existing normative ethical views as VNET (for virtue-theoretic normative ethical theory).
I've been binge-watching Doctor Who, and two days ago I finished Susanna Clarke's new novel Piranesi. I love them both! Doctor Who is among my favorite TV series ever, and the images of Piranesi will probably linger with me for the rest of my life. …
Sharing, downloading, and reusing software is common-place, some of which is carried out legally with open source software. When it is not legal, it is unclear how many infringements have taken place: does an infringement count for the artefact as a whole or for each source file of a computer program? To answer this question, it must first be established whether a computer program should be considered as an integral whole, a collection, or a mere set of distinct files, and why. We argue that a program is a functional whole, availing of, and combining, arguments from mereology, granularity, modularity, unity, and function to substantiate the claim. The argumentation and answer contributes to the ontology of software artefacts, may assist industry in litigation cases, and demonstrates that the notion of unifying relation is operationalisable.
Debate about the epistemic prowess of historical science has focused on local underdetermination problems generated by a lack of historical data; the prevalence of information loss over geological time, and the capacities of scientists to mitigate it. Drawing on Leonelli’s recent distinction between ‘phenomena-time’ and ‘data-time’ I argue that factors like data generation, curation and management significantly complexifies and undermines this: underdetermination is a bad way of framing the challenges historical scientists face. In doing so, I identify circumstances of ‘epistemic scarcity’ where underdetermination problems are particularly salient, and discuss cases where ‘legacy data’—data generated using differing technologies and systems of practice—are drawn upon to overcome underdetermination. This suggests that one source of overcoming underdetermination is our knowledge of science’s past. Further, data-time makes agnostic positions about the epistemic fortunes of scientists working under epistemic scarcity more plausible. But agnosticism seems to leave philosophers without much normative grip. So, I sketch an alternative approach: focusing on the strategies scientists adopt to maximize their epistemic power in light of the resources available to them.
Which domains of biology do philosophers of biology primarily study? The fact that philosophy of biology has been dominated by an interest for evolutionary biology is widely admitted, but it has not been strictly demonstrated. Here I analyse the topics of all the papers published in Biology & Philosophy, just as the journal celebrates its thirtieth anniversary. I then compare the distribution of biological topics in Biology & Philosophy with that of the scientific journal Proceedings of the National Academy of Science of the USA, focusing on the recent period 2003-2015. This comparison reveals a significant mismatch between the distributions of these topics. I examine plausible explanations for that mismatch. Finally, I argue that many biological topics underrepresented in philosophy of biology raise important philosophical issues and should therefore play a more central role in future philosophy of biology.
May 2020 marked the 25th anniversary of the death of Miguel Sánchez-Mazas, founder of Theoria. An International Journal of Theory, History and Foundations of Science, and regarded as the person who brought mathematical logic to Spain. Here we present some of his biographical features and a summary of his contributions, from his early work in the 1950s - introducing contemporary advances in logic and philosophy of science in a philosophically backward milieu dominated by the scholasticism of that era in Spain - to the development of a project of Lebnizian lineage aimed at producing an arithmetic calculation that would elude some of the difficulties confronting Leibniz’s calculus. KEYWORDS: Miguel Sánchez-Mazas, Leibniz, numerical characteristic, calculation of norms, jurisprudence.
Noether’s first theorem does not establish a one-way explanatory arrow from symmetries to conservation laws, but such an arrow is widely assumed in discussions of the theorem in the physics and philosophy literature. It is argued here that there are pragmatic reasons for privileging symmetries, even if they do not strictly justify explanatory priority. To this end, some practical factors are adduced as to why Noether’s direct theorem seems to be more well-known and exploited than its converse, with special attention being given to the sometimes overlooked nature of Noether’s converse result and to its strengthened version due to Luis Martinez Alonso in 1979 and Peter Olver in 1986.
It is usual to identify initial conditions of classical dynamical systems with mathematical real numbers. However, almost all real numbers contain an infinite amount of information. I argue that a finite volume of space can’t contain more than a finite amount of information, hence that the mathematical real numbers are not physically relevant. Moreover, a better terminology for the so-called real numbers is “random numbers”, as their series of bits are truly random. I propose an alternative classical mechanics, which is empirically equivalent to classical mechanics, but uses only finite-information numbers. This alternative classical mechanics is non-deterministic, despite the use of deterministic equations, in a way similar to quantum theory. Interestingly, both alternative classical mechanics and quantum theories can be supplemented by additional variables in such a way that the supplemented theory is deterministic. Most physicists straightforwardly supplement classical theory with real numbers to which they attribute physical existence, while most physicists reject Bohmian mechanics as supplemented quantum theory, arguing that Bohmian positions have no physical reality.
Studies on Platonic ‘Theoria motus abstracti’ are often focused on dynamics rather than kinematics, in particular on psychic self-motion. This state of affairs is, of course, far from being a bland academic accident: according to Plato, dynamics is the higher science while kinematics is lower on the ‘scientific’ spectrum . Furthermore, when scholars investigate Platonic abstract kinematics, in front of them there is a very limited set of texts . Among them, one of the most interesting undoubtedly remains a passage of Parmenides in which Plato challenges the puzzle of the ‘instant of change’, namely the famous text about the ‘sudden’ (τὸ ἐξαίφνης).
Biologists like to think of themselves as properly scientific behaviourists, explaining and predicting the ways that proteins, organelles, cells, plants, animals and whole biota behave under various conditions, thanks to the smaller parts of which they are composed. ey identify causal mechanisms that reliably execute various functions such as copying DNA, attacking antigens, photosynthesising, discerning temperature gradients, capturing prey, finding their way back to their nests and so forth, but they don’t think that this acknowledgment of functions implicates them in any discredited teleology or imputation of reasons and purposes or understanding to the cells and other parts of the mechanisms they investigate.