1. 661.486824
    . What’s would I say is the most important takeaway from last week’s NISS “statistics debate” if you’re using (or contemplating using) Bayes factors (BFs)–of the sort Jim Berger recommends– as replacements for P-values? …
    Found 11 minutes ago on D. G. Mayo's blog
  2. 6224.486979
    In this short survey article, I discuss Bell’s theorem and some strategies that attempt to avoid the conclusion of non-locality. I focus on two that intersect with the philosophy of probability: (1) quantum probabilities and (2) super-determinism. The issues they raised not only apply to a wide class of no-go theorems about quantum mechanics but are also of general philosophical interest.
    Found 1 hour, 43 minutes ago on PhilSci Archive
  3. 6510.487001
    The propensity nature of evolutionary fitness has long been appreciated and is nowadays amply discussed (Abrams, 2009, 2012; Ariew and Ernst, 2009; Ariew and Lewontin, 2004; Beatty and Finsen, 1989; Brandon, 1978; Drouet and Merlin, 2015; Mills and Beatty, 1979; Millstein, 2003, 2016; Pence and Ramsey, 2013; Sober, 1984, 2001, 2013, 2019; Walsh, 2010; Walsh, Ariew, Mahen, 2016; etc). The discussion has, however, on occasion followed long standing conflations in the philosophy of probability between propensities, probabilities, and frequencies. In this article, I apply a more recent conception of propensities in modelling practice (the ‘complex nexus of chance’, CNC) to some key issues, regarding whether and how fitness is explanatory, and how it ought to be represented mathematically. The ensuing complex nexus of fitness (CNF) emphasises the distinction between biological propensities and the probability distributions over offspring numbers that they give rise to; and how critical it is to distinguish the possession conditions of the underlying dispositional (physical and biological) properties from those of their probabilistic manifestations.
    Found 1 hour, 48 minutes ago on PhilSci Archive
  4. 11862.487016
    To a first approximation, epistemic utility theory is an application of standard decision theoretic tools to the study of epistemic rationality. The strategy consists in identifying a particular class of decision problems—epistemic decision problems—and using the recommendations that our decision theory makes for them in order to motivate principles of epistemic rationality. The resulting principles will of course be a function of, among other things, what we take epistemic decision problems to be and of what specific brand of decision theory we rely on.1 But regardless of the details, epistemic utility theory inherits from the decision theoretic framework a distinction between axiological notions—of epistemic value or epistemic utility—and deontological notions—like epistemic rationality or epistemic permissibility.
    Found 3 hours, 17 minutes ago on Alejandro Pérez Carballo's site
  5. 118252.48703
    There can be no question that Ernest Sosa is one of the most influential voices in contemporary epistemology. He has made pathbreaking contributions to a wide range of topics in the field and beyond, including on its most central issues such as the nature of knowledge, the structure of knowledge, the value of knowledge and the extent of knowledge. It is fair to say that his most widely discussed contributions, at least in recent times, are on virtue epistemology and safety conditions on knowledge. Whilst both topics are intimately related in Sosa’s own work, they have generated discussions that have lives of their own. Since the bulk of the contributions to this special issue also focus on these two topics, I will take a few moments to sketch a few key ideas in what follows.
    Found 1 day, 8 hours ago on Christoph Kelp's site
  6. 160621.48707
    This paper applies Edward Craig’s and Bernard Williams’ ‘genealogical’ method to the debate between relativism and its opponents in epistemology and in the philosophy of language. We explain how the central function of knowledge attributions -- to ‘flag good informants’ -- explains the intuitions behind five different positions (two forms of relativism, absolutism, contextualism, and invariantism). We also investigate the question whether genealogy is neutral in the controversy over relativism. We conclude that it is not: genealogy is most naturally taken to favour an anti-realism about epistemic norms. And anti-realism threatens absolutism.
    Found 1 day, 20 hours ago on Robin McKenna's site
  7. 160810.487106
    : Many of us hold false beliefs about matters that are relevant to public policy such as climate change and the safety of vaccines. What can be done to rectify this situation? This question can be read in two ways. According to the descriptive reading, it concerns which methods will be effective in persuading people that their beliefs are false. According to the normative reading, it concerns which methods we are permitted to use in the service of persuading people. Some effective methods—a programme of brainwashing, say—would not be permissible. In this paper I compare “methods of rational persuasion” with what you might call “marketing methods” such as how one frames the problem of climate change. My aim is to show that “marketing methods” are preferable to “methods of rational persuasion”. My argument has two parts. First, I argue that the evidence suggests that “marketing methods” are more effective in persuading people to change their minds. Second, I argue that “marketing methods” are an acceptable response to the normative question.
    Found 1 day, 20 hours ago on Robin McKenna's site
  8. 160876.48713
    Keith DeRose’s new book The Appearance of Ignorance is a welcome companion volume to his 2009 book The Case for Contextualism. Where latter focused on contextual-ism as a view in the philosophy of language, the former focuses on how contextualism contributes to our understanding of (and solution to) some perennial epistemological problems, with the skeptical problem being the main focus of six of the seven chapters. DeRose’s view is that a solution to the skeptical problem must do two things. First, it must explain how it is that we can know lots of things, such as that we have hands. Second, it must explain how it can seem that we don’t know these things. In slogan form, DeRose’s argument is that a contextualist semantics for knowledge attributions is needed to account for the “appearance of ignorance”—the appearance that we don’t know that skeptical hypotheses fail to obtain. In my critical discussion, I will argue inter alia that we don’t need a contextualist semantics to account for the appearance of ignorance, and in any case that the “strength” of the appearance of ignorance is unclear, as is the need for a philosophical diagnosis of it.
    Found 1 day, 20 hours ago on Robin McKenna's site
  9. 191301.487158
    (150 words): Moral, social, political, and other “nonepistemic” values can lead to bias in science, from prioritizing certain topics over others to the rationalization of questionable research practices. Such values might seem particularly common or powerful in the social sciences, given their subject matter. However, I argue first that the well-documented phenomenon of motivated reasoning provides a useful framework for understanding when values guide scientific inquiry (in pernicious or productive ways). Second, this analysis reveals a parity thesis: values influence the social and natural sciences about equally, particularly because both are so prominently affected by desires for social credit and status, including recognition and career advancement. Ultimately, bias in natural and social science is both natural and social— that is, a part of human nature and considerably motivated by a concern for social status (and its maintenance). Whether the pervasive influence of values is inimical to the sciences is a separate question.
    Found 2 days, 5 hours ago on Josh May's site
  10. 192563.487174
    We show that under plausible levels of background risk, no theory of choice under risk—such as expected utility theory, prospect theory, or rank dependent utility—can simultaneously satisfy the following three economic postulates: (i) Decision makers are risk-averse over small gambles, (ii) they respect stochastic dominance, and (iii) they account for background risk.
    Found 2 days, 5 hours ago on Luciano Pomatto's site
  11. 224983.487199
    How should governments decide between alternative taxation schemes, environmental protection regulations, infrastructure plans, climate change policies, healthcare systems, and other policies? One kind of consideration that should bear on such decisions is their effects on people’s well-being. The most rigorous methodology for evaluating such effects is the “social welfare function” (SWF) approach originating in the work of Abram Bergson and Paul Samuelson and further developed by Kenneth Arrow, Amartya Sen, and other economists.
    Found 2 days, 14 hours ago on PhilPapers
  12. 259336.487215
    F. A. Hayek and the Epistemology of Politics is primarily intended as a contribution to the philosophy and methodology of the Austrian School of economics (pp. 1-2). However, as the symposium participants are all quick to note, several of the book’s central arguments, especially those advanced in the first chapter, are of potential significance far beyond Austrian economics. The arguments of the first chapter present an important methodological challenge to multiple fields of political inquiry, to traditional political philosophy and theory, and to modern political science, as well as a significant practical problem for anyone concerned with the effectiveness of political action. Professional political thinkers and laypersons alike conceive the basic political problem to concern the motivations, reasons, incentives, etc., of policymakers. On this way of thinking, the fundamental problem to be solved, analytically, by the disciplines of political inquiry, and, practically, in political life, is how to ensure that policymakers are adequately motivated to pursue policy goals either that are in constituents’ interests or that constituents’ want pursued. I do not deny the significance of this problem or the value of the proposed solutions, whether analytical or practical-constitutional, that have been offered in the long course of the history of politics and political thought. The book does not suggest that we should scrap thousands of years of political inquiry and start all over again.
    Found 3 days ago on Reason Papers
  13. 282932.48723
    One approach to knowledge, termed the relevant alternatives theory, stipulates that a belief amounts to knowledge if one can eliminate all relevant alternatives to the belief in the epistemic situation. This paper uses causal graphical models to formalize the relevant alternatives approach to knowledge. On this theory, an epistemic situation is encoded through the causal relationships between propositions, which determine which alternatives are relevant and irrelevant. This formalization entails that statistical evidence is not sufficient for knowledge, provides a simple way to incorporate epistemic contextualism, and can rule out many Gettier cases from knowledge. The interpretation in terms of causal models offers more precise predictions for the relevant alternatives theory, strengthening the case for it as a theory of knowledge.
    Found 3 days, 6 hours ago on PhilPapers
  14. 295959.487243
    This paper uses the example of the Covid-19 pandemic to analyse the danger associated with insufficient pluralism in evidence-based public health policy. Drawing on certain elements in Paul Feyerabend’s political philosophy of science, it discusses reasons for implementing more pluralism as well as challenges to be tackled on the way forward.
    Found 3 days, 10 hours ago on PhilSci Archive
  15. 324488.487258
    . How did I respond to those 7 burning questions at last week’s (“P-Value”) Statistics Debate? Here’s a fairly close transcript of my (a) general answer, and (b) final remark, for each question–without the in-between responses to Jim and David. …
    Found 3 days, 18 hours ago on D. G. Mayo's blog
  16. 379160.487272
    Two of the most systematic and well-developed theories of social norms analyse such norms in terms of patterns of individual attitudes. On Bicchieri’s view (2006, 2017), social norms most centrally involve a pattern of preferences among the members of a relevant population, conditional on their normative and empirical expectations of other members. According to Brennan et al. (2013; hereafter I will refer to this as the ‘BEGS account’), social norms most centrally involve patterns of normative attitudes among the members of a given group, grounded in a social practice of that group. This paper argues that the existence of attitudinal social norms speaks in favour of Bicchieri’s preference-based view, and against the BEGS account’s normative attitude-based view. I will first present some reasons to think that there are attitudinal social norms – social norms that demand not just behaviour, but also a variety of attitudes. I will then argue that, with a very minor modification, Bicchieri’s account can properly capture such attitudinal social norms and that the BEGS account cannot.
    Found 4 days, 9 hours ago on Han van Wietmarschen's site
  17. 454328.487317
    The TL;DR summary of what follows is that we should quantify the conventionality of a regularity (David-Lewis-style) as follows: A regularity R in the behaviour of population P in a recurring situation S, is a convention of depth x, breadth y and degree z when there is a recurring situation T that refines S, and in each instance of T there is a subpopulation K of P, such that it’s true and common knowledge among K in that instance that:(A) BEHAVIOUR CONDITION: everyone in K conforms to R (B) EXPECTATION CONDITION: everyone in K expects everyone else in K to conform to R (C) SPECIAL PREFERENCE CONDITION: everyone in K prefers that they conform to R conditionally on everyone else in K conforming to R. where x (depth) is the fraction of S-situations which are T, y (breadth) is the fraction of all Ps involved who are Ks in this instance, and z is the degree to which (A-C) obtaining resembles a coordination equilibrium that solves a coordination problem among the Ks. …
    Found 5 days, 6 hours ago on Robbie Williams's blog
  18. 544773.48735
    Debate about the epistemic prowess of historical science has focused on local underdetermination problems generated by a lack of historical data; the prevalence of information loss over geological time, and the capacities of scientists to mitigate it. Drawing on Leonelli’s recent distinction between ‘phenomena-time’ and ‘data-time’ I argue that factors like data generation, curation and management significantly complexifies and undermines this: underdetermination is a bad way of framing the challenges historical scientists face. In doing so, I identify circumstances of ‘epistemic scarcity’ where underdetermination problems are particularly salient, and discuss cases where ‘legacy data’—data generated using differing technologies and systems of practice—are drawn upon to overcome underdetermination. This suggests that one source of overcoming underdetermination is our knowledge of science’s past. Further, data-time makes agnostic positions about the epistemic fortunes of scientists working under epistemic scarcity more plausible. But agnosticism seems to leave philosophers without much normative grip. So, I sketch an alternative approach: focusing on the strategies scientists adopt to maximize their epistemic power in light of the resources available to them.
    Found 6 days, 7 hours ago on PhilSci Archive
  19. 586128.48739
    Epidemiological explanation often has a “black box” character, meaning the intermediate steps between cause and effect are unknown. Filling in black boxes is thought to improve causal inferences by making them intelligible. I argue that adding information about intermediate causes to a black box explanation is an unreliable guide to pragmatic intelligibility because it may mislead us about the stability of a cause. I diagnose a problem that I call wishful intelligibility, which occurs when scientists misjudge the limitations of certain features of an explanation. Wishful intelligibility gives us a new reason to prefer black box explanations in some contexts.
    Found 6 days, 18 hours ago on PhilSci Archive
  20. 635886.487423
    (1700 words; 8 minute read.) What rational polarization looks like. ​It’s September 21, 2020. Justice Ruth Bader Ginsburg has just died. Republicans are moving to fill her seat; Democrats are crying foul.​Fox News publishes an op-ed by Ted Cruz arguing that the Senate has a duty to fill her seat before the election. …
    Found 1 week ago on Kevin Dorst's blog
  21. 681167.487454
    What is possible, according to the empiricist conception, is what our evidence positively allows; and what is necessary is what it compels. These notions, along with logical possibility, are the only defensible notions of possibility and necessity. In so far as nomic and metaphysical possibility are defensible, they fall within empirical possibility. These empirical conceptions are incompatible with traditional possible world semantics. Empirically necessary propositions cannot be defined as those true in all possible worlds. There can be empirical possibilities without empirical necessities. The duality of possibility and necessity can be degenerate and can even be falsified.
    Found 1 week ago on John Norton's site
  22. 687823.487475
    Consider an utterance of ‘Fish sticks are tasty’ as made by a speaker who likes fish sticks. How will the speaker assess this claim when, at some later point in her life, she comes to dislike fish sticks? As true or as false? Will she retract her earlier statement or stand by it? More generally, will she use her present taste standard in assessing the claim or the standard she had at the time of the original utterance? The answer to this question is of vital importance for the recent discussion on the semantics and pragmatics of so-called “predicates of personal taste” (e.g. “tasty” and “fun”).
    Found 1 week ago on Philosopher's Imprint
  23. 688632.487494
    Many philosophers think that common sense knowledge survives sophisticated philosophical proofs against it. It’s much more certain that things move that it is that the premises of Zeno’s counterarguments are true. What goes for Zeno’s arguments against motion arguably goes for philosophical arguments against causation, time, tables, human beings, knowledge, and more.
    Found 1 week ago on PhilPapers
  24. 688742.487519
    It is widely held that assertions are partially governed by an epistemic norm. But what is the epistemic condition set out in the norm? Is it knowledge, truth, belief, or something else? In this paper, I defend a view similar to that of Stanley (2008), according to which the relevant epistemic condition is epistemic certainty, where epistemic certainty (but not knowledge) is context-sensitive. I start by distinguishing epistemic certainty, subjective certainty, and knowledge. Then, I explain why it’s much more plausible to think that ‘certain’, rather than ‘know’, is context-sensitive. After that, I respond to an important worry raised by Pritchard, according to which the proposed view is too strong to accommodate our current practice of assertion. I then show that the main linguistic and conversational data advanced in the recent literature in favour of the knowledge condition are best explained by the certainty view. Finally, I offer two principled considerations: the certainty view is the only one compatible with three independently plausible claims and it fits very well with the common thought that knowledge does not entail certainty.
    Found 1 week ago on PhilPapers
  25. 727094.487535
    No Face Ghost by Duangrat AnutaratanyaA picture of continuitySome beliefs are epistemically innocent when they are irrational but provide epistemic benefits that would not be available otherwise. We already saw some examples: delusion, confabulation, and optimistically biased beliefs. …
    Found 1 week, 1 day ago on The Brains Blog
  26. 746822.487549
    In a series of papers over the past twenty years, and in a new book, Igor Douven has argued that Bayesians are too quick to reject versions of inference to the best explanation that cannot be accommodated within their framework. In this paper, I survey Douven’s worries and attempt to answer them using a series of pragmatic and purely epistemic arguments that I take to show that Bayes’ Rule really is the only correct way to respond to your evidence.
    Found 1 week, 1 day ago on PhilPapers
  27. 813837.487563
    Landscape (The Promise), original artwork by Magdalena AntrobusPowerful agentsWe are likely to overestimate our capacities and make exceedingly rosy predictions about our future. This widespread bias towards optimism is a robust finding in psychology. …
    Found 1 week, 2 days ago on The Brains Blog
  28. 875802.487577
    Bayesian personalism models learning from experience as the updating of an agent’s credence function on the information the agent acquires. The standard updating rules are hamstrung for zero probability events. The maneuvers that have been proposed to handle this problem are examined and found wanting: they offer only temporary relief but no satisfying and stable long term resolution. They do suggest a strategy for avoiding the problem altogether, but the price to be paid is a very crabbed account of learning from experience. I outline what Bayesians would need to do in order to come to grips with the problem rather than seeking to avoid it. Furthermore, I emphasize that an adequate treatment of the issues must work not only for classical probability but also for quantum probability as well, the latter of which is rarely discussed in the philosophical literature in the same breath with the updating problem. Since it is not obvious how the maneuvers applied to updating classical probability can be made to work for updating quantum probability a rethinking of the problem may be required. At the same time I indicate that in some special cases quantum probability theory has a self-contained solution to the problem of updating on zero probability events requiring no additional technical devices or rationality constraints.
    Found 1 week, 3 days ago on PhilSci Archive
  29. 900071.487592
    Three Little Wild Boar and the Big Bad Wolf by Duangrat AnutaratanyaIncurable confabulatorsPhilosophers sometimes describe humans as rational animals. It would be more accurate to say that we are confabulating animals. …
    Found 1 week, 3 days ago on The Brains Blog
  30. 920850.487606
    In this paper, I use interventionist causal models to identify some novel Newcomb problems, and subsequently use these problems to refine existing interventionist treatments of causal decision theory. The new Newcomb problems that stir trouble for existing interventionist treatments involve so-called “exotic choice” — i.e., decision-making contexts where the agent has evidence about the outcome of her choice. I argue that when choice is exotic, the interventionist can adequately capture causal-decision-theoretic reasoning by introducing a new interventionist approach to updating on exotic evidence. But I also argue that this new updating procedure is principled only if the interventionist trades in the typical interventionist conception of choice for an alternative Ramseyan conception. I end by arguing that the guide to exotic choice developed here may be useful in some everyday contexts, despite its name.
    Found 1 week, 3 days ago on PhilPapers