1. 83468.11878
    The desirable gambles framework offers the most comprehensive foundations for the theory of lower previsions, which in turn affords the most general account of imprecise probabilities. Nevertheless, for all its generality, the theory of lower previsions rests on the notion of linear utility. This commitment to linearity is clearest in the coherence axioms for sets of desirable gambles. This paper considers two routes to relaxing this commitment. The first preserves the additive structure of the desirable gambles framework and the machinery for coherent inference but detaches the interpretation of desirability from the multiplicative scale invariance axiom. The second strays from the additive combination axiom to accommodate repeated gambles that return rewards by a non-stationary processes that is not necessarily additive. Unlike the first approach, which is a conservative amendment to the desirable gambles framework, the second is a radical departure. Yet, common to both is a method for describing rewards called discounted utility.
    Found 23 hours, 11 minutes ago on PhilSci Archive
  2. 88548.118897
    . We’re always reading about how the pandemic has created a new emphasis on preprints, so it stands to reason that non-reviewed preposts would now have a place in blogs. Maybe then I’ll “publish” some of the half-baked posts languishing on draft on errorstatistics.com. …
    Found 1 day ago on D. G. Mayo's blog
  3. 157013.118937
    COVID-19 has substantially affected our lives during 2020. Since its beginning, several epidemiological models have been developed to investigate the specific dynamics of the disease. Early COVID-19 epidemiological models were purely statistical, based on a curve-fitting approach, and did not include causal knowledge about the disease. Yet, these models had predictive capacity; thus they were used to ground important political decisions, in virtue of the understanding of the dynamics of the pandemic that they offered. This raises a philosophical question about how purely statistical models can yield understanding, and if so, what the relationship between prediction and understanding in these models is. Drawing on the model that was developed by the Institute of Health Metrics and Evaluation, we argue that early epidemiological models yielded a modality of understanding that we call descriptive understanding, which contrasts with the so-called explanatory understanding which is assumed to be the only form of scientific understanding.
    Found 1 day, 19 hours ago on PhilSci Archive
  4. 157035.118975
    The debate between ΛCDM and MOND is often cast in terms of competing gravitational theories. However, recent philosophical discussion suggests that the ΛCDM–MOND debate demonstrates the challenges of multiscale modeling in the context of cosmological scales. I extend this discussion and explore what happens when the debate is thought to be about modeling rather than about theory, offering a model-focused interpretation of the ΛCDM–MOND debate. This analysis shows how a model-focused interpretation of the debate provides a better understanding of challenges associated with extension to a different scale or domain, which are tied to commitments about explanatory fit.
    Found 1 day, 19 hours ago on PhilSci Archive
  5. 233901.119006
    When people combine concepts these are often characterised as “hybrid”, “impossible”, or “humorous”. However, when simply considering them in terms of extensional logic, the novel concepts understood as a conjunctive concept will often lack meaning having an empty extension (consider “a tooth that is a chair”, “a pet flower”, etc.). Still, people use different strategies to produce new non-empty concepts: additive or integrative combination of features, alignment of features, instantiation, etc. All these strategies involve the ability to deal with conflicting attributes and the creation of new (combinations of) properties. We here consider in particular the case where a Head concept has superior ‘asymmetric’ control over steering the resulting concept combination (or hybridisation) with a Modifier concept. Specifically, we propose a dialogical approach to concept combination and discuss an implementation based on axiom weakening, which models the cognitive and logical mechanics of this asymmetric form of hybridisation.
    Found 2 days, 16 hours ago on Nicolas Troquard's site
  6. 256532.119035
    This paper aims to shed light on the relation between Boltzmannian statistical mechanics and Gibbsian statistical mechanics by studying the Mechanical Averaging Principle, which says that, under certain conditions, Boltzmannian equilibrium values and Gibbsian phase averages are approximately equal. What are these conditions? We identify three conditions each of which is individually sufficient (but not necessary) for Boltzmannian equilibrium values to be approximately equal to Gibbsian phase averages: the Khinchin condition, and two conditions that result from two new theorems, the Average Equivalence Theorem and the Cancelling Out Theorem. These conditions are not trivially satisfied, and there are core models of statistical mechanics, the six-vertex model and the Ising model, in which they fail.
    Found 2 days, 23 hours ago on PhilSci Archive
  7. 256590.119063
    The relational interpretation (or RQM, for Relational Quantum Mechanics) solves the measurement problem by considering an ontology of sparse relative facts. Facts are realized in interactions between any two physical systems and are relative to these systems. RQM’s technical core is the realisation that quantum transition amplitudes determine physical probabilities only when their arguments are facts relative to the same system. The relativity of facts can be neglected in the approximation where decoherence hides interference, thus making facts approximately stable.
    Found 2 days, 23 hours ago on PhilSci Archive
  8. 256678.119094
    Dupre and Nicholson (2018) defend the metaphysical thesis that the ‘living world’ is not composed of things or substances, as traditionally believed, but of processes. They advocate a process – as opposed to a substance – metaphysics and ontology, which results to be more empirically adequate to what contemporary biology suggests.
    Found 2 days, 23 hours ago on PhilSci Archive
  9. 256793.119122
    We propose that measures of information integration can be more straightforwardly interpreted as measures of agency rather than of consciousness. This may be useful to the goals of consciousness research, given how agency and consciousness are “duals” in many (though not all) respects.
    Found 2 days, 23 hours ago on PhilSci Archive
  10. 256930.11915
    The notion of growth is one of the most studied notions within economic theory and, traditionally, it is accounted for on the basis of a positivist thesis according to which assumptions are not relevant, as long as economic models have acceptable predictive power. Following this view, it does not matter whether assumptions are realistic or not. Arguments against this principle may involve a defense of the realistic assumptions over highly idealized or false ones. This article aims in a different direction. Instead of demanding more realism, we can accept the spirit of the mentioned thesis, but, instead, criticize the circularity that may arise by combining different assumptions that are necessary for the explanation of economic growth in mainstream economics. Such a circularity is a key aspect of the well-known problem of providing microfoundations for macroeconomic properties. It is here suggested that the notion of emergence could be appropriate to arrive at a better understanding of growth, clarifying the issues related to circularity, but without totally rejecting the usefulness of unrealistic assumptions.
    Found 2 days, 23 hours ago on PhilSci Archive
  11. 545704.119178
    Aesthetic values have featured in scientific practice for centuries, shaping what theories and experiments are pursued, what explanations are considered satisfactory and whether theories are trusted. How do such values enter in the different levels of scientific practice and should they influence our epistemic attitudes? In this chapter I explore these questions and how throughout scientific progress the questions we ask about the role of aesthetic values might change. I start this chapter with an overview of the traditional philosophical distinction between context of discovery and context of justification, showing how aesthetic values were taken to be relevant to scientific discovery and not scientific evaluation, which was regarded value-free. I then proceed with an exploration of different levels of scientific activities, from designing experiments and reconstructing fossils to evaluating data. In this discussion we will see that the traditional distinction between context of discovery and justification seems to break down, as aesthetic values shape all levels of scientific activity. I then turn our attention to the epistemological question: can beauty play an epistemic role, is it to be trusted, or is it a suspect value that might bias scientific inquiry? I explore how we could justify the epistemic import of aesthetic values and present some concerns as well. In the last section I ask whether we should expect the questions surrounding aesthetic values in scientific practice to change with scientific progress, as we enter the era of post-empirical physics, big data science, and make more and more discoveries using AI.
    Found 6 days, 7 hours ago on PhilSci Archive
  12. 545818.119205
    I provide a critical commentary regarding the attitude of the logician and the philosopher towards the physicist and physics. The commentary is intended to showcase how a general change in attitude towards making scientific inquiries can be beneficial for science as a whole. However, such a change can come at the cost of looking beyond the categories of the disciplines of logic, philosophy and physics. It is through self-inquiry that such a change is possible, along with the realization of the essence of the middle that is otherwise excluded by choice. The logician, who generally holds a reverential attitude towards the physicist, can then actively contribute to the betterment of physics by improving the language through which the physicist expresses his experience. The philosopher, who otherwise chooses to follow the advancement of physics and gets stuck in the trap of sophistication of language, can then be of guidance to the physicist on intellectual grounds by having the physicist’s experience himself. In course of this commentary, I provide a glimpse of how a truthful conversion of verbal statements to physico-mathematical expressions unravels the hitherto unrealized connection between Heisenberg uncertainty relation and Cauchy’s definition of derivative that is used in physics. The commentary can be an essential reading if the reader is willing to look beyond the categories of logic, philosophy and physics by being ‘nobody’.
    Found 6 days, 7 hours ago on PhilSci Archive
  13. 545969.119238
    Many physicists have thought that absolute time became otiose with the introduction of Special Relativity. William Lane Craig disagrees. Craig argues that although relativity is empirically adequate within a domain of application, relativity is literally false and should be supplanted by a Neo-Lorentzian alternative that allows for absolute time. Meanwhile, Craig and co-author James Sinclair have argued that physical cosmology supports the conclusion that physical reality began to exist at a finite time in the past. However, on their view, the beginning of physical reality requires the objective passage of absolute time, so that the beginning of physical reality stands or falls with Craig’s Neo-Lorentzian metaphysics. Here, I raise doubts about whether, given Craig’s NeoLorentzian metaphysics, physical cosmology could adequately support a beginning of physical reality within the finite past. Craig and Sinclair’s conception of the beginning of the universe requires a past boundary to the universe. A past boundary to the universe cannot be directly observed and so must be inferred from the observed matter-energy distribution in conjunction with auxilary hypotheses drawn from a substantive physical theory. Craig’s brand of Neo Lorentzianism has not been sufficiently well specified so as to infer either that there is a past boundary or that the boundary is located in the finite past. Consequently, Neo Lorentzianism implicitly introduces a form of skepticism that removes the ability that we might have otherwise had to infer a beginning of the universe. Furthermore, in analyzing traditional big bang models, I develop criteria that Neo-Lorentzians should deploy in thinking about the direction and duration of time in cosmological models generally. For my last task, I apply the same criteria to bounce cosmologies and show that Craig and Sinclair have been wrong to interpret bounce cosmologies as including a beginning of physical reality.
    Found 6 days, 7 hours ago on PhilSci Archive
  14. 561149.119272
    Explanatory realism is the view that explanations work by providing information about relations of productive determination such as causation or grounding. The view has gained considerable popularity in the last decades, especially in the context of metaphysical debates about non-causal explanation. What makes the view particularly attractive is that it fits nicely with the idea that not all explanations are causal whilst avoiding an implausible pluralism about explanation. Another attractive feature of the view is that it allows explanation to be a partially epistemic, context-dependent phenomenon. In spite of its attractiveness, explanatory realism has recently been subject to criticism. In particular, Taylor (Philos Stud 175(1):197–219, 2018). has presented four types of explanation that the view allegedly cannot account for. This paper defends explanatory realism against Taylor’s challenges. We will show that Taylor’s counterexamples are either explanations that turn out to provide information about entities standing in productive determination relations or that they are not genuine explanations in the first place.
    Found 6 days, 11 hours ago on PhilPapers
  15. 590543.119302
    Johannes Kepler (1571–1630) is one of the most significant representatives of the so-called Scientific Revolution of the 16th and 17th centuries. Although he received only the basic training of a “magister” and was professionally oriented towards theology at the beginning of his career, he rapidly became known for his mathematical skills and theoretical creativity. As a convinced Copernican, Kepler was able to defend the new system on different fronts: against the old astronomers who still sustained the system of Ptolemy, against the Aristotelian natural philosophers, against the followers of the new “mixed system” of Tycho Brahe—whom Kepler succeeded as Imperial Mathematician in Prague—and even against the standard Copernican position according to which the new system was to be considered merely as a computational device and not necessarily a physical reality.
    Found 6 days, 20 hours ago on Wes Morriston's site
  16. 618831.119329
    Traditionally, logic has been the dominant formal method within philosophy. Are logical methods still dominant today, or have the types of formal methods used in philosophy changed in recent times? To address this question, we coded a sample of philosophy papers from the late 2000s and from the late 2010s for the formal methods they used. The results indicate that (a) the proportion of papers using logical methods remained more or less constant over that time period but (b) the proportion of papers using probabilistic methods was approximately three times higher in the late 2010s than it was in the late 2000s. Further analyses explored this change by looking more closely at specific methods, specific levels of technical engagement, and specific subdisciplines within philosophy. These analyses indicate that the increasing proportion of papers using probabilistic methods was pervasive, not confined to particular probabilistic methods, levels of sophistication, or subdisciplines.
    Found 1 week ago on PhilSci Archive
  17. 618856.119364
    As with most topics in philosophy, there is no consensus about what experimental philosophy is. Most broadly, experimental philosophy involves using scientific methods to collect empirical data for the purpose of casting light on philosophical issues. Such a definition threatens to be too broad, however: Taking the nature of matter to be a philosophical issue, research at the Large Hadron Collider would count as experimental philosophy. Others have suggested more narrow definitions, characterizing experimental philosophy in terms of the use of scientific methods to investigate intuitions. This threatens to be too narrow, however, excluding such work as Eric Schwitzgebel’s comparison of the rates of theft of ethics books to similar volumes from other areas of philosophy for the purpose of finding out whether philosophical training in ethics promotes moral behavior. While restricting experimental philosophy to the study of intuitions is too narrow, this nonetheless covers most of the research in this area. Focusing on this research, we begin by discussing some of the methods that have been used by experimental philosophers. We then distinguish between three types of goals that have guided experimental philosophers, illustrating these goals with some examples.
    Found 1 week ago on Justin Sytsma's site
  18. 618863.119396
    Eugen Fischer and colleagues expand on a body of empirical work offering a debunking explanation of a key assumption involved in the argument from illusion. Following Snowden (1992), we can distinguish between the base case and the spreading step in the argument. Fischer et al. target the base case. In the most prominent current versions of the argument, the key move in the base case involves the phenomenal principle (Robinson, 1994, 32): “If there sensibly appears to a subject to be something which possesses a particular sensible quality then there is something of which the subject is aware which does possess that sensible quality.” In brief, Fischer et al. contend that the move here from a seemingly uncontroversial claim such as “the coin appears elliptical to me” to there being something of which the subject is aware that is elliptical requires that the initial claim be given a “literal interpretation” such that something elliptical has appeared to the subject. But they contend that under such an interpretation the claim should no longer be taken to be uncontroversial, assuming too much of what the argument needs to establish. And they argue that much of the intuitive appeal of this move can be explained in terms of accepting the claim based on the dominant usage of appearance verbs (e.g., I think the coin is elliptical), then shifting to the less salient phenomenal usage required for the conclusion. Fischer et al. then present the results of a series of nifty new studies in cross-cultural psycholinguistics to support the conclusion that people make stereotypical inferences warranted by the dominant sense of appearance verbs, even in contexts where this dominant sense is inappropriate.
    Found 1 week ago on Justin Sytsma's site
  19. 618868.11943
    Eugen Fischer and John Collins have brought together an impressive, and important, series of essays concerning the methodological debates between rationalists and naturalists, and how these debates have been impacted by work in experimental philosophy. The work at issue concerns the evidential value of intuitions, and as such is only a small part of the experimental philosophy corpus as I understand it. In fact, Fischer and Collins define experimental philosophy in this narrow sense in their introduction. On their view, experimental philosophy ‘‘builds on the assumption that, for better or worse, intuitions are crucially involved in philosophical work’’ (3). The parenthetical serves to emphasize that such work could either be pursued from a positive perspective aiming to vindicate the use of intuitions in philosophy or from a negative perspective aiming to undermine that use. Noting these two perspectives, it might then seem that experimental philosophy is neutral with regard to methodological debate: ‘‘experimental philosophy is not a party to the dispute between methodological rationalism and naturalism, but offers a new framework for settling it’’ (23).
    Found 1 week ago on Justin Sytsma's site
  20. 661358.11946
    In this paper, we use the case of the COVID-19 pandemic in Europe to address the question of what kind of knowledge we should incorporate into public health policy. We show that policy-making in Europe during the COVID-19 pandemic has been biomedicine-centric in that its evidential basis marginalised input from non-biomedical disciplines. We then argue that in particular the social sciences could contribute essential expertise and evidence to public health policy in times of biomedical emergencies and that we should thus strive for a tighter integration of the social sciences in future evidence-based policy-making. This demand faces challenges on different levels, which we identify and discuss as potential inhibitors for a more pluralistic evidential basis.
    Found 1 week ago on PhilSci Archive
  21. 731134.119494
    Writing comments on a post about adversarial collaboration feels like a place where I should be adversarial (if in a collaborative spirit). But I agree with basically everything Eric says here. Frankly, this is all spot on. You probably don’t want to read 500 words from me just saying “yep, this” and agreeing with his excellent, sensible advice, though. So, let me attempt to be provocative: Eric doesn’t go far enough! (Not that he was trying to, of course.) All philosophers should be asking themselves what empirical evidence would actually test their views. Collaboration should be the rule, not the exception. And we should expect collaborations to have an adversarial element, treating this as a feature, not a bug.
    Found 1 week, 1 day ago on Justin Sytsma's site
  22. 753873.119526
    Betting on collapse (EDC, ch.6) Posted on Wednesday, 15 Sep 2021. Topic: decision theory Chapter 6 of Evidence, Decision and Causality presents another alleged counterexample to CDT, involving a bet on the measurement of entangled particles. …
    Found 1 week, 1 day ago on wo's weblog
  23. 804510.119556
    I offer a case that quantum query complexity still has loads of enticing and fundamental open problems—from relativized QMA versus QCMA and BQP versus IP, to time/space tradeoffs for collision and element distinctness, to polynomial degree versus quantum query complexity for partial functions, to the Unitary Synthesis Problem and more.
    Found 1 week, 2 days ago on Scott Aaronson's site
  24. 816976.119586
    Most authors who discuss willpower assume that everyone knows what it is, but our assumptions differ to such an extent that we talk past each other. We agree that willpower is the psychological function that resists temptations – variously known as impulses, addictions, or bad habits; that it operates simultaneously with temptations, without prior commitment; and that ’s skill at exec-use of it is limited by its cost, commonly called effort, as well as by the person utive functioning. However, accounts are usually not clear about how motivation functions during the application of willpower, or how motivation is related to effort. Some accounts depict willpower as the perceiving or formation of motivational contingencies that outweigh the temptation, and some depict it as a continuous use of mechanisms that interfere with reweighing the temptation. Some others now suggest that impulse control can bypass motivation altogether, although they refer to this route as habit rather than willpower.
    Found 1 week, 2 days ago on Daniel Kelly's site
  25. 862591.119614
    The pattern of implicatures of modified numeral ‘more than n’ depends on the roundness of n. Cummins, Sauerland, and Solt (2012) present experimental evidence for the relation between roundness and implicature patterns, and propose a pragmatic account of the phenomenon. More recently, Hesse and Benz (2020) present more extensive evidence showing that implicatures also depend on the magnitude of n and propose a novel explanation based on the Approximate Number System (Dehaene, 1999). Despite the wealth of experimental data, no formal account has yet been proposed to characterize the full posterior distribution over numbers of a listener after hearing ‘more than n’. We develop one such account within the Rational Speech Act framework, quantitatively reconstructing the pragmatic reasoning of a rational listener. We show that our pragmatic account correctly predicts various features of the experimental data.
    Found 1 week, 2 days ago on Jakub Szymanik's site
  26. 892516.119651
    While evenness is understood to be maximal if all types (species, geno-types, alleles, etc.) are represented equally (via abundance, biomass, area, etc.), its opposite, maximal unevenness, either remains conceptually in the dark or is conceived as the type distribution that minimizes the applied evenness index. The latter approach, however, frequently leads to conceptual inconsistency due to the fact that the minimizing distribution is not specifiable or is monomorphic. The state of monomorphism, however, is indeterminate in terms of its evenness/unevenness characteristics. Indeed, the semantic indeterminacy also shows up in the observation that monomorphism represents a state of pronounced discontinuity for the established evenness indices. This serious conceptual inconsistency is latent in the widely held idea that evenness is an independent component of diversity. As a consequence, the established evenness indices largely appear as indicators of relative polymorphism rather than as indicators of evenness.
    Found 1 week, 3 days ago on PhilSci Archive
  27. 892639.119683
    Is freedom compatible with determinism? Davidson (in “Mental Events”) famously rephrased this question by replacing “freedom” with “anomaly of the mental”, that is, failure to fall under a law. In order to prove that the anomaly of the mental is compatible with other conjectures he makes, in particular that: (a) there is psycho-physical causation; (b) “where there is causality, there must be a law” (Davidson 1970, p. 208); and (c) the mental supervenes on the physical, Davidson proposed a model (i.e., an interpretation under which all these conjectures are true), that came to be known as anomalous monism.
    Found 1 week, 3 days ago on PhilSci Archive
  28. 892701.119713
    Phylogenetic models traditionally represent the history of life as having a strictly-branching tree structure. However, it is becoming increasingly clear that the history of life is often not strictly-branching; lateral gene transfer, endosymbiosis, and hybridization, for example, can all produce lateral branching events. There is thus motivation to allow phylogenetic models to have a reticulate structure. One proposal involves the reconciliation of genealogical discordance. Briefly, this method involves using patterns of disagreement – discordance – between trees of different genes to add lateral branching events to phylogenetic trees of taxa, and to estimate the most likely cause of these events. I use this practice to argue for: (1) a need for expanded accounts of multiple-models idealization, (2) a distinction between automatic and manual de-idealization, and (3) recognition that idealization may serve the meso-level aims of science in a different way than hitherto acknowledged.
    Found 1 week, 3 days ago on PhilSci Archive
  29. 892783.119745
    We were slightly concerned, upon having read Eric Winsberg, Jason Brennan and Chris Surprenant’s reply to our paper “Were Lockdowns Justified? A Return to the Facts and Evidence”, that they may have fundamentally misunderstood the nature of our argument, so we issue the following clarification, along with a comment on our motivations for writing such a piece, for the interested reader.
    Found 1 week, 3 days ago on PhilSci Archive
  30. 927711.119781
    Fixing the Past and the Laws (EDC, ch.5) Posted on Monday, 13 Sep 2021 Chapter 5 of Evidence, Decision and Causality presents a powerful challenge to CDT (drawing on Ahmed (2013) and Ahmed (2014)). Imagine you have strong evidence that a certain deterministic system S is the true system of laws in our universe. …
    Found 1 week, 3 days ago on wo's weblog