1. 44789.679167
    The issue of independent evidence is of central importance to hypothesis testing in evolutionary biology. Suppose you wanted to test the hypothesis that long fur is an adaptation to cold climate and short fur is an adaptation to warm climate. You look at 20 hour species; 10 live in a cold climate and have long fur, and 10 live in a warm climate and have short fur. Is there any reason to think that the data do not confirm the adaptive hypothesis? One worry is that the species in each group resemble each other merely because they inherited their fur length from a common ancestor of the group (and that the temperatures experienced by ancestors and descendants are similar). This influence of ancestor on descendant is often called phylogenetic inertia (e.g., see Harvey and l’ngel 1991).
    Found 12 hours, 26 minutes ago on Elliott Sober's site
  2. 44851.679275
    There are many reasons for objecting to quantifying the ‘proof beyond reasonable doubt’ standard of criminal law as a percentage probability. They are divided into ethical and policy reasons, on the one hand, and reasons arising from the nature of logical probabilities, on the other. It is argued that these reasons are substantial and suggest that the criminal standard of proof should not be given a precise number. But those reasons do not rule out a minimal imprecise number. ‘Well above 80%’ is suggested as a standard, implying that any attempt by a prosecutor or jury to take the ‘proof beyond reasonable doubt’ standard to be 80% or less should be ruled out as a matter of law.
    Found 12 hours, 27 minutes ago on James Franklin's site
  3. 89065.679372
    Résumé : Cet article cherche à montrer comment la pratique mathématique, particulièrement celle admettant des représentations visuelles, peut conduire à de nouveau résultats mathématiques. L’argumentation est basée sur l’étude du cas d’un domaine des mathématiques relativement récent et prometteur: la théorie géométrique des groupes. L’article discute comment la représentation des groupes par les graphes de Cayley rendit possible la découverte de nouvelles propriétés géométriques de groupes. Abstract: The paper aims to show how mathematical practice, in particular with visual representations can lead to new mathematical results. The argument is based on a case study from a relatively recent and promising mathematical subject—geometric group theory. The paper discusses how the representation of groups by Cayley graphs made possible to discover new geometric properties of groups.
    Found 1 day ago on PhilSci Archive
  4. 89109.67941
    The idea that a serious threat to scientific realism comes from unconceived alternatives has been proposed by van Fraassen, Sklar, Stanford and Wray among others. Peter Lipton’s critique of this threat from underconsideration is examined briefly in terms of its logic and its applicability to the case of space-time and particle physics. The example of space-time and particle physics indicates a generic heuristic for quantitative sciences for constructing potentially serious cases of underdetermination, involving one-parameter family of rivals Tm (m real and small) that work as a team rather than as a single rival against default theory T . In important examples this new parameter has a physical meaning (e.g., particle mass) and makes a crucial conceptual difference, shrinking the symmetry group and in some case putting gauge freedom, formal indeterminism vs. determinism, the presence of the hole argument, etc., at risk. Methodologies akin to eliminative induction or tempered subjective Bayesianism are more demonstrably reliable than the custom of attending only to “our best theory”: they can lead either to a serious rivalry or to improved arguments for the favorite theory. The example of General Relativity (massless spin 2 in particle physics terminology) vs. massive spin 2 gravity, a recent topic in the physics literature, is discussed. Arguably the General Relativity and philosophy literatures have ignored the most serious rival to General Relativity.
    Found 1 day ago on PhilSci Archive
  5. 93900.679442
    The slogan ‘Evidence of evidence is evidence’ may sound plausible, but what it means is far from clear. It has often been applied to connect evidence in the current situation to evidence in another situation. The relevant link between situations may be diachronic (White 2006: 538): is present evidence of past or future evidence of something present evidence of that thing? Alternatively, the link may be interpersonal (Feldman 2007: 208): is evidence for me of evidence for you of something evidence for me of that thing? Such interperspectival links have been discussed because they can destabilize inter-perspectival disagreements. In their own right they have become the topic of a lively recent debate (Fitelson 2012, Feldman 2014, Roche 2014, Tal and Comesaña 2014).
    Found 1 day, 2 hours ago on Timothy Williamson's site
  6. 96871.679477
    Okasha, in *Evolution and the Levels of Selection*, convincingly argues that two rival statistical decompositions of covariance, namely contextual analysis and the neighbour approach, are better causal decompositions than the hierarchical Price approach. However, he claims that this result cannot be generalized in the special case of soft selection and argues that the Price approach represents in this case a better option. He provides several arguments to substantiate this claim. In this paper, I demonstrate that these arguments are flawed and argue that neither the Price equation nor the contextual and neighbour partitionings sensu Okasha are adequate causal decompositions in cases of soft selection. The Price partitioning is generally unable to detect cross-level by-products and this naturally also applies to soft selection. Both contextual and neighbour partitionings violate the fundamental principle of determinism that the same cause always produces the same effect. I argue that a fourth partitioning widely used in the contemporary social sciences, under the generic term of ‘hierarchical linear model’ and related to contextual analysis understood broadly, addresses the shortcomings of the three other partitionings and thus represents a better causal decomposition.
    Found 1 day, 2 hours ago on Pierrick Bourrat's site
  7. 211807.67951
    We Explore Consequences of the View That to Know a Proposition Your Rational Credence in the Proposition Must Exceed a Certain Threshold. In Other Words, to Know Something You Must Have Evidence That Makes Rational a High Credence in It. We Relate Such a Threshold View to Dorr Et Al.’S (Philosophical Studies 170(2):277–287, 2014) Argument Against the Principle They Call Fair Coins: ‘‘If You Know a Coin Won’T Land Tails, Then You Know It Won’T Be Flipped.’’ They Argue for Rejecting Fair Coins Because It Leads to a Pervasive Skepticism About Knowledge of the Future. We Argue That the Threshold View of Evidence and Knowledge Gives Independent Grounds to Reject Fair Coins.
    Found 2 days, 10 hours ago on Daniel Rothschild's site
  8. 215334.679537
    Aafira and Halim are both 90% confident that it will be sunny tomorrow. Aafira bases her credence on her observation of the weather today and her past experience of the weather on days that follow days like today — around nine out of ten of them have been sunny. Halim bases his credence on wishful thinking — he just really likes the sun. Aafira, it seems, is justified in her credence, while Halim is not. Just as one of your full or categorical beliefs might be justified if it is based on visual perception under good conditions, or on memories of recent important events, or on testimony from experts, so might one of your credences be; and just as one of your full beliefs might be unjustified if it is based on wishful thinking, or biased stereotypical associations, or testimony from ideologically driven news outlets, so might your credences be. In this paper, we seek an account of justified credence — in particular, we seek necessary and sufficient conditions for a credence to be justified. Our account will be reliabilist.
    Found 2 days, 11 hours ago on Richard Pettigrew's site
  9. 215937.67957
    By Aris Spanos One of R. A. Fisher’s (17 February 1890 — 29 July 1962) most re­markable, but least recognized, achievement was to initiate the recast­ing of statistical induction. Fisher (1922) pioneered modern frequentist statistics as a model-based approach to statistical induction anchored on the notion of a statistical model, formalized by: Mθ(x)={f(x;θ); θ∈Θ}; x∈Rn ;Θ⊂Rm; m < n; (1) where the distribution of the sample f(x;θ) ‘encapsulates’ the proba­bilistic information in the statistical model. …
    Found 2 days, 11 hours ago on D. G. Mayo's blog
  10. 280390.679605
    In its most abstract form, an ontology is an account of fundamental degrees of freedom in nature. The metaphysician asks, what are the independently varying components of nature, their internal degrees of freedom and the configurations they can assume? The rationalist metaphysician supposes that we have some form of rational insight into the nature of reality. The naturalistic metaphysician relies on observation and experiment. Her task is to infer ontology from data. Given an ontology and a set of laws, one can generate a range of possible behavior,ii so the naturalistic metaphysician faces an inverse problem: how does she infer backwards from a range of observed behavior to underlying ontology?
    Found 3 days, 5 hours ago on Jenann Ismael's site
  11. 283990.679632
    Before the development of quantum mechanics, most of the philosophical discussion of probability focused on statistical probabilities. Philosophers of science have a particular interest in statistical probabilities because they play an important role in the testing and confirmation of theories, and they played a central role in the statistical mechanics of Boltzmann and Gibbs developed in the 18th century. Since the introduction of quantum mechanics, however, much of the philosophical attention has become focused on the interpretation of chances. These are the probabilities assigned to particular events (the detection of a photon at a certain location on a photographic plate, or the registration of the result of a spin experiment on a particular electron) by applications of the Born Rule. The appearance of chances in quantum mechanics marked the first time that probabilities made an explicit appearance in a fundamental theory. They raise new kinds of ontological questions. Unlike statistical probabilities (which pertain to classes of events), chances are single-case probabilities. And unlike credences (which represent the epistemic states of believers), chances purport to represent features of the physical world.
    Found 3 days, 6 hours ago on Jenann Ismael's site
  12. 364680.679659
    Several authors have claimed that prediction is essentially impossible in the general theory of relativity, the case being particularly strong, it is said, when one fully considers the epistemic predicament of the observer. Each of these claims rests on the support of an underdetermination argument and a particular interpretation of the concept of prediction. I argue that these underdetermination arguments fail and depend on an implausible explication of prediction in the theory. The technical results adduced in these arguments can be related to certain epistemic issues, but can only be misleadingly or mistakenly characterized as related to prediction.
    Found 4 days, 5 hours ago on PhilPapers
  13. 369728.679693
    I give an account of proof terms for derivations in a sequent calculus for classical propositional logic. The term for a derivation δ of a sequent Σ  ∆ encodes how the premises Σ and conclusions ∆ are related in δ. This encoding is many–to–one in the sense that different derivations can have the same proof term, since different derivations may be different ways of representing the same underlying connection between premises and conclusions. However, not all proof terms for a sequent Σ  ∆ are the same. There may be different ways to connect those premises and conclusions.
    Found 4 days, 6 hours ago on Greg Restall's site
  14. 371047.679722
    Donkey sentences have existential and universal readings, but they are not often perceived as ambiguous. We extend the pragmatic theory of homogeneity in plural definites by Križ (2016) to explain how context disambiguates donkey sentences. We propose that the denotations of such sentences produce truth value gaps — in certain scenarios the sentences are neither true nor false — and demonstrate that Križ’s pragmatic theory fills these gaps to generate the standard judgments of the literature. Building on Muskens’s (1996) Compositional Discourse Representation Theory, the semantic analysis defines a general schema for quantification that delivers the required truth value gaps. Given the independently motivated pragmatic account of homogeneity inferences, we argue that donkey ambiguities do not require plural information states, contra Brasoveanu 2008, 2010, or error states and supervaluationist determiners, contra Champollion 2016. Moreover we point out several empirical issues with the trivalent dynamic fragment in Champollion 2016, all of which are avoided by not relying on plural information states. Yet, as in Champollion 2016, the parallel between donkey pronouns and definite plurals is still located in the pragmatics rather than in the semantics, which sidesteps problems known to arise for some previous accounts according to which donkey pronouns and definite plurals both have plural referents (Krifka 1996, Yoon 1996).
    Found 4 days, 7 hours ago on Lucas Champollion's site
  15. 372942.679752
    Computational complexity theory is a branch of computer science that is dedicated to classifying computational problems in terms of their difficulty. Unlike computability theory, whose object is to determine what we can compute in principle, the object of complexity theory is to inform us with regards to our practical limits. It thus serves as a natural conceptual bridge between the study of mathematics and the study of technology, in the sense that computational complexity theory
    Found 4 days, 7 hours ago on PhilSci Archive
  16. 373009.679783
    According to the hierarchy of models (HoM) account of scientific experimentation developed by Patrick Suppes and elaborated by Deborah Mayo, theoretical considerations about the phenomena of interest are involved in an experiment through theoretical models that in turn relate to experimental data through data models, via the linkage of experimental models. In this paper, I dispute the HoM account in the context of present-day high-energy physics (HEP) experiments. I argue that even though the HoM account aims to characterize experimentation as a model-based activity, it does not involve a modeling concept for the process of data acquisition and thus fails to provide a model-based characterization of the theory-experiment relationship underlying this process. In order to characterize the foregoing relationship, I propose the concept of a model of data acquisition and illustrate it in the case of the ATLAS experiment at CERN’s Large Hadron Collider, where the Higgs boson was discovered in 2012. I show that the process of data acquisition in the ATLAS experiment is performed according to a model of data acquisition that specifies and organizes the experimental procedures necessary to select the data according to a predetermined set of selection criteria. I also point out that this data acquisition model is theory-laden, in the sense that the underlying data selection criteria are determined in accordance with the testable predictions of the theoretical models that the ATLAS experiment is aimed to test. I take the foregoing theory-ladenness to indicate that the relationship between the procedures of the ATLAS experiment and the theoretical models of the phenomena of interest is first established, prior to the formation of data models, through the data acquisition model of the experiment, thus not requiring the intermediary of other types of models as suggested by the HoM account. I therefore conclude that in the context of HEP experiments, the HoM account does not consistently extend to the process of data acquisition so as to include models of data acquisition.
    Found 4 days, 7 hours ago on PhilSci Archive
  17. 373024.679816
    Electromagnetism is one of the oldest natural phenomena studied by modern science. Laws of electromagnetism gradually evolved from the various experimental observations. In this process, Coulomb’s law was the first one which established the 1/r dependence of force between two charges. Then, Faraday discovered the induction of voltage by changing magnetic field and Ampere quantified the magnetic field generated by electric current. Finally, Maxwell introduced the concept of displacement current (i.e. time varying electric field giving rise to magnetic field) and wrote all the laws of electromagnetism in an elegant form which are commonly known Maxwell’s equations. These equations in differential form are given by,
    Found 4 days, 7 hours ago on PhilSci Archive
  18. 374111.679894
    Formal learning theory is the mathematical embodiment of a normative epistemology. It deals with the question of how an agent should use observations about her environment to arrive at correct and informative conclusions. Philosophers such as Putnam, Glymour and Kelly have developed learning theory as a normative framework for scientific reasoning and inductive inference. Terminology. Cognitive science and related fields typically use the term “learning” for the process of gaining information through observation— hence the name “learning theory”. To most cognitive scientists, the term “learning theory” suggests the empirical study of human and animal learning stemming from the behaviourist paradigm in psychology.
    Found 4 days, 7 hours ago on Stanford Encyclopedia of Philosophy
  19. 374160.679914
    It is not news that we often make discoveries or find reasons for a mathematical proposition by thinking alone. But does any of this thinking count as conducting a thought experiment? The answer to that question is “yes”, but without refinement the question is uninteresting. Suppose you want to know whether the equation [ 8x + 12y = 6 ] has a solution in the integers. You might mentally substitute some integer values for the variables and calculate. In that case you would be mentally trying something out, experimenting with particular integer values, in order to test the hypothesis that the equation has no solution in the integers. Not getting a solution first time, you might repeat the thought experiment with different integer inputs.
    Found 4 days, 7 hours ago on PhilSci Archive
  20. 374193.679929
    In this paper we argue that the different positions taken by Dyson and Feynman on Feynman diagrams’ representational role depend on different styles of scientific thinking. We begin by criticizing the idea that Feynman Diagrams can be considered to be pictures or depictions of actual physical processes. We then show that the best interpretation of the role they play in quantum field theory and quantum electrodynamics is captured by Hughes' Denotation, Deduction and Interpretation theory of models (DDI), where “models” are to be interpreted as inferential, non-representational devices constructed in given social contexts by the community of physicists.
    Found 4 days, 7 hours ago on PhilSci Archive
  21. 374221.679948
    – According to pancomputationalism, all physical systems – atoms, rocks, hurricanes, and toasters – perform computations. Pancomputationalism seems to be increasingly popular among some philosophers and physicists. In this paper, we interpret pancomputationalism in terms of computational descriptions of varying strength—computational interpretations of physical microstates and dynamics that vary in their restrictiveness. We distinguish several types of pancomputationalism and identify essential features of the computational descriptions required to support them. By tying various pancomputationalist theses directly to notions of what counts as computation in a physical system, we clarify the meaning, strength, and plausibility of pancomputationalist claims. We show that the force of these claims is diminished when weaknesses in their supporting computational descriptions are laid bare. Specifically, once computation is meaningfully distinguished from ordinary dynamics, the most sensational pancomputationalist claims are unwarranted, whereas the more modest claims offer little more than recognition of causal similarities between physical processes and the most primitive computing processes.
    Found 4 days, 7 hours ago on PhilSci Archive
  22. 374360.679963
    Hintikka taught us that S5 was the wrong epistemic logic because of the unwarranted powers of negative introspection afforded by the (5) schema, ♦p → ♦p. (Tim Williamson later targeted the (4) schema, and with it S4, but that is another story.) The punchline here is that the problem is really the (B) schema, also known as the Brouwershe schema: (B) p → ♦p. In fact, you should think of the (5) schema within S5 as the best hand to play in a classical modal system with (K) when you are dealt the (B) schema. That is the conclusion of (Wheeler 2015), which is about the logic of information rather than logic for lunatics. The behavior of (B), when interpreted as an epistemic modal rather than as a provability operator, is so bizarre, so unreasonable qua epistemic modal, that epistemic logicians should stop referring to (B) as the Brouwershe schema to avoid sullying Brouwer’s good name. Instead, I recommend hereafter for epistemic logicians to refer to (B) as The Blog Schema.
    Found 4 days, 7 hours ago on PhilSci Archive
  23. 374385.679978
    The Univalent Foundations (UF) offer a new picture of the foundations of mathematics largely independent from set theory. In this paper I will focus on the question of whether Homotopy Type Theory (HoTT) (as a formalization of UF) can be justified intuitively as a theory of shapes in the same way that ZFC (as a formalization of set-theoretic foundations) can be justified intuitively as a theory of collections. I first clarify what I mean by an “intuitive justification” by distinguishing between formal and pre-formal “meaning explanations” in the vein of Martin-Löf. I then explain why Martin-Löf’s original meaning explanation for type theory no longer applies to HoTT. Finally, I outline a pre-formal meaning explanation for HoTT based on spatial notions like “shape”, “path”, “point” etc. which in particular provides an intuitive justification of the axiom of univalence. I conclude by discussing the limitations and prospects of such a project.
    Found 4 days, 7 hours ago on PhilSci Archive
  24. 374407.679994
    The Born’s rule to interpret the square of wave function as the probability to get a specific value in measurement has been accepted as a postulate in foundations of quantum mechanics. Although there have been so many attempts at deriving this rule theoretically using different approaches such as frequency operator approach, many-world theory, Bayesian probability and envariance, literature shows that arguments in each of these methods are circular. In view of absence of a convincing theoretical proof, recently some researchers have carried out experiments to validate the rule up-to maximum possible accuracy using multi-order interference (Sinha et al, Science, 329, 418 [2010]). But, a convincing analytical proof of Born’s rule will make us understand the basic process responsible for exact square dependency of probability on wave function. In this paper, by generalizing the method of calculating probability in common experience into quantum mechanics, we prove the Born’s rule for statistical interpretation of wave function.
    Found 4 days, 8 hours ago on PhilSci Archive
  25. 374478.680008
    In this paper I argue that the consistency condition from the Deutsch’s influential model for closed timelike curves (CTCs) differs significantly from the classical consistency condition found in Lewis [15] and Novikov [16], as well as from the consistency condition found in the P-CTC model, the major rival to Deutsch’s approach. Both the CCC and the P-CTC consistency condition are formulable in the context of a single history of the world. Deutsch’s consistency condition relies on the existence of a structure of parallel worlds. I argue that Deutsch’s commitment to realism about parallel worlds puts his solutions to the information paradox in jeopardy. I argue that, because of Deutsch’s commitment to this metaphysical picture, he is committed to the existence of physical situations that are in every way indistinguishable from the paradoxes he attempts to rule out by adopting the model in the first place. Deutsch’s proposed solution to the Knowledge Paradox, in particular his commitment to the actuality of the many worlds of the Everett interpretation (on which he relies to solve the paradoxes), guarantees the existence of worlds that are indistinguishable from worlds in which the genuine Knowledge Paradox arises.
    Found 4 days, 8 hours ago on PhilSci Archive
  26. 374511.680023
    It was first suggested by Albert that the existence of real, physical non-unitarity at the quantum level would yield a complete explanation for the increase of entropy over time in macroscopic systems. An alternative understanding of the source of non-unitarity is presented herein, in terms of the Transactional Interpretation. The present model provides a specific physical justification for Boltzmann’s Stosszahlansatz (assumption of molecular chaos), thereby changing its status from an ad hoc postulate to a theoretically grounded result, without requiring any change to the basic quantum theory. In addition, it is argued that TI provides an elegant way of reconciling, via collapse, the time-reversible Liouville evolution with the time-irreversible evolution inherent in master equations. The present model is contrasted with the GRW ‘spontaneous collapse’ theory previously suggested for this purpose by Albert.
    Found 4 days, 8 hours ago on PhilSci Archive
  27. 376581.680038
    One of Tarski’s stated aims was to give an explication of the classical conception of truth—truth as ‘saying it how it is’. Many subsequent commentators have felt that he achieved this aim. Tarski’s core idea of defining truth via satisfaction has now found its way into standard logic textbooks. This paper looks at such textbook definitions of truth in a model for standard first-order languages and argues that they fail from the point of view of explication of the classical notion of truth. The paper furthermore argues that a subtly different definition—also to be found in classic textbooks but much less prevalent than the kind of definition that proceeds via satisfaction—succeeds from this point of view.
    Found 4 days, 8 hours ago on Nick Smith's site
  28. 605364.680106
    Torturing someone is gravely wrong because it causes grave harm to the victim, and the wickedness evinced in the act is typically proportional to the harm (as well as depending on many other factors). …
    Found 1 week ago on Alexander Pruss's Blog
  29. 688939.680143
    In his discussion of the four causes, Aristotle claims that ‘the hypotheses are material causes of the conclusion’ (Physics 2.3, Metaphysics Δ 2). This claim has puzzled commentators since antiquity. It is usually taken to mean that the premises of any deduction are material causes of the conclusion. By contrast, I argue that the claim does not apply to deductions in general but only to scientific demonstrations. In Aristotle’s view, the theorems of a given science are composites composed of the indemonstrable premises from which they are demonstrated. Accordingly, these premises are elements, and hence material causes, of the theorems. Given this, Aristotle’s claim can be shown to be well-motivated and illuminating.
    Found 1 week ago on Marko Malink's site
  30. 735327.68018
    Reliabilism about justified belief comes in two varieties: process reliabilism and indicator reliabilism. According to process reliabilism, a belief is justified if it is formed by a process that is likely to produce truths; according to indicator reliabilism, a belief is justified if it likely to be true given the ground on which the belief is based. …
    Found 1 week, 1 day ago on M-Phi