
30808.432015
In a recent paper, S. Gao has claimed that, under the assumption that the initial state of the universe is a pure quantum state, only the many worlds interpretation can account for the observed arrow of time. We show that his argument is untenable and that if endorsed it potentially leads to undermine the search for a scientific explanation of certain phenomena.

30854.432139
Multiscale modeling techniques have attracted increasing attention by philosophers of science, but the resulting discussions have almost exclusively focused on issues surrounding explanation (e.g., reduction and emergence). In this paper, I argue that besides explanation, multiscale techniques can serve important exploratory functions when scientists model systems whose organization at different scales is illunderstood. My account distinguishes explanatory and descriptive multiscale modeling based on which epistemic goal scientists aim to achieve when using multiscale techniques. In explanatory multiscale modeling, scientists use multiscale techniques to select information that is relevant to explain a particular type of behavior of the target system. In descriptive multiscale modeling scientists use multiscale techniques to explore lowerscale features which could be explanatorily relevant to many different types of behavior, and to determine which features of a target system an upperscale data pattern could refer to. Using multiscale models from datadriven neuroscience as a case study, I argue that descriptive multiscale models have an exploratory function because they are a sources of potential explanations and serve as tools to reassess our conception of the target system.

30922.432168
Some authors, inspired by the theoretical requirements for the formulation of a quantum theory of gravity, proposed a relational reconstruction of the quantum parametertime—the time of the unitary evolution, which would make quantum mechanics compatible with relativity. The aim of the present work is to follow the lead of those relational programs by proposing a relational reconstruction of the eventtime—which orders the detection of the definite values of the system’s observables. Such a reconstruction will be based on the modal Hamiltonian interpretation of quantum mechanics, which provides a clear criterion to select which observables acquire a definite value and to specify in what situation they do so.

65692.432186
The Kepler problem is the study of a particle moving in an attractive inverse square force. In classical mechanics, this problem shows up when you study the motion of a planet around the Sun in the Solar System. …

118428.432203
probabilistic incorrectness in the (over)rating of the subject, (ii) the possibility of imagining nonquantum scenarios but completely similar to that experiment (iii) lack of ratified practical tests having genuine essence (i.e., noncounterfeit). So, the aforesaid experiment appears as a simplistic thought exercise without any notable significance for quantum physics.

204295.432221
In his 1956 book ‘The direction of Time’, Hans Reichenbach offered a comprensive analysis of the physical ground of the direction of time, the notion of physical cause, and the relation between the two. I review its conclusions and argue that at the light of recent advances Reichenbach analysis provides the best account of the physical underpinning of these notions. I integrate recent results in cosmology, and relative to the physical underpinning of records and agency into Reichenbach’s account, and discuss which questions it leaves open.

340792.43225
In this article, we discuss a simple argument that modal metaphysics is misconceived, and responses to it. Unlike Quine’s, this argument begins with the banal observation that there are different candidate interpretations of the predicate ‘could have been the case’. This is analogous to the observation that there are different candidate interpretations of the predicate ‘is a member of’. The argument then infers that the search for metaphysical necessities is misguided in much the way the ‘settheoretic pluralist’ (Hamkins and ClarkeDoane [2017] claims that the search for the true axioms of set theory is. We show that the obvious responses to this argument fail.

428437.432265
In 'Why I Am Not a Utilitarian', Michael Huemer objects that "there are so many counterexamples, and the intuitions about these examples are strong and widespread, it’s hard to see how utilitarianism could be justified overall." …

608789.43228
This paper proposes a framework for representing in Bayesian terms the idea that analogical arguments of various degrees of strength may provide inductive support to yet untested scientific hypotheses. On this account, contextual information plays a crucial role in determining whether, and to what extent, a given similarity or dissimilarity between source and target may confirm an empirical hypothesis over a rival one. In addition to showing confirmation by analogy compatible with the adoption of a Bayesian standpoint, the proposal outlined in this paper reveals a close agreement between the fulfillment of Hesse’s (1963) criteria for analogical arguments capable of inductive support and the attribution of confirmatory power by the lights of Bayesian confirmation theory. In this sense, the Bayesian representation not only enriches a framework, Hesse’s, of enduring relevance for understanding scientific activity, but may offer something akin to a proof of concept of it.

629726.432294
Call a quantifier ‘unrestricted’ if it ranges over absolutely all objects. Arguably, unrestricted quantification is often presupposed in philosophical inquiry. However, developing a semantic theory that vindicates unrestricted quantification proves rather difficult, at least as long as we formulate our semantic theory within a classical firstorder language. It has been argued that using a type theory as framework for our semantic theory provides a resolution of this problem, at least if a broadly Fregean interpretation of type theory is assumed. However, the intelligibility of this interpretation has been questioned. In this paper I introduce a typefree theory of properties that can also be used to vindicate unrestricted quantification. This alternative emerges very naturally by reflecting on the features on which the typetheoretic solution of the problem of unrestricted quantification relies. Although this alternative theory is formulated in a nonclassical logic, it preserves the deductive strength of classical strict type theory in a natural way. The ideas developed in this paper make crucial use of Russell’s notion of range of significance.

687412.432307
is very reliable in assigning this credence—previously, he denied the effectiveness of masks, and other measures he claimed to be effective turned out not to be. We can conceptualise the reliability assigned to a person for a given proposition as a number ranging from 0 (completely unreliable) to 1 (completely reliable). A completely unreliable person would make entirely random reports. Their statements would never be related to the truth. Even if they were true, they would be true merely by coincidence. Others could never rely on what they said. To such a person, we could assign a reliability of 0. But they are a hypothetical person; most people of flesh and blood are not that unreliable, not even your erratic politician. Imagine that you assign 0.2 to them regarding the claim that masks lower the risk of coronavirus transmission.

702607.432321
The first sentence in the title means roughly: All the people around are tired. The second means: Do not mess with any of them. Even though the second sentence looks just like a negative counterpart of the first, it doesn’t have the expected compositional meaning: it doesn’t mean “do not mess with all the people”. This phenomenon is extremely general. It takes place with Bare Plurals, as in the title. It figures prominently in the behavior of Plural Definites (I spoke to the students in trouble @ "/I didn’t speak to the students in trouble @ ¬$). It also takes place with to Donkey pronouns (Every farmer who had a donkey sold it @ "/ No man who had a donkey sold it @ ¬$). These switches of quantificational force under polarity reversals calls to mind Free Choice phenomena. In particular, a determiner like any is interpreted as a narrow scope existential in a sentence like I didn’t talk to any student in trouble @ ¬ $; however, in positive environments, the existential meaning of any emerges as strengthened to universal I spoke to any student in trouble @ ". It is tempting to conjecture that the source of this uniform behavior is a uniform mechanism. While these constructions (Free Choice any, Bare Plurals, Plural Definites, and Donkey pronouns) have been studied extensively, and insightful approaches to Plural Definites in terms of Free Choice mechanisms have also been proposed (Bar Lev 2018, 2021), a unitary analysis has not been attempted to the best of my knowledge. In spite of the many challenges that a unified analysis faces, it is worth a try, for, if successful, it would considerably push forward our understanding of a wide range of very diverse constructions.

781502.432335
. Nathan Schachtman, Esq., J.D. Legal Counsel for Scientific Challenges
Of Significance, Error, Confidence, and Confusion – In the Law and In Statistical Practice
The metaphor of law as an “empty vessel” is frequently invoked to describe the law generally, as well as pejoratively to describe lawyers. …

781978.43236
A defence is offered of a version of the branchcounting rule for probability in the Everett interpretation (otherwise known as manyworlds interpretation) of quantum mechanics that both depends on the state and is continuous in the norm topology on Hilbert space. The wellknown branchcounting rule, for realistic models of measurements, in which branches are defined by decoherence theory, fails this test. The new rule hinges on the use of decoherence theory in defining branching structure, and specifically decoherent histories theory. On this basis ratios of branch numbers are defined, free of any convention. They agree with the Born rule, and deliver a notion of objective probability similar to naïve frequentism, save that the frequencies of outcomes are not confined to a single world at different times, but spread over worlds at a single time. Nor is it ad hoc: it is recognizably akin to the combinatorial approach to thermodynamic probability, as introduced by Boltzmann in 1879. It is identical to the procedure followed by Planck, Bose, Einstein and Dirac in defining the equilibrium distribution of the BoseEinstein gas. It also connects in a simple way with the decisiontheory approach to quantum probability.

860985.432392
In this article, we provide three generators of propositional formulae for arbitrary languages, which uniformly sample three different formulae spaces. They take the same three parameters as input, namely, a desired depth, a set of atomics and a set of logical constants (with specified arities). The first generator returns formulae of exactly the given depth, using all or some of the propositional letters. The second does the same but samples upto the given depth. The third generator outputs formulae with exactly the desired depth and all the atomics in the set. To make the generators uniform (i.e. to make them return every formula in their space with the same probability), we will prove various cardinality results about those spaces.

861188.432408
In this article, I show that the semantics one adopts for mass terms constrains the metaphysical claims one can make about mixtures. I first expose why mixtures challenge a singularist approach based on mereological sums. After discussing an alternative, nonsingularist approach based on plural logic, I take chemistry into account and explain how it changes our perspective on these issues.

909857.432423
. John Park, MD
Radiation Oncologist
Kansas City VA Medical Center
Poisoned Priors: Will You Drink from This Well? As an oncologist, specializing in the field of radiation oncology, “The Statistics Wars and Intellectual Conflicts of Interest”, as Prof. Mayo’s recent editorial is titled, is one of practical importance to me and my patients (Mayo, 2021). …

1012852.432455
There are different varieties of conservatism concerning belief formation and revision. We assesses the veritistic effects of a particular kind of conservatism commonly attributed to Quine: the socalled maxim of minimum mutiliation, which states that agents should give up as few beliefs as possible when facing recalcitrant evidence. Based on a formal bounded rationality model of belief revi

1035863.43247
. Brian Dennis
Professor Emeritus
Dept Fish and Wildlife Sciences,
Dept Mathematics and Statistical Science
University of Idaho
Journal Editors Be Warned: Statistics Won’t Be Contained
I heartily second Professor Mayo’s call, in a recent issue of Conservation Biology, for science journals to tread lightly on prescribing statistical methods (Mayo 2021). …

1041526.432493
The modern abundance and prominence of data has led to the development of “data science” as a new field of enquiry, along with a body of epistemological reflections upon its foundations, methods, and consequences. This article provides a systematic analysis and critical review of significant open problems and debates in the epistemology of data science. We propose a partition of the epistemology of data science into the following five domains: (i) the constitution of data science; (ii) the kind of enquiry that it identifies; (iii) the kinds of knowledge that data science generates; (iv) the nature and epistemological significance of “black box” problems; and (v) the relationship between data science and the philosophy of science more generally.

1093780.43251
We propose a model of incomplete twofold multiprior preferences, in which an act f is ranked above an act g only when f provides higher utility in a worstcase scenario than what g provides in a bestcase scenario. The model explains failures of contingent reasoning, captured through a weakening of the statebystate monotonicity (or dominance) axiom. Our model gives rise to rich comparative statics results, as well as extension exercises, and connections to choice theory. We present an application to secondprice auctions.

1153617.432524
. Philip B. Stark
Professor
Department of Statistics
University of California, Berkeley
I enjoyed Prof. Mayo’s comment in Conservation Biology Mayo, 2021 very much, and agree enthusiastically with most of it. …

1378561.432541
Bylinina & Nouwen (2018) use the NPIlicensing properties of zero to argue that plural count nouns must be analyzed as having domains with the structure of a complete lattice, containing minimal objects whose count is zero. In this paper, I consider a wider array of data which indicates that the traditional analysis of plural count noun domains as join semilattices, lacking minimal objects, is correct. It follows from this view that sentences in which zero combines with a plural count noun are contradictions, but they come to have contingent truth conditions, I claim, because exhaustification can sometimes return exclusively exclusive content. Building on recent proposals by Bassi, Del Pinal & Sauerland (2021), I argue that the content of exhaustification just is exclusion of alternatives, and whether the prejacent is entailed depends on whether the exclusive proposition is notatissue vs. atissue.

1413456.43257
Relational mechanics is a reformulation of mechanics (classical or quantum) for which space is relational. This means that the configuration of an N  particle system is a shape, which is what remains when the effects of rotations, translations and dilations are quotiented out. This reformulation of mechanics naturally leads to a relational notion of time as well, in which a history of the universe is just a curve in shape space without any reference to a special parametrization of the curve given by an absolute Newtonian time. When relational mechanics (classical or quantum) is regarded as fundamental, the usual descriptions in terms of absolute space and absolute time emerge merely as corresponding to the choice of a gauge. This gauge freedom forces us to recognize that what we have traditionally regarded as fundamental in physics might in fact be imposed by us through our choice of gauge. It thus imparts a somewhat Kantian aspect to physical theory.

1430608.432586
Any intermediate propositional logic (i.e., a logic including intuitionistic logic and contained in classical logic) can be extended to a calculus with epsilon and tauoperators and critical formulas. For classical logic, this results in Hilbert’s εcalculus. The first and second εtheorems for classical logic establish conservativity of the εcalculus over its classical base logic. It is well known that the second εtheorem fails for the intuitionistic εcalculus, as prenexation is impossible. The paper investigates the effect of adding critical ε and τ formulas and using the translation of quantifiers into ε and τ terms to intermediate logics. It is shown that conservativity over the propositional base logic also holds for such intermediate ετ calculi. The “extended” first εtheorem holds if the base logic is finitevalued GodelDummett logic, fails otherwise, but holds for certain provable formulas in infinitevalued Godel logic. The second εtheorem also holds for finitevalued firstorder Godel logics. The methods used to prove the extended first εtheorem for infinitevalued Godel logic suggest applications to theories of arithmetic.

1469236.432603
. Yudi Pawitan
Professor
Department of Medical Epidemiology and Biostatistics
Karolinska Institutet, Stockholm
Behavioral aspects in the statistical significance wargame
I remember with fondness the good old days when the only ‘statistical war’game was fought between the Bayesian and the frequentist. …

1474350.432622
The principal target of this article is the reification Bruineberg et al. perceive of formalism within the literature on the variational free energy minimisation (VFEM) framework. The authors do not provide a definition of reification, as none yet exists. Here I offer one. On this definition, the objects of the authors’ critiques fall short of fullblown reification—as do the authors themselves. Scientific modelling can often look a bit like playing a game of pretend. We play soldiers and pretend our sticks to be guns; we play explorers and pretend the sand to be lava. Misrepresentations, distortions, nonlinearities, untruths, and oddities run rampant in scientific models. Such idealisations are harmless, so long as we do not forget that we have made them; so long as we do not forget that the sand is really sand and our friends really our friends, and not lava or enemy soldiers after all.

1568799.432638
. Edward L. Ionides
. Director of Undergraduate Programs and Professor,
Department of Statistics, University of Michigan
Ya’acov Ritov Professor
Department of Statistics, University of Michigan
Thanks for the clear presentation of the issues at stake in your recent Conservation Biology editorial (Mayo 2021). …

1568800.432656
. ProfessorDepartment of Statistical SciencesUniversity of Bologna
The ASA controversy on Pvalues as an illustration of the difficulty of statistics
“I work on Multidimensional Scaling for more than 40 years, and the longer I work on it, the more I realise how much of it I don’t understand. …

1572593.432674
In the last few years, machine learning researchers have proposed a plethora of prospective ‘statistical criteria of algorithmic fairness’, i.e. purely statistical necessary conditions that a predictive algorithm’s predictions must satisfy in order for the algorithm to count as fair. However, a mixture of formal nogo theorems and devastating counterexamples have served to undermine the philosophical credibility of almost all of these conditions. Only one statistical criterion retains anything like universal support, namely calibration within groups. In this paper, I (i) argue that calibration within groups is neither a necessary nor a sufficient condition for algorithmic fairness, (ii) propose, motivate and defend a novel criterion, called ‘base rate tracking’, which evades the theorems and counterexamples that undermined existing criteria and allows us to accurately diagnose and quantify many paradigmatic instances of algorithmic unfairness, and (iii) reevaluate the proper role of statistical criteria of algorithmic fairness in the project of ensuring the fair and equitable application of predictive algorithms in society.