
42234.929585
I wrote 300 issues of a column called This Week’s Finds, where I explained math and physics. In the fall of 2022 I gave ten talks based on these columns. I just finished giving eight more! Now I’m done. …

95514.929771
In recent decades, Bayesian modeling has achieved extraordinary success within perceptual psychology (Knill and Richards, 1996; Rescorla, 2015; Rescorla, 2020a; Rescorla, 2021). Bayesian models posit that the perceptual system assigns subjective probabilities (or credences) to hypotheses regarding distal conditions (e.g. hypotheses regarding possible shapes, sizes, colors, or speeds of perceived objects). The perceptual system deploys its subjective probabilities to estimate distal conditions based upon proximal sensory input (e.g. retinal stimulations). It does so through computations that are fast, automatic, subpersonal, and inaccessible to conscious introspection.

95537.929801
Kolmogorov conditionalization is a strategy for updating credences based on propositions that have initial probability 0. I explore the connection between Kolmogorov conditionalization and Dutch books. Previous discussions of the connection rely crucially upon a factivity assumption: they assume that the agent updates credences based on true propositions. The factivity assumption discounts cases of misplaced certainty, i.e. cases where the agent invests credence 1 in a falsehood. Yet misplaced certainty arises routinely in scientific and philosophical applications of Bayesian decision theory. I prove a nonfactive Dutch book theorem and converse Dutch book theorem for Kolmogorov conditionalization. The theorems do not rely upon the factivity assumption, so they establish that Kolmogorov conditionalization has unique pragmatic virtues that persist even in cases of misplaced certainty.

95566.929821
When P(E) > 0, conditional probabilities P(H  E) are given by the ratio formula. An agent engages in ratio conditionalization when she updates her credences using conditional probabilities dictated by the ratio formula. Ratio conditionalization cannot eradicate certainties, including certainties gained through prior exercises of ratio conditionalization. An agent who updates her credences only through ratio conditionalization risks permanent certainty in propositions against which she has overwhelming evidence. To avoid this undesirable consequence, I argue that we should supplement ratio conditionalization with Kolmogorov conditionalization, a strategy for updating credences based on propositions E such that P(E) = 0. Kolmogorov conditionalization can eradicate certainties, including certainties gained through prior exercises of conditionalization. Adducing general theorems and detailed examples, I show that Kolmogorov conditionalization helps us model epistemic defeat across a wide range of circumstances.

216055.929839
Imagine that we are on a train playing with some mechanical systems. Why can’t we detect any differences in their behavior when the train is parked versus when it is moving uniformly? The standard answer is that boosts are symmetries of Newtonian systems. In this paper, I use the case of a spring to argue that this answer is problematic because symmetries are neither sufficient nor necessary for preserving its behavior. I also develop a new answer according to which boosts preserve the relational properties on which the behavior of a system depends, even when they are not symmetries.

263435.929855
There are two ways to characterize symmetric relations. One is intensional: necessarily, Rxy iff Ryx. In some discussions of relations, however, what is important is whether or not a relation gives rise to the same completion of a given type (fact, state of affairs, or proposition) for each of its possible applications to some fixed relata. Kit Fine calls relations that do ‘strictly symmetric’. Is there is a difference between the notions of necessary and strict symmetry that would prevent them from being used interchangeably in such discussions? I show that there is. While the notions coincide assuming an intensional account of relations and their completions, according to which relations/completions are identical if they are necessarily coinstantiated/equivalent, they come apart assuming a hyperintensional account, which individuates relations and completions more finely on the basis of relations’ real definitions. I establish this by identifying two definable relations, each of which is necessarily symmetric but nonetheless results in distinct facts when it applies to the same objects in opposite orders. In each case, I argue that these facts are distinct because they have different grounds.

263458.92987
Inspired by the early Wittgenstein’s concept of nonsense (meaning that which lies beyond the limits of language), we define two different, yet complementary, types of nonsense: formal nonsense and pragmatic nonsense. The simpler notion of formal nonsense is initially defined within Tarski’s semantic theory of truth; the notion of pragmatic nonsense, by its turn, is formulated within the context of the theory of pragmatic truth, also known as quasitruth, as formalized by da Costa and his collaborators. While an expression will be considered formally nonsensical if the formal criteria required for the assignment of any truthvalue (whether true, false, pragmatically true, or pragmatically false) to such sentence are not met, a (wellformed) formula will be considered pragmatically nonsensical if the pragmatic criteria (inscribed within the context of scientific practice) required for the assignment of any truthvalue to such sentence are not met. Thus, in the context of the theory of pragmatic truth, any (wellformed) formula of a formal language interpreted on a simple pragmatic structure will be considered pragmatically nonsensical if the set of primary sentences of such structure is not wellbuilt, that is, if it does not include the relevant observational data and/or theoretical results, or if it does include sentences that are inconsistent with such data.

270289.929884
In this discussion, we look at three potential problems that arise for Whiting’s account of normative reasons. The first has to do with the idea that objective reasons might have a modal dimension. The second and third concern the idea that there is some sort of direct connection between sets of reasons and the deliberative ought or the ought of rationality. We can see that we might be better served using credences about reasons (i.e., creasons) to characterise any ought that is distinct from the objective ought than possessed or apparent reasons.

321134.9299
Aristotle’s definition of syllogism in his Prior Analytics is usually charged of being too vague and, more specifically, of not being adequate for its supposed definiendum. Aristotle is supposed to define the stricter notion of “syllogism” (a sort of argument in which the premises are an appropriate pair of sentences formulated in predicative form, in which the conclusion is a different sentence in predicative form) but he seems to produce a definition for some broader notion of valid argument or deduction in general. 1 I believe that this charge, however likely, is not correct. Aristotle’s definition of syllogism is far from being clear, since it is phrased in his jargon and with his usual laconism. However, once we are in a better position to understand what he means with his peculiar phrasing, we can see that the definition he offers is appropriate for the very notion of syllogism. I mean that Aristotle is really defining that form of argument in which the conclusion is a predicative (or categorical) form attained by means of a premisepair of the appropriate sort (with a middle term relating to each extreme in each premise ).

331306.929915
Perturbative expansions have played a peculiarly central role in quantum field theory, not only in extracting empirical predictions but also in investigations of the theory’s mathematical and conceptual foundations. This paper brings the special status of QFT perturbative expansions into focus by tracing the history of mathematical physics work on perturbative QFT and situating a contemporary approach, perturbative algebraic QFT, within this historical context. Highlighting the role that perturbative expansions have played in foundational investigations helps to clarify the relationships between the formulations of QFT developed in mathematical physics and highenergy phenomenology.

345857.929934
In the last couple of decades, the most prominent argument from evil is base on the idea that God couldn’t allow a gratuitous evil. Here is one way to define a gratuitous evil, paraphrasing Rowe:
 E is gratuitous if and only if there is no greater or equal good G that is only obtainable by God if God permits E or something equal or worse. …

378863.92995
This paper introduces the axiom of Negative Dominance, stating that, if a lottery f is strictly preferred to a lottery g, then some outcome in the support of f is strictly preferred to some outcome in the support of g. It is shown that, if preferences are incomplete on a sufficiently rich domain, then this plausible axiom, which holds for complete preferences, is incompatible with an array of otherwise plausible axioms for choice under uncertainty. In particular, in this setting, Negative Dominance conflicts with the standard Independence axiom. A novel theory, which includes Negative Dominance, and rejects Independence, is developed and shown to be consistent.

395964.929965
Atomic and closetoatomic scale manufacturing (ACSM) is the core competence of Manufacturing III. Unlike other conceptions or terminologies that only focus on the atomic level precision, ACSM defines a new realm of manufacturing where quantum mechanics plays the dominant role in the atom/molecule addition, migration and removal, considering the uncertainty principle and the discrete nature of particles. As ACSM is still in its infant stage, only little has been systematically elaborated at the core proposition of ACSM by now, hence there is a need to understand its concept and vision. This article elucidates the development of ACSM and clarifies its proposition, which aims to achieve a clearer understanding on ACSM and direct more effective efforts toward this promising area.

435021.92998
can solve a wide array of problems, and the models and proofs of unsatisfiability emitted by SAT solvers can be checked by verified software. In this way, the SAT toolchain is trustworthy. However, many applications are not expressed natively in SAT and must instead be encoded into SAT. These encodings are often subtle, and implementations are errorprone. Formal correctness proofs are needed to ensure that implementations are bugfree. In this paper, we present a library for formally verifying SAT encodings, written using the Lean interactive theorem prover. Our library currently contains verified encodings for the parity, atmostone, and atmostk constraints. It also contains methods of generating fresh variable names and combining subencodings to form more complex ones, such as one for encoding a valid Sudoku board. The proofs in our library are general, and so this library serves as a basis for future encoding efforts.

518427.929999
Conditional statements are ubiquitous, from promises and threats to reasoning and decision making. By now, logicians have studied them from many different angles, both semantic and prooftheoretic. This paper suggests two more perspectives on the meaning of conditionals, one dynamic and one geometric, that may throw yet more light on a familiar and yet in some ways surprisingly elusive and manyfaceted notion.

552022.930018
Supersymmetry (SUSY) has long been considered an exceptionally promising theory. A central role for the promise has been played by naturalness arguments. Yet, given the absence of experimental findings it is questionable whether the promise will ever be fulfilled. Here, I provide an analysis of the promises associated with SUSY, employing a concept of pursuitworthiness. A research program like SUSY is pursuitworthy if (1) it has the plausible potential to provide high epistemic gain and (2) that gain can be achieved with manageable research efforts. Naturalness arguments have been employed to support both conditions (1) and (2). First, SUSY has been motivated by way of analogy: the proposed symmetry between fermions and bosons is supposed to ’protect’ the small Higgs mass from large quantum corrections just as the electron mass is protected through the chiral symmetry. Thus, SUSY held the promise of solving a major problem of the Standard Model of particle physics. Second, naturalness arguments have been employed to indicate that such gain is achievable at relatively low cost: SUSY discoveries seemed to be well in reach of upcoming highenergy experiments. While the first part of the naturalness argument may have the right form to facilitate considerations of pursuitworthiness, the second part of the argument has been problematically overstated.

561913.930032
A number of authors, including me, have argued that the output of our most complex climate models, that is, of global climate models and Earth system models, should be assessed possibilistically. Worries about the viability of doing so have also been expressed. I examine the assessment of the output of relatively simple climate models in the context of discovery and point out that this assessment is of epistemic possibilities. At the same time, I show that the concept of epistemic possibility used in the relevant studies does not fit available analyses of this concept. Moreover, I provide an alternative analysis that does fit the studies and broad climate modelling practices as well as meshes with my existing view that climate model assessment should typically be of real possibilities. On my analysis, to assert that a proposition is epistemically possible is to assert that it is not known to be false and is consistent with at least approximate knowledge of the basic way things are. I, finally, consider some of the implications of my discussion for available possibilistic views of climate model assessment and for worries about such views. I conclude that my view helps to address worries about such assessment and permits using the full range of climate models in it.

561935.93005
A formal theory of causal reasoning is presented that encompasses both Pearl’s approach to causality and several key formalisms of nonmonotonic reasoning in Artificial Intelligence. This theory will be derived from a single rationality principle of causal acceptance for propositions. However, this principle will also set the theory of causal reasoning apart from common representational approaches to reasoning formalisms.

579279.930065
Two types of formal models—landscape search tasks and twoarmed bandit models—are often used to study the effects that various social factors have on epistemic performance. I argue that they can be understood within a single framework. In this unified framework, I develop a model that may be used to understand the effects of functional and demographic diversity and their interaction. Using the unified model, I find that the benefit of demographic diversity is most pronounced in a functionally homogeneous group, and decreases with the increase of functional diversity.

677280.93008
In this paper, I argue that the Hole Argument can be formulated without using the notion of isomorphism and for this reason it is not threatened by the criticism of Halvorson and Manchak (2022). I divide the Hole Argument, following Earman and Norton (1987), into two steps: the proof of the Gauge Theorem and the transition from the Gauge Theorem to the conclusion of radical indeterminism. I argue that the Gauge Theorem does not rely on the notion of isomorphism, but on the notion of the diffeomorphisminvariance of the equations of local spacetime theories; however, for this approach to work, the definition of such theories needs certain amendments with respect to its formulation by Earman and Norton. In the analysis of the second step, I postulate that we should use the notion of radical indeterminism instead of indeterminism simpliciter and that we should not decide in advance what kind of maps are to be used in comparing models. Instead, we can choose tentatively some kind of maps for this purpose and check whether a given choice leads to radical indeterminism involving empirically indistinguishable models. In this way, the usage of the notion of isomorphism is avoided also in the second step of the Hole Argument. A general picture is that physical equivalence can be established by means of an iterative procedure, in which we examine various candidate classes of maps and, depending on the outcomes, we need to broaden or narrow these classes; the Hole Argument can be viewed as a particular instance of this procedure.

677333.930095
Explicating the concept of coherence and establishing a measure for assessing the coherence of an information set are two of the most important tasks of coherentist epistemology. To this end, several principles have been proposed to guide the specification of a measure of coherence. We depart from this prevailing path by challenging two wellestablished and prima facie plausible principles: Agreement and Dependence. Instead, we propose a new probabilistic measure of coherence that combines basic intuitions of both principles, but without strictly satisfying either of them. It is then shown that the new measure outperforms alternative measures in terms of its truthtracking properties. We consider this feature to be central and argue that coherence matters because it is likely to be our best available guide to truth, at least when more direct evidence is unavailable.

710274.930111
I learned a lot from the comments on Part 3 and also this related thread on the Category Theory Community Server:
• Coalgebras, operational semantics and the Giry monad. I’d like to thank Matteo Cappucci, David Egolf, Tobias Fritz, Tom Hirschowitz, David Jaz Myers, Mike Shulman, Nathaniel Virgo and many others for help. …

725188.930127
Monists and pluralists disagree concerning how many ordinary objects there are in a single situation. For instance, pluralists argue that a statue and the clay it is made of have different properties, and thereby are different. The standard monist’s response is to hold that there is just a single object, and that, under the description “being a statue”, this object is, e.g., aesthetically valuable, and that, under the description “being a piece of clay”, it is not aesthetically valuable. However, Fine provided an ontological reading of the expression “an object under a description”: the theory of rigid embodiments. The debate between monists and pluralists reduplicates in the domain of ordinary occurrences, like walks and conferences.

898378.930141
The inflation of Type I error rates is thought to be one of the causes of the replication crisis. Questionable research practices such as phacking are thought to inflate Type I error rates above their nominal level, leading to unexpectedly high levels of false positives in the literature and, consequently, unexpectedly low replication rates. In this article, I offer an alternative view. I argue that questionable and other research practices do not usually inflate relevant Type I error rates. I begin by introducing the concept of Type I error rates and distinguishing between statistical errors and theoretical errors. I then illustrate my argument with respect to model misspecification, multiple testing, selective inference, forking paths, exploratory analyses, phacking, optional stopping, double dipping, and HARKing. In each case, I demonstrate that relevant Type I error rates are not usually inflated above their nominal level, and in the rare cases that they are, the inflation is easily identified and resolved. I conclude that the replication crisis may be explained, at least in part, by researchers’ underestimation of theoretical errors and misinterpretation of statistical errors.

898398.930156
In relevant logic, a conditional is provable only if its antecedent is relevant to its consequent—the provability of conditionals must respect relevance. What exactly respecting relevance amounts to has, however, been up for debate essentially ever since the notion was introduced. One of the more promising approaches to formally explicating respect for relevance is to articulate it in terms of variable sharing results. There are four such results in the extant literature, viz. ordinary variable sharing, strong variable sharing, depth relevance, and strong depth relevance. We’ll have more to say about these below. What’s important to note is that (a) each of these codifies a different way logics can be (or perhaps, can be said to be) relevant and (b) most of the results concerning variablesharing have roughly the form ‘thus and so logics enjoy such and such a variablesharing property’. What has been lacking until recently was a serious exploration of the mechanisms by which logics come to have these properties. And—apart from the tantalizingbuttentative results in the work of Gemma Robles and Jos´e M´endez, (see e.g. Robles and M´endez 2011; M´endez and Robles 2012)—there’s been little to no exploration of what the broadest possible class of logics enjoying any of these properties might be.

908016.93017
We reexamine the old question to what extent mathematics may be compared to a game. Under the spell of Wittgenstein, we propose that the more refined object of comparison is a “motley of language games”, the nature of which was (implicitly) clarified by Hilbert: via different language games, axiomatization lies at the basis of both the rigour and the applicability of mathematics. In the “formalist” game, mathematics resembles chess via a clear conceptual dictionary. Accepting this resemblance: like positions in chess, mathematical sentences cannot be true or false; true statements in mathematics are about sentences, namely that they are theorems (if they are). In principle, the certainty of mathematics resides in proofs, but to this end, in practice these must be “surveyable”. Hilbert and Wittgenstein proposed almost oppositie criteria for surveyability; we try to overcome their difference by invoking computerverified proofs. The “applied” language game is based on Hilbert’s axiomatization program for physics (and other scientific disciplines), refined by Wittgenstein’s idea that theorems are yardsticks to which empirical phenomena may be compared, and further improved by invoking elements of van Fraassen’s constructive empiricism. From this perspective, in an appendix we also briefly review the varying roles and structures of axioms, definitions, and proofs in mathematics. Our view is not meant as a philosophy of mathematics by itself, but as a coat rack analogous to category theory, onto which various (traditional and new) philosophies of mathematics (such as formalism, intuitionism, structuralism, deductivism, and the philosophy of mathematical practice) may be attached and may even peacefully support each other.

965695.930186
The propensity interpretation of fitness (PIF) holds that evolutionary fitness is an objectively probabilistic causal disposition (i.e., a propensity) toward reproductive success. I characterize this as the conceptual foundation of the PIF. Reproductive propensities are meant to explain trends in actual reproductive outcomes. In this paper, I analyze the minimal theoretical and ontological commitments that must accompany the explanatory power afforded by the PIF’s foundation. I discuss three senses in which these commitments are less burdensome than has typically been recognized: the PIF’s foundation is (i) compatible with a principled pluralism regarding the mathematical relationship between measures of individual and trait reproductive success; (ii) independent of the propensity interpretation of probability; and (iii) independent of microphysical indeterminism. The most substantive ontological commitment of the PIF’s foundation is to objective modal structures wherein macrophysical probabilities and causation can be found, but I hedge against metaphysically inflationary readings of this modality.

965723.930201
It is possible that you are living in a simulation—that your world is computergenerated rather than physical. But how likely is this scenario? Bostrom and Chalmers each argue that it is moderately likely—neither very likely nor very unlikely. However, they adopt an unorthodox form of reasoning about selflocation uncertainty. Our main contention here is that Bostrom’s and Chalmers’ premises, when combined with orthodoxy about selflocation, yields instead the conclusion that you are almost certainly living in a simulation. We consider how this (surprising) conclusion might be resisted, and show that the analogy between Sleeping Beauty cases and simulation cases provides a new way of evaluating approaches to selflocation uncertainty. In particular, we argue that some conditionalizationbased approaches to selflocation are problematically limited in their applicability.

965785.930217
It is usually thought that decoherence is necessary for the emergence of many worlds. In this paper, I argue that this may be not the case. First, I argue that the original synchronic decoherence condition leads to a contradiction in a thought experiment. Next, I argue that although the diachronic environmental decoherence condition may avoid the contradiction, it is not a necessary condition for the emergence of many worlds on a sufficiently short timescale. Finally, I argue that a more plausible necessary condition is the synchronic nointerference condition, and it can also avoid the contradiction.

965827.930232
To mitigate the Look Elsewhere Effect in multiple hypothesis testing using pvalues, the paper suggests an “entropic correction” of the significance level at which the null hypothesis is rejected. The proposed correction uses the entropic uncertainty associated with the probability measure that expresses the priortotest probabilities expressing how likely the confirming evidence may occur at values of the parameter. When the priortotest probability is uniform (embodying maximal uncertainty) the entropic correction coincides with the Bonferroni correction. When the priortotest probability embodies maximal certainty (is concentrated on a single value of the parameter at which the evidence is obtained), the entropic correction overrides the Look Elsewhere Effect completely by not requiring any correction of significance. The intermediate situation is illustrated by a simple hypothetical example. Interpreting the priortotest probability subjectively allows a Bayesian spirit enter the frequentist multiple hypothesis testing in a disciplined manner. If the priortotest probability is determined objectively, the entropic correction makes possible to take into account in a technically explicit way the background theoretical knowledge relevant for the test.