
105226.456836
This is a report on the project “Axiomatizing Conditional Normative Reasoning” (ANCoR, M 3240N) funded by the Austrian Science Fund (FWF). The project aims to deepen our understanding of conditional normative reasoning by providing an axiomatic study of it at the propositional but also firstorder level. The focus is on a particular framework, the socalled preferencebased logic for conditional obligation, whose main strength has to do with the treatment of contrarytoduty reasoning and reasoning about exceptions. The project considers not only the metatheory of this family of logics but also its mechanization.

168216.456971
The Kuhnian view of theory choice (post Structure) leaves a lot of space for a diversity of theory choice preferences. It remains mysterious, however, how scientists could ever converge on a theory, given this diversity. This paper will argue that there is a solution to the problem of convergence, which can be had even on Kuhn’s own terms.

168251.456983
According to the ωrule, it is valid to infer that all natural numbers possess some property, if possesses it, 1 possesses it, 2 possesses it, and so on. The ωrule is important because its inclusion in certain arithmetical theories results in true arithmetic. It is controversial because it seems impossible for finite human beings to follow, given that it seems to require accepting infinitely many premises. Inspired by a remark of Wittgenstein’s, I argue that the mystery of how we follow the ωrule subsides once we treat the rule as helping to give meaning to the symbol, “…”.

168285.456995
We give a new and elementary construction of primitive positive decomposition of higher arity relations into binary relations on finite domains. Such decompositions come up in applications to constraint satisfaction problems, clone theory and relational databases. The construction exploits functional completeness of 2input functions in manyvalued logic by interpreting relations as graphs of partially defined multivalued ‘functions’. The ‘functions’ are then composed from ordinary functions in the usual sense. The construction is computationally effective and relies on welldeveloped methods of functional decomposition, but reduces relations only to ternary relations. An additional construction then decomposes ternary into binary relations, also effectively, by converting certain disjunctions into existential quantifications. The result gives a uniform proof of Peirce’s reduction thesis on finite domains, and shows that the graph of any Sheffer function composes all relations there.

168321.457002
We study logical reduction (factorization) of relations into relations of lower arity by Boolean or relative products that come from applying conjunctions and existential quantifiers to predicates, i.e. by primitive positive formulas of predicate calculus. Our algebraic framework unifies natural joins and data dependencies of database theory and relational algebra of clone theory with the bond algebra of C.S. Peirce. We also offer new constructions of reductions, systematically study irreducible relations and reductions to them, and introduce a new characteristic of relations, ternarity, that measures their ‘complexity of relating’ and allows to refine reduction results. In particular, we refine Peirce’s controversial reduction thesis, and show that reducibility behavior is dramatically different on finite and infinite domains.

168354.45701
We argue that traditional formulations of the reduction thesis that tie it to privileged relational operations do not suffice for Peirce’s justification of the categories, and invite the charge of gerrymandering to make it come out as true. We then develop a more robust invariant formulation of the thesis by explicating the use of triads in any relational operations, which is immune to that charge. The explication also allows us to track how Thirdness enters the structure of higher order relations, and even propose a numerical measure of it. Our analysis reveals new conceptual phenomena when negation or disjunction are used to compound relations.

492287.457018
We study the anchoring effect in a computational model of group deliberation on preference rankings. Anchoring is a form of pathdependence through which the opinions of those who speak early have a stronger influence on the outcome of deliberation than the opinions of those who speak later. We show that anchoring can occur even among fully rational agents. We then compare the respective effects of anchoring and three other determinants of the deliberative outcome: the relative weight or social influence of the speakers, the popularity of a given speaker’s opinion, and the homogeneity of the group. We find that, on average, anchoring has the strongest effect among these. We finally show that anchoring is often correlated with increases in proximity to singleplateauedness. We conclude that anchoring can constitute a structural bias that might hinder some of the otherwise positive effects of group deliberation.

492379.457026
The collapse of a quantum state can be understood as a mathematical way to construct a joint probability density even for operators that do not commute. We can formalize that construction as a noncommutative, nonassociative collapse product that is nonlinear in its left operand as a model for joint measurements at timelike separation, in part inspired by the sequential product for positive semidefinite operators. The familiar collapse picture, in which a quantum state collapses after each measurement as a way to construct a joint probability density for consecutive measurements, is equivalent to a nocollapse picture in which Luders transformers applied to subsequent measurements construct a QuantumMechanics–FreeSubsystem of Quantum NonDemolition operators, not as a dynamical process but as an alternative mathematical model for the same consecutive measurements. The nocollapse picture is particularly simpler when we apply signal analysis to millions or billions of consecutive measurements.

514423.457033
We provide a philosophical reconstruction and analysis of the debate on the scientific status of cosmic inflation that has played out in recent years. In a series of critical papers, Ijjas, Steinhardt, and Loeb have questioned the scientificality of current views on cosmic inflation. Proponents of cosmic inflation, such as Guth and Linde, have in turn defended the scientific credentials of their approach. We argue that, while this defense, narrowly construed, is successful against Ijjas, Steinhardt, and Loeb, the latters’ reasoning does point to a significant epistemic issue that arises with respect to inflationary theory. We claim that a broadening of the concept of theory assessment to include metaempirical considerations is needed to address that issue in an adequate way.

514483.45704
The formalism of generalized quantum histories allows a symmetrical treatment of space and time correlations, by taking different traces of the same history density matrix. We recall how to characterize spatial and temporal entanglement in this framework. An operative protocol is presented, to map a history state into the ket of a static composite system. We show, by examples, how the LeggettGarg and the temporal CHSH inequalities can be violated in our approach.

514511.457051
In 1935, Schrodinger introduced what he considered to be a reductio against the Copenhagen interpretation of quantum mechanics. His argument was based on a “ridiculous case” that is widely used today to portray the counterintuitive nature of quantum superposition. Schrodinger imagined that a cat was placed out of sight in a box with a mechanism that would kill the cat within an hour with 50% probability. Since the deadly mechanism employed a quantum process for its trigger, he supposed the cat was in a quantum superposition of 50% Live Cat + 50% Dead Cat.

566478.45707
Negation is common to all human languages. What explains its universality? Our hypothesis is that the emergence of expressions for denial, such as the word ‘not’, is an adaptation to existing conditions in the social and informational environment: a specific linguistic form was coopted to express denial, given a preference for information sharing, the limits of a finite lexicon, and localized social repercussions against synonymy. In support of our hypothesis, we present a costly signalling model of communication. The model formalizes ordinary aspects of Stalnakerian conversations, implements the conditions we isolated for the emergence of denial, and computes their longterm consequences through a widely employed evolutionary dynamics, whose results are calculated by computer simulations. The model shows that under a reasonable configuration of parameter values, functional pressure derived from conversational constraints favours the emergence of denial by means of a dedicated expression, such as the word ‘not’.

599982.45708
Wilhelm (Forthcom Synth 199:6357–6369, 2021) has recently defended a criterion for comparing structure of mathematical objects, which he calls Subgroup. He argues that Subgroup is better than SYM , another widely adopted criterion. We argue that this is mistaken; Subgroup is strictly worse than SYM . We then formulate a new criterion that improves on both SYM and Subgroup, answering Wilhelm’s criticisms of SYM along the way. We conclude by arguing that no criterion that looks only to the automorphisms of mathematical objects to compare their structure can be fully satisfactory.

687520.457091
Famously, Adrian Moore has argued that absolute representations of reality are possible: that it is possible to represent reality from no particular point of view. Moreover, Moore believes that absolute representations are a desideratum of physics. Recently, however, debates in the philosophy of physics have arisen regarding the apparent impossibility of absolute representations of certain aspects of nature in light of our current best theories of physics. Throughout this article, we take gravitational energy as a particular case study of an aspect of nature that seemingly does not admit of an absolute representation. There is, therefore, a prima facie tension between Moore’s a priori case on the one hand, and the stateofplay in modern physics on the other. This article overcomes this tension by demonstrating how, when formulated in the correct way, modern physics admits of an absolute representation of gravitational energy after all. In so doing, the article offers a detailed case study of Moore’s argument for absolute representations, clarifying its structure and bringing it into contact with the distinction drawn by philosophers of physics between coordinatefreedom and coordinateindependence, as well as the philosophy of spacetime physics.

710393.4571
Today I’d like to wrap up my discussion of how to implement the Game of Life in our agentbased model software called AlgebraicABMs. Kris Brown’s software for the Game of Life is here:
• game_of_life: code and explanation of the code. …

711398.457108
In the context of the probabilistic finetuning argument for the existence of a divine designer, appeals to the existence of a multiverse have often seemed ad hoc. The situation is rather different, though, if we have independent evidence from physics for a multiverse. I argue that the fate of the finetuning argument depends on open questions in fundamental physics and cosmology. I also argue that the manyworlds interpretation of quantum mechanics opens up new routes to undercutting the force of the finetuning argument.

739709.457114
A distinction is made between superpositional and nonsuperpositional quantum computers. The notion of quantum learning systems { quantum computers that modify themselves in order to improve their performance { is introduced. A particular nonsuperpositional quantum learning system, a quantum neurocomputer, is described: a conventional neural network implemented in a system which is a variation on the familiar twoslit apparatus from quantum physics. This is followed by a discussion of the advantages that quantum computers in general, and quantum neurocomputers in particular, might bring, not only to our search for more powerful computational systems, but also to our search for greater understanding of the brain, the mind, and quantum physics itself.

827265.45712
Famous problems in variablepopulation welfare economics have led some to suggest that social welfare comparisons over such populations may be incomplete. In the theory of rational choice with incomplete preferences, attention has recently centered on the Expected MultiUtility framework, which permits incompleteness but preserves vNM independence and can be derived from weak, attractive axioms. Here, we apply this framework to variablepopulation welfare economics. We show that Expected MultiUtility for social preferences, combined with a stochastic exante Paretotype axiom, characterizes Expected CriticalSet Generalized Utilitarianism, in the presence of basic axioms. The further addition of Negative Dominance, an axiom recently introduced to the philosophy literature, yields a characterization of Expected CriticalLevel Generalized Utilitarianism.

860743.457127
Although the basic approach of the manyworlds interpretation is that the ontology of our universe is a unitarily evolving highly entangled wavefunction, we live in a world in which we only perceive a tiny part of the global wavefunction, and the quantum state corresponding to our world does not evolve unitarily, making our world an effectively open system. Recently, the common approach that within a world conservation laws hold only for an ensemble of measurements was questioned. The paper analyzes how this can affect viewing worlds as open systems and proposes a possible resolution of the unitarity puzzle.

976073.457134
This book is the second of two volumes on belief and counterfactuals. It consists of five of a total of eleven chapters. ... ... Finally, while merely a change in terminology, I should perhaps note that, throughout the second volume, I follow my own suggestion from the first volume of referring to subjective probabilities not anymore as what they are not, viz., degrees of belief, but as what they are: degrees of certainty.

976108.45714
Given synthetic Euclidean geometry, I define length λ(a, b) (of a segment ab), by taking equivalence classes with respect to the congruence relation, ≡: i.e., λ(a, b) = λ(c, d) ↔ ab ≡ cd. By geometric constructions and explicit definitions, one may define the Length structure, _{L} = (_{L}, ,⊕,⪯, ), “instantiated by Euclidean geometry”, so to speak. One may show that this structure is isomorphic to the set of nonnegative elements of the onedimensional linearly ordered vector space over _{R}. One may define the notion of a numerical scale (for length) and a unit (for length). One may show how numerical scales for length are determined by Cartesian coordinate systems. One may also obtain a derivation of Maxwell’s quantity formula, Q = {Q}[Q], for lengths.

1074342.457147
Before posting new reflections on where we are 5 years after the ASA Pvalue controversy–both my own and readers’–I will reblog some reader commentaries from 2022 in connection with my (2022) editorial in Conservation Biology: “The Statistical Wars and Intellectual Conflicts of Interest”. …

1078025.457156
Why is copper red? Why is it so soft compared to, say, nickel—the element right next to it in the periodic table? Why is it such a good conductor of electricity? All of this stems from a violation of Hund’s rules. …

1091518.457163
Let us consider an acyclic causal model M of the sort that is central to causal modeling (Spirtes et al. 1993/2000, Pearl 2000/2009, Halpern 2016, Hitchcock 2018). Readers familiar with them can skip this section. M = , ^{F ⟩} is a causal model if, and only if, is a signature and = {F1 , . . . , Fn represents a set of n structural equations, for a finite natural number n. S = , , R is a signature if, and only if, is a finite set of exogenous variables, V = V1 , . . . ,Vn is a set of n endogenous variables that is disjoint from U, and R : U ∪ V → R assigns to each exogenous or endogenous variable X in U ∪ V its range (not codomain) R (X) ⊆ R. F = F1 , . . . , Fn represents a set of n structural equations if, and only if, for each natural number i, 1 ≤ i ≤ n: Fi is a function from the Cartesian product i = X∈U∪V\{Vi R (X) of the ranges of all exogenous and endogenous variables other than Vi into the range R (Vi) of the endogenous variable Vi. The set of possible worlds of the causal model M is defined as the Cartesian productW = X∈U∪VR (X) of the ranges of all exogenous and endogenous variables.

1091548.45717
There is a growing consensus among philosophers that quantifying valueladen concepts can be epistemically successful and politically legitimate if all valueladen choices in the process of quantification are aligned with stakeholder values. I argue that proponents of this view have failed to argue for its basic premise: successful quantification is sufficiently unconstrained so that it can be achieved along specific, stakeholderrelative pathways. I then challenge this premise by considering a rare example of successful valueladen quantification in seismology. Seismologists quantified earthquake size precisely by excluding stakeholder values from measure design and testing.

1091721.457177
The masscount distinction is a morphosyntactic distinction among nouns in English and many other languages. Tree, chair, person, group, and portion are count nouns, which come with the plural and accept numerals such as one and first; water, rice, furniture, silverware, and law enforcement are mass nouns, which lack the plural and do not accept numerals. The morphosyntactic distinction is generally taken to have semantic content or reflect a semantic masscount distinction. At the center of the semantic masscount distinction is, in some way or another, a notion of being one or being a single entity, the basis of countability. There is little unanimity, however, of how the notion of being a single entity is to be understood and thus what the semantic masscount distinction consists in.

1091836.457184
To analyse contingent propositions, this paper investigates how branching time structures can be combined with probability theory. In particular, it considers assigning infinitesimal probabilities—available in nonArchimedean probability theory—to individual histories. This allows us to introduce the concept of ‘remote possibility’ as a new modal notion between ‘impossibility’ and ‘appreciable possibility’. The proposal is illustrated by applying it to a future contingent and a historical counterfactual concerning an infinite sequence of coin tosses. The latter is a toy model that is used to illustrate the applicability of the proposal to more realistic physical models.

1091870.457192
According to the orthodox view, one can appeal to the symmetries of a theory in order to show that it is impossible to measure the properties that are not invariant under such symmetries. For example, it is widely believed that the fact that boosts are symmetries of Newtonian mechanics entails that it is impossible to measure states of absolute motion in a Newtonian world (these states vary under boosts). This paper offers an overview of the various ways by which philosophers have spelled out the connection between the symmetries of a theory and the alleged impossibility of measuring some properties (the variant ones). The paper will use the case of absolute motion as a case study, and will discuss a recent unorthodox view according to which this kind of motion can actually be measured in Newtonian mechanics. The paper ends by considering some avenues by which the discussion can be further developed.

1324586.457199
A wide variety of stochastic models of cladogenesis (based on speciation and extinction) lead to an identical distribution on phylogenetic tree shapes once the edge lengths are ignored. By contrast, the distribution of the tree’s edge lengths is generally quite sensitive to the underlying model. In this paper, we review the impact of different model choices on tree shape and edge length distribution, and its impact for studying the properties of phylogenetic diversity (PD) as a measure of biodiversity, and the loss of PD as species become extinct at the present. We also compare PD with a stochastic model of feature diversity, and investigate some mathematical links and inequalities between these two measures plus their predictions concerning the loss of biodiversity under extinction at the present.

1380250.457206
This paper presents a new problem for the inference rule commonly known as Inference to the Best Explanation (IBE). The problem is that uncertainty about parts of one’s evidence may undermine the inferrability of a hypothesis that would provide the best explanation of that evidence, especially in cases where there is an alternative hypothesis that would provide a better explanation of only the more certain pieces of evidence. A potential solution to the problem is sketched, in which IBE is generalized to handle uncertain evidence by invoking a notion of evidential robustness.