
39398.13094
Works by (Humberstone 1981, 2011), van Benthem (1981, 2016), Holliday 2014, forthcoming, and Ding & Holliday 2020 attempt to develop a semantics of modal logic in terms of “possibilities”, i.e., “less determinate entities than possible worlds” (Edgington 1985). These works take possibilities as semantically primitive entities, stipulate a number of semantic principles that govern these entities (namely, Ordering, Persistence, Refinement, Cofinality, Negation, and Conjunction), and then interpret a modal language via this semantic structure. In this paper, we define possibilities in object theory (OT), and derive, as theorems, the semantic principles stipulated in the works cited. We then raise a concern for the semantic investigation of possibilities without a modal operator, and show that no such concerns the metaphysics of possibilities as developed in OT.

42980.131298
Hume’s argument against the credibility of testimony for miracles – in Section 10 of his Enquiry concerning Human Understanding – is one of the most famous in the philosophical canon. Yet both its interpretation and its assessment are highly controversial. I have discussed the most common interpretative issues elsewhere, and will mainly pass over these here (with references to those previous discussions in case readers wish to follow them up). My main aim now is to focus instead on the cogency and force of Hume’s argument, and how it relates to his more general scepticism about theism as manifested in his Dialogues concerning Natural Religion. So this is primarily a philosophical rather than interpretative investigation.

147659.131327
It has been suggested that the following three theses are incompatible: Moral Realism, Epistemicism about vagueness, and the claim that moral terms are vague. If this is so, (at least) one these three must be rejected. This paper explores the possibility of resolving this trilemma by rejecting Moral Vagueness.

170807.131362
In this article I introduce a distinction between two types of reparametrization invariant models and I argue that while both suffer from a problem of time at the time of applying canonical quantization methods to quantize them, its severity depends greatly on the type of model. Deparametrizable models are models that have time as a configuration or phase space variable and this makes it the case that the problem of time can be solved. In the case of nondeparametrizable models, we cannot find time in the configuration or phase space of the model, and hence the techniques that allow solving the problem in the deparametrizable case do not apply. This seems to signal that the canonical quantization techniques fail to give a satisfactory quantization of nondeparametrizable models. As I argue that general relativity is nondeparametrizable, this implies that the canonical quantization of this theory may fail to provide a successful theory of quantum gravity.

170829.13138
This paper examines the development of causal perturbation theory, a reformulation of perturbative quantum field theory (QFT) starting from a causality condition rather than a timeevolution equation. We situate this program alongside other causalitybased reformulations of relativistic quantum theory which flourished in the postwar period, contrasting it in particular with axiomatic QFT. Whereas the axiomatic QFT tradition tried to move beyond the perturbative expansion, causal perturbation theory can be thought of as a foundational investigation of this approximation method itself. Unearthing this largely forgotten research program helps clarify questions of contemporary philosophical interest, for instance about the interpretative significance of the ultraviolet divergences which appear in the series expansion, but also help us understand why causality conditions became so ubiquitous in postwar highenergy theory.

209194.1314
Expertise slows the progress of knowledge, some say. First, it delays arrival at the cutting edge: if you must master everything that came before, you may not begin original research until your 30s, when your brain is a rigid fossil and retirement is already near. …

209194.131417
Readers: I gave the Neyman Seminar at Berkeley last Wednesday, October 9, and had been so busy preparing it that I did not update my leisurely cruise for October. This is the second stop. I will shortly post remarks on the the panel discussion that followed my Neyman talk (with panelists, Ben Recht, Philip Stark, Bin Yu, and Snow Zhang), which was quite illuminating. …

343940.131445
I suggest that the current situation in quantum field theory (QFT) provides some reason to question the universal validity of ontological reductionism. I argue that the renormalization group flow is reversible except at fixed points, which makes the relation between large and small distance scales quite symmetric in QFT, opening up at least the technical possibility of a nonreductionist approach to QFT. I suggest that some conceptual problems encountered within QFT may potentially be mitigated by moving to an alternative picture in which it is no longer the case that the large supervenes on the small. Finally, I explore some specific models in which a form of nonreductionism might be implemented, and consider the prospects for future development of these models.

343960.131459
The current paper examines how a commitment to a principle, adhered to by an individual agent, becomes an accepted standard of an epistemic community. Addressing this question requires three steps: first, to define the terms used throughout the paper, and especially the characteristics of commitments to a principle. The second step is to find a mechanism through which such epistemic commitments are introduced to an epistemic community and in certain cases are adopted as the standard by the community. While there could be several such mechanisms, the current paper focuses on the practice of model formulation. The third step is to demonstrate the analytical framework developed in the first two steps in a case study. The case study chosen for this paper is the unique approach to feedback analysis adopted by the ecologist and population geneticist Richard Levins. In what follows I will show that part of the features that made Levins’ approach unique was his Marxist commitments, and his attempt to embody those commitments in feedback analysis by formally representing them as modeling assumptions.

343984.131472
In this paper I raise a worry about the most extended resolutions of the problem of time of canonical quantizations of general relativity. The reason for this is that these resolutions are based on analogies with deparametrizable models for which the problem can be solved, while I argue in this paper that there are good reasons for doubting about these resolutions when the theory is not deparametrizable, which is the case of general relativity. I introduce an example of a nondeparametrizable model, a double harmonic oscillator system expressed by its Jacobi action, and argue that the problem of time for this model is not solvable, in the sense that its canonical quantization doesn’t lead to the quantum theory of two harmonic oscillators and the standard resolutions of the problem of time don’t work for this case. I argue that as general relativity is strongly analogous to this model, one should take seriously the view that the canonical quantization of general relativity doesn’t lead to a meaningful quantum theory. Finally, I comment that this has an impact on the foundations of different approaches to quantum gravity.

344006.131501
In this paper I introduce the idea of geometrogenesis as suggested in the group field theory (GFT) literature, and I offer a criticism of it. Geometrogenesis in the context of GFT is the idea that what we observe as the big bang is nothing else but a phase transition from a nongeometric phase of the universe to a geometric one, which is the one we live in and the one to which the spacetime concepts apply. GFT offers the machinery to speak about geometric and nongeometric phases, but I argue that there are serious conceptual issues that threaten the viability of the idea. Some of these issues are directly related to the foundations of GFT and are concerned with the fact that it isn’t clear what GFT amounts to and how to understand it. The other main source of trouble has to do with geometrogenesis itself and its conceptual underpinnings, as it is unclear whether it requires the addition of an extra temporal or quasitemporal dimension, which is unwanted and problematic.

344029.131524
In this paper I offer an introduction to group field theory (GFT) and to some of the issues affecting the foundations of this approach to quantum gravity. I first introduce covariant GFT as the theory that one obtains by interpreting the amplitudes of certain spin foam models as Feynman amplitudes in a perturbative expansion. However, I argue that it is unclear that this definition of GFTs amounts to something beyond a computational rule for finding these transition amplitudes and that GFT doesn’t seem able to offer any new insight into the foundations of quantum gravity. Then, I move to another formulation of GFT which I call canonical GFT and which uses the standard structures of quantum mechanics. This formulation is of extended use in cosmological applications of GFT, but I argue that it is only heuristically connected with the covariant version and spin foam models. Moreover, I argue that this approach is affected by a version of the problem of time which raises worries about its viability. Therefore, I conclude that there are serious concerns about the justification and interpretation of GFT in either version of it.

344052.131547
In this paper, I consider a recent controversy about whether firstclass constraints generate gauge transformations in the case of electromagnetism. I argue that there is a notion of gauge transformation, the extended notion, which is different from the original gauge transformation of electromagnetism, but at the same time not trivial, which allows the making of that claim. I further argue that one can expect that this claim can be extended to more general theories, and that Dirac’s conjecture may be true for some physically reasonable theories and only in this sense of gauge transformation. Finally, I argue that the extended notion of gauge transformation seems unnatural from the point of view of classical theories, but that it nicely fits with the way quantum versions of gauge theories are constructed.

344086.131573
In this paper I argue that the fundamental aspect of our notion of time is that it defines an order relation, be it a total order relation between configurations of the world or just a partial order relation between events. This position is in contrast with a relationalist view popular in the quantum gravity literature, according to which it is just correlations between physical quantities that we observe and which capture every aspect of temporality in the world, at least according to general relativity. I argue that the view of time as defining an order relation is perfectly compatible with the way general relativity is applied, while the relationalist view has to face some challenges. This debate is important not only from the perspective of the metaphysics of space and time and of how to interpret our physical theories, but also for the development and understanding of theories of quantum gravity.

344111.131592
Some authors have defended the claim that one needs to be able to define ‘physical coordinate systems’ and ‘observables’ in order to make sense of general relativity. Moreover, in Rovelli (Physical Review D, 65(4), 044017 2002), Rovelli proposes a way of implementing these ideas by making use of a system of satellites that allows defining a set of ‘physical coordinates’, the GPS coordinates. In this article I oppose these views in four ways. First, I defend an alternative way of understanding general relativity which implies that we have a perfectly fine interpretation of the models of the theory even in the absence of ‘physical coordinate systems’. Second, I analyze and challenge the motivations behind the ‘observable’ view. Third, I analyze Rovelli’s proposal and I conclude that it does not allow extracting any physical information from our models that wasn’t available before. Fourth, I draw an analogy between general relativistic spacetimes and Newtonian spacetimes, which allows me to argue that as ‘physical observables’ are not needed in Newtonian spacetime, then neither are they in general relativity. In this sense, I conclude that the ‘observable’ view of general relativity is unmotivated.

366323.131618
Ludwig Boltzmann (1844–1906) is generally acknowledged as one of
the most important physicists of the nineteenth century. Particularly
famous is his statistical explanation of the second law of
thermodynamics. The celebrated formula \(S = k \log W\), expressing a
relation between entropy \(S\) and probability \(W\) has been engraved
on his tombstone (even though he never actually wrote this formula
down). Boltzmann’s views on statistical physics continue to play
an important role in contemporary debates on the foundations of that
theory. However, Boltzmann’s ideas on the precise relationship between
the thermodynamical properties of macroscopic bodies and their
microscopic constitution, and the role of probability in this
relationship are involved and differed quite remarkably in different
periods of his life.

401790.131636
Unwarranted and incorrect claims have been made in the philosophy literature regarding the quantum theory of molecules. Various influential authors (Lombardi and Castagnino 2010; Chang 2015; Cartwright 2022) have asserted that approximations used in the quantum chemistry of molecules, and specifically the BornOppenheimer approximation, violates the Heisenberg uncertainty principle, and thus is in fundamental conflict with quantum theory. From this the failure of reduction of chemistry to physics is adduced. We refute these claims based upon a (textbook level) presentation of the mathematical details of the approximation together with an analysis of the relevant physical idealizations. There are more subtle questions regarding the formal justification of a particular set of mathematical idealizations involved in modern formalisations of the BornOppenheimer approximation (Sutcliffe and Woolley 2012). Drawing upon recent work in the mathematical physics literature (Jecko 2014) we show how such idealizations may also be justified to the relevant standards of rigour. We conclude with a prospectus of wider philosophical issues regarding rigour, reduction, and idealization in the quantum theory of molecules. This prospectus sets an agenda for work in the philosophy of quantum chemistry that is more solidly grounded in scientific practice.

401812.131658
In previous work the author has proposed a different approach to the problem of von Neumann measurement and wave function collapse. Here we apply it to the collapse of degenerate states. Our predictions differ from those of von Neumann and, separately, Lüders in significant ways. An experiment is suggested that might distinguish between the possibilities.

452326.131676
I consider statistical criteria of algorithmic fairness from the perspective of the ideals of fairness to which these criteria are committed. I distinguish and describe three theoretical roles such ideals might play. The usefulness of this program is illustrated by taking Base Rate Tracking and its ratio variant as a case study. I identify and compare the ideals of these two criteria, then consider them in each of the aforementioned three roles for ideals. This ideals program may present a way forward in the normative evaluation of candidate statistical criteria of algorithmic fairness.

459454.1317
Based on unpublished, archival material, some informal reactions by George Polya to Imre Lakatos’ ”Proofs and Refutations” are presented. The archival material is letters by Polya to Lakatos in the period between 1957 and 1965. The letters show that Polya admired Lakatos’ work but he also voiced some criticism, especially when Lakatos deviates from heuristics.

488536.131719
Standard decision theory studies oneshot decisions, where an agent faces a single choice. Real decision problems, one might think, are more complex. To find the way out of a maze, or to win a game of chess, the agent needs to make a series of choices, each dependent on the others. …

574807.131736
Katie Steele and H. Orri Stefánsson argue that, to reflect an agent’s limited awareness, the algebra of propositions on which that agent’s credences are defined should be relativised to their awareness state. I argue that this produces insurmountable difficulties. But the project of relativising the agent’s algebra to reflect their partial perspective need not be abandoned: the algebra can be relativised, not to the agent’s awareness state, but to what we might call their subjective modality.

574941.131761
Studies of visual event individuation often consider people’s representations of activities involving agents performing complex tasks. Concomitantly, theories of event individuation emphasize predictions about agents’ intentions. Studies that have examined simple, nonagential occurrences leave open the possiblity that principles of visual object individuation play a role in visual event individuation. Unearthing principles that may be sufficient for event individuation which are distinct both from predictions about agents’ intentions and from visual object individuation, we draw on and extend studies that reveal object and event representation to be deeply analogous in our cognitive economy. We provide evidence that ‘temporal shaping’ is a sufficient lowlevel perceptual criterion for the visual individuation of events. In our study, temporal shaping is effected by the introduction of pauses into an otherwise continuous process. Future studies should address other visual mechanisms for introducing temporal shaping (e.g., color changes).

624388.131785
Landau and Peierls wrote down the Hamiltonian of a simplified version of quantum electrodynamics in the particleposition representation. We present a multitime version of their Schrodinger equation, which bears several advantages over their original equation: the time evolution equations are simpler and more natural; they are more transparent with respect to choice of gauge; and, perhaps most importantly, they are manifestly Lorentz covariant. We discuss properties of the multitime equations. Along the way, we also discuss the Lorentz covariant 3d Dirac delta distribution for spacelike surfaces and the inner product of photon wave functions on spacelike surfaces in an arbitrary gauge.

690334.131802
In this paper I will address three topics in the logic of conditionals. The first is the question whether the class of ‘reasonable’ probability functions must be closed under conditionalization. The second topic is the character of logical consequence when probabilities of conditionals come into play. The third is more specific: I want to present a challenge to the possible worlds approach in formal semantics, in favor of an algebraic approach. For this I will use as a case study Alan Hajek’s views on counterfactual conditionals, and its problems with infinity. Included in this will be reasons to expect algebras of propositions to be incomplete algebras. Throughout I will use as foil what is known variously as Stalnaker’s Thesis, or the Conditional Construal of Conditional Probability (CCCP). That is the thesis that the probability of a conditional A → B is the conditional probability of B given A, when defined. That the CCCP is tenable for a reasonable logic of conditionals I will presuppose in the body of the paper, but I will present its credentials in the Appendix. The CCCP is to be distinguished from the Extended Stalnaker’s Thesis, or Extended CCCP, that the conditional probability of A → B given C equals the conditional probability of B given A and C. That extended thesis has been demolished again and again, and will appear here only in a note, to be dismissed.

838184.131823
The other night I spoke at a quantum computing event and was asked—for the hundredth time? the thousandth?—whether I agreed that the quantum algorithm called QAOA was poised revolutionize industries by finding better solutions to NPhard optimization problems. …

919311.13184
Human vision can detect a single photon, but the minimal exposure required to extract meaning from stimulation remains unknown. This requirement cannot be characterised by stimulus energy, because the system is differenfined by configuration rather than physical tially sensitive to attributes de amplitude. Determining minimal exposure durations required for processing various stimulus attributes can thus reveal the system’s priorities. Using a tachistoscope enabling arbitrarily brief displays, we establish minimal durations for processing human faces, a stimulus category whose perception is associated with several wellcharacterised behavioural and neural markers. Neural and psychophysical measures show a sequence of distinct minimal exposures for stimulation detection, objectlevel detection, facespecific processing, and emotionspecific processing. Resolving ongoing debates, face orientation affects minimal exposure but emotional expression does not.

921039.131859
This paper challenges the notion of emergent time in quantum cosmology by examining the reconciliation of the timeless WheelerdeWitt equation with the Universe’s dynamical evolution. It critically evaluates the analogy between the WheelerDeWitt and KleinGordon equations, highlighting challenges for the identification of an emergent time parameter. The paper concludes that refining this analogy may lead to a better understanding of emergent time in quantum cosmology, though still not free of complications.

921244.131878
The concept of information was introduced in the middle of the last century by Shannon and since then an entire branch of research has been developing into what is called Mathematical Theory of Communication which deals with studying the amount of information exchanged in a communication channel. In this article we want to use the concept of information to analyze the conceptual change that occurred with the Copernican Revolution, limiting ourselves to the concept of Physical Object and using the Dynamic Frames developed by Barsalou in Cognitive Science.

921265.1319
Most believe that there are no empirical grounds that make the adoption of quantum logic necessary. Ian Rumfitt has further argued that this adoption is not possible, either, for the proof that distribution fails in quantum mechanics is rulecircular or unsound. I respond to Rumfitt, by showing that neither is the case: rulecircularity disappears when an appropriate semantics is considered, and soundness is restored by slightly modifying standard quantum mechanics. Thus, albeit this is indeed not necessary, it is however possible for a quantum logician to rationally adjudicate against classical logic.