-
46805.449414
In Part 1, I explained how statistical mechanics is connected to a rig whose operations depend on a real parameter and approach the ‘tropical rig’, with operations and as I explained my hope that if we take equations from classical statistical mechanics, expressed in terms of this -dependent rig, and let we get equations in thermodynamics. …
-
359469.449661
I’m trying to work out how classical statistical mechanics can reduce to thermodynamics in a certain limit. I sketched out the game plan in Part 1 but there are a lot of details to hammer out. While I’m doing this, let me stall for time by explaining more precisely what I mean by ‘thermodynamics’. …
-
404830.449683
This paper proves normalisation theorems for intuitionist and classical negative free logic, without and with the operator for definite descriptions. Rules specific to free logic give rise to new kinds of maximal formulas additional to those familiar from standard intuitionist and classical logic. When is added it must be ensured that reduction procedures involving replacements of parameters by terms do not introduce new maximal formulas of higher degree than the ones removed. The problem is solved by a rule that permits restricting these terms in the rules for @, D and to parameters or constants. A restricted subformula property for deductions in systems without is considered. It is improved upon by an alternative formalisation of free logic building on an idea of Ja´skowski’s. In the classical system the rules for require treatment known from normalisation for classical logic with _ or D. The philosophical significance of the results is also indicated.
-
602714.449697
Introductory note . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 1.2
-
635861.449711
In 2016 Vladimir Voevodsky sent the author an email message where he explained his conception of mathematical structure using a historical example borrowed from the Commentary to the First Book of Euclid’s Elements by Proclus; this message was followed by a short exchange where Vladimir clarified his conception of structure. In this Chapter Voevodsky’s historical example is explained in detail, and the relevance of Voevodsky’s conception of mathematical structure in Homotopy Type theory is shown. The Chapter also discusses some related historical and philosophical issues risen by Vladimir Voevodsky in the same email exchange. This includes a comparison of Voevodsky’s conception of mathematical structure and other conceptions of structure found in the current literature. The concluding part of this Chapter includes relevant fragments of the email exchange between Vladimir Voevodsky and the author.
-
654374.449723
A couple of years ago I showed how to construct hyperreal finitely additive probabilities on infinite sets that satisfy certain symmetry constraints and have the Bayesian regularity property that every possible outcome has non-zero probability. …
-
658028.449736
The evolutionary relationships between species are typically represented in the biological literature by rooted phylogenetic trees. However, a tree fails to capture ancestral reticulate processes, such as the formation of hybrid species or lateral gene transfer events between lineages, and so the history of life is more accurately described by a rooted phylogenetic network. Nevertheless, phylogenetic networks may be complex and difficult to interpret, so biologists sometimes prefer a tree that summarises the central tree-like trend of evolution. In this paper, we formally investigate methods for transforming an arbitrary phylogenetic network into a tree (on the same set of leaves) and ask which ones (if any) satisfy a simple consistency condition. This consistency condition states that if we add additional species into a phylogenetic network (without otherwise changing this original network) then transforming this enlarged network into a rooted phylogenetic tree induces the same tree on the original set of species as transforming the original network. We show that the LSA (lowest stable ancestor) tree method satisfies this consistency property, whereas several other commonly used methods (and a new one we introduce) do not. We also briefly consider transformations that convert arbitrary phylogenetic networks to another simpler class, namely normal networks.
-
736219.449755
This paper offers a new theory of donkey anaphora that does not include any dynamic component. Even if the approach is not dynamic, it retains a key aspect of the dynamic tradition, namely the view that information states include not just factual information about the world, but also information about discourse referents, e.g., variables. It also makes crucial use of plural assignment functions (sets of standard assigments, cf. van der Berg 1996; Nouwen 2003; Brasoveanu 2008). Unlike dynamic approaches, sentences are evaluated as true or false relative to a pair (w, G), where w is a possible world and G is a plural assignment, with no reference to contexts or information states, and compositional semantics does not refer in any way to context update. In order to predict adequate meanings and felicity conditions, I combine two ingredients that have been used to account for presupposition projection, namely Trivalence (Peters 1979; Beaver and Krahmer 2001) and Schlenker’s Transparency Principle(Schlenker 2007, 2008).
-
801103.449767
We present Duper, a proof-producing theorem prover for Lean based on the superposition calculus. Duper can be called directly as a terminal tactic in interactive Lean proofs, but is also designed with proof reconstruction for a future Lean hammer in mind. In this paper, we describe Duper’s underlying approach to proof search and proof reconstruction with a particular emphasis on the challenges of working in a dependent type theory. We also compare Duper’s performance to Metis’ on pre-existing benchmarks to give evidence that Duper is performant enough to be useful for proof reconstruction in a hammer.
-
808952.449788
Bayesian epistemology is broadly concerned with providing norms for rational belief and learning using the mathematics of probability theory. But many authors have worried that the theory is too idealized to accurately describe real agents. In this paper I argue that Bayesian epistemology can describe more realistic agents while retaining sufficient generality by introducing ideas from a branch of mathematics called computable analysis. I call this program computable Bayesian epistemology. I situate this program by contrasting it with an ongoing debate about ideal versus bounded rationality. I then present foundational ideas from computable analysis and demonstrate their usefulness by proving the main result: on countably generated spaces there are no computable, finitely additive probability measures. On this basis I argue that bounded agents cannot have finitely additive credences, and so countable additivity is the appropriate norm of rationality. I conclude by discussing prospects for this research program.
-
1203997.449801
We say of an isolated macroscopic quantum system in a pure state ψ that it is in macroscopic thermal equilibrium if ψ lies in or close to a suitable subspace Heq of Hilbert space. It is known that every initial state ψ will eventually reach macroscopic thermal equilibrium and stay there most of the time (“thermalize”) if the Hamiltonian is non-degenerate and satisfies the appropriate version of the eigenstate thermalization hypothesis (ETH), i.e., that every eigenvector is in macroscopic thermal equilibrium. Shiraishi and Tasaki recently proved the ETH for a certain perturbation HfFθ of the Hamiltonian HfF of N ≫ 1 free fermions on a one-dimensional lattice. The perturbation is needed to remove the high degeneracies of HfF. Here, we point out that also for degenerate Hamiltonians, all ψ thermalize if the ETH holds for every eigenbasis, and we prove that this is the case for HfF. On top of that and more generally, we develop another strategy of proving thermalization, inspired by the fact that there is one eigenbasis of HfF for which ETH can be proven more easily and with smaller error bounds than for the others. This strategy applies to arbitrarily small generic perturbations HfF of HfF, which seem no less realistic than HfFθ , and to arbitrary spatial dimensions. In fact, we consider any given H , suppose that the ETH holds for some but not necessarily every eigenbasis of H , and add a small generic perturbation, H = H + λV with λ ≪ 1. Then, although H (which is non-degenerate) may still not satisfy the ETH, we show that nevertheless (i) every ψ thermalizes for most perturbations V , and more generally, (ii) for any subspace Hν (such as corresponding to a non-equilibrium macro state), most perturbations V are such that most ψ from Hν thermalize.
-
1270413.449813
Theories of qualitative probability provide a justification for the use of numerical probabilities to represent an agent’s degrees of belief. If a qualitative probability relation satisfies a set of well-known axioms then there is a probability measure that is compatible with that relation. In the particular case of subjective probability this means that we have sufficient conditions for representing an agent as having probabilistic beliefs. But the classical results are not constructive; there is in no general method for calculating the compatible measure from the qualitative relation. To address this problem this paper introduces the theory of computable qualitative probability. I show that there is an algorithm that computes a probability measure from a qualitative relation in highly general circumstances. Moreover I show that given a natural computability requirement on the qualitative relation the resulting probability measure is also computable. Since computable probability is a growing interest in Bayesian epistemology this result provides a valuable interpretation of that notion.
-
1270432.449825
We consider and apply a multidimensional discrete-time delay autonomous third order nonlinear vector difference equation system, where the orthogonal change after two reflections is given by a vector cross product leading to spinor rotations Xi−Xi−2 = −Xi−1 ×Fi (Xi−1 , Xi−2, Xi−3) ∈ R or ∈ R , where the symmetry invariant C = Xi · Xi−1 = Xi−1 ·Xi−2 = ... allows at every step for a polar charge generating sign flip including expansion/contraction by a scalar factor. Using the Frenet frame approach defining orthogonal co-moving components with torsion and curvature parameter, both, the orthogonal frame and the change of the frame are represented by the three position memory terms recurrently. The necessary cross and dot vector product (split-)algebra is encoded in variable multiplication tables in 3d and 7d. We discuss two special Fi types showing stable point densities, which are drifting limit cycles with sub-cycles, where the resulting smooth orbital spinor dynamics shows discrete atomic-type orbital eigenstates or local waves with characteristic numbers, non-local reflection, instability, hysteresis, interaction, helical emissions, and transition to chaos. Limit cycle modular arithmetic with equivalence classes characterizes the resulting point groups, rings, or Nj-gons, where sub-cycles appear on three levels with a total limit cycle length given by the product of the three sub-cycle lengths.
-
1293378.449838
In a 1966 lecture, “What Are the Logical Notions?”, Tarski’s proposed the following criterion of logicality: Invariance under Permutation: A notion is logical iff it is invariant under all permutations of the individuals in the “world” (or universe of discourse). By “notions” Tarski understood not linguistic or conceptual entities but objects of the kind referred to by such entities, i.e., objects in the world, including individuals, properties (sets), relations, and functions. “World” he understood as including both physical and mathematical objects and as forming a type-theoretic hierarchy, based on Principia Mathematica or a similar theory. In the present context it will be convenient to view objects as operators (characteristic functions representing them) and use standard set theory with urelements rather than Principia Mathematica as our background theory.
-
1407537.449849
I have never taken the contributions of Hermann Weyl to the foundations of intuitionistic mathematics at all seriously. For one thing, he was a turncoat, the Benedict Arnold of mathematical intuitionism. ... Weyl embraced (a ragged selection from) Brouwer’s ideas with his usual grandiose fanfare. However – by 1924 or so – he had slunk away from them ignominiously. ... As if he had not inflicted enough damage upon Brouwer’s intuitionism, Weyl set out to clarify the murk of the Brouwerian philosophy by folding into it liberal doses of the intellectually poisonous mire of Husserlian phenomenology. In that, Weyl only compounded his treachery and, like his latterday followers [Van Atten, D. van Dalen, & Tieszen 2002], ramped up the crazy. Without question, there is a hell for mathematicians – and Hermann Weyl now lies deep in its Ninth Circle. (McCarty 2021, 316) There are good reasons for some of these claims. Unfortunately, the secondary literature remains largely ineffective in suppressing the intellectual toxins that McCarty, and many other philosophers, detected in Husserlian phenomenology. Most attempts to elucidate the effects of the latter on Weyl’s philosophical thinking have been just as successful. But as far as I can see, some other of McCarty’s claims are unwarranted. For one thing, it’s not clear that what he saw as treason can be held against Weyl. Surely some kinds of treason are morally justified, even obligatory. It seems to me that, like his character Aristides, Weyl treasured honesty more than loyalty. But I will pass no judgment on that, though I find McCarty’s vision of the mathematical underworld quite amusing, just as he may have intended it.
-
1510121.449861
The formal, deductive system of ‘object theory’ (OT) has been put forward and applied in a number of papers since 1983. The canonical formulation, however, appears in Principia Logico-Metaphysica (Zalta m.s.), where it is first expressed in a 2nd-order quantified modal language.
-
1510255.449873
We connect the problem of variable choice to the so-called Bertrand Paradox. We begin with a neglected argument by C.S. Peirce against conceptualism, or what we now call “objective Bayesianism.” Peirce skewers the conceptualist on a dilemma. Either the conceptualist will not be able to learn from her evidence or she will have to endorse a contradiction. In the argument for the second horn of his dilemma, Peirce produces an instance of Bertrand’s Paradox. We consider a version of the Paradox from White (2010) and his argument that the Principle of Indifference is unnecessary to generate Bertrand-style problems. We generalize these arguments and show that any non-trivial partition of the event-space has a paradox-generating counterpart. We observe that partitions are essential to variable definitions and argue that the problem of variable choice— previously thought only a problem for interventionists—is in fact a problem for everyone.
-
1966979.449885
In this paper, I show how two modes of predication and quantification in a modal context allow one to (a) define what it is for an individual or relation to exist, (b) define identity conditions for properties and relations conceived hyperintensionally, (c) define identity conditions for individuals and prove the necessity of identity for both individuals and relations, (d) derive the central definition of free logic as a theorem, (e) define the essential properties of abstract objects and provide a framework for defining the essential properties of ordinary objects, and (f) derive a theory of truth. I also describe my indebtedness to the work of Terence Parsons, and take the opportunity to advance the discussion in connection with an objection raised to the theory of essential properties.
-
2235239.449897
A bureaucracy has to determine the values of many decision variables while satisfying a set of constraints. The bureaucracy is not assumed to have any objective function beyond achieving a feasible solution, which can be viewed as “satis…cing” à la Simon (1955). We assume that the variables are integer-valued and the constraints are linear. We show that simple and (arguably) natural versions of the problem are already NP-Hard. We therefore look at decentralized decisions, where each o¢ ce controls but one decision variable and can determine its value as a function of its past values. However, an attempt to consult more than a single past case can lead to Condorcet-style consistency problems. We prove an Arrovian result, showing that, under certain conditions, feasibility is guaranteed only if all of- …ces mimic their decisions in the same past case. This result can be viewed as explaining a status quo bias.
-
2618806.449911
A general mathematical framework, based on countable partitions of Natural Numbers [1], is presented, that allows to provide a Semantics to propositional languages. It has the particularity of allowing both the valuations and the interpretation Sets for the connectives to discriminate complexity of the formulas. This allows different adequacy criteria to be used to assess formulas associated with the same connective, but that differ in their complexity. The presented method can be adapted potentially infinite number of connectives and truth values, therefore, it can be considered a general framework to provide semantics to several of the known logic systems (eg, LC, L3 LP, FDE). The presented semantics allow to converge to different standard semantics if the separation complexity procedure is annulled. Therefore, it can be understood as a framework that allows greater precision (in complexity terms) with respect to formula satisfaction. Naturally, because of how it is built, it can be incorporated into non-deterministic semantics. The presented procedure also allows generating valuations that grant a different truth value to each formula of propositional language. As a positive side effect, our method allows a constructive proof of the equipotence between N and N for all Natural n.
-
2791777.449926
Introduction. In the previous paper it was suggested that Gödel’s phenomenological program has been usually replaced by its merely mathematical component in the recent years. Here, we intend to clarify a possible path for this program by means of the analysis of the context principle. In few words, Gödel seems to provide some kind of modal flavour for certain statements of set theory, namely, those that are ‘forced upon us’, say the axioms sanctioned by the so-called intrinsic criteria. In general, we claim that modalizing notions (not exclusively propositions) leads to conceptions such as those of intuitionists or phenomenologists. Therefore, the connection of Gödel’s phenomenological program with these ideas seems natural, to say the least. Nevertheless, Gödel can be read as defending modal collapse to some degree. Therefore, the interesting appeal of modalizing seems lost. It is here where we defend that the context principle, read psychologically (or, to be more precise, in an intentional manner) seems to be of some help: the possibility of natural extensions of ZFC –as a psychological fact– is granted by how we work with the formal system and how certain notions appear in our framework.
-
2879667.449938
Inferential expressivism makes a systematic distinction between inferences that are valid qua preserving commitment and inferences that are valid qua preserving evidence. I argue that the characteristic inferences licensed by the principle of comprehension, from x is P to x is in the extension of P and vice versa, fail to preserve evidence, but do preserve commitment. Taking this observation into account allows one to phrase inference rules for unrestricted comprehension without running into Russell’s paradox. In the resulting logic, one can derive full second-order arithmetic. Thus, it is possible to derive classical arithmetic in a consistent logic with unrestricted comprehension.
-
2907135.44995
Consider a typical case of relaxation of a system, initially out of thermal equilibrium, to a state of thermal equilibrium. It might, for instance, involve two objects, initially at different temperatures, placed in thermal contact, which, after a while, come to have the same temperature. I’d like to highlight two things about processes of this sort. First, they represent a lost opportunity to obtain work from the system; one could have used the two objects as reservoirs for a heat engine, extracting heat from the warmer one, converting some to work and discarding the rest into the cooler one, until their temperatures are equal. The flip side of this coin is: once the temperatures have equalized, one has to do work or expend some resource that could be used to do work, in order to restore the initial state. We call processes like this dissipatory. Spontaneous equilibration involves dissipation.
-
2954672.449962
We draw attention to a tendency of ancient Peripatetic logic echoing into the middle ages and beyond, to disregard limiting cases. Specific instances of this attitude have often been noticed, but they do not seem to have been taken together as illustrating a general pattern. We review the variety of ways in which the tendency manifested itself, discuss its apparent absence or at least reduced presence in Stoic logic, note some exceptions, and speculate on the reasons for its existence. The note raises as many questions as it answers, and should be seen as an invitation to further discussion.
-
3073614.449973
The realization of large-scale complex engineered systems is contingent upon satisfaction of the preferences of the stakeholder. With numerous decisions being involved in all the aspects of the system lifecycle, from conception to disposal, it is critical to have an explicit and rigorous representation of stakeholder preferences to be communicated to key personnel in the organizational hierarchy. Past work on stakeholder preference representation and communication in systems engineering has been primarily requirement-driven. More recent value-based approaches still do not offer a rigorous framework on how to represent stakeholder preferences but assume that an overarching value function that can precisely capture stakeholder preferences exists. This article provides a formalism based on modal preference logic to aid in rigorous representation and communication of stakeholder preferences. Formal definitions for the different types of stakeholder preferences encountered in a systems engineering context are provided in addition to multiple theorems that improve the understanding of the relationship between stakeholder preferences and the solution space.
-
3501736.449995
We begin with a brief explanation of our proof-theoretic criterion of paradoxicality— its motivation, its methods, and its results so far. It is a proof-theoretic account of paradoxicality that can be given in addition to, or alongside, the more familiar semantic account of Kripke. It is a question for further research whether the two accounts agree in general on what is to count as a paradox. It is also a question for further research whether and, if so, how the so-called Ekman problem bears on the investigations here of the intensional paradoxes. Possible exceptions to the proof-theoretic criterion are Prior’s Theorem and Russell’s Paradox of Propositions—the two best-known ‘intensional’ paradoxes. We have not yet addressed them. We do so here. The results are encouraging.
-
3501799.450008
Informally rigorous mathematical reasoning is relevant. So too should be the premises to the conclusions of formal proofs that regiment it. The rule Ex Falso Quodlibet induces spectacular irrelevance. We therefore drop it. The resulting systems of Core Logic C and Classical Core Logic C can formalize all the informally rigorous reasoning in constructive and classical mathematics respectively. We effect a revised match-up between deducibility in Classical Core Logic and a new notion of relevant logical consequence. It matches better the deducibility relation of Classical Core Logic than does the Tarskian notion of consequence. It is implosive, not explosive.
-
3501821.45002
In this note we extend a remarkable result of Brauer [2024] concerning propositional Classical Core Logic. We show that it holds also at first order. This affords a soundness and completeness result for Classical Core Logic. The ℂ -provable sequents are exactly those that are uniform substitution instances of perfectly valid sequents, i.e. sequents that are valid and that need every one of their sentences in order to be so. Brauer [2020] showed that the notion of perfect validity itself is unaxiomatizable. In the Appendix we use his method to show that our notion of relevant validity in Tennant [2024] is likewise unaxiomatizable. It would appear that the taking of substitution instances is an essential ingredient in the construction of a semantical relation of consequence that will be axiomatizable—and indeed, by the rules of proof for Classical Core Logic.
-
3501844.450035
This study seeks to reveal the proper source of the (correct ) rules of natural deduction (and their associated rules of the sequent calculus). Perhaps surprisingly, this source consists of just the familiar truth tables (deriving from Frege). These tables can be construed inferentially. The primitive steps of value-computation correspond to primitive steps of ‘inference’. We shall call them, however, primitive steps (or rules) of evaluation. These can be steps of verification or of falsification. The rules of evaluation constitute the inductive clauses in a metalinguistic co-inductive definition of model-relative verifications and falsifications.
-
3501889.450048
We furnish a core-logical development of the ‘Godel-numbering’ framework that allows metamathematicians to attain limitative results about arithmetical truth without incorporating a genuine truth-predicate into the language in a way that would lead to semantic closure. We show how Tarski’s celebrated theorem on the arithmetical undefin-ability of arithmetical truth can be established using only Core Logic in both the object language and the metalanguage. We do so at a high level of abstraction, by augmenting the usual first-order language of arithmetic with a primitive predicate Tr and then showing how it cannot be a truth-predicate for the augmented language.