Skip to main content
Top

2022 | Book

Logical Foundations of Computer Science

International Symposium, LFCS 2022, Deerfield Beach, FL, USA, January 10–13, 2022, Proceedings

insite
SEARCH

About this book

This book constitutes the refereed proceedings of the International Symposium on Logical Foundations of Computer Science, LFCS 2022, held in Deerfield Beach, FL, USA, in January 2022.
The 23 revised full papers were carefully reviewed and selected from 35 submissions. The scope of the Symposium is broad and includes constructive mathematics and type theory; homotopy type theory; logic, automata, and automatic structures; computability and randomness; logical foundations of programming; logical aspects of computational complexity; parameterized complexity; logic programming and constraints; automated deduction and interactive theorem proving; logical methods in protocol and program verification; logical methods in program specification and extraction; domain theory logics; logical foundations of database theory; equational logic and term rewriting; lambda and combinatory calculi; categorical logic and topological semantics; linear logic; epistemic and temporal logics; intelligent and multiple-agent system logics; logics of proof and justification; non-monotonic reasoning; logic in game theory and social software; logic of hybrid systems; distributed system logics; mathematical fuzzy logic; system design logics; other logics in computer science.

Table of Contents

Frontmatter
A Non-hyperarithmetical Gödel Logic
Abstract
Let \(\mathsf{{G}}_\downarrow \) be the Gödel logic whose set of truth values is https://static-content.springer.com/image/chp%3A10.1007%2F978-3-030-93100-1_1/525557_1_En_1_IEq2_HTML.gif . Baaz-Leitsch-Zach have shown that \(\mathsf{{G}}_\downarrow \) is not recursively axiomatizable and Hájek showed that it is not arithmetical. We find the optimal strengthening of their theorems and prove that the set of validities of \(\mathsf{{G}}_\downarrow \) is \(\varPi ^1_1\) complete and the set of satisfiable formulas in \(\mathsf{{G}}_\downarrow \) is \(\varSigma ^1_1\) complete.
Juan Pablo Aguilera, Jan Bydzovsky, David Fernández-Duque
Andrews Skolemization May Shorten Resolution Proofs Non-elementarily
Abstract
In this paper we construct a sequence of formulas \(F_1, F_2, \ldots \) with resolution proofs \(\pi _1, \pi _2, \ldots \) of these formulas after Andrews Skolemization, such that there is no elementary bound in the complexity of \(\pi _1, \pi _2, \ldots \) of resolution proofs \(\pi '_1, \pi '_2, \ldots \) after structural Skolemization. The proofs are based on the elementary relation of resolution derivations with Andrews Skolemization to cut-free \(\mathbf{LK}^{+}\)-derivations and of resolution derivations with structural Skolemization to cut-free \(\mathbf{LK}\)-derivations. Therefore, this paper develops an application of the concept of only globally sound calculi to automated theorem proving.
Matthias Baaz, Anela Lolic
The Isomorphism Problem for FST Injection Structures
Abstract
An injection structure \({\mathcal A}= (A,f)\) is a set A together with a one-place one-to-one function f. \({\mathcal A}\) is a Finite State Transducer (abbreviated FST) injection structure if A is a regular set, that is, the set of words accepted by some finite automaton, and f is realized by a deterministic finite-state transducer. Automatic relational structures have been well-studied along with the isomorphism problem for automatic structures. For an FST injection structure (Af), the graph of f is not necessarily automatic. We continue the study of the complexity of FST injection structures by showing that the isomorphism problem for unary FST injection structures is decidable in quadratic time in the size (number of states) of the FST.
Douglas Cenzer, Richard Krogman
Justification Logic and Type Theory as Formalizations of Intuitionistic Propositional Logic
Abstract
We explore two ways of formalizing Kreisel’s addendum to the Brouwer-Heyting-Kolmogorov interpretation. To do this we compare Artemov’s justification logic with simply typed \(\lambda \) calculus, by introducing a map from justification terms into \(\lambda \) terms, which can be viewed as a method of extracting the computational content of the justification terms. Then we examine the interpretation of Kreisel’s addendum in justification logic along with the image of the resulting justification terms under our map.
Neil J. DeBoer
Hyperarithmetical Worm Battles
Abstract
Japaridze’s provability logic \(\mathrm {GLP}\) has one modality [n] for each natural number and has been used by Beklemishev for a proof theoretic analysis of Peano arithmetic (\(\mathrm {PA}\)) and related theories. Among other benefits, this analysis yields the so-called Every Worm Dies (\(\mathrm {EWD}\)) principle, a natural combinatorial statement independent of \(\mathrm {PA}\). Recently, Beklemishev and Pakhomov have studied notions of provability corresponding to transfinite modalities in \(\mathrm {GLP}\). We show that indeed the natural transfinite extension of \(\mathrm {GLP}\) is sound for this interpretation, and yields independent combinatorial principles for the second order theory \(\mathrm {ACA}\) of arithmetical comprehension with full induction. We also provide restricted versions of \(\mathrm {EWD}\) related to the fragments \(\mathrm {I}\Sigma _n\) of Peano arithmetic.
David Fernández-Duque, Konstnatinos Papafilippou, Joost J. Joosten
Parametric Church’s Thesis: Synthetic Computability Without Choice
Abstract
In synthetic computability, pioneered by Richman, Bridges, and Bauer, one develops computability theory without an explicit model of computation. This is enabled by assuming an axiom equivalent to postulating a function \(\phi \) to be universal for the space https://static-content.springer.com/image/chp%3A10.1007%2F978-3-030-93100-1_6/525557_1_En_6_IEq2_HTML.gif (CT\(_\phi \), a consequence of the constructivist axiom CT), Markov’s principle, and at least the axiom of countable choice. Assuming CT and countable choice invalidates the law of excluded middle, thereby also invalidating classical intuitions prevalent in textbooks on computability. On the other hand, results like Rice’s theorem are not provable without a form of choice.
In contrast to existing work, we base our investigations in constructive type theory with a separate, impredicative universe of propositions where countable choice does not hold and thus a priori CT\(_{\phi }\) and the law of excluded middle seem to be consistent. We introduce various parametric strengthenings of CT\(_{\phi }\), which are equivalent to assuming CT\(_\phi \) and an \(S^m_n\) operator for \(\phi \) like in the \(S^m_n\) theorem. The strengthened axioms allow developing synthetic computability theory without choice, as demonstrated by elegant synthetic proofs of Rice’s theorem. Moreover, they seem to be not in conflict with classical intuitions since they are consequences of the traditional analytic form of CT.
Besides explaining the novel axioms and proofs of Rice’s theorem we contribute machine-checked proofs of all results in the Coq proof assistant.
Yannick Forster
Constructive and Mechanised Meta-Theory of Intuitionistic Epistemic Logic
Abstract
Artemov and Protopopescu proposed intuitionistic epistemic logic (IEL) to capture an intuitionistic conception of knowledge. By establishing completeness, they provided the base for a meta-theoretic investigation of IEL, which was continued by Krupski with a proof of cut-elimination, and Su and Sano establishing semantic cut-elimination and the finite model property. However, to the best of our knowledge, no analysis of these results in a constructive meta-logic has been conducted.
We aim to close this gap and investigate IEL in the constructive type theory of the Coq proof assistant. Concretely, we present a constructive and mechanised completeness proof for IEL, employing a syntactic decidability proof based on cut-elimination to constructivise the ideas from the literature. Following Su and Sano, we then also give constructive versions of semantic cut-elimination and the finite model property. Given our constructive and mechanised setting, all these results now bear executable algorithms. We expect that our methods used for mechanising cut-elimination and decidability also extend to other modal logics (and have verified this observation for the classical modal logic K).
Christian Hagemeier, Dominik Kirst
A Parametrized Family of Tversky Metrics Connecting the Jaccard Distance to an Analogue of the Normalized Information Distance
Abstract
Jiménez, Becerra, and Gelbukh (2013) defined a family of “symmetric Tversky ratio models” \(S_{\alpha ,\beta }\), \(0\le \alpha \le 1\), \(\beta >0\). Each function \(D_{\alpha ,\beta }=1-S_{\alpha ,\beta }\) is a semimetric on the powerset of a given finite set.
We show that \(D_{\alpha ,\beta }\) is a metric if and only if \(0\le \alpha \le \frac{1}{2}\) and \(\beta \ge 1/(1-\alpha )\). This result is formally verified in the Lean proof assistant.
The extreme points of this parametrized space of metrics are \(J_1=D_{1/2,2}\), the Jaccard distance, and \(J_{\infty }=D_{0,1}\), an analogue of the normalized information distance of M. Li, Chen, X. Li, Ma, and Vitányi (2004).
Bjørn Kjos-Hanssen, Saroj Niraula, Soowhan Yoon
A Parameterized View on the Complexity of Dependence Logic
Abstract
In this paper, we investigate the parameterized complexity of model checking for Dependence Logic which is a well studied logic in the area of Team Semantics. We start with a list of nine immediate parameterizations for this problem, namely: the number of disjunctions (i.e., splits)/(free) variables/universal quantifiers, formula-size, the tree-width of the Gaifman graph of the input structure, the size of the universe/team, and the arity of dependence atoms. We present a comprehensive picture of the parameterized complexity of model checking and obtain a division of the problem into tractable and various intractable degrees. Furthermore, we also consider the complexity of the most important variants (data and expression complexity) of the model checking problem by fixing parts of the input.
Juha Kontinen, Arne Meier, Yasir Mahmood
A Logic of Interactive Proofs
Abstract
We introduce the probabilistic two-agent justification logic \(\mathsf {IPJ}\), a logic in which we can reason about agents that perform interactive proofs. In order to study the growth rate of the probabilities in \(\mathsf {IPJ}\), we present a new method of parametrizing \(\mathsf {IPJ}\) over certain negligible functions. Further, our approach leads to a new notion of zero-knowledge proofs.
David Lehnherr, Zoran Ognjanović, Thomas Studer
Recursive Rules with Aggregation: A Simple Unified Semantics
Abstract
Complex reasoning problems are most clearly and easily specified using logical rules, but require recursive rules with aggregation such as counts and sums for practical applications. Unfortunately, the meaning of such rules has been a significant challenge, leading to many disagreeing semantics.
This paper describes a unified semantics for recursive rules with aggregation, extending the unified founded semantics and constraint semantics for recursive rules with negation. The key idea is to support simple expression of the different assumptions underlying different semantics, and orthogonally interpret aggregation operations using their simple usual meaning. We present formal definition of the semantics, prove important properties of the semantics, and compare with prior semantics. In particular, we present an efficient inference over aggregation that gives precise answers to all examples we have studied from the literature. We also applied our semantics to a wide range of challenging examples, and performed experiments on the most challenging ones, all confirming our analyzed results.
Yanhong A. Liu, Scott D. Stoller
Computational Properties of Partial Non-deterministic Matrices and Their Logics
Abstract
Incorporating non-determinism and partiality in the traditional notion of logical matrix semantics has proven to be decisive in a myriad of recent compositionality results in logic. However, several important properties which are known to be computable for finite matrices have not been studied in this wider context of partial non-deterministic matrices (PNmatrices).
This paper is dedicated to understanding how this generalization of the considered semantical structures affects the computational properties of basic problems regarding their induced logics, in particular their sets of theorems.
We will show that the landscape is quite rich, as some problems keep their computational status, for others the complexity increases, and for a few decidability is lost. Namely, we show that checking if the logics defined by two finite PNmatrices have the same theorems is undecidable. This latter result is obtained by reduction from the undecidable problem of checking universality of term-DAG-automata.
Sérgio Marcelino, Carlos Caleiro, Pedro Filipe
Soundness and Completeness Results for LEA and Probability Semantics
Abstract
In [2], a logical system called the logic of evidence aggregation (LEA) was introduced, along with an intended semantics for it called probability semantics. The goal was to describe probabilistic evidence aggregation in the setting of formal logic. However, as noted in that paper, LEA is not complete with respect to probability semantics. This leaves open the tasks to find sound and complete semantics for LEA and a proper axiomatization for probability semantics. In this paper we do both. We define a class of basic models called deductive basic models. We show LEA is sound and complete with respect to the class of deductive basic models. We also define an axiomatic system \(\mathsf{LEA}_+\) extending LEA and show it is sound and complete with respect to probability semantics.
Eoin Moore
On Inverse Operators in Dynamic Epistemic Logic
Abstract
We extend Dynamic Epistemic Logic with inverse operators \(\langle \!\langle \alpha ^{-1}\rangle \!\rangle \) of an action \(\alpha \) along the line of tense logics. The meaning of the formula \(\langle \!\langle \alpha ^{-1}\rangle \!\rangle \varphi \) is ‘\(\varphi \) is the case before an action \(\alpha \)’. This augmentation of expressivity enables us to capture important aspects of communication actions. We also propose its semantics using model transition systems provided in our previous work, which are a suitable framework for interpreting inverse operators. In this framework, we give several soundness/completeness correspondences, which lead to modular proofs of completeness of public announcement logic and epistemic action logic of Baltag-Moss-Solecki extended with inverse operators with respect to suitable classes of MTSs.
Shota Motoura, Shin-ya Katsumata
Betwixt Turing and Kleene
Abstract
Turing’s famous ‘machine’ model constitutes the first intuitively convincing framework for computing with real numbers. Kleene’s computation schemes S1–S9 extend Turing’s approach and provide a framework for computing with objects of any finite type. Various research programs have been proposed in which higher-order objects, like functions on the real numbers, are represented/coded as real numbers, so as to make them amenable to the Turing framework. It is then a natural question whether there is any significant difference between the Kleene approach or the Turing-approach-via-codes. Continuous functions being well-studied in this context, we study functions of bounded variation, which have at most countably many points of discontinuity. A central result is the Jordan decomposition theorem that a function of bounded variation on [0, 1] equals the difference of two monotone functions. We show that for this theorem and related results, the difference between the Kleene approach and the Turing-approach-via-codes is huge, in that full second-order arithmetic readily comes to the fore in Kleene’s approach, in the guise of Kleene’s quantifier \(\exists ^{3}\).
Dag Normann, Sam Sanders
Computability Models over Categories and Presheaves
Abstract
Generalising slightly the notions of a strict computability model and of a simulation between them, which were elaborated by Longley and Normann in [9], we define canonical computability models over certain categories and appropriate presheaves on them. We study the canonical total computability model over a category \(\mathcal {C}\) and a covariant presheaf on \(\mathcal {C}\), and the canonical partial computability model over a category \(\mathcal {C}\) with pullbacks and a pullback preserving, covariant presheaf on \(\mathcal {C}\). These computability models are shown to be special cases of a computability model over a category \(\mathcal {C}\) with a so-called base of computability and a pullback preserving, covariant presheaf on \(\mathcal {C}\). In this way Rosolini’s theory of dominions is connected with the theory of computability models. All our notions and results are dualised by considering certain (contravariant) presheaves on appropriate categories.
Iosif Petrakis
Reducts of Relation Algebras: The Aspects of Axiomatisability and Finite Representability
Abstract
In this paper, we show that the class of representable residuated semigroups has the finite representation property. That is, every finite representable residuated semigroup is representable over a finite base. This result gives a positive solution to Problem 19.17 from the monograph by Hirsch and Hodkinson [13].
We also show that the class of representable join semilattice-ordered semigroups is pseudo-universal and it has a recursively enumerable axiomatisation. For this purpose, we introduce representability games for join semilattice-ordered semigroups.
Daniel Rogozin
Between Turing and Kleene
Abstract
Turing’s famous ‘machine’ model constitutes the first intuitively convincing framework for computing with real numbers. Kleene’s computation schemes S1–S9 extend Turing’s approach to computing with objects of any finite type. Both frameworks have their pros and cons and it is a natural question if there is an approach that marries the best of both the Turing and Kleene worlds. In answer to this question, we propose a considerable extension of the scope of Turing’s approach. Central is a fragment of the Axiom of Choice involving continuous choice functions, going back to Kreisel-Troelstra and intuitionistic analysis. Put another way, we formulate a relation ‘is computationally stronger than’ involving third-order objects that overcomes (many of) the pitfalls of the Turing and Kleene frameworks.
Sam Sanders
Propositional Dynamic Logic with Quantification over Regular Computation Sequences
Abstract
We extend test-free regular propositional dynamic logic with operators expressing combinations of existential and universal quantifiers quantifying over computation sequences represented by a given regular expression and states accessible via these computation sequences. This extended language is able to express that there is a computation sequence represented by a given regular expression that leads only to states where a given formula is satisfied, or that for all computation sequences represented by a given regular expression there is a state accessible via the computation sequence where a given formula is satisfied. Such quantifier combinations are essential in expressing, for instance, that a given non-deterministic finite automaton accepts all words of a given regular language or that there is a specific sequence of actions instantiating a plan expressed by a regular expression that is guaranteed to accomplish a certain goal. The existential-universal quantifier combination is modelled by neighborhood functions. We prove that a rich fragment of our logic is decidable and \( EXPTIME \)-complete by embedding the fragment into deterministic propositional dynamic logic.
Igor Sedlár
Finite Generation and Presentation Problems for Lambda Calculus and Combinatory Logic
Abstract
We solve several finite generation and presentation problems for lambda calculus and combinatory logic.
Rick Statman
Exact and Parameterized Algorithms for Read-Once Refutations in Horn Constraint Systems
Abstract
In this paper, we discuss exact and parameterized algorithms for the problem of finding a read-once refutation in an unsatisfiable Horn Constraint System (HCS). Recall that a linear constraint system \(\mathbf{A \cdot x \ge b}\) is said to be a Horn constraint system, if each entry in \(\mathbf{A}\) belongs to the set \(\{0,1,-1\}\) and at most one entry in each row of \(\mathbf{A}\) is positive. In this paper, we examine the importance of constraints in which more variables have negative coefficients than have positive coefficients. There exist several algorithms for checking whether a Horn constraint system is feasible. To the best of our knowledge, these algorithms are not certifying, i.e., they do not provide a certificate of infeasibility. Our work is concerned with providing a specialized class of certificates called “read-once refutations”. In a read-once refutation, each constraint defining the HCS may be used at most once in the derivation of a refutation. The problem of checking if an HCS has a read-once refutation (HCS ROR) has been shown to be NP-hard. We analyze the HCS ROR problem from two different algorithmic perspectives, viz., parameterized algorithms and exact exponential algorithms.
K. Subramani, Piotr Wojciechowski
Dialectica Logical Principles
Abstract
Gödel’ s Dialectica interpretation was designed to obtain a relative consistency proof for Heyting arithmetic, to be used in conjunction with the double negation interpretation to obtain the consistency of Peano arithmetic. In recent years, proof theoretic transformations (so-called proof interpretations) that are based on Gödel’s Dialectica interpretation have been used systematically to extract new content from proofs and so the interpretation has found relevant applications in several areas of mathematics and computer science. Following our previous work on ‘Gödel fibrations’, we present a (hyper)doctrine characterisation of the Dialectica which corresponds exactly to the logical description of the interpretation. To show that we derive the soundness of the interpretation of the implication connective, as expounded on by Spector and Troelstra, in the categorical model. This requires extra logical principles, going beyond intuitionistic logic, namely Markov’s Principle (MP) and the Independence of Premise (IP) principle, as well as some choice. We show how these principles are satisfied in the categorical setting, establishing a tight (internal language) correspondence between the logical system and the categorical framework. This tight correspondence should come handy not only when discussing the applications of the Dialectica already known, like its use to extract computational content from (some) classical theorems (proof mining), its use to help to model specific abstract machines, etc. but also to help devise new applications.
Davide Trotta, Matteo Spadetto, Valeria de Paiva
Small Model Property Reflects in Games and Automata
Abstract
Small model property is an important property that implies decidability. We show that the small model size is directly related to some important resources in games and automata for checking provability.
Maciej Zielenkiewicz
Backmatter
Metadata
Title
Logical Foundations of Computer Science
Editors
Sergei Artemov
Prof. Anil Nerode
Copyright Year
2022
Electronic ISBN
978-3-030-93100-1
Print ISBN
978-3-030-93099-8
DOI
https://doi.org/10.1007/978-3-030-93100-1

Premium Partner