Skip to main content

2019 | Buch

Description Logic, Theory Combination, and All That

Essays Dedicated to Franz Baader on the Occasion of His 60th Birthday

herausgegeben von: Carsten Lutz, Prof. Uli Sattler, Prof. Cesare Tinelli, Anni-Yasmin Turhan, Prof. Frank Wolter

Verlag: Springer International Publishing

Buchreihe : Lecture Notes in Computer Science

insite
SUCHEN

Über dieses Buch

This Festschrift has been put together on the occasion of Franz Baader's 60th birthday to celebrate his fundamental and highly influential scientific contributions. The 30 papers in this volume cover several scientific areas that Franz Baader has been working on during the last three decades, including description logics, term rewriting, and the combination of decision procedures. We hope that readers will enjoy the articles gathered in Franz's honour and appreciate the breadth and depth of his favourite areas of computer science.

Inhaltsverzeichnis

Frontmatter
A Tour of Franz Baader’s Contributions to Knowledge Representation and Automated Deduction
Abstract
This article provides an introduction to the Festschrift that has been put together on the occasion of Franz Baader’s 60th birthday to celebrate his fundamental and highly influential scientific contributions. We start with a brief and personal overview of Franz’s career, listing some important collaborators, places, and scientific milestones, and then provide first person accounts of how each one of us came in contact with Franz and how we benefitted from his collaboration and mentoring. Our selection is not intended to be complete and it is in fact strongly biased by our own personal experience and preferences. Many of Franz’s contributions that we had to leave out are discussed in later chapters of this volume.
Carsten Lutz, Uli Sattler, Cesare Tinelli, Anni-Yasmin Turhan, Frank Wolter
Hierarchic Superposition Revisited
Abstract
Many applications of automated deduction require reasoning in first-order logic modulo background theories, in particular some form of integer arithmetic. A major unsolved research challenge is to design theorem provers that are “reasonably complete” even in the presence of free function symbols ranging into a background theory sort. The hierarchic superposition calculus of Bachmair, Ganzinger, and Waldmann already supports such symbols, but, as we demonstrate, not optimally. This paper aims to rectify the situation by introducing a novel form of clause abstraction, a core component in the hierarchic superposition calculus for transforming clauses into a form needed for internal operation. We argue for the benefits of the resulting calculus and provide two new completeness results: one for the fragment where all background-sorted terms are ground and another one for a special case of linear (integer or rational) arithmetic as a background theory.
Peter Baumgartner, Uwe Waldmann
Theory Combination: Beyond Equality Sharing
Abstract
Satisfiability is the problem of deciding whether a formula has a model. Although it is not even semidecidable in first-order logic, it is decidable in some first-order theories or fragments thereof (e.g., the quantifier-free fragment). Satisfiability modulo a theory is the problem of determining whether a quantifier-free formula admits a model that is a model of a given theory. If the formula mixes theories, the considered theory is their union, and combination of theories is the problem of combining decision procedures for the individual theories to get one for their union. A standard solution is the equality-sharing method by Nelson and Oppen, which requires the theories to be disjoint and stably infinite. This paper surveys selected approaches to the problem of reasoning in the union of disjoint theories, that aim at going beyond equality sharing, including: asymmetric extensions of equality sharing, where some theories are unrestricted, while others must satisfy stronger requirements than stable infiniteness; superposition-based decision procedures; and current work on conflict-driven satisfiability (CDSAT).
Maria Paola Bonacina, Pascal Fontaine, Christophe Ringeissen, Cesare Tinelli
Initial Steps Towards a Family of Regular-Like Plan Description Logics
Abstract
A wide range of ordinary Description Logics (DLs) have been explored by considering collections of concept/role constructors, and types of terminologies, yielding an array of complexity results. Representation and reasoning with plans is a very important topic in AI, yet there has been very little work on finding and studying DL constructors for plan concepts.
We start to remedy this problem here by considering Plan DLs where concept instances are sequences of action instances, and hence plan concepts can be viewed as analogues of formal languages, describing sets of strings. Inspired by the clasp system, we consider using regular-like expressions, obtaining a rich variety of Plan DLs based on combinations of regular-like expression constructors, including sequence (concatenation), alternation (union, disjunction), looping (Kleene star), conjunction (intersection), and complement. To model the important notion of concurrency, we also consider interleaving.
We present results from the formal language literature which have immediate bearing on the complexity of DL-like reasoning tasks. However, we also focus on succinctness of representation, and on expressive power, issues first studied by Franz Baader for ordinary DLs.
Alexander Borgida
Reasoning with Justifiable Exceptions in Contextualized Knowledge Repositories
Abstract
The Contextualized Knowledge Repository (CKR) framework has been proposed as a description logics-based approach for contextualization of knowledge, a well-known area of study in AI. The CKR knowledge bases are structured in two layers: a global context contains context-independent knowledge and contextual structure, while a set of local contexts hold specific knowledge bases. In practical uses of CKR, it is often desirable that global knowledge can be “overridden” at the local level, that is to recognize local pieces of knowledge that do not need to satisfy the general axiom. By targeting this need, in our recent works we presented an extension of CKR with global defeasible axioms, which apply local instances unless an exception for overriding exists; such an exception, however, requires that justification is provable from the knowledge base. In this paper we apply this framework to the basic description logic \(\mathcal{E}\mathcal{L}_\bot \). We provide a formalization of \(\mathcal{E}\mathcal{L}_\bot \) CKRs with global defeasible axioms and study their semantic and computational properties. Moreover, we present a translation of CKRs to datalog programs under the answer set semantics for instance checking.
Loris Bozzato, Thomas Eiter, Luciano Serafini
Strong Explanations for Nonmonotonic Reasoning
Abstract
The ability to generate explanations for inferences drawn from a knowledge base is of utmost importance for intelligent systems. A central notion in this context are minimal subsets of the knowledge base entailing a certain formula. Such subsets are often referred to as justifications, and their identification is called axiom pinpointing.As observed by Franz Baader, this concept of explanations is useful for monotonic logics in which additional information can never invalidate former conclusions. However, for nonmonotonic logics the concept simply makes no sense. In this paper, we introduce a different notion, called strong explanation. Strong explanations coincide with the standard notion for monotonic logics, but also handle the nonmonotonic case adequately.
Gerhard Brewka, Markus Ulbricht
A KLM Perspective on Defeasible Reasoning for Description Logics
Abstract
In this paper we present an approach to defeasible reasoning for the description logic \(\mathcal {ALC}\). The results discussed here are based on work done by Kraus, Lehmann and Magidor (KLM) on defeasible conditionals in the propositional case. We consider versions of a preferential semantics for two forms of defeasible subsumption, and link these semantic constructions formally to KLM-style syntactic properties via representation results. In addition to showing that the semantics is appropriate, these results pave the way for more effective decision procedures for defeasible reasoning in description logics. With the semantics of the defeasible version of \(\mathcal {ALC}\) in place, we turn to the investigation of an appropriate form of defeasible entailment for this enriched version of \(\mathcal {ALC}\). This investigation includes an algorithm for the computation of a form of defeasible entailment known as rational closure in the propositional case. Importantly, the algorithm relies completely on classical entailment checks and shows that the computational complexity of reasoning over defeasible ontologies is no worse than that of the underlying classical \(\mathcal {ALC}\). Before concluding, we take a brief tour of some existing work on defeasible extensions of \(\mathcal {ALC}\) that go beyond defeasible subsumption.
Katarina Britz, Giovanni Casini, Thomas Meyer, Ivan Varzinczak
Temporal Logic Programs with Temporal Description Logic Axioms
Abstract
In this paper we introduce a combination of Answer Set Programming (ASP) and Description Logics (DL) (in particular, \(\mathcal{ALC}\)) on top of a modal temporal basis using connectives from Linear-time Temporal Logic (LTL). On the one hand, for the temporal extension of \(\mathcal{ALC}\), we depart from Baader et al.’s proposal \(\mathcal{ALC}\)-LTL that restricts the use of temporal operators to occur only in front of DL axioms. On the other hand, for the temporal extension of ASP we use its formalization in terms of Temporal (Quantified) Equilibrium Logic (TEL). This choice is convenient since (non-temporal) Equilibrium Logic has been already used to capture the semantics of hybrid theories, that is, combinations of ASP programs with DL axioms. Our proposal, called \(\mathcal{ALC}\)-TEL, actually interprets \(\mathcal{ALC}\) axioms in terms of their translation into first order sentences, so that the semantics of TEL is eventually used in the background. The resulting formalism conservatively extends TEL, hybrid theories and \(\mathcal{ALC}\)-LTL as particular cases.
Pedro Cabalar, Torsten Schaub
The What-To-Ask Problem for Ontology-Based Peers
Abstract
The issue of cooperation, integration, and coordination between information peers has been addressed over the years both in the context of the Semantic Web and in several other networked environments, including data integration, Peer-to-Peer and Grid computing, service-oriented computing, distributed agent systems, and collaborative data sharing. One of the main problems arising in such contexts is how to exploit the mappings between peers in order to answer queries posed to one peer. We address this issue for peers managing data through ontologies and in particular focus on ontologies specified in logics of the DL-Lite family. Our goal is to present some basic, fundamental results on this problem. In particular, we focus on a simplified setting based on just two interoperating peers, and we investigate how to solve the so-called “What-To-Ask” problem: find a way to answer queries posed to a peer by relying only on the query answering service available at the queried peer and at the other peer. We show both a positive and a negative result. Namely, we first prove that a solution to this problem always exists when the ontology is specified in \(\textit{DL-Lite}_{\mathcal {R}} \), and we provide an algorithm to compute it. Then, we show that for the case of \(\textit{DL-Lite}_{\mathcal {F}} \) the problem may have no solution. We finally illustrate that a solution to our problem can still be found even for more general networks of peers, and for any language of the DL-Lite family, provided that we interpret mappings according to an epistemic semantics, rather than the usual first-order semantics.
Diego Calvanese, Giuseppe De Giacomo, Domenico Lembo, Maurizio Lenzerini, Riccardo Rosati
From Model Completeness to Verification of Data Aware Processes
Abstract
Model Completeness is a classical topic in model-theoretic algebra, and its inspiration sources are areas like algebraic geometry and field theory. Yet, recently, there have been remarkable applications in computer science: these applications range from combined decision procedures for satisfiability and interpolation, to connections between temporal logic and monadic second order logic and to model-checking. In this paper we mostly concentrate on the last one: we study verification over a general model of so-called artifact-centric systems, which are used to capture business processes by giving equal important to the control-flow and data-related aspects. In particular, we are interested in assessing (parameterized) safety properties irrespectively of the initial database instance. We view such artifact systems as array-based systems, establishing a correspondence with model checking based on Satisfiability-Modulo-Theories (SMT). Model completeness comes into the picture in this framework by supplying quantifier elimination algorithms for suitable existentially closed structures. Such algorithms, whose complexity is unexpectedly low in some cases of our interest, are exploited during search and to represent the sets of reachable states. Our first implementation, built up on top of the mcmt model-checker, makes all our foundational results fully operational and quite effective, as demonstrated by our first experiments.
Diego Calvanese, Silvio Ghilardi, Alessandro Gianola, Marco Montali, Andrey Rivkin
Situation Calculus Meets Description Logics
Abstract
For more than six years, the groups of Franz Baader and Gerhard Lakemeyer have collaborated in the area of decidable verification of Golog programs. Golog is an action programming language, whose semantics is based on the Situation Calculus, a variant of full first-order logic. In order to achieve decidability, the expressiveness of the base logic had to be restricted, and using a Description Logic was a natural choice. In this chapter, we highlight some of the main results and insights obtained during our collaboration.
Jens Claßen, Gerhard Lakemeyer, Benjamin Zarrieß
Provenance Analysis: A Perspective for Description Logics?
Abstract
Provenance analysis aims at understanding how the result of a computational process with a complex input, consisting of multiple items, depends on the various parts of this input. In database theory, provenance analysis based on interpretations in commutative semirings has been developed for positive database query languages, to understand which combinations of the atomic facts in a database can be used for deriving the result of a given query. In joint work with Val Tannen, we have recently proposed a new approach for the provenance analysis of logics with negation, such as first-order logic and fixed-point logic. It is based on new semirings of dual-indeterminate polynomials or dual-indeterminate formal power series, which are obtained by taking quotients of traditional provenance semirings by congruences that are generated by products of positive and negative provenance tokens. This provenance approach has also been applied to fragments of first-order logics such as modal and guarded logics. In this paper, we explore the question whether, and to what extent, the provenance approach might be useful in the field of description logics.
Katrin M. Dannert, Erich Grädel
Extending with Linear Constraints on the Probability of Axioms
Abstract
One of the main reasons to employ a description logic such as \(\mathscr {E\!L}^{++}\) is the fact that it has efficient, polynomial-time algorithmic properties such as deciding consistency and inferring subsumption. However, simply by adding negation of concepts to it, we obtain the expressivity of description logics whose decision procedure is ExpTime-complete. Similar complexity explosion occurs if we add probability assignments on concepts. To lower the resulting complexity, we instead concentrate on assigning probabilities to Axioms/GCIs. We show that the consistency detection problem for such a probabilistic description logic is NP-complete, and present a linear algebraic deterministic algorithm to solve it, using the column generation technique. We also examine and provide algorithms for the probabilistic extension problem, which consists of inferring the minimum and maximum probabilities for a new axiom, given a consistent probabilistic knowledge base.
Marcelo Finger
Effective Query Answering with Ontologies and DBoxes
Abstract
The goal of this chapter is to survey the formalisation of a precise and uniform integration between first-order ontologies, first-order queries, and classical relational databases (DBoxes) We include here non-standard variants of first-order logic, such as the one with active domain semantics and standard name assumption, used typically in database theory. We present a general framework for the rewriting of a domain independent first-order query in presence of an arbitrary domain independent first-order logic ontology over a signature extending a database signature with additional predicates. The framework supports deciding the existence of a logically equivalent and – given the ontology – safe-range first-order reformulation (called exact reformulation) of a domain independent first-order query in terms of the database signature, and if such a reformulation exists, it provides an effective approach to construct the reformulation based on interpolation using standard theorem proving techniques (i.e., tableau). Since the reformulation is a safe-range formula, it is effectively executable as an SQL query. We finally present an application of the framework with the very expressive \(\mathcal {ALCHOI}\) and \(\mathcal {SHOQ}\) description logics ontologies, by providing effective means to compute safe-range first-order exact reformulations of queries.
Enrico Franconi, Volha Kerhet
Checking the Data Complexity of Ontology-Mediated Queries: A Case Study with Non-uniform CSPs and Polyanna
Abstract
It has recently been shown that first-order- and datalog-rewritability of ontology-mediated queries (OMQs) with expressive ontologies can be checked in NExpTime using a reduction to CSPs. In this paper, we present a case study for OMQs with Boolean conjunctive queries and a fixed ontology consisting of a single covering axiom \(A \sqsubseteq F \sqcup T\), possibly supplemented with a disjointness axiom for T and F. The ultimate aim is to classify such OMQs according to their data complexity: \(\textsc {AC}^0\), L, NL, P or coNP. We report on our experience with trying to distinguish between OMQs in P and coNP using the reduction to CSPs and the Polyanna software for finding polymorphisms.
Olga Gerasimova, Stanislav Kikot, Michael Zakharyaschev
Perceptual Context in Cognitive Hierarchies
Abstract
Cognition does not only depend on bottom-up sensor feature abstraction, but also relies on contextual information being passed top-down. Context is higher level information that helps to predict belief states at lower levels. The main contribution of this paper is to provide a formalisation of perceptual context and its integration into a new process model for cognitive hierarchies. Several simple instantiations of a cognitive hierarchy are used to illustrate the role of context. Notably, we demonstrate the use context in a novel approach to visually track the pose of rigid objects with just a 2D camera.
Bernhard Hengst, Maurice Pagnucco, David Rajaratnam, Claude Sammut, Michael Thielscher
Do Humans Reason with -Matchers?
Abstract
The Weak Completion Semantics is a novel, integrated and computational cognitive theory. Recently, it has been applied to ethical decision making. To this end, it was extended by equational theories as needed by the fluent calculus. To compute least models equational matching problems have to be solved. Do humans consider equational matching in reasoning episodes?
Steffen Hölldobler
Pseudo-contractions as Gentle Repairs
Abstract
Updating a knowledge base to remove an unwanted consequence is a challenging task. Some of the original sentences must be either deleted or weakened in such a way that the sentence to be removed is no longer entailed by the resulting set. On the other hand, it is desirable that the existing knowledge be preserved as much as possible, minimising the loss of information. Several approaches to this problem can be found in the literature. In particular, when the knowledge is represented by an ontology, two different families of frameworks have been developed in the literature in the past decades with numerous ideas in common but with little interaction between the communities: applications of AGM-like Belief Change and justification-based Ontology Repair. In this paper, we investigate the relationship between pseudo-contraction operations and gentle repairs. Both aim to avoid the complete deletion of sentences when replacing them with weaker versions is enough to prevent the entailment of the unwanted formula. We show the correspondence between concepts on both sides and investigate under which conditions they are equivalent. Furthermore, we propose a unified notation for the two approaches, which might contribute to the integration of the two areas.
Vinícius Bitencourt Matos, Ricardo Guimarães, Yuri David Santos, Renata Wassermann
FunDL
A Family of Feature-Based Description Logics, with Applications in Querying Structured Data Sources
Abstract
Feature-based description logics replace the notion of roles, interpreted as binary relations, with features, interpreted as unary functions. Another notable feature of these logics is their use of path functional dependencies that allow for complex identification constraints to be formulated. The use of features and path functional dependencies makes the logics particularly well suited for capturing and integrating data sources conforming to an underlying object-relational schema that include a variety of common integrity constraints. We first survey expressive variants of feature logics, including the boundaries of decidability. We then survey a restricted tractable family of feature logics suited to query answering, and study the limits of tractability of reasoning.
Stephanie McIntyre, David Toman, Grant Weddell
Some Thoughts on Forward Induction in Multi-Agent-Path Finding Under Destination Uncertainty
Abstract
While the notion of implicit coordination helps to design frameworks in which agents can cooperatively act with only minimal communication, it so far lacks exploiting observations made while executing a plan. In this note, we have a look at what can be done in order to overcome this shortcoming, at least in a specialized setting.
Bernhard Nebel
Temporally Attributed Description Logics
Abstract
Knowledge graphs are based on graph models enriched with (sets of) attribute-value pairs, called annotations, attached to vertices and edges. Many application scenarios of knowledge graphs crucially rely on the frequent use of annotations related to time. Building upon attributed logics, we design description logics enriched with temporal annotations whose values are interpreted over discrete time. Investigating the complexity of reasoning in this new formalism, it turns out that reasoning in our temporally attributed description logic \(\mathcal {ALCH} ^{\mathbb {T}}_@\) is highly undecidable; thus we establish restrictions where it becomes decidable, and even tractable.
Ana Ozaki, Markus Krötzsch, Sebastian Rudolph
Explaining Axiom Pinpointing
Abstract
Axiom pinpointing refers to the task of highlighting (or pinpointing) the axioms in an ontology that are responsible for a given consequence to follow. This is a fundamental task for understanding and debugging very large ontologies. Although the name axiom pinpointing was only coined in 2003, the problem itself has a much older history, even if considering only description logic ontologies. In this work, we try to explain axiom pinpointing: what it is; how it works; how it is solved; and what it is useful for. To answer this questions, we take a historic look at the field, focusing mainly on description logics, and the specific contributions stemming from one researcher, who started it all in more than one sense.
Rafael Peñaloza
Asymmetric Unification and Disunification
Abstract
We compare two kinds of unification problems: Asymmetric Unification and Disunification, which are variants of Equational Unification. Asymmetric Unification is a type of Equational Unification where the instances of the right-hand sides of the equations are in normal form with respect to the given term rewriting system. In Disunification we solve equations and disequations with respect to an equational theory for the case with free constants. We contrast the time complexities of both and show that the two problems are incomparable: there are theories where one can be solved in polynomial time while the other is NP-hard. This goes both ways. The time complexity also varies based on the termination ordering used in the term rewriting system.
Veena Ravishankar, Kimberly A. Cornell, Paliath Narendran
Building and Combining Matching Algorithms
Abstract
The concept of matching is ubiquitous in declarative programming and in automated reasoning. For instance, it is a key mechanism to run rule-based programs and to simplify clauses generated by theorem provers. A matching problem can be seen as a particular conjunction of equations where each equation has a ground side. We give an overview of techniques that can be applied to build and combine matching algorithms. First, we survey mutation-based techniques as a way to build a generic matching algorithm for a large class of equational theories. Second, combination techniques are introduced to get combined matching algorithms for disjoint unions of theories. Then we show how these combination algorithms can be extended to handle non-disjoint unions of theories sharing only constructors. These extensions are possible if an appropriate notion of normal form is computable.
Christophe Ringeissen
Presburger Concept Cardinality Constraints in Very Expressive Description Logics
Allegro sexagenarioso ma non ritardando
Abstract
Quantitative information plays an increasingly important role in knowledge representation. To this end, many formalisms have been proposed that enrich traditional KR formalisms by counting and some sort of arithmetics. Baader and Ecke (2017) propose an extension of the description logic \(\mathcal {ALC}\) by axioms which express correspondences between the cardinalities of concepts by means of Presburger arithmetics. This paper extends their results, enhancing the expressivity of the underlying logic as well as the constraints while preserving complexities. It also widens the scope of investigation from finite models to the classical semantics where infinite models are allowed. We also provide first results on query entailment in such logics. As opposed to prior work, our results are established by polynomially encoding the cardinality constraints in the underlying logic.
“Johann” Sebastian Rudolph
A Note on Unification, Subsumption and Unification Type
Abstract
Various forms of subsumption preorders are used in the literature for comparing unifiers and general solutions of a unification problem for generality and for defining the unification type. This note presents some of them and discusses their pros and cons. In particular arguments against the exist-substitution-based subsumption preorder (ess) are discussed. A proposal for a further partition of unification type nullary is made. Also some historical notes are included.
Manfred Schmidt-Schauß
15 Years of Consequence-Based Reasoning
Abstract
Description logics (DLs) are a family of formal languages for knowledge representation with numerous applications. Consequence-based reasoning is a promising approach to DL reasoning which can be traced back to the work of Franz Baader and his group on efficient subsumption algorithms for the \(\mathcal {EL}\) family of DLs circa 2004. Consequence-based reasoning combines ideas from hypertableaux and resolution in a way that has proved very effective in practice, and it still remains an active field of research. In this paper, we review the evolution of the field in the last 15 years and discuss the various consequence-based calculi that have been developed for different DLs, from the lightweight \(\mathcal {EL}\) to the expressive \(\mathcal {SROIQ}\). We thus provide a comprehensive and up-to-date analysis that highlights the common characteristics of these calculi and discusses their implementation.
David Tena Cucala, Bernardo Cuenca Grau, Ian Horrocks
Maximum Entropy Calculations for the Probabilistic Description Logic
Abstract
The probabilistic Description Logic https://static-content.springer.com/image/chp%3A10.1007%2F978-3-030-22102-7_28/485324_1_En_28_IEq3_HTML.gif extends the classical Description Logic https://static-content.springer.com/image/chp%3A10.1007%2F978-3-030-22102-7_28/485324_1_En_28_IEq4_HTML.gif with probabilistic conditionals of the form (D|C)[p] stating that “D follows from C with probability p.” Conditionals are interpreted based on the aggregating semantics where probabilities are understood as degrees of belief. For reasoning with probabilistic conditional knowledge bases in https://static-content.springer.com/image/chp%3A10.1007%2F978-3-030-22102-7_28/485324_1_En_28_IEq5_HTML.gif , the principle of maximum entropy provides a valuable methodology following the idea of completing missing information in a most cautious way. In this paper, we give a guidance on calculating approximations of the maximum entropy distribution for https://static-content.springer.com/image/chp%3A10.1007%2F978-3-030-22102-7_28/485324_1_En_28_IEq6_HTML.gif -knowledge bases. For this, we discuss the benefits of solving the dual maximum entropy optimization problem instead of the primal problem. In particular, we show how representations of approximations of the maximum entropy distribution can be derived from the dual problem in time polynomial in the size of the underlying domain. The domain is the crucial quantity in practical applications. For a compact representation of the objective function of the dual maximum entropy optimization problem, we apply the principle of typed model counting.
Marco Wilhelm, Gabriele Kern-Isberner
Automating Automated Reasoning
The Case of Two Generic Automated Reasoning Tools
Abstract
The vision of automated support for the investigation of logics, proposed decades ago, has been implemented in many forms, producing numerous tools that analyze various logical properties (e.g., cut-elimination, semantics, and more). However, full ‘automation of automated reasoning’ in the sense of automatic generation of efficient provers has remained a ‘holy grail’ of the field. Creating a generic prover which can efficiently reason in a given logic is challenging, as each logic may be based on a different language, and involve different inference rules, that require different implementation considerations to achieve efficiency, or even tractability. Two recently introduced generic automated provers apply different approaches to tackle this challenge. \({\text {MetTeL}}\), based on the formalism of tableaux, automatically generates a prover for a given tableau calculus, by implementing generic proof-search procedures with optimizations applicable to many tableau calculi. \({\text {Gen2sat}}\), based on the formalism of sequent calculi, shifts the burden of search to the realm of off-the-shelf SAT solvers by applying a uniform reduction of derivability in sequent calculi to SAT. This paper examines these two generic provers, focusing in particular on criteria relevant for comparing their performance and usability. To this end, we evaluate the performance of the tools, and describe the results of a preliminary empirical study where user experiences of expert logicians using the two tools are compared.
Yoni Zohar, Dmitry Tishkovsky, Renate A. Schmidt, Anna Zamansky
On Bounded-Memory Stream Data Processing with Description Logics
Abstract
Various research groups of the description logic community, in particular the group of Franz Baader, have been involved in recent efforts on temporalizing or streamifying ontology-mediated query answering (OMQA). As a result, various temporal and streamified extensions of query languages for description logics with different expressivity were investigated. For practically useful implementations of OMQA systems over temporal and streaming data, efficient algorithms for answering continuous queries are indispensable. But, depending on the expressivity of the query and ontology language, finding an efficient algorithm may not always be possible. Hence, the aim should be to provide criteria for easily checking whether an efficient algorithm exists at all and, possibly, to describe such an algorithm for a given query. In particular, for stream data it is important to find simple criteria that help deciding whether a given OMQA query can be answered with sub-linear space w.r.t. the length of a growing stream prefix. An important special case dealt with under the term “bounded memory” is that of testing for constant space. This paper discusses known syntactical criteria for bounded-memory processing of SQL queries over relational data streams and describes how these criteria from the database community can be lifted to criteria of bounded-memory query answering in the streamified OMQA setting. For illustration purposes, a syntactic criterion for bounded-memory processing of queries formulated in a fragment of the stream-temporal query language STARQL is given.
Özgür Lütfü Özçep, Ralf Möller
Backmatter
Metadaten
Titel
Description Logic, Theory Combination, and All That
herausgegeben von
Carsten Lutz
Prof. Uli Sattler
Prof. Cesare Tinelli
Anni-Yasmin Turhan
Prof. Frank Wolter
Copyright-Jahr
2019
Electronic ISBN
978-3-030-22102-7
Print ISBN
978-3-030-22101-0
DOI
https://doi.org/10.1007/978-3-030-22102-7

Premium Partner