Skip to main content

Über dieses Buch

This book constitutes the refereed proceedings of the 11th International Symposium on Foundations of Information and Knowledge Systems, FoIKS 2020, held in Dortmund, Germany, in February 2020.
The 19 revised full papers presented were carefully reviewed and selected from 33 submissions. The papers address various topics such as big data; database design; dynamics of information; information fusion; integrity and constraint management; intelligent agents; knowledge discovery and information retrieval; knowledge representation, reasoning and planning; logics in databases and AI; mathematical foundations; security in information and knowledge systems; semi-structured data and XML; social computing; the semantic web and knowledge management; and the world wide web.​



Functional Dependencies in Incomplete Databases with Limited Domains

Missing data value is an extensive problem in both research and industrial developers. Two general approaches are there to deal with the problem of missing values in databases, they either could be ignored (removed) or imputed (filled in) with new values [9]. In the present paper, we use the second method. Possible worlds were introduced in [14, 16] and possible and certain keys, as well as weak and strong functional dependencies were studied. We introduced the intermediate concept of strongly possible worlds that are obtained by imputing values already existing in the table in a preceding paper. This results in strongly possible keys and strongly possible functional dependencies. We give a polynomial algorithm to verify a single spKey, and show that in general, it is NP-complete to verify an arbitrary collection of spKeys. We give a graph theoretical characterization of the validity of a given spFD \(X\rightarrow _{sp}Y\). We analyze which weak/strong functional dependency axioms remain sound for strongly possible functional dependencies and give appropriate modifications of the not sound ones.
Munqath Alattar, Attila Sali

Normal Forms of Conditional Knowledge Bases Respecting Entailments and Renamings

Normal forms of conditional knowledge bases are useful to create, process and compare the knowledge represented by them. In this paper, we propose the reduced antecedent normal form (RANF) for conditional knowledge bases. Compared to the antecedent normal form, it represents conditional knowledge with significantly fewer conditionals. A set of transformation rules maps every knowledge base to a model equivalent knowledge base in RANF. The new notion of renaming normal form (\(\rho \)NF) of a conditional knowledge base takes signature renamings into account. We develop an algorithm for systematically generating conditional knowledge bases over a given signature that are both in RANF and in \(\rho \)NF. The generated knowledge bases are consistent, pairwise not antecedentwise equivalent and pairwise not equivalent under signature renaming. Furthermore, the algorithm is complete in the sense that, taking signature renamings and model equivalence into account, every consistent knowledge base is generated.
Christoph Beierle, Jonas Haldimann

On Matrices and K-Relations

We show that the matrix query language \(\mathsf {MATLANG}\) corresponds to a natural fragment of the positive relational algebra on K-relations. The fragment is defined by introducing a composition operator and restricting K-relation arities to two. We then proceed to show that \(\mathsf {MATLANG}\) can express all matrix queries expressible in the positive relational algebra on K-relations, when intermediate arities are restricted to three. Thus we offer an analogue, in a model with numerical data, to the situation in classical logic, where the algebra of binary relations is equivalent to first-order logic with three variables.
Robert Brijder, Marc Gyssens, Jan Van den Bussche

Social Consolidations: Rational Belief in a Many-Valued Logic of Evidence and Peerhood

We explore an interpretation of FVEL, a four-valued logic of evidence, where states represent agents, the propositional layer corresponds to the evidence available to these agents, and the relation corresponds to peerhood connections between them. Belief is determined based on the agent’s evidence, but also on her peers’ evidence. Consolidation functions are proposed, which map evidence situations to belief attitudes. We adapt some postulates of Social Choice Theory to our belief formation setting and, with them, we separate rational from irrational consolidations. We define a dynamic operator for addition and removal of evidence, which serves as a basis for some essential dynamic postulates and also for future developments on consolidations that take amounts of evidence into account. Our main technical result is a characterisation of a class of consolidations satisfying most of our rationality postulates.
Yuri David Santos

ASPARTIX-V19 - An Answer-Set Programming Based System for Abstract Argumentation

We present ASPARTIX-V, a tool for reasoning in abstract argumentation frameworks that is based on answer-set programming (ASP), in its 2019 release. ASPARTIX-V participated in this year’s edition of the International Competition on Computational Models of Argumentation (ICCMA’19) in all classical (static) reasoning tasks. In this paper we discuss extensions the ASPARTIX suite of systems has undergone for ICCMA’19. This includes incorporation of recent ASP language constructs (e.g. conditional literals), domain heuristics within ASP, and multi-shot methods. In particular, with this version of ASPARTIX-V we partially deviate from an earlier focus on monolithic approaches (i.e., one-shot solving via a single ASP encoding) to further enhance performance. We also briefly report on the results achieved by ASPARTIX-V in ICCMA’19.
Wolfgang Dvořák, Anna Rapberger, Johannes P. Wallner, Stefan Woltran

Proper Hierarchies in Polylogarithmic Time and Absence of Complete Problems

The polylogarithmic time hierarchy structures sub-linear time complexity. In recent work it was shown that all classes \(\tilde{\varSigma }_{m}^{ plog }\) or \(\tilde{\varPi }_{m}^{ plog }\) (\(m \in \mathbb {N}\)) in this hierarchy can be captured by semantically restricted fragments of second-order logic. In this paper the descriptive complexity theory of polylogarithmic time is taken further showing that there are strict hierarchies inside each of the classes of the hierarchy. A straightforward consequence of this result is that there are no complete problems for these complexity classes, not even under polynomial time reductions.
Flavio Ferrarotti, Senén González, Klaus-Dieter Schewe, José María Turull-Torres

Diversity, Dependence and Independence

We introduce the concepts of dependence and independence in a very general framework. We use a concept of rank to study dependence and independence. By means of the rank we identify (total) dependence with inability to create more diversity, and (total) independence with the presence of maximum diversity. We show that our theory of dependence and independence covers a variety of dependence concepts, for example the seemingly unrelated concepts of linear dependence in algebra and dependence of variables in logic.
Pietro Galliani, Jouko Väänänen

Towards Probabilistic Reasoning in Type Theory - The Intersection Type Case

The development of different probabilistic models of uncertainty has been inspired by the rapid progress in various fields, e.g. in AI, probabilistic programming, etc. Lambda calculus is a universal model of computation suitable to express programming languages concepts. Hence, different methods for probabilistic reasoning in lambda calculus have been investigated. In this paper, we develop a formal model for probabilistic reasoning about lambda terms with intersection types, which is a combination of lambda calculus and probabilistic logic. The language of lambda calculus with intersection types is endowed with a probabilistic operator. We propose a semantics based on the possible world approach. An infinitary axiomatization is given for this system and it is proved to be sound with respect to the proposed semantics.
Silvia Ghilezan, Jelena Ivetić, Simona Kašterović, Zoran Ognjanović, Nenad Savić

Measuring Inconsistency in a General Information Space

AI systems often need to deal with inconsistent information. For this reason since the early 2000s some AI researchers have developed ways to measure the amount of inconsistency in a knowledge base. By now there is a substantial amount of research about various aspects of inconsistency measuring. The problem is that most of this work applies only to knowledge bases formulated as sets of formulas in propositional logic. Hence this work is not really applicable to the way that information is actually stored. The purpose of this paper is to extend inconsistency measuring to real world information. We first define the concept of general information space which encompasses various types of databases and scenarios in AI systems. Then, we show how to transform any general information space to an inconsistency equivalent propositional knowledge base, and finally apply propositional inconsistency measures to find the inconsistency of the general information space. Our method allows for the direct comparison of the inconsistency of different information spaces, even though the data is presented in different ways. We demonstrate the transformation on three general information spaces: a relational database, a graph database, and a Blocks world scenario, where we apply several inconsistency measures after performing the transformation.
John Grant, Francesco Parisi

Parameterised Complexity of Model Checking and Satisfiability in Propositional Dependence Logic

In this paper, we initiate a systematic study of the parameterised complexity in the field of Dependence Logics which finds its origin in the Dependence Logic of Väänänen from 2007. We study a propositional variant of this logic (PDL) and investigate a variety of parameterisations with respect to the central decision problems. The model checking problem (MC) of PDL is NP-complete (Ebbing and Lohmann, SOFSEM 2012). The subject of this research is to identify a list of parameterisations (formula-size, formula-depth, treewidth, team-size, number of variables) under which MC becomes fixed-parameter tractable. Furthermore, we show that the number of disjunctions or the arity of dependence atoms (dep-arity) as a parameter both yield a paraNP-completeness result. Then, we consider the satisfiability problem (SAT) which classically is known to be NP-complete as well (Lohmann and Vollmer, Studia Logica 2013). There we are presenting a different picture: under team-size, or dep-arity SAT is paraNP-complete whereas under all other mentioned parameters the problem is in FPT. Finally, we introduce a variant of the satisfiability problem, asking for teams of a given size, and show for this problem an almost complete picture.
Yasir Mahmood, Arne Meier

Utilizing Deep Learning and RDF to Predict Heart Transplantation Survival

In this paper, we describe the conversion of three different heart transplantation data sets to a Resource Description Framework (RDF) representation and how it can be utilized to train deep learning models. These models were used to predict the outcome of patients both pre- and post-transplant and to calculate their survival time.
The International Society for Heart & Lung Transplantation (ISHLT) maintains a registry of heart transplantations that it gathers from grafts performed worldwide. The American organization United Network for Organ Sharing (UNOS) and the Scandinavian Scandiatransplant are contributors to this registry, although they use different data models.
We designed a unified graph representation covering these three data sets and we converted the databases into RDF triples. We used the resulting triplestore as input to several machine learning models trained to predict different aspects of heart transplantation patients.
Recipient and donor properties are essential to predict the outcome of heart transplantation patients. In contrast with the manual techniques we used to extract data from the tabulated files, the RDF triplestore together with SPARQL, enables us to experiment quickly and automatically with different combinations of features sets, to predict the survival, and simulate the effectiveness of organ allocation policies.
Dennis Medved, Johan Nilsson, Pierre Nugues

Game Description Logic with Integers: A GDL Numerical Extension

Many problems can be viewed as games, where one or more agents try to ensure that certain objectives hold no matter the behavior from the environment and other agents. In recent years, a number of logical formalisms have been proposed for specifying games among which the Game Description Language (GDL) was established as the official language for General Game Playing. Although numbers are recurring in games, the description of games with numerical features in GDL requires the enumeration from all possible numeric values and the relation among them. Thereby, in this paper, we introduce the Game Description Logic with Integers (GDLZ) to describe games with numerical variables, numerical parameters, as well as to perform numerical comparisons. We compare our approach with GDL and show that when describing the same game, GDLZ is more compact.
Munyque Mittelmann, Laurent Perrussel

Craig Interpolation of Epistemic Logics with Distributed Knowledge

Distributed Knowledge among agents is an important topic in multi-agent systems. While semantic studies of distributed knowledge have been done by several authors in the context of epistemic logic, there are a few proof-theoretic studies. This paper provides cut-free Gentzen-style sequent calculi for epistemic logics with distributed knowledge and establishes Craig Interpolation Theorem for the logics by a constructive method, i.e., Maehara method.
Ryo Murai, Katsuhiko Sano

On the Dynamics of Structured Argumentation: Modeling Changes in Default Justification Logic

This paper studies information changes in default justification logic with argumentation semantics. We introduce dynamic operators that combine belief revision and default theory tools to define both prioritized and non-prioritized operations of contraction, expansion and revision for justification logic-based default theories. This combination enriches both default logics and belief revision techniques. We argue that the kind of attack called “undermining” amounts to those operations that contract a knowledge base by an attacked formula.
Stipe Pandžić

Logic-Based Approach to Incremental Monitoring and Optimization on Strongly Distributed Data Streams

In this paper, we systematically adopt logical reduction techniques to monitoring and optimization on distributed data streams. The first technique: Feferman-Vaught reductions, which describe how the queries over a disjoint union of data streams can be computed from queries over the components and queries over the index set. The second one: the syntactically defined translation schemes, which describe possible transformations of data. Combination of these two techniques allows us to consider not only monitoring and optimization on disjoin unions of data streams but rather on much richer compositions. We call them strongly distributed data streams. Our approach is applicable to both homogeneous and heterogeneous data streams. While, as a rule, the known approaches provide some approximation of the solution of the original problem, our method derives solutions over the components of a strongly distributed data stream, such that their further proceeding gives a result that is equivalent to the solution of the original problem on the given data stream.
Elena V. Ravve

Realisability of Choreographies

Choreographies prescribe the rendez-vous synchronisation of messages in a system of communicating finite state machines. Such a system is called realisable, if the traces of the prescribed communication coincide with those of the asynchronous system of peers, where the communication channels either use FIFO queues or multiset mailboxes. In this paper we provide two necessary conditions for synchronisability and hence for realisability of communication choreographies. We show that both conditions together are sufficient. A simple consequence is that realisability in the presence of a choreography becomes decidable. The conditions permit realisable choreographies to be obtained by means of composition, and then choreographies can be further refined into concurrent systems of communicating machines.
Klaus-Dieter Schewe, Yamine Aït-Ameur, Sarah Benyagoub

Schema Optimisation Instead of (Local) Normalisation

Classical normalisation theory has a number of lacunas although it is commonly and widely accepted and it is the basis for database theory since the 80ies. Most textbooks and monographs still follow this approach despite the good number of open problems. Today, modern object-relational DBMS offer far better capabilities than the systems that have been built in the past based on the strict relational paradigm. Constraint maintenance has been oriented on transformation of structures to structures that are free of functional dependencies beside key constraints. The maintenance of coherence constraints such as two-type inclusion constraints has been neglected although this maintenance might be the most expensive one. In reality normalisation is local optimisation that exclusively considers functional dependency maintenance.
We thus need a different normalisation approach. This paper develops an approach towards optimisation of schemata and global normalisation. This approach results in a denormalisation and object-relational database schemata.
Bernhard Thalheim

Strongly Minimal MapReduce Algorithms: A TeraSort Case Study

MapReduce is a widely used parallel computing paradigm for the big data realm on the scale of terabytes and higher. The introduction of minimal MapReduce algorithms promised efficiency in load balancing among participating machines by ensuring that partition skew (where some machines end up processing a significantly larger fraction of the input than other machines) is prevented. Despite minimal MapReduce algorithms guarantee of load-balancing within constant multiplicative factors, the constants are relatively large which severely diminishes the theoretical appeal for true efficiency at scale.
We introduce the notion of strongly minimal MapReduce algorithms that provide strong guarantees of parallelization up to a small additive factor that diminishes with an increasing number of machines. We show that a strongly minimal MapReduce algorithm exists for sorting; this leads to strongly minimal algorithms for several fundamental database algorithms and operations that crucially rely on sorting as a primitive. Our techniques are general and apply beyond the analysis of strongly minimal MapReduce algorithms; we show that given a sufficiently high, but still realistic, sampling rate, the approximate partitions obtained from a particular sampling strategy are almost as good as the partitions produced by an ideal partitioning.
Daniel Xia, Michael Simpson, Venkatesh Srinivasan, Alex Thomo

Event Sequence Interpretation of Structural Geomodels: A Knowledge-Based Approach for Extracting Tectonic Sequences

The tasks of obtaining past event occurrences and their temporal order information are important parts of the cognition of the external world. We call this kind of tasks Event Sequence Interpretations (ESI). In this work, we focus in the ESI in structural geomodels and propose a knowledge-based approach for extracting tectonic sequences, which is crucial for the cognition of structural geomodels.
As a cognitive task, tectonic sequence interpretation has not been highly automated due to the need to use a large amount of expert knowledge for recognition and reasoning. Meanwhile, artificial ESI may introduce cognitive biases that ultimately lead to subjective uncertainty in the results, which affects the credibility of the interpretations and increases risks in oil and gas production. One potential solution is making personal knowledge better available for computers so that computers can also do ESI. Therefore, we proposed a meta-model for formally representing expert knowledge. The instance of the knowledge representation (KR) meta-model is called an Event Pattern (EP), which describes the associations between event occurrences and geometric features in the models. Moreover, we proposed a new pattern matching model called Joint Prototype Model (JPM) to find evidences of event occurrences from the raw geological data. The temporal relations of the events can be extracted according to the spatial topology of the geological objects. Our approach can also be extended from structural geomodels to other spatial geometric models. We show the effectiveness of the approach by an application to a real structural geomodel dataset.
Xianglin Zhan, Cai Lu, Guangmin Hu


Weitere Informationen

Premium Partner