2012 | OriginalPaper | Buchkapitel
On Lifted Inference for a Relational Probabilistic Conditional Logic with Maximum Entropy Semantics
verfasst von : Annika Krämer, Christoph Beierle
Erschienen in: Foundations of Information and Knowledge Systems
Verlag: Springer Berlin Heidelberg
Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.
Wählen Sie Textabschnitte aus um mit Künstlicher Intelligenz passenden Patente zu finden. powered by
Markieren Sie Textabschnitte, um KI-gestützt weitere passende Inhalte zu finden. powered by
When extending probabilistic logic to a relational setting, it is desirable to still be able to use efficient inference mechanisms developed for the propositional case. In this paper, we investigate the relational probabilistic conditional logic FO-PCL whose semantics employs the principle of maximum entropy. While in general, this semantics is defined via the ground instances of the rules in an FO-PCL knowledge base
$\mathcal{R}$
, the maximum entropy model can be computed on the level of rules rather than on the level of instances of the rules if
$\mathcal{R}$
is parametrically uniform, thus providing lifted inference.We elaborate in detail the reasons precluding
$\mathcal{R}$
from being parametrically uniform. Based on this investigation, we derive a new syntactic criterion for parametric uniformity and develop an algorithm that transforms any FO-PCL knowledge base
$\mathcal{R}$
into an equivalent knowledge base
$\mathcal{R'}$
that is parametrically uniform.