Skip to main content

2015 | Buch

Probabilistic Graphical Models

Principles and Applications

insite
SUCHEN

Über dieses Buch

This accessible text/reference provides a general introduction to probabilistic graphical models (PGMs) from an engineering perspective. The book covers the fundamentals for each of the main classes of PGMs, including representation, inference and learning principles, and reviews real-world applications for each type of model. These applications are drawn from a broad range of disciplines, highlighting the many uses of Bayesian classifiers, hidden Markov models, Bayesian networks, dynamic and temporal Bayesian networks, Markov random fields, influence diagrams, and Markov decision processes. Features: presents a unified framework encompassing all of the main classes of PGMs; describes the practical application of the different techniques; examines the latest developments in the field, covering multidimensional Bayesian classifiers, relational graphical models and causal models; provides exercises, suggestions for further reading, and ideas for research or programming projects at the end of each chapter.

Inhaltsverzeichnis

Frontmatter

Fundamentals

Frontmatter
Chapter 1. Introduction
Abstract
This introductory chapter starts by describing the effects of uncertainty in intelligent systems and presents a brief history of the development of uncertain reasoning in artificial intelligence. Then it presents the basic approach for probabilistic reasoning, motivating the development of probabilistic graphical models. It gives an overview of probabilistic graphical models, the types of models, and how these can be classified. It concludes with a description of the rest of the book.
Luis Enrique Sucar
Chapter 2. Probability Theory
Abstract
This chapter presents an overview of some basic concepts in probability theory which are important for understanding probabilistic graphical models. First, the main interpretations and mathematical definition of probability are introduced. Second, the basic rules of probability theory are presented, including the concept of conditional independence and Bayes rule. Third, an overview of random variables and some important distributions are described. Lastly, the basics of information theory are presented.
Luis Enrique Sucar
Chapter 3. Graph Theory
Abstract
In this chapter, a review of some aspects of graph theory that are important for probabilistic graphical models are presented. After providing a definition of directed and undirected graphs, some basic theoretical graph concepts are introduced, including types of graphs, trajectories and circuits, and graph isomorphism. A section is dedicated to trees, an important type of graph. Some more advanced theoretical graph aspects required for inference in probabilistic models are introduced, including cliques, triangulated graphs, and perfect orderings. The chapter concludes with a description of the maximum cardinality search and graph filling algorithms.
Luis Enrique Sucar

Probabilistic Models

Frontmatter
Chapter 4. Bayesian Classifiers
Abstract
This chapter covers Bayesian classifiers. After a brief introduction to the classification problem, the Naive Bayesian classifier is presented, as well as its main variants: TAN and BAN. Then the semi-Naive Bayesian classifier is described. A multidimensional classifier may assign several classes to the same object. Two alternatives for multidimensional classification are analyzed: the multidimensional Bayesian network classifier and the Bayesian chain classifier. Then an introduction to hierarchical classification is presented. The chapter concludes by illustrating the application of Bayesian classifiers in two domains: skin pixel detection in images and drug selection for HIV treatment.
Luis Enrique Sucar
Chapter 5. Hidden Markov Models
Abstract
Markov chains and hidden Markov models (HMMs) are particular types of PGMs that represent dynamic processes. After a brief introduction to Markov chains, this chapter focuses on hidden Markov models. The algorithms for solving the basic problems: evaluation, optimal sequence, and parameter learning are presented. The chapter concludes with a description of several extensions to the basic HMM, and two applications: the “PageRank” procedure used by Google and gesture recognition.
Luis Enrique Sucar
Chapter 6. Markov Random Fields
Abstract
This chapter presents an introduction to Markov random fields (MRFs), also known as Markov networks, which are undirected graphical models. We describe how a Markov random field is represented, including its structure and parameters, with emphasis on regular MRFs. Then, a general stochastic simulation algorithm to find the optimum configuration of an MRF is described, including some of its main variants. The problem of parameter estimation for an MRF is addressed, considering the maximum likelihood estimator. Conditional random fields are also introduced. The chapter concludes with two applications of MRFs for image analysis, one for image de-noising and the other for improving image annotation by including spatial relations.
Luis Enrique Sucar
Chapter 7. Bayesian Networks: Representation and Inference
Abstract
This chapter introduces Bayesian networks, covering representation and inference. The basic representational aspects of a Bayesian network are presented, including the concept of D-Separation and the independence axioms. With respect to parameter specification, the two main alternatives for a compact representation are described, one based on canonical models and the other on graphical representations. Then the main algorithms for probabilistic inference are introduced, including belief propagation, variable elimination, conditioning, junction trees, loopy propagation, and stochastic simulation. The chapter concludes by illustrating the application of Bayesian networks in information validation and system reliability analysis.
Luis Enrique Sucar
Chapter 8. Bayesian Networks: Learning
Abstract
This chapter gives an introduction to learning Bayesian networks including both parameter and structure learning. Parameter learning includes how to handle uncertainty in the parameters and missing data; it also includes the basic discretization techniques. After describing the techniques for learning tree and polytree BNs, the two main types of methods for structure learning are described: score and search, and independence tests. We then describe how to combine expert knowledge and data. The chapter concludes with an application example in the area of pollution modeling.
Luis Enrique Sucar
Chapter 9. Dynamic and Temporal Bayesian Networks
Abstract
Dynamic Bayesian network models extend BNs to represent the temporal evolution of a certain process. There are two basic types of Bayesian network models for dynamic processes: state based and event based. Dynamic Bayesian networks are state-based models that represent the state of each variable at discrete time intervals. Event-based models represent the changes in the state of each state variable; each temporal variable will then correspond to the time in which a state change occurs. In this chapter, we will review dynamic Bayesian networks and event networks, including representation, inference, and learning. The chapter includes two application examples: dynamic Bayesian networks for gesture recognition and temporal nodes Bayesian networks for HIV mutational pathways prediction.
Luis Enrique Sucar

Decision Models

Frontmatter
Chapter 10. Decision Graphs
Abstract
This chapter introduces decision models. First, a brief review of the fundamentals of decision theory is presented. Second, we describe decision trees and their evaluation strategy. Third, influence diagrams are introduced, including two alternative evaluation strategies: variable elimination and transformation to a Bayesian network. The chapter concludes with an application of a decision model that acts as a caregiver to guide an elderly or handicapped person in cleaning her hands.
Luis Enrique Sucar
Chapter 11. Markov Decision Processes
Abstract
This chapter introduces sequential decision problems, in particular Markov decision processes (MDPs). A formal definition of an MDP is given, and the two most common solution techniques are described: value iteration and policy iteration. Then, factored MDPs are described, which provide a representation based on graphical models to solve very large MDPs. An introduction to partially observable MDPs (POMDPs) is also included. The chapter concludes by describing two applications of MDPs: power plant control and service robot task coordination.
Luis Enrique Sucar

Relational and Causal Models

Frontmatter
Chapter 12. Relational Probabilistic Graphical Models
Abstract
This chapter introduces relational probabilistic graphical models (RPGMs), which combine the expressive power of predicate logic with the uncertain reasoning capabilities of probabilistic graphical models. First, a brief review of propositional and predicate logic is presented. Then, two different relational probabilistic formalisms are described: probabilistic relational models and Markov logic networks. Finally, the application of the two previous approaches is illustrated in two domains, student modeling for a virtual laboratory and visual object recognition based on symbol-relational grammars.
Luis Enrique Sucar
Chapter 13. Graphical Causal Models
Abstract
This chapter gives an introduction to causal modeling, in particular to causal Bayesian networks. It starts by introducing causal models and their importance. Then causal Bayesian networks are described, including two types of causal reasoning, prediction and counterfactuals. It continues with the topic of learning causal models, presenting one of the state-of-the-art techniques. Finally, it shows an example of learning causal models from real-world data about children with Attention Deficit Hyperactivity Disorder.
Luis Enrique Sucar
Backmatter
Metadaten
Titel
Probabilistic Graphical Models
verfasst von
Luis Enrique Sucar
Copyright-Jahr
2015
Verlag
Springer London
Electronic ISBN
978-1-4471-6699-3
Print ISBN
978-1-4471-6698-6
DOI
https://doi.org/10.1007/978-1-4471-6699-3