2014 | OriginalPaper | Chapter
Variational EM Learning of DSBNs with Conditional Deep Boltzmann Machines
Authors : Xing Zhang, Siwei Lyu
Published in: Artificial Neural Networks and Machine Learning – ICANN 2014
Publisher: Springer International Publishing
Activate our intelligent search to find suitable subject content or patents.
Select sections of text to find matching patents with Artificial Intelligence. powered by
Select sections of text to find additional relevant content using AI-assisted search. powered by
Variational EM (VEM) is an efficient parameter learning scheme for sigmoid belief networks with many layers of latent variables. The choice of the inference model that forms the variational lower bound of the log likelihood is critical in VEM learning. The mean field approximations and wake-sleep algorithm use simple models that are computationally efficient, but may be poor approximations to the true posterior densities when the latent variables have strong mutual dependencies. In this paper, we describe a variational EM learning method of DSBNs with a new inference model known as the
conditional deep Boltzmann machine
(cDBM), which is an
undirected
graphical model capable of representing complex dependencies among latent variables. We show that this algorithm does not require the computation of the intractable partition function in the undirected cDBM model, and can be accelerated with contrastive learning. Performances of the proposed method are evaluated and compared on handwritten digit data.