2000 MarkovChainSamplingMethods

From GM-RKB
Jump to navigation Jump to search

Subject Headings: Markov Chain Algorithm

Notes

Quotes

Key Words

Auxiliary variable methods; Density estimation; Latent class models; Monte Carlo; Metropolis-Hasting algorithm.

Abstract

This article reviews Markov chain methods for sampling from the posterior distribution of a Dirichlet process mixture model and presents two new classes of methods. One new approach is to make Metropolis-Hastings updates of the indicators specifying which mixture component is associated with each observation, perhaps supplemented with a partial form of Gibbs sampling. The other new approach extends Gibbs sampling for these indicators by using a set of auxiliary parameters. These methods are simple to implement and are more efficient than previous ways of handling general Dirichlet process mixture models with non-conjugate priors.

1. Introduction

Modeling a distribution as a mixture of simpler distributions is useful both as a nonparametric density estimation method and as a way of identifying latent classes that can explain the dependencies observed between variables. Mixtures with a countably infinite number of components can reasonably be handled in a Bayesian framework by employing a prior distribution for mixing proportions, such as a Dirichlet process, that leads to a few of these components dominating. Use of countably infinite mixtures bypasses the need to determine the "correct" number of components in a finite mixture model, a task which is fraught with technical difficulties. In many contexts, a countably infinite mixture is also a more realistic model than a mixture with a small number of components.

Use of Dirichlet process mixture models has become computationally feasible with the development of Markov chain methods for sampling from the posterior distribution of the parameters of the component distributions and/or of the associations of mixture components with observations. Methods based on Gibbs sampling can easily be implemented for models based on conjugate prior distributions, but when non-conjugate priors are used, as is appropriate in many contexts, straightforward Gibbs sampling requires that an often difficult numerical integration be performed. West, Muller, and Escobar (1994) used a Monte Carlo approximation to this integral, but the error from using such an approximation is likely to be large in many contexts.

MacEachern and Muller (1998) devised an exact approach for handling non-conjugate priors that uses a mapping from a set of auxiliary parameters to the set of parameters currently in use. Their "no gaps" and "complete" algorithms based on this approach are widely applicable, but somewhat inefficient. Walker and Damien (1998) applied a rather different auxiliary variable method to some Dirichlet process mixture models, but their method appears to be unsuitable for general use, as it again requires the computation of a difficult integral.

  • In this article, I review this past work and present two new approaches to Markov chain sampling. A very simple method for handling non-conjugate priors is to use Metropolis-Hastings updates with the conditional prior as the proposal distribution. A variation of this method may sometimes sample more efficiently, particularly when combined with a partial form of Gibbs sampling. Another class of methods uses Gibbs sampling in a space with auxiliary parameters. The simplest method of this type is very similar to the "no gaps" algorithm of MacEachern and Muller, but is more efficient. This approach also yields an algorithm that resembles use of a Monte Carlo approximation to the necessary integrals, but which does not suffer from any approximation error.

,

 AuthorvolumeDate ValuetitletypejournaltitleUrldoinoteyear
2000 MarkovChainSamplingMethodsRadford M. NealMarkov Chain Sampling Methods for Dirichlet Process Mixture ModelsJournal of Computational and Graphical Statisticshttp://gs2040.sp.cs.cmu.edu/neal.pdf2000