Monte-Carlo Sampling
1. Contents
Covering the topics below:
Rejection sampling (Deng's Lecture Notes 8, Gelman Chapter 10)
Adaptive rejection sampling (Deng's Lecture Notes 8, great example from P26-31)
Importance sampling (PRML Chapter 10 and corresponding notes)
Sampling-importance-resampling (PRML Chapter 10 and corresponding notes, Gelman Chapter 10)
Basis of markov chains (PRML Chapter 10 and corresponding notes, Deng's Lecture Notes 11)
The Metropolis-Hastings algorithm (Deng's Lecture Notes 9, Gelman Chapter 11)
- Convergence, assessment and effective number of draws of MH algorithm (Gelman Chapter 11 from P281-288, Deng's Lecture Notes 11)
- Jumping Rules. (Gelman Chapter 12 from P295-296, Deng's Lecture Notes 11)
Gibbs Sampling (Deng's Lecture Notes 10, Gelman Chapter 11)
The Gibbs sampler can be slow to converge because of posterior dependence among parameters that cannot simply be resolved with a linear transformation.
- Auxiliary variable method: using auxiliary variables to obtain conditionally independent components. (See an example in Gelman Chapter 12, P293-294)
- Parameter expansion method: (See an example in Gelman Chapter 12, P294-295)
Slice sampling (PRML Chapter 10's corresponding notes)
Hybrid Monte-Carlo (PRML Chapter 10 and corresponding notes, MCMC: Hamiltonian Monte Carlo and R. M. Neal's paper)
Leapfrog method introduces numerical error due to the fact that step size can not be as small as possible, thus we are using a Metropolis-like acceptance probability to ensure system Hamiltonian remains constant.
We sample momentum variable in the outer iteration so that the system energy will change, therefore we can arrive at an ergodic sampling scheme. Notice that sample from momentum will not change location variable x since they are independent.
Although independent, we can use Hamiltonian dynamics to automatically update location variable with respect to momentum variable.
With Hamiltonian dynamics, we avoid the random walk behaviour.
2. Reference
Wanlu Deng’s Lecture Notes 8 - 10 (Tsinghua University, Bayesian Methods for Statistical Inference, Spring 2018 ).
Bishop, Christopher. Pattern Recognition and Machine Learning (PRML). Springer, 2006.
Gelman, Andrew, et al. Bayesian data analysis. Chapman and Hall/CRC, 2013. 3rd edition.
Neal, R. M. (2011). MCMC using Hamiltonian dynamics. Handbook of Markov Chain Monte Carlo, 2(11), 2.
Joan Bruna's homework 5 on HMC (New York University, DS.1005 Inference and Representation, Fall 2018)
See below my summary notes on PRML chapter 11.