Hidden Markov Models and Sequential Monte-Carlo methods


So-called hidden Markov chain (or state-space) models are time series models involving a "signal" (a Markov process $(X_t)$ describing the state of a system) observed in an imperfect and noisy way in the form of data, e.g. $Y_t=f(X_t)+epsilon_t$. These models are widely used in many disciplines:

  • Finance: stochastic volatility ($X_t$ is the unobserved volatility)… 
  • Engineering: target tracking ($X_t$ is the position of a mobile whose trajectory we are trying to find; speech recognition ($X_t$ is a phoneme). 
  • Biostatistics: Ecology ($X_t$=population size); Epidemiology ($X_t$=number of infected).

The aim of this course is to present modern methods of sequential analysis of such models, based on particle algorithms (Sequential Monte Carlo). The problems of filtering, smoothing, prediction, and parameter estimation will be discussed.
At the end of the course, we will also briefly discuss the extension of such algorithms to non-sequential problems, notably in Bayesian Statistics..

  • 2A Simulation and Monte Carlo or similar course 
  • 3A courses of "Computational Statistics" and "Bayesian Statistics" are recommended but not mandatory.

At the end of the course, the student will be able to: 

  • state the main properties of HMM models 
  •  implement a particle filter to filter and smooth a given HMM model 
  • estimate the parameters of such a model from different methods


  1. Introduction: definition of HMM (Hidden Markov models), main properties, notion of filtering, smoothing and prediction, forward-backward formulas.
  2. Discrete HMMs, Baum-Petrie's algorithm
  3. Gaussian linear HMM, Kalman algorithm
  4. SMC algorithms for filtering an HMM model
  5. Estimation in HMM models
  6. Introduction to non-sequential applications of SMC algorithms


Del Moral (2004). Feynman-Kac formulae, Springer.
Cappé, Moulines and Ryden (2010). Inference in Hidden Markov Models (Springer Series in Statistics)