The lectures will deal with the following issues:
- Discrete Stochastic Dynamic Programming Models (DSDPM). Introduction and definitions, the Bellman equation, introduction to recursive methods.
- An example of DSDPM. Solving a partial equilibrium search problem, maximum likelihood estimation.
- General Framework. State Space, Law of Motions, Classification of solution methods, Dimension of integrals, Curse of dimensionality.
- Direct Solution Methods. Schooling Decisions as an optimal stopping problem, Extreme value-dynamic Logit Model (Rust’s method).
- Computationally Intensive Methods. Simulation Methods, Interpolation Methods.
- Identification and Estimation. A theorem by Hotz and Miller, the degree of underidentification, restrictions on preferences.
- Alternative Solution Methods. Estimation by Conditional Choice Probabilities, OLS estimation, Non rational expectations (Expectation Parametrization Method).
- Applications. Examples of computer programs for various methods.
Bellman, R. (1957), Dynamic Programming, Princeton University Press, Princeton.
Eckstein, Z. et K. Wolpin (1989), "The Specification and Estimation of Dynamic Stochastic Discrete Choice Models", Journal of Human Resources, 24, 562-598.
Keane, M. P. et K. Wolpin (1997), "The Career Decisions of Young Men", Journal of Political Economy, 105, 473-522.
Rust, J. (1994), "Structural Estimation of Markov Decision Processes" in Handbook of Econometrics, Engle, R. et D. McFadden (eds.), Elsevier Science, North-Holland Publishers, Amsterdam, 3081-4143.