ENSAE Paris - École d'ingénieurs pour l'économie, la data science, la finance et l'actuariat

Advanced Machine Learning

Teacher

STROMME Austin

Department: Statistics

Objective

The objective of this course is to present the theoretical basics of statistical learning by focusing essentially on binary classification.

The statistical (and algorithmic) complexity of this problem will be considered through the analysis of the Empirical Risk Minimization first, and then other general class of algorithms : SVM, Neural Nets, Boosting and Random Forest (if time permits). Their statistical properties will be discussed and compared.

These algorithms will be applied to real data during the TP sessions.

Planning

  1. The framework and examples
  2. Empirical Risk Minimization
  3. VC dimension and inequality
  4. Convexification and general losses
  5. Chaining
  6. Support Vector Machines
  7. Boosting
  8. Feed Forward Neural Nets

References

Y. Mansour, Machine Learning: Foundations, Tel-Aviv University, 2013 
P. Rigollet, Mathematics of Machine Learning, MIT, 2015 
A. Ng, Machine Learning, Stanford, 2015 
S. Kakade and A. Tewari, Topics in Artificial Intelligence, TTIC, 2008 

L. Devroye, L. Gyorfi and G. Lugosi. A Probabilistic Theory of Pattern Recognition. Springer 1996. 
T. Hastie, R. Tibshirani and J. Friedman. The Elements of Statistical Learning: Data Mining, Inference, and Prediction. Springer 2009.