MAI0019
Statistical Learning Theory with Concentration Inequalities/
Statistik inlärningsteori med koncentrationsolikheter
Number of credits: 15 hp
Examiner: Timo Koski
Course literature:
- O. Catoni (2004): Statistical Learning Theory and Stochastic Optimization, Lecture Notes in Mathematics 1851, Springer Verlag.
- V.N.Vapnik (1998): Statistical Learning Theory. chapters 14-16. John Wiley & Sons.
- M. Vidyasagar (2003): Learning and Generalization. Springer.
- Material on Support Vector Machines.
Course contents:
- The course investigates tools for analysis of performance of simplified models for prediction, estimation, and classification of complex data.
- The course starts with the theory and algorithms of support vector machines and the Vapnik-Chervonenkis theory.
- The techniques of analysis are rooted in information theory (minimax compression and learning, 'blowing-up lemma'), PAC-Bayesian theorems and concentration inequalities. Tools from probability are Bennett's, Hoeffding's, Chernoff's, Azuma's and McDiarmid's inequalities.
- Oracle inequalities, non-asymptotic bounds on the statistical risk, selfboundedness of Vapnik entropy and concentration inequalities for statistical learning (entropy method, logarithmic Sobolev inequalities) will be presented.
Organisation:
Examination: Presentations by participants. Home assignments.
Prerequisites:
- A graduate course in measure and integration theory (e.g., given by TM/MAI).
- A graduate course in Markov Chain Monte Carlo (e.g., given by mat.stat./MAI).
- A graduate course in probability and stochastic processes (e.g, given by mat.stat./MAI).
- A graduate course in statistical inference. (e.g., given by stat./MAI).
- An undergraduate/graduate course in information theory (e.g. given by Division of Data Transmission/ISY).
Page manager:
karin.johansson@liu.se
Last updated: 2014-04-29