Winter 2010
Instructor: Dale Schuurmans, Ath409, x2-4806, dale@cs.ualberta.ca
CMPUT 656 - Matrix and Convex Methods in Machine Learning
Department of Computing Science
University of Alberta
Room: CSC B-41
Time: TR 12:30-14:00
Office hours: R 14:00-15:00 (or by appointment)
This course will cover the fundamental concepts and key algorithms of matrix analysis and convex optimization as they apply to machine learning problems. A key subtopic will be representation learning based on sparse regularization and their supporting nonsmooth optimization methods. The lectures will be organized around important machine learning challenges, but will also focus on presenting globally optimal solution techniques from matrix theory and mathematical programming. An auxiliary goal of the course is to provide a familiarity with the key algorithmic ideas behind matrix analysis and convex optimization, which are of growing importance in machine learning.
Although there are no formal prerequisites for this course, I assume some prior familiarity with machine learning problems and techniques. Although much of the material is necessarily based on matrix analysis and multivariable optimization, I fully expect to fill in this background as the course proceeds. Nevertheless, a willingness to re-acquire basic linear algebra and calculus concepts, while acquiring some knowledge of convex analysis, particularly when it becomes obvious that these be extremely useful, will be most helpful.
Course work:
2 Assignments | 25% each | ||||||||||
Project | 50% |
References:
There will be no official text for this course. Most of the material will be covered in lecture notes and supplementary excerpts that will be provided.
Supplementary References On-line:
The Matrix Reference Manual,
by Mike Brookes
www.ee.ic.ac.uk/hp/staff/dmb/matrix/intro.html
The Matrix Cookbook,
by Kaare Brandt Petersen and Michael Syskind Pedersen
matrixcookbook.com
Convex Optimization,
by Stephen Boyd and Lieven Vandenberghe
www.stanford.edu/~boyd/cvxbook.html
Lecture Notes on Convex Programming,
by Arkadi Nemirovski
www2.isye.gatech.edu/~nemirovs
(More will be added as the course proceeds.)
Lecture 1 | Introduction | Tues Jan 5 | |
Part 1 | Basic Forms | ||
---|---|---|---|
Lecture 2 | Supervised learning forms | Thur Jan 7 | |
Lecture 3 | Optimal clustering 1 | Tues Jan 12 | |
Lecture 4 | Optimal clustering 2 | Thur Jan 14 | |
Lecture 5 | Optimal embedding 1 | Tues Jan 19 | |
Lecture 6 | Optimal embedding 2 | Thur Jan 21 | |
Lecture 7 | Gaussian models | Tues Jan 26 | |
break | |||
Part 2 | Generalized Loss Minimization | ||
Lecture 8 | Generalized loss -- supervised learning | Thur Feb 4 | |
Lecture 9 | Generalized loss -- clustering | Tues Feb 9 | |
Lecture 10 | Generalized loss -- embedding | Thur Feb 11 | |
reading week | |||
Part 3 | Representation Learning | ||
Lecture 11 | Supervised case: sparse regularization | Tues Feb 23 | |
Lecture 12 | Kernel learning | Thur Feb 25 | |
Lecture 13 | Sparse coding | Tues Mar 2 | |
Lecture 14 | Multitask representation learning | Thur Mar 4 | |
break | |||
Lecture 15 | Deep learning | Tues Mar 16 | |
Part 4 | Structured-Output Learning | ||
Lecture 16 | Supervised structured-output learning | Thur Mar 18 | |
Lecture 17 | Structured clustering | Tues Mar 23 | |
Lecture 18 | Structured embedding | Thur Mar 25 | |
Part 5 | Compressed Sensing | ||
Lecture 19 | Compressed sensing & sparse reconstruction 1 | Tues Mar 30 | |
Lecture 20 | Compressed sensing & sparse reconstruction 2 | Thur Apr 1 | |
Lecture 21 | Relational representation learning 1 | Tues Apr 6 | |
Lecture 22 | Relational representation learning 2 | Thur Apr 8 |