CS 544 Optimization methods in vision and learning

 

Instructor D.A. Forsyth, 3310 Siebel, daf@uiuc.edu

TA: Xiaoming Zhao, xz23@illinois.edu

Office hours:

MP's

Current Notes

  1. Week 1:
  2. Week 2:
  3. Week 3:
  4. Week 4: Constrained optimization basics
  5. Week 5: Inequalities, first look
  6. Week 6: Interior Point Methods
  7. Week 7: Finish interior Point Methods; start with discrete problems
  8. Week 8: Various discrete problems
  9. Week 9: More discrete problems and graphical models
  10. Week 10: Graphical models and other approximate solution techniques Papers

    These go far beyond where I'll go in lecture, but lectures will set you up to read these.

  11. Week 11: Finish graphical models; start various first order methods Papers and notes

    These go far beyond where I'll go in lecture, but lectures will set you up to read these.

  12. Week 12: More first order methods Papers and notes

    These go slightly beyond where I'll go in lecture.

  13. Week 13: Yet more first order methods Papers and notes

    These go notably beyond where I'll go in lecture.

  14. Week 14: Gradient boost and a taste of Bayesian Optimization Papers and notes

    These go notably beyond where I'll go in lecture.

Resources

    Continuous optimization books
  1. Numerical Optimization (Springer Series in Operations Research and Financial Engineering) by Jorge Nocedal and Stephen Wright, 2006
  2. Convex Optimization by Stephen Boyd and Lieven Vandenberghe, Cambridge, 2004
  3. Understanding and Using Linear Programming, J. Matousek and B. Gartner, Springer, 2007
  4. Introduction to Optimization, P. Pedregal, Springer, 2004
  5. Optimization for Machine Learning, Sra, Nowozin and Wright, MIT Press, 2011
  6. Foundations of Optimization, Guler, Springer, 2010
  7. Nonlinear Programming by Dimitri P. Bertsekas, Athena, 1999
  8. Practical Methods of Optimization by R. Fletcher, Wiley, 2000
  9. Practical Optimization by Philip E. Gill, Walter Murray, Margaret H. Wright, Academic, 1982
  10. The EM Algorithm and Extensions 2nd Edition by Geoffrey McLachlan (Author), Thriyambakam Krishnan (Author)

Notes From Last Time

  1. Week 1: Basic continuous optimization (descent directions, coordinate ascent, EM as coordinate ascent, Newton's method, stabilized Newton's method); My notes
  2. Week 2, Week 3: More basic continuous optimization (trust regions, dogleg method, subspace method); My notes: (Conjugate gradient Old notes) (Conjugate gradient and Polak Ribiere Notes- use these for conjugate gradient as well);(Quasi Newton methods for big problems, including DFP, BFGS, limited memory methods) My notes:
  3. Week 4, 5 and 6: Constrained optimization methods
  4. Initial remarks on Combinatorial optimization (my notes); Flow and cuts (my notes)
  5. alpha-expansion, alpha-beta swaps; MRF's as cuts (my notes):
  6. Stochastic Average Gradient (my notes)
  7. Matchings (my notes); bipartite graph matching as a linear program (my notes)
  8. Dual decomposition methods (my notes); ADMM (my notes there really are two page 13's, sorry)
  9. The graphical models block
  10. Proximal Algorithms (my notes)
  11. Gradient Boost (my terse notes; more detailed notes with some stuff on xgboost)
  12. Striking behavior of gradient descent My notes
  13. Gradient Boost (my terse notes; more detailed notes with some stuff on xgboost)
  14. The Linear Quadratic Regulator (my notes)
  15. Markov Decision Processes (my notes)
  16. Learning from an expert, I (my notes)
  17. Structure Learning (my notes)
  18. Papers