Office Hours Time: WF 14h00-15h00, Location: 3310 Siebel
Alternative locations may be available
We are no longer meeting in person. I will release movies, readings and homeworks to keep the course running. You should have received email from me about this.
Key points:
Class Date | Readings | Movies |
---|---|---|
13 Mar | ch. 11, up to 11.4 | bias+variance (end of 11 Mar lecture) |
13 Mar | Simple model selection | |
13 Mar | Robust regression using IRLS | |
13 Mar | Regression using Generalized Linear Models | |
25 Mar | Finish ch. 11 | Regression using the Lasso |
25 Mar | Regression using Elastic Net | |
27 Mar | 12.1 | Greedy stagewise regression |
27 Mar/1 April | 12.2 | Gradient boost |
1 April | 12.2 | Gradient boosting decision stumps |
1 April | 12.2 | Gradient boosting regression trees |
3 April | 13.1 | Markov chains - basic ideas |
3 April | 13.2 | Simulating a Markov Chain |
3 April | 13.2 | Text models with Markov Chains |
8 April | 13.2 | Hidden Markov Models -basic ideas |
8 April | 13.2 | Hidden Markov Models - Dynamic Programming |
8 April | 13.2 | Hidden Markov Models - an example |
10 April | 13.3 | Learning an HMM from data using EM (sorry, no short movie) |
Class Date | Readings | Movies |
---|---|---|
13 Mar | ch. 11 | Bias+Variance; simple model selection; IRLS |
13 Mar/25 Mar | Generalized linear models; Lasso; | |
25 Mar | More generalized linear models; Lasso; Elastic net; some other stuff which you can ignore | |
27 Mar, 1 April | 12.1, 12.2 | Boosting and Gradient Boost |
3 April | 13.1 | Introductory Markov chains (the chapter reference is wrong - I changed the chapter numbers - it's an old movie) |
3 April/7 April | 13.2 and 13.3 | Simulating Markov chains; text models; Hidden Markov Models; dynamic programming |
10 April | 13.3 | Learning an HMM from data using EM |
LINK ISN'T BROKEN I will be absent 31 Jan (sorry!). Also absent 21 Feb, aargh! Also absent 13 Mar, mild signs of illness so self-isolating, aargh! Check for movies!
Getting into the class In the past, we've been able to admit everyone who wanted to get into the in-person version of the class after the first rush settled down. Will this be true this semester? who knows? not me. PLEASE do not come and tell me that you really want to get in, or your cat died and its last words were you should take the class, or something. We're not going to go over an enrollment of 100. Corollary: If you plan dropping, do so early; someone else wants your seat.
Can I get in even though I won't be able to come to lecture cause I'm doing something else, but I'll watch the movies. I think this strategy is unwise, but I suppose it's not really my problem.
Can I audit? The main resource limits on the physical class are physical seats in the room. We cannot have an overcrowded room. If physical seats are open, sure (I'm always happy to have an audience); but please don't take a seat that should be occupied by someone who is registered
https://piazza.com/class/k62hiyy1jr81jm?cid=8
Evaluation is by: Homeworks and take home final.
I will shortly post a policy on collaboration and plagiarism
I will start at the beginning of the textbook and proceed to the end, covering approximately one chapter per week. You'll notice there are 19 substantive chapters and 15 weeks; this is to allow a little spreading out, but in week N I expect to be close to chapter 15*N/19. Read the textbook. I wrote it specifically for this course, AND it's free. I will split time in lecture between sketching important points described in the text, and solving problems. If you haven't read the text, this might be quite puzzling.
Applied Machine Learning D.A. Forsyth, Springer, 2019
Important In the past, people have brought the pdf with them on mobile devices. I think this is a good idea. Or you could buy a paper copy. The PDF is a free download from the UIUC library (you have to be on the intranet to download it, I think)
TBA
Class Date | PM-low-res |
---|---|
22-Jan | low-resClassification; Nearest neighbors |
24-Jan | low-resNaive Bayes; SVMs |
29-Jan | low-resLearning theory |
31-Jan | low-resHigh dimensional data; Covariance matrices |
5-Feb | low-resDiagonalizing covariance matrices; multivariate normals |
7-Feb | low-resPrincipal components |
12-Feb | low-resMore PCA; NIPALS |
14-Feb | low-resLow rank models |
19-Feb | low-rescanonical correlation analysis |
21-Feb | We did not meet; movies in lieu on Announcements page |
26-Feb | low-resVector quantization |
28-Feb | low-resend VQ; start EM |
4-Mar | low-resEM; start regression |
6-Mar | low-resbasic least squares regression; R-squared; outliers; |
11-Mar | low-res multivariate regression; using leverage, Cook's distance, standardized residuals; bias and variance |
13-Mar | We did not meet; movies in lieu on Announcements page |
Probability and Statistics for Computer Science, D.A. Forsyth