Professor: Adam Oberman https://www.adamoberman.net/
Teaching Assistant: Viet Nguyen firstname.lastname@example.org
Class time: Tuesday and Thursday 2:35pm - 3:55 pm. Lea 14
Office Hours: Tuesday and Thursday 1:05pm-1:30pm and 4:05pm-4:30pm. Wednesday 3:30-4:00pm (by appointment). Occasionally family life interferes with my schedule, and I may miss the office hour. An email in advance is appreciated, but is not necessary.
Lecture notes and assignments: https://adam-oberman.github.io/
Assigment submission: https://mycourses2.mcgill.ca/
A mathematically rigorous approach to Machine Learning (ML), with a focus on a rigourous presentation of ML models. Proofs of in-distribution generalization bounds.
Students will be expected to have seen and coded ML models before. Experience with mathematical proofs and with probability is expected.
Use Shalev-Shwartz for introduction and definitions. Use Mohri for the proofs, which are more precise.
Be sure to discuss both an early reference, and track back references to early work, which may be more clear.
01/05/2023 Zoom lecture, covered first day handout and projects. Overview of Ch 2 of Bach: understanding "all you need is scale" vs. No Free Lunch Theorem.
01/10/2023 In class. Ch 2 Bach Lecture 1 2023.01.10 562.pdf
01/12/2023 Thursday. Cheat Sheet for Measure Theory/Probability/ML Notation. PAC Learning, reference: Mohri Ch2, SS 2.3.1 & Ch 3. Lecture 3 2022.01.12.pdf
01/19 Thursday. Reference: Excerpt from Jeff Calder notes on Calculus of Variations
01/24 and 26/2023 Rademacher Complexity. Reference: Chapter 3 of Foundations of Machine Learning by Mohri, Rostamizadeh, Talwalkar.
02/7/2023 and 02/09/2023:
Convex Learning Problems, Ch 12, S-S
Stability theory and generalization Ch 12, S-S
02/14 and 16: Stability, SS Thm 13.2, Mohri Thm 14.2
02/21 and 23:
Reference https://udlbook.github.io/udlbook/, Chapters 14-18
Study notes on Rademacher Complexity MC_562_Rademacher.pdf
Study notes on Stability Generalization bounds MC_562_Stability.pdf