What's in here?
All notes are written in a modern style with explicit definition/theorem references and hyperlinks. Also, the drawing is done professionally and cleanly.
Junior @University of Michigan
Linear Programming (MATH561/IOE510/TO518)
This is the first course in the series of graduate-level, large-scale, and rigorous mathematical programming courses taught by Jon Lee. Topics include Duality Theorems, the mathematical rigorous Simplex Algorithm, Complementary Slackness, Large-Scale Linear Programming, Sensitivity Analysis, and Integer Programming with their applications.
This course is not intended to teach you how to hand-solve small-scale linear programming problems, rather, it's intended to give a rigorous foundation for solving large-scale linear programming problems in an algorithmic way. We rely on
Gurobifor examples to solve various problems in the assignments.
Analysis of Social Networks (EECS544/EECS444)
This is a graduate-level course about social network analysis taught by Vijay G Subramanian, aiming at a rigorous mathematical understanding of various social network algorithms and theories. Topics include Graph Partitioning Algorithms, Stochastic Processes, Random Graph Theory, and Algorithmic Game Theory, including Auctions and Matching Market Algorithms.
The course title makes this course's intended audience rather narrow, but actually one can get a lot out of this course, especially some classical graph algorithms with theoretical analysis.
Algebraic Topology (MATH592)
This is a graduate-level course taught by Jennifer Wilson about Introduction to Algebraic Topology. Topics include CW-Complex, Fundamental Group, Van-Kampen Theorem, Homology, and also their applications like Lefschetz fixed-point theorem.
Some topology and abstract algebra background is required, especially group theory. But other than that, the course is self-contained enough.
Real Analysis (MATH597)
This is the graduate-level real analysis course taught by Jinho Baik. Topics include Measure Theory, Hilbert Spaces, Banach Spaces, Spaces, and some Fourier Analyses. While focusing on real measures, we did discuss signed and complex measures for completeness.
This course is pretty rigorous and well-structured and acts as a pre-request for functional analysis (MATH 602). It's self-contained enough and only needs some previous exposure to mathematical analysis.
Senior @University of Michigan
EECS572, TA)Randomness and Computation (
This is the advanced graduate-level theory course focused on randomized complexity and related topics taught by Mahdi Cheraghchi. Topics include various randomized algorithms, Randomized Complexity, Markov Chains, Random Walks, Expander Graphs, Pseudo-random Generators, and Hardness v.s. Randomness.
Overall a rigorous course covering all background knowledge one might need to do research in the related fields. I'm grateful to be a teaching assistant for this course together with Neophytos Charalambides as an undergrad.
Approximation Algorithms and Hardness of Approximation (EECS598-001)
This is the graduate-level algorithm course taught by Euiwoong Lee, which focuses on methods of designing and analyzing approximation algorithms, together with the theoretical background on showing the hardness of approximation. Topics include Covering, Clustering, Network Design, and CSP. We also discussed Lasserre (SoS) Hierarchy, Unique Game Conjecture, and Probabilistic Checkable Proofs.
This is one of the most exciting courses I have taken: algorithmic design, hardness of approximation, and fancy topics such as SoS hierarchy, PCP, and UGC are all fun to learn, especially the approximated complexity theory part.
Functional Analysis (MATH602)
This is the graduate-level functional analysis course taught by Joseph Conlon. The focus of this course is rather standard, including Banach and Hilbert Spaces Theory, Bounded Linear, Compact, and Self-Adjoint Operators Theorem, Representation, Hahn-Banach, Open Mapping Theorem, and Spectral Theory. We also covered some point-set topology along the way.
A rigorous course gives you the needed tools for analyzing function spaces. It'll give you a solid understanding of infinite dimensional vector spaces and how to deal with operators over these spaces.
EECS475, TA)Introduction to Cryptography (
This is the upper-level theory course on formal cryptography and related topics taught by Mahdi Cheraghchi. Topics include various Historic Ciphers, Perfect Secrecy, Symmetric Encryption (including pseudo-random generators, stream ciphers, pseudo-random functions, and permutations), Message Authentication, Cryptographic Hash Functions, and Public Key Encryption.
An interesting foray into theoretical cryptography. I'm grateful to be a teaching assistant for this course together with Nikhil Shagrithaya as an undergrad.
Mathematical Logic (MATH681)
This is the graduate-level mathematical logic course taught by Matthew Harrison-Trainor, aiming to obtain insights into all other branches of mathematics, such as algebraic geometry, analysis, etc. Specifically, we will cover Model Theory beyond the basic foundational Ideas of Logic.
"Learn some fundamental stuff and show off to your friends" is basically my mindset when taking this course 🤪 But seriously, learning something fundamental at this level is a new experience and challenge for me, but hey, it's the last semester, so might just relax and see how it goes!
Riemannian Geometry (MATH635)
This is the advanced graduate-level differential geometry course focused on Riemannian geometry taught by Lydia Bieri. Topics include local and global aspects of differential geometry and the relation with the underlying topology.
I always wanted to have a solid understanding of differential geometry since the recent advances in machine learning theory rely on related concepts quite heavily in some particular branches such as optimization and the well-known manifold hypothesis, or even more practical, manifold learning.
First Year Ph.D. @University of Illinois Urbana-Champaign
Empirical Process Theory (STAT576)
This is the advanced graduate-level statistics course focused on empirical process theory taught by Sabyasachi Chatterjee. Topics include the classical Concentration Inequality, Expected Supremum of Empirical Process, Applications to M-Estimation, and Fixed Design Non-Parametric Regression.
This is one of the hardest courses I have taken due to how messy some calculations are and the common tricks used in theoretical statistics which I'm not familiar with. However, this is quite relevant to modern generalization theory for deep learning (actually a foundation), so I actually quite enjoy learning this course.
- I also took Nonlinear Programming (MATH663/IOE611), but the professor provided excellent lecture slides, so I won't bother scribing it myself.↩