$$ \newcommand{\bone}{\mathbf{1}} \newcommand{\bbeta}{\mathbf{\beta}} \newcommand{\bdelta}{\mathbf{\delta}} \newcommand{\bepsilon}{\mathbf{\epsilon}} \newcommand{\blambda}{\mathbf{\lambda}} \newcommand{\bomega}{\mathbf{\omega}} \newcommand{\bpi}{\mathbf{\pi}} \newcommand{\bphi}{\mathbf{\phi}} \newcommand{\bvphi}{\mathbf{\varphi}} \newcommand{\bpsi}{\mathbf{\psi}} \newcommand{\bsigma}{\mathbf{\sigma}} \newcommand{\btheta}{\mathbf{\theta}} \newcommand{\btau}{\mathbf{\tau}} \newcommand{\ba}{\mathbf{a}} \newcommand{\bb}{\mathbf{b}} \newcommand{\bc}{\mathbf{c}} \newcommand{\bd}{\mathbf{d}} \newcommand{\be}{\mathbf{e}} \newcommand{\boldf}{\mathbf{f}} \newcommand{\bg}{\mathbf{g}} \newcommand{\bh}{\mathbf{h}} \newcommand{\bi}{\mathbf{i}} \newcommand{\bj}{\mathbf{j}} \newcommand{\bk}{\mathbf{k}} \newcommand{\bell}{\mathbf{\ell}} \newcommand{\bm}{\mathbf{m}} \newcommand{\bn}{\mathbf{n}} \newcommand{\bo}{\mathbf{o}} \newcommand{\bp}{\mathbf{p}} \newcommand{\bq}{\mathbf{q}} \newcommand{\br}{\mathbf{r}} \newcommand{\bs}{\mathbf{s}} \newcommand{\bt}{\mathbf{t}} \newcommand{\bu}{\mathbf{u}} \newcommand{\bv}{\mathbf{v}} \newcommand{\bw}{\mathbf{w}} \newcommand{\bx}{\mathbf{x}} \newcommand{\by}{\mathbf{y}} \newcommand{\bz}{\mathbf{z}} \newcommand{\bA}{\mathbf{A}} \newcommand{\bB}{\mathbf{B}} \newcommand{\bC}{\mathbf{C}} \newcommand{\bD}{\mathbf{D}} \newcommand{\bE}{\mathbf{E}} \newcommand{\bF}{\mathbf{F}} \newcommand{\bG}{\mathbf{G}} \newcommand{\bH}{\mathbf{H}} \newcommand{\bI}{\mathbf{I}} \newcommand{\bJ}{\mathbf{J}} \newcommand{\bK}{\mathbf{K}} \newcommand{\bL}{\mathbf{L}} \newcommand{\bM}{\mathbf{M}} \newcommand{\bN}{\mathbf{N}} \newcommand{\bP}{\mathbf{P}} \newcommand{\bQ}{\mathbf{Q}} \newcommand{\bR}{\mathbf{R}} \newcommand{\bS}{\mathbf{S}} \newcommand{\bT}{\mathbf{T}} \newcommand{\bU}{\mathbf{U}} \newcommand{\bV}{\mathbf{V}} \newcommand{\bW}{\mathbf{W}} \newcommand{\bX}{\mathbf{X}} \newcommand{\bY}{\mathbf{Y}} \newcommand{\bZ}{\mathbf{Z}} \newcommand{\calA}{\mathcal{A}} \newcommand{\calB}{\mathcal{B}} \newcommand{\calC}{\mathcal{C}} \newcommand{\calD}{\mathcal{D}} \newcommand{\calE}{\mathcal{E}} \newcommand{\calF}{\mathcal{F}} \newcommand{\calG}{\mathcal{G}} \newcommand{\calH}{\mathcal{H}} \newcommand{\calI}{\mathcal{I}} \newcommand{\calJ}{\mathcal{J}} \newcommand{\calK}{\mathcal{K}} \newcommand{\calL}{\mathcal{L}} \newcommand{\calM}{\mathcal{M}} \newcommand{\calN}{\mathcal{N}} \newcommand{\calO}{\mathcal{O}} \newcommand{\calP}{\mathcal{P}} \newcommand{\calQ}{\mathcal{Q}} \newcommand{\calR}{\mathcal{R}} \newcommand{\calS}{\mathcal{S}} \newcommand{\calT}{\mathcal{T}} \newcommand{\calU}{\mathcal{U}} \newcommand{\calV}{\mathcal{V}} \newcommand{\calW}{\mathcal{W}} \newcommand{\calX}{\mathcal{X}} \newcommand{\calY}{\mathcal{Y}} \newcommand{\calZ}{\mathcal{Z}} \newcommand{\R}{\mathbb{R}} \newcommand{\C}{\mathbb{C}} \newcommand{\N}{\mathbb{N}} \newcommand{\Z}{\mathbb{Z}} \newcommand{\F}{\mathbb{F}} \newcommand{\Q}{\mathbb{Q}} \DeclareMathOperator*{\argmax}{arg\,max} \DeclareMathOperator*{\argmin}{arg\,min} \newcommand{\nnz}[1]{\mbox{nnz}(#1)} \newcommand{\dotprod}[2]{\langle #1, #2 \rangle} \newcommand{\ignore}[1]{} \let\Pr\relax \DeclareMathOperator*{\Pr}{\mathbf{Pr}} \newcommand{\E}{\mathbb{E}} \DeclareMathOperator*{\Ex}{\mathbf{E}} \DeclareMathOperator*{\Var}{\mathbf{Var}} \DeclareMathOperator*{\Cov}{\mathbf{Cov}} \DeclareMathOperator*{\stddev}{\mathbf{stddev}} \DeclareMathOperator*{\avg}{avg} \DeclareMathOperator{\poly}{poly} \DeclareMathOperator{\polylog}{polylog} \DeclareMathOperator{\size}{size} \DeclareMathOperator{\sgn}{sgn} \DeclareMathOperator{\dist}{dist} \DeclareMathOperator{\vol}{vol} \DeclareMathOperator{\spn}{span} \DeclareMathOperator{\supp}{supp} \DeclareMathOperator{\tr}{tr} \DeclareMathOperator{\Tr}{Tr} \DeclareMathOperator{\codim}{codim} \DeclareMathOperator{\diag}{diag} \newcommand{\PTIME}{\mathsf{P}} \newcommand{\LOGSPACE}{\mathsf{L}} \newcommand{\ZPP}{\mathsf{ZPP}} \newcommand{\RP}{\mathsf{RP}} \newcommand{\BPP}{\mathsf{BPP}} \newcommand{\P}{\mathsf{P}} \newcommand{\NP}{\mathsf{NP}} \newcommand{\TC}{\mathsf{TC}} \newcommand{\AC}{\mathsf{AC}} \newcommand{\SC}{\mathsf{SC}} \newcommand{\SZK}{\mathsf{SZK}} \newcommand{\AM}{\mathsf{AM}} \newcommand{\IP}{\mathsf{IP}} \newcommand{\PSPACE}{\mathsf{PSPACE}} \newcommand{\EXP}{\mathsf{EXP}} \newcommand{\MIP}{\mathsf{MIP}} \newcommand{\NEXP}{\mathsf{NEXP}} \newcommand{\BQP}{\mathsf{BQP}} \newcommand{\distP}{\mathsf{dist\textbf{P}}} \newcommand{\distNP}{\mathsf{dist\textbf{NP}}} \newcommand{\eps}{\epsilon} \newcommand{\lam}{\lambda} \newcommand{\dleta}{\delta} \newcommand{\simga}{\sigma} \newcommand{\vphi}{\varphi} \newcommand{\la}{\langle} \newcommand{\ra}{\rangle} \newcommand{\wt}[1]{\widetilde{#1}} \newcommand{\wh}[1]{\widehat{#1}} \newcommand{\ol}[1]{\overline{#1}} \newcommand{\ul}[1]{\underline{#1}} \newcommand{\ot}{\otimes} \newcommand{\zo}{\{0,1\}} \newcommand{\co}{:} %\newcommand{\co}{\colon} \newcommand{\bdry}{\partial} \newcommand{\grad}{\nabla} \newcommand{\transp}{^\intercal} \newcommand{\inv}{^{-1}} \newcommand{\symmdiff}{\triangle} \newcommand{\symdiff}{\symmdiff} \newcommand{\half}{\tfrac{1}{2}} \newcommand{\bbone}{\mathbbm 1} \newcommand{\Id}{\bbone} \newcommand{\SAT}{\mathsf{SAT}} \newcommand{\bcalG}{\boldsymbol{\calG}} \newcommand{\calbG}{\bcalG} \newcommand{\bcalX}{\boldsymbol{\calX}} \newcommand{\calbX}{\bcalX} \newcommand{\bcalY}{\boldsymbol{\calY}} \newcommand{\calbY}{\bcalY} \newcommand{\bcalZ}{\boldsymbol{\calZ}} \newcommand{\calbZ}{\bcalZ} $$

CMU Online

Publicly Accessible CMU Courses

Education and knowledge should be accessible to anyone who is willing to learn. There are many great course offerings from the top-ranked Computer Science program in Carnegie Mellon University whose lectures are available on the public domain. The purpose of this page is to curate these courses, so people can self-study and learn from the best professors in the world without the hefty college price tag. I hope that this resource will be helpful for both people currently in industry aiming to improve their knowledge, current students to make course decisions or to do self-learning, and even prospective students to get a taste of what a CMU education looks like. In addition to courses, I have also included links to weekly seminar series which are held by various research groups for people interested in getting closer to state-of-the-art developments.

Courses which are accessible to undergraduates will have a tag. This does not mean that the difficulty is not suitable for graduate students; on the contrary, almost all such classes are cross-listed as graduate courses and are commonly taken by graduate students at CMU. There are also some graduate classes (based on the course number) which are suitable for advanced undergraduates which I marked as undergraduate as well. Classes that require significant background and which are less accessible will have a tag.

Some courses are only publicly available on Panopto instead of Youtube, in which case I provided a link to the Panopto playlist. Don’t feel too sad that this is the case, because it can be more conducive to use Panopto than Youtube due to the ability to customize speaker and presentation view.

Within each category, the courses are not ordered in any particular way that is indicative of difficulty or the order in which they should be taken. Consider it to be arbitrary.

Please let me know if there are any dead links, or if you spot an error with the page. In addition, if there are courses that I missed which might be useful for others, you can either modify the relevant data file here and make a pull request, or let me know via email. I will very much appreciate it!

The list of courses available publicly is heavily skewed towards which departments or faculty members make their lecture recordings publicly available, and therefore omits a lot of classes that Computer Science majors in CMU tend to take. In particular, courses from the following departments in the School of Computer Science are not present: Computational Biology, Human-Computer Interaction, Institute for Software Research.

The contents on this page is personally curated and does not reflects the views of CMU or SCS, and is also not endorsed by either parties.

Machine Learning

The machine learning courses in this section generally assumes working knowledge of probability, statistics, calculus, and linear algebra. 10-701 Introduction to Machine Learning or 10-715 Advanced Introduction to Machine Learning are the recommended pre-requisites for most of these courses if you do not have taken any prior machine learning classes. Both of these introductory courses are also available in the table below.

Course Name
10-414/714 Deep Learning Systems, Algorithms and Implementation (Fall 2022)
Instructors: Zico Kolter , Tianqi Chen

Deep learning methods have revolutionized a number of fields in Artificial Intelligence and Machine Learning in recent years. The widespread adoption of deep learning methods have in no small part been driven by the widespread availability of easy-to-use deep learning systems, such as PyTorch and TensorFlow. But despite their widespread availability and use, it is much less common for students to get involved with the internals of these libraries, to understand how they function at a fundamental level. But understanding these libraries deeply will help you make better use of their functionality, and enable you to develop or extend these libraries when needed to fit your own custom use cases in deep learning.

The goal of this course is to provide students an understanding and overview of the “full stack” of deep learning systems, ranging from the high-level modeling design of modern deep learning systems, to the basic implementation of automatic differentiation tools, to the underlying device-level implementation of efficient algorithms. Throughout the course, students will design and build from scratch a complete deep learning library, capable of efficient GPU-based operations, automatic differentiation of all implemented functions, and the necessary modules to support parameterized layers, loss functions, data loaders, and optimizers. Using these tools, students will then build several state-of-the-art modeling methods, including convolutional networks for image classification and segmentation, recurrent networks and self-attention models for sequential tasks such as language modeling, and generative models for image generation.
11-711 Advanced NLP (Fall 2022)
Instructors: Graham Neubig , Robert E. Frederking

Natural language processing technology attempts to model human language with computers, tackling a wide variety of problems from automatic translation to question answering. CS11-711 Advanced Natural Language Processing (at Carnegie Mellon University's Language Technology Institute) is an introductory graduate-level course on natural language processing aimed at students who are interested in doing cutting-edge research in the field. In it, we describe fundamental tasks in natural language processing such as syntactic, semantic, and discourse analysis, as well as methods to solve these tasks. The course focuses on modern methods using neural networks, and covers the basic modeling and learning algorithms required therefore. The class culminates in a project in which students attempt to reimplement and improve upon a research paper in a topic of their choosing.
10-708 Probabilistic Graphical Models (Spring 2020)
Instructor: Eric P. Xing

Many of the problems in artificial intelligence, statistics, computer systems, computer vision, natural language processing, and computational biology, among many other fields, can be viewed as the search for a coherent global conclusion from local information. The probabilistic graphical models' framework provides a unified view for this wide range of problems, enabling efficient inference, decision-making, and learning in problems with a very large number of attributes and huge datasets. This graduate-level course will provide you with a strong foundation for both applying graphical models to complex problems and for addressing core research topics in graphical models. The class will cover classical families of undirected and directed graphical models (i.e. Markov Random Fields and Bayesian Networks), modern deep generative models, as well as topics in graph neural networks and causal inference. It will also cover the necessary algorithmic toolkit, including variational inference and Markov Chain Monte Carlo methods.
10-725 Convex Optimization (Fall 2022)
Instructor: Yuanzhi Li

Public Panopto Lecture Videos Link
Convex optimization was an extremely useful tool in Machine Learning, capable of solving many real-world problems efficiently, such as linear regression, logistic regression, linear programming, SDP programming etc. However, in modern machine learning, as the non-convex neural networks growing to be the dominating models in this field, principles developed in traditional convex optimization are now becoming insufficient.

In these lectures, you will see a convex optimization course that hasn't been taught (or close to being taught) anywhere else: This newly designed course spans the old topics in convex optimization such as Gradient Descent, Stochastic Gradient Descent, Mirror Descent, Accelerated Gradient Descent, Interior Point Method, and making connections all the way to optimizations topics in deep learning, such as Neural Tangent Kernel, Scheduled Learning Rate, Momentums, Batch normalization and Linear Reinforcement Learning.

In the end of the course, you will have a sense about the spirit of convex optimization, and how it can still be applied to other real-world problems that are not convex at all.
10-701 Introduction to Machine Learning (Fall 2020)
Instructors: Ziv Bar-Joseph , Eric P. Xing

Machine Learning is concerned with computer programs that automatically improve their performance through experience (e.g., programs that learn to recognize human faces, recommend music and movies, and drive autonomous robots). This course covers the theory and practical algorithms for machine learning from a variety of perspectives. We cover topics such as Linear Regression, SVMs, Neural Networks, Graphical Models, Clustering, etc. Programming assignments include hands-on experiments with various learning algorithms. This course is designed to give a PhD-level student a thorough grounding in the methodologies, technologies, mathematics and algorithms currently needed by people who do research in machine learning.
10-715 Advanced Introduction to Machine Learning (Fall 2015)
Instructors: Barnabas Poczos , Alex Smola

The rapid improvement of sensory techniques and processor speed, and the availability of inexpensive massive digital storage, have led to a growing demand for systems that can automatically comprehend and mine massive and complex data from diverse sources. Machine Learning is becoming the primary mechanism by which information is extracted from Big Data, and a primary pillar that Artificial Intelligence is built upon.

This course is designed for Ph.D. students whose primary field of study is machine learning, or who intend to make machine learning methodological research a main focus of their thesis. It will give students a thorough grounding in the algorithms, mathematics, theories, and insights needed to do in-depth research and applications in machine learning. The topics of this course will in part parallel those covered in the general graduate machine learning course (10-701), but with a greater emphasis on depth in theory and algorithms. The course will also include additional advanced topics such as RKHS and representer theory, Bayesian nonparametrics, additional material on graphical models, manifolds and spectral graph theory, reinforcement learning and online learning, etc.
11-485/785 Introduction to Deep Learning (Spring 2023)
Instructors: Bhiksha Raj , Rita Singh

"Deep Learning" systems, typified by deep neural networks, are increasingly taking over all the AI tasks, ranging from language understanding, speech and image recognition, to machine translation, planning, and even game playing and autonomous driving. As a result, expertise in deep learning is fast changing from an esoteric desirable to a mandatory prerequisite in many advanced academic settings, and a large advantage in the industrial job market. In this course we will learn about the basics of deep neural networks, and their applications to various AI tasks. By the end of the course, it is expected that students will have significant familiarity with the subject, and be able to apply Deep Learning to a variety of tasks. They will also be positioned to understand much of the current literature on the topic and extend their knowledge through further study.
10-703 Deep Reinforcement Learning & Control (Fall 2022)
Instructor: Katerina Fragkiadaki

Public Panopto Lecture Videos Link
This course will cover latest advances in Reinforcement Learning and Imitation learning. This is a fast developing research field and an official textbook is available only for about one forth of the course material. The rest will be taught from recent research papers. This course brings together many disciplines of Artificial Intelligence to show how to develop intelligent agent that can learn to sense the world and learn to act imitating others or maximizing sparse rewards Particular focus will be given in incorporating visual sensory input and learning suitable visual state representations.
11-777 Multi-Modal Machine Learning (Fall 2020)
Instructor: Louis-Philippe Morency

Multimodal machine learning (MMML) is a vibrant multi-disciplinary research field which addresses some of the original goals of artificial intelligence by integrating and modeling multiple communicative modalities, including linguistic, acoustic, and visual messages. With the initial research on audio-visual speech recognition and more recently with language & vision projects such as image and video captioning, this research field brings some unique challenges for multimodal researchers given the heterogeneity of the data and the contingency often found between modalities. This course will teach fundamental mathematical concepts related to MMML including multimodal alignment and fusion, heterogeneous representation learning and multistream temporal modeling. We will also review recent papers describing state-of-the-art probabilistic models and computational algorithms for MMML and discuss the current and upcoming challenges. The course will present the fundamental mathematical concepts in machine learning and deep learning relevant to the five main challenges in multimodal machine learning: (1) multimodal representation learning, (2) translation & mapping, (3) modality alignment, (4) multimodal fusion and (5) co-learning. These include, but not limited to, multimodal auto-encoder, deep canonical correlation analysis, multi-kernel learning, attention models and multimodal recurrent neural networks. The course will also discuss many of the recent applications of MMML including multimodal affect recognition, image and video captioning and cross-modal multimedia retrieval.
10-605/805 Machine Learning with Large Datasets (Fall 2016)
Instructor: William Cohen

Large datasets are difficult to work with for several reasons. They are difficult to visualize, and it is difficult to understand what sort of errors and biases are present in them. They are computationally expensive to process, and often the cost of learning is hard to predict - for instance, and algorithm that runs quickly in a dataset that fits in memory may be exorbitantly expensive when the dataset is too large for memory. Large datasets may also display qualitatively different behavior in terms of which learning methods produce the most accurate predictions. This course is intended to provide a student practical knowledge of, and experience with, the issues involving large datasets. Among the issues considered are: scalable learning techniques, such as streaming machine learning techniques; parallel infrastructures such as map-reduce; practical techniques for reducing the memory requirements for learning methods, such as feature hashing and Bloom filters; and techniques for analysis of programs in terms of memory, disk usage, and (for parallel methods) communication complexity.
15-388 Practical Data Science (Fall 2019)
Instructor: Zico Kolter

Public Panopto Lecture Videos Link
Data science is the study and practice of how we can extract insight and knowledge from large amounts of data. It is a burgeoning field, currently attracting substantial demand from both academia and industry.

This course provides a practical introduction to the “full stack” of data science analysis, including data collection and processing, data visualization and presentation, statistical model building using machine learning, and big data techniques for scaling these methods. Topics covered include: collecting and processing data using relational methods, time series approaches, graph and network models, free text analysis, and spatial geographic methods; analyzing the data using a variety of statistical and machine learning methods include linear and non-linear regression and classification, unsupervised learning and anomaly detection, plus advanced machine learning methods like kernel approaches, boosting, or deep learning; visualizing and presenting data, particularly focusing the case of high-dimensional data; and applying these methods to big data settings, where multiple machines and distributed computation are needed to fully leverage the data.
15-780 Graduate Artificial Intelligence (Spring 2014)
Instructors: Zico Kolter , Zachary B. Rubinstein

CMU AI Seminar (Ongoing)

Link to Youtube Channel
This seminar aims to cover a wide variety of AI topics, such as computer vision, natural language processing (NLP), theoretical ML, AI fairness & ethics, cognitive science, AI hardware, etc.

Computer Systems

15-213 Introduction to Computer Systems is a required course for all CS majors in CMU, and is the pre-requisite for all subsequent systems classes. It is a good place to start if you are new to computer systems. In most other universities, this class is usually referred to as an operating systems class.

Course Name
15-213/513 Introduction to Computer Systems (Fall 2015)
Instructors: Randal E. Bryant , David R. O'Hallaron

The Introduction to Computer Systems course provides a programmer's view of how computer systems execute programs, store information, and communicate. It enables students to become more effective programmers, especially in dealing with issues of performance, portability and robustness. It also serves as a foundation for courses on compilers, networks, operating systems, and computer architecture, where a deeper understanding of systems-level issues is required. Topics covered include: machine-level code and its generation by optimizing compilers, performance evaluation and optimization, computer arithmetic, memory organization and management, networking technology and protocols, and supporting concurrent computation.
15-445/645 Database Systems (Fall 2022)
Instructor: Andy Pavlo

This course is on the design and implementation of database management systems. Topics include data models (relational, document, key/value), storage models (n-ary, decomposition), query languages (SQL, stored procedures), storage architectures (heaps, log-structured), indexing (order preserving trees, hash tables), transaction processing (ACID, concurrency control), recovery (logging, checkpoints), query processing (joins, sorting, aggregation, optimization), and parallel architectures (multi-core, distributed). Case studies on open-source and commercial database systems are used to illustrate these techniques and trade-offs. The course is appropriate for students that are prepared to flex their strong systems programming skills.
15-721 Advanced Database Systems (Spring 2023)
Instructor: Andy Pavlo

This course is a comprehensive study of the internals of modern database management systems. It will cover the core concepts and fundamentals of the components that are used in both high-performance transaction processing systems (OLTP) and large-scale analytical systems (OLAP). The class will stress both efficiency and correctness of the implementation of these ideas. All class projects will be in the context of a real in-memory, multi-core database system. The course is appropriate for graduate students in software systems and for advanced undergraduates with dirty systems programming skills.
15-418/618 Parallel Computer Architecture and Programming (Spring 2016)
Instructors: Kayvon Fatahalian , Randal E. Bryant

Public Panopto Lecture Videos Link
From smart phones, to multi-core CPUs and GPUs, to the world's largest supercomputers, parallel processing is ubiquitous in modern computing. The goal of this course is to provide a deep understanding of the fundamental principles and engineering trade-offs involved in designing modern parallel computing systems as well as to teach parallel programming techniques necessary to effectively utilize these machines. Because writing good parallel programs requires an understanding of key machine performance characteristics, this course will cover hardware design and how that affects software design.
¡Databases! – A Database Seminar Series (Fall 2020)

A weekly seminar series on databases hosted by CMU, streamed on Zoom and open to the public. Each speaker will present the implementation details of their respective systems and examples of the technical challenges they faced when working with real-world customers.

It is possible that similar such seminars will be held in the future, check their Youtube channel for the latest updates.

Computer Science Theory

15-251 Great Ideas in Theoretical Computer Science is the introductory CS theory class taken by CS majors in their first year, and would be a good place to start if you are new to CS theory. All subsequent theory classes assumes knowledge from 15-251.

Course Name
15-251 Great Ideas in Theoretical Computer Science (Spring 2016)
Instructor: Ryan O'Donnell

This course is about how to use theoretical ideas to formulate and solve problems in computer science. It integrates mathematical material with general problem solving techniques and computer science applications. Examples are drawn from algorithms, complexity theory, game theory, probability theory, graph theory, automata theory, algebra, cryptography, and combinatorics. Assignments involve both mathematical proofs and programming.
15-455 Undergraduate Complexity Theory (Spring 2017)
Instructor: Ryan O'Donnell

The course will overlap significantly with "Part 3: Complexity" of Sipser's textbook. Topics will include models of computation (Turing Machines, circuits, ...), time and space complexity; hierarchy theorems by diagonalization; P vs. NP; theorems of Ladner, Mahaney, Savitch, Immerman-Szelepcsényi; randomized computation; and, the polynomial-time hierarchy. We will also cover some non-traditional topics of current research interest, such as the Exponential Time Hypothesis, Feige's Hypothesis, and fine-grained complexity.
15-459 Quantum Computation (Fall 2018)
Instructor: Ryan O'Donnell

This course will be an introduction to quantum computation and quantum information theory, from the perspective of theoretical computer science. Topics include: Qubit operations, multi-qubit systems, p=artial measurements, entanglement, quantum teleportation and quantum money, quantum circuit model, Deutsch-Jozsa and Simons algorithm, number theory and Shors Algorithm, Grovers Algorithm, quantum complexity theory, limitations and current practical developments.
15-850 Advanced Algorithms (Spring 2023)
Instructor: Anupam Gupta

Public Panopto Lecture Videos Link
An intensive graduate course on the design and analysis of algorithms. This course is primarily aimed at graduate and advanced undergraduate students interested in doing research in theoretical computer science. It is more specialized and in-depth than the graduate level Algorithms in the Real World course (15-750), and the undergraduate/masters' level Algorithms course (15-451/651). We will cover advanced algorithmic ideas (some classical, some very recent), and the theory behind it (theorems, proofs). For topics that were covered in 451/750, the goal is to cover advanced content, proofs, and new techniques.
15-855 Graduate Complexity Theory (Fall 2017)
Instructor: Ryan O'Donnell
Recommended Pre-requisites: 15-445 Undergraduate Complexity Theory

The overarching goal is for students to learn the foundational aspects of structural complexity theory developed in the period 1970–2010. Students should learn the foundational relationships between time, space, and nondeterminism in computational complexity. Students will learn about circuit classes, how they relate to uniform models of computation, tools for proving circuit lower bounds, and also reasons why proving lower bounds are difficult. Students will be exposed to the pervasive role of randomness in complexity theory, even in the proof of purely deterministic results. By the end of the course, students will be able to study and contribute to cutting-edge research in computational complexity theory
15-751 Theory Toolkit (Spring 2020)
Instructor: Ryan O'Donnell

The goal of this class is to provide students with the tools needed to be able to read and participate in contemporary research in theoretical computer science. The main focus will be on learning fundamental CS Theory concepts and mathematical tools. Students should learn to apply basic techniques in asymptotic and probabilistic analysis. They should gain familiarity and comfort with a wide variety of tools in theoretical computer science, including analysis of Boolean functions, error-correcting codes, pseudorandomness, graph expansion, information theory, and communication complexity. They will learn applications of linear programming and semidefinite programming in constraint satisfaction problems, with a view from proof complexity. They will also be exposed to different computational intractability assumptions, and learn how they are applied. A secondary focus of the course will be developing good practices in mathematical writing and good research habits. A goal for the students will be to improve their facility with LaTeX and their ability to write clear correct proofs.
15-859GG Proofs vs Algorithms: The Sum-of-Squares Method (Fall 2020)
Instructor: Pravesh Kothari

This seminar examines the sum-of-squares method with an emphasis on recent developments and open directions for research. We will cover use of this method in designing algorithms for both worst-case optimization problems such as Max-Cut and Coloring and average-case problems arising in machine learning and algorithmic statistics such as learning mixtures of Gaussians and robust statistics. We will also cover techniques for proving lower-bounds on this method and its connection to restricted algorithmic frameworks such as Statistical Query Model, the Low-Degree Likelihood Ratio Method and Extended Formulations.
15-859S Analysis of Boolean Functions (Fall 2012)
Instructor: Ryan O'Donnell

Boolean functions are perhaps the most basic object of study in theoretical computer science. They also arise in several other areas of mathematics, including combinatorics (graph theory, extremal combinatorics, additive combinatorics), metric and Banach spaces, statistical physics, and mathematical social choice. In this course we will study Boolean functions via their Fourier transform and other analytic methods. Highlights will include applications in property testing, social choice, learning theory, circuit complexity, pseudorandomness, constraint satisfaction problems, additive combinatorics, hypercontractivity, Gaussian geometry, random graph theory, and probabilistic invariance principles.
Theory Lunch (Ongoing)

Link to Youtube Channel
Theory Lunch is an informal seminar run by the Algorithms and Complexity Theory Group at CMU. The meetings have various forms: talks on recently completed results, joint reading of an interesting paper, presentations of current work in progress, exciting open problems, etc.

Programming Language Theory

Unfortunately, there are not a lot of publicly available programming language (PL) theory lectures. However, Robert Harper and Jan Hoffmann, staples of the PL scene in CMU, both frequently give lectures for the Oregon Programming Languages Summer School that are much smaller in scope than what is covered in a comparable full-semester course in CMU, but still goes through many key ideas nonetheless. Therefore, I have also decided to include these resources than not have anything, which would be a shame as CMU is a powerhouse in PL after all.

Why are so many courses titled 15-819? This is because the course number is a catch-all for advanced topics in PL theory, so while the course name is the same, the content can vary.

15-312 Foundations of Programming Languages is the introductory class of the PL track, and would be a good place to start. The other introductory PL class 15-317 Constructive Logic unfortunately does not have any publicly available lectures.

Course Name
15-312 Foundations of Programming Languages (OPLSS Summer 2018)
Instructor: Jan Hoffmann

Note that videos in the playlist are from a similar set of lectures given at OPLSS, instead of CMU due to the absence of recordings.
This course discusses in depth many of the concepts underlying the design, definition, implementation, and use of modern programming languages. Formal approaches to defining the syntax and semantics are used to describe the fundamental concepts underlying programming languages. A variety of programming paradigms are covered such as imperative, functional, logic, and concurrent programming. In addition to the formal studies, experience with programming in the languages is used to illustrate how different design goals can lead to radically different languages and models of computation.
15-819 Homotopy Type Theory (Fall 2012)
Instructor: Robert Harper

This is a graduate research seminar on Homotopy Type Theory (HoTT), a recent enrichment of Intuitionistic Type Theory (ITT) to include "higher-dimensional" types. The dimensionality of a type refers to the structure of its paths, the constructive witnesses to the equality of pairs of elements of a type, which themselves form a type, the identity type. In general a type is infinite dimensional in the sense that it exhibits non-trivial structure at all dimensions: it has elements, paths between elements, paths between paths, and so on to all finite levels. Moreover, the paths at each level exhibit the algebraic structure of a (higher) groupoid, meaning that there is always the "null path" witnessing reflexivity, the "inverse" path witnessing symmetry, and the "concatenation" of paths witnessing transitivity such that group-like laws hold "up to higher homotopy". Specifically, there are higher-dimensional paths witnessing the associative, unital, and inverse laws for these operations. Altogether this means that a type is a weak ∞-groupoid.
15-816 Advanced Topics in Logic: Substructural Logics (Spring 2012)
Instructor: Frank Pfenning

Public Panopto Lecture Videos Link
This graduate course provides an introduction to substructural logics, such as linear, ordered, affine, bunched, or separation logic, with an emphasis on their applications in computer science. This includes the design and theory of programming constructs for concurrent message-passing computation and techniques for specifying and reasoning about programming languages in the form of substructural operational semantics.
15-819 Resource Aware Programming Languages (OPLSS Summer 2019)
Instructor: Jan Hoffmann

Note that videos in the playlist are from a similar set of lectures given at OPLSS, instead of CMU due to the absence of recordings.
Resource usage—the amount of time, memory, and energy a program requires for its execution—is one of the central subjects of computer science. Nevertheless, resource usage often does not play a central role in classical programming-language concepts such as operational semantics, type systems, and program logics. The first part of the course revisits these concepts to model and analyze resource usage of programs in a compositional and mathematically-precise way. The emphasis is on practical, type-based techniques that automatically inform programmers about the resource usage of their code.

The second part of the course studies probabilistic programming languages. Such languages describe probability distributions and can be used to precisely describe and analyze probabilistic models. Probabilistic languages are for instance used in statistical modeling and to facilitate probabilistic inference. The focus of the course is on the semantics and analysis of probabilistic languages. Interestingly, the techniques from the first part can be applied to automatically analyze the expected cost and results of probabilistic programs.

The third part of the course, presents an application of automatic resource bound analysis in Nomos, a programming language for implementing digital contracts. Nomos is based on session types, which are a focus in this part. Moreover, blockchains and smart contracts are briefly covered to motivate the need for resource bounds.
15-819 Computational Type Theory (OPLSS Summer 2018)
Instructor: Robert Harper

Note that videos in the playlist are from a similar set of lectures given at OPLSS, instead of CMU due to the absence of recordings.
Type theory may be presented semantically, with types specifying the behavior of programs, or axiomatically, as a formal system admitting many interpretations. From a computer science perspective, the computational semantics is of the essence, and the role of formalisms is pragmatic, the means for writing programs and building proof checkers. A central tool in either case is the method of logical relations, or Tait's method. Semantically, a wide range of typing constructs can be expressed using logical relations. Formally, logical relations are fundamental to the study of properties of languages, such as decidability of typing or termination properties.

In this course we will study a range of typing concepts using these methods, from simple typing concepts such as functions and data structures found in many programming languages, to polymorphic type theory, which accounts for abstract types, and to dependent type theory, which accounts for mathematical proofs and the theory of program modules. Specific topics will vary according to the needs and interests of the students.

Computer Graphics

Course Name
15-462/662 Computer Graphics (Fall 2020)
Instructor: Keenan Crane

For whatever reason embedding this playlist directly results in an error, so here's a direct link: https://www.youtube.com/playlist?list=PL9_jI1bdZmz2emSh0UQ5iOdT2xRHFHL7E
This course provides a comprehensive introduction to computer graphics. It focuses on fundamental concepts and techniques, and their cross-cutting relationship to multiple problem domains in graphics (rendering, animation, geometry, imaging). Topics include: sampling, aliasing, interpolation, rasterization, geometric transformations, parameterization, visibility, compositing, filtering, convolution, curves & surfaces, geometric data structures, subdivision, meshing, spatial hierarchies, ray tracing, radiometry, reflectance, light fields, geometric optics, Monte Carlo rendering, importance sampling, camera models, high-performance ray tracing, differential equations, time integration, numerical differentiation, physically-based animation, optimization, numerical linear algebra, inverse kinematics, Fourier methods, data fitting, example-based synthesis
15-458/858 Discrete Differential Geometry (Spring 2021)
Instructor: Keenan Crane

This course focuses on three-dimensional geometry processing, while simultaneously providing a first course in traditional differential geometry. Our main goal is to show how fundamental geometric concepts (like curvature) can be understood from complementary computational and mathematical points of view. This dual perspective enriches understanding on both sides, and leads to the development of practical algorithms for working with real-world geometric data. Along the way we will revisit important ideas from calculus and linear algebra, putting a strong emphasis on intuitive, visual understanding that complements the more traditional formal, algebraic treatment. The course provides essential mathematical background as well as a large array of real-world examples and applications. It also provides a short survey of recent developments in digital geometry processing and discrete differential geometry. Topics include: curves and surfaces, curvature, connections and parallel transport, exterior algebra, exterior calculus, Stokes’ theorem, simplicial homology, de Rham cohomology, Helmholtz-Hodge decomposition, conformal mapping, finite element methods, and numerical linear algebra. Applications include: approximation of curvature, curve and surface smoothing, surface parameterization, vector field design, and computation of geodesic distance.


Course Name
16-715 Advanced Robot Dynamics and Simulation (Fall 2022)
Instructor: Zachary Manchester

This course explores the fundamental mathematics behind modeling the physics of robots, as well as state-of-the-art algorithms for robot simulation. We will review classical topics like Lagrangian mechanics and Hamilton’s Principle of Least Action, as well as modern computational methods like discrete mechanics and fast linear-time algorithms for dynamics simulation. A particular focus of the course will be rigorous treatments of 3D rotations and non-smooth contact interactions (impacts and friction) that are so prevalent in robotics applications. We will use numerous case studies to explore these topics, including quadrotors, fixed-wing aircraft, wheeled vehicles, quadrupeds, humanoids, and manipulators. Homework assignments will focus on practical implementation of algorithms and a course project will encourage students to apply simulation methods to their own research.
16-745 Optimal Control and Reinforcement Learning (Spring 2022)
Instructor: Zachary Manchester

This is a course about how to make robots move through and interact with their environment with speed, efficiency, and robustness. We will survey a broad range of topics from nonlinear dynamics, linear systems theory, classical optimal control, numerical optimization, state estimation, system identification, and reinforcement learning. The goal is to provide students with hands-on experience applying each of these ideas to a variety of robotic systems so that they can use them in their own research.
Robotics Institute Seminar Series (Ongoing)

Robotics Seminar series hosted by Carnegie Mellon University's Robotics Institute.

Mathematics and Statistics

In general, the math department does not record any of their lectures, so this portion is pretty dry. However, it was worth having it just to include Po-Shen Loh’s lectures. He coaches the USA IMO team and is well-loved by his students.

Course Name
21-228 Discrete Mathematics Computer Graphics (Sping 2021)
Instructor: Po-Shen Loh

Note that videos in the playlist are in reverse chronological order.
Combinatorics, which concerns the study of discrete structures such as sets and networks, is as ancient as humanity's ability to count. Yet although in the beginning, combinatorial problems were solved by pure ingenuity, today that alone is not enough. A rich variety of powerful methods have been developed, often drawing inspiration from other fields such as probability, analysis, algorithms, and even algebra and topology.

This course provides a general introduction to this beautiful subject. Students will learn both classical methods for clever counting, as well as how to apply more modern methods to analyze recursions and sequences. Students will also learn fundamental techniques in graph theory. Throughout the course, students will encounter novel challenges that blur the lines between combinatorics and other subjects, such as number theory, probability, and geometry, so that they develop the skills to creatively combine seeming disparate areas of mathematics.

This course is structured around challenge. Lecture topics are hand-picked to reflect the rather advanced ability level of the general CMU student, and consequently, much of the curriculum sequence is original. Homework and exam problems are particularly difficult, and require creative problem-solving rather than application of learned techniques. To encourage students to truly develop these skills, collaboration is encouraged on homework, and exams (which are non-collaborative) will be open-notes.
21-738 Extremal Combinatorics (Spring 2022)
Instructor: Po-Shen Loh

Note that videos in the playlist are in reverse chronological order.
This is one of the core graduate courses in advanced combinatorial methods, for the Algorithms, Combinatorics, and Optimization program at CMU. As such, it will be a rigorous and challenging introduction to extremal combinatorics, aimed to provide the necessary background for cutting-edge research in this area. Typical problems concern the maximum or minimum values attained by common graph parameters over certain interesting families of combinatorial objects.

Extremal results are not only interesting in their own right, but are also useful in applications because sometimes one can already draw valuable conclusions from the combinatorial basis of a problem with deeper mathematical structure. In that case, the most easily accessible graph-theoretical properties are often the values of the graph parameters, such as the number of edges, diameter, size of the largest set of pairwise non-adjacent vertices, etc. It is therefore useful to understand what other properties are already implied once certain graph parameters lie in certain ranges.
36-705 Intermediate Statistics (Fall 2016)
Instructor: Larry Wasserman

This course covers the fundamentals of theoretical statistics. Topics include: probability inequalities, point and interval estimation, minimax theory, hypothesis testing, data reduction, convergence concepts, Bayesian inference, nonparametric statistics, bootstrap resampling, VC dimension, prediction and model selection.