Subject Updates Fall 2023

    The following subjects will be offered:


    6.C25 Real World Computation with Julia (was 6.S083)
    • Level: Undergraduate
    • Units: 3-0-9
    • Prereqs: 6.100A, 18.03, and 18.06
    • Instructor: Professor Alan Edelman
    • Schedule: TR1-2:30, room 2-190
    • Satisfies: AUS2, II, Concentration in Numerical Methods

    Description

    Focuses on algorithms and techniques for writing and using modern technical software in a job, lab, or research group environment that may consist of interdisciplinary teams, where performance may be critical, and where the software needs to be flexible and adaptable. Topics include automatic differentiation, matrix calculus, scientific machine learning, parallel and GPU computing, and performance optimization with introductory applications to climate science, economics, agent-based modeling, and other areas. Labs and projects focus on performant, readable, composable algorithms, and software. Programming will be in Julia. Expects students to have some familiarity with Python, Matlab, or R. No Julia experience necessary.


    6.C27/6.C67 Computational Imaging: Physics and Algorithms (was 6.S045)
    • Level: Undergraduate/Graduate
    • Units: 3-0-9
    • Prereqs: 18.C06 and (1.00, 1.000, 2.086, 3.019, or 6.100A)
    • Instructor: Professors George Barbastathis (gbarb@mit.edu), Rajeev Ram (rajeev@mit.edu), Sixian You (sixian@mit.edu), James LeBeau (lebeau@mit.edu)
    • Schedule: Lec: MW11, room 36-156; Rec: F11, room 34-304
    • Satisfies: UG: AUS; EE Track:Systems Science; G: AAGS, Concentration subject in Applied Physics

    Description

    Explores the contemporary computational understanding of imaging: encoding information about a physical object onto a form of radiation, transferring the radiation through an imaging system, converting it to a digital signal, and computationally decoding and presenting the information to the user. Introduces a unified formulation of computational imaging systems as a three-round “learning spiral”: the first two rounds describe the physical and algorithmic parts in two exemplary imaging systems. The third round involves a class project on an imaging system chosen by students. Undergraduate and graduate versions share lectures but have different recitations. Involves optional “clinics” to even out background knowledge of linear algebra, optimization, and computational imaging-related programming best practices for students of diverse disciplinary backgrounds. Students taking graduate version complete additional assignments.


    6.S043/6.S953 Machine Learning for Drug Discovery (Cancelled FT23)
    • Level: Undergraduate/Graduate
    • Units: 3-0-9
    • Prereqs: 6.3900 or 6.C01, or equivalents
    • Instructor: Prof. Regina Barzilay (regina@csail.mit.edu)
    • Schedule:
    • Satisfies: AUS (U); 6-4 AUS (U); Concentration subject in BioEECS (G), AAGS (G), grad_AUS (G)

    Description

    Covers concepts and machine learning algorithms for design and development of therapeutics, ranging from early molecular discovery to optimization of clinical trials.  Emphasis on learning neural representations of biological objects (e.g., small molecules, proteins, cells) and modeling their interactions. Covers machine learning for property prediction, de-novo molecular design, binding, docking, target discovery, and methods for experimental design. Students enrolled in the graduate version complete additional assignments.


    6.S062 Generative Machine Learning in K-12 Education (meets with MAS.S10, MAS.S60)
    • Level: Undergraduate
    • Units: 2-0-10
    • Prereqs: 6.100A or permission of instructor
    • Instructor:  Prof. Hal Abelson (hal@mit.edu); Prof. Randy Davis (davis@csail.mit.edu), Prof. Cynthia Breazeal (cynthiab@media.mit.edu)
    • Schedule:  T1-3, room 34-303
    • Satisfies: II

    Description

    The introduction of transformer architectures in 2017 triggered an evolution in machine learning that today lets anyone make original computer-generated essays, stories, pictures, videos and programs, all without the need to code.  Participants in this class will get a foundation to this technology and explore new opportunities that it enables In K-12 education.  Much of the work will be project-based involving the construction of new learning tools and testing them with K-12 students and their teachers.


    6.S890 Topics in Multiagent Learning
    • Level: Graduate
    • Units: 3-0-9
    • Prereqs: 6.1220 or 6.7201, 6.1200
    • Instructors: Professors Costis Daskalakis (costs@csail.mit.edu) and Gabriele Farina
    • Schedule: TR11-12:30, room 3-333
    • Satisfies: AUS2, II, 6-4 AUS, AAGS, 6-3 Track in Theory, Concentration subject in Theoretical CS, AI

    Description

    Presents research topics at the interface of computer science, machine learning, and game theory. Explores computational aspects of strategic behavior for decision-makers in nonstationary multiagent environments. Explores game-theoretic notions of optimality that are applicable to these settings, and how decision-makers may learn optimal strategies from repeated interaction. Presents equilibrium computation algorithms, complexity barriers for equilibria and fixed points, the theory of learning in games, and multi-agent reinforcement learning. Presents practical aspects of learning in games, including recent progress in solving Go, Poker and other large games. 


    6.S891 Algorithmic Counting and Sampling: Probability, Polynomials, and More
    • Level: Graduate
    • Units: 3-0-9
    • Prereqs: 6.1220 and (6.3700 or 6.3702 or 6.7700) or permission of instructor
    • Instructors: Professor Kuikui Liu (liukui@mit.edu)
    • Schedule:TR9:30-11, room 34-301
    • Satisfies: AUS2, II, 6-4 AUS, AAGS, 6-3 Track in Theory, Concentration subject in Theoretical CS, AI

    Description

    This course introduces the modern theory of algorithms for sampling from high-dimensional probability distributions and estimating their partition functions. These fundamental algorithmic primitives lie at the foundation of statistics, physics, and machine learning. We will study a diverse set of algorithmic paradigms for solving these problems based on Markov chains, decay of correlations, geometry of polynomials, and more. We will further study the rich set of analytic, probabilistic, and algebraic tools used to give performance guarantees to these algorithms.


    6.S896 Algorithmic Statistics
    • Level: Graduate
    • Units: 3-0-9
    • Prereqs: 6.1220 and 18.600 and 18.200 (other probability and linear algebras subjects also accepted), graduate level mathematical maturity
    • Instructors: Professors Sam Hopkins (samhop@mit.edu) and Costis Daskalakis (costis@csail.mit.edu)
    • Schedule: TR2:30-4, room 32-124
    • Satisfies: II, 6-3 Track:Theory, AAGS, Concentration subject in Theoretical CS or AI

    Description

    Introduction to algorithms and computational complexity for high-dimensional statistical inference problems, with focus on provable polynomial-time guarantees. Covers modern algorithm design techniques via convex programming and Sum of Squares method, graphical models as a language to describe complex but tractable high-dimensional learning problems and associated learning algorithms, and basics of complexity for statistical problems, including statistical query and low-degree lower bounds and reductions.


    6.S898 Deep Learning
    • Level: Graduate
    • Units: 3-0-9
    • Prereqs: (6.3900[6.036] or 6.C01 or 6.3720[6.401]) and (6.3700[6.041] or 6.3800[6.008] or 18.05) and (18.C06 or 18.06)
    • Instructor: Professor Phillip Isola (phillipi@mit.edu) and Prof. Sara Beery (beery@mit.edu)
    • Schedule: TR1-2:30, room 2-190
    • Satisfies: AUS2, II; AAGS, grad_AUS2; Concentration Subject in AI

    Description

    Fundamentals of deep learning, including both theory and applications. Topics include neural net architectures (MLPs, CNNs, RNNs, graph nets, transformers), geometry and invariances in deep learning, backpropagation and automatic differentiation, learning theory and generalization in high-dimensions, and applications to computer vision, natural language processing, and robotics.


    6.S951 Modern Mathematical Statistics
    • Level: Graduate
    • Units: 3-0-9
    • Prereqs: (6.S06 or 18.C06) and (6.3700 or 18.600) and (18.650 or 6.3720) and 18.100 or equivalent
    • Instructors:  Prof. Martin Wainwright (mjwain@mit.edu)
    • Schedule: TR9:3-11, room 36-156
    • Satisfies: AUS; 6-4 AUS; AAGS; AI & Signal Systems Concentration

    Description

    Graduate class in mathematical statistics targeted to students interested in statistical research.   Requires previous exposure to undergraduate statistics (e.g., 18.650/6.3720), along with strong undergraduate background in linear algebra, probability, and real analysis.  Emphasis on proofs and fundamental understanding.


    6.S954 Algorithmic Lower Bounds: Fun with Hardness Proofs
    • Level: Graduate
    • Units: 3-0-9
    • Prereqs: 6.1210
    • Instructor: Prof. Erik Demaine (edemaine@mit.edu)
    • Schedule:  MW3-4:30, room 32-082
    • Satisfies: grad_AUS; AAGS; CS Theory Track, Theoretical CS Concentration            

    Description

    A practical algorithmic approach to proving problems computationally hard for various complexity classes: P, NP, ASP, #P, APX, W[1], PSPACE, EXPTIME, PPAD, R. Variety of hardness proof styles, reductions, and gadgets. Hardness of approximation, counting solutions, and fixed-parameter algorithms. Connection between games and computation, with many examples drawn from games and puzzles.


    6.S955 Applied Numerical Algorithms
    • Level: Graduate
    • Units:  3-0-9
    • Prereqs: linear algebra, multivariable differential calculus, and basic coding
    • Instructor:  Prof. Justin Solomon (jsolomon@mit.edu)
    • Schedule: TR2:30-4, room 24-307
    • Satisfies: AUS; 6-3 CS Theory Track; 6-4 AUS; AAGS; Graphics HCI, Numerical Methods Concentration

    Description

    Broad survey of numerical algorithms used in graphics, vision, robotics, machine learning, and scientific computing, with emphasis on incorporating these algorithms into downstream applications.  We focus on challenges that arise in applying/implementing numerical algorithms and recognizing which numerical methods are relevant to different applications.  Topics include numerical linear algebra (QR, LU, SVD matrix factorizations; eigenvectors; conjugate gradients), ordinary and partial differential equations (divided differences, finite element method), and nonlinear systems and optimization (gradient descent, Newton/quasi-Newton methods, gradient-free optimization, constrained optimization).  Examples and case studies drawn from the computer science literature.


    6.S979 Values and AI: Accidents, Alignment, and Misuse
    • Level: Graduate
    • Units: 2-0-10
    • Prereqs: 6.7900 or permission of instructor
    • Instructor: Professor Dylan Hadfield-Menell, (dhm@csail.mit.edu)
    • Schedule: T1-3, room 34-301
    • Satisfies: AAGS; Concentration Subject in AI

    Description

    An interdisciplinary graduate seminar course that covers the interactions between values and the development of artificial intelligence tools. The course will focus on the intersection between technology, ethics, and law. We will consider several applications of AI, with a focus on recommendation systems, large language models, and autonomous vehicles. Topics will include: preference elicitation, (cooperative) inverse reinforcement learning, planning under uncertainty, uncertainty quantification, reward design, qualitative evaluation, interpretable machine learning, regulation of emerging technology, human-robot interaction, principal-agent problems, societal feedback loops, existential risk from advanced AI, multi-agent cooperation, participatory design, democratic AI. Students will be graded on short reading quizzes, a paper presentation, and a final paper.


    6.S980 Machine Learning for Inverse Graphics
    • Level: Graduate
    • Units: 3-0-9
    • Prereqs: 6.3900[6.036] OR permission of instructor
    • Instructor: Professor Vincent Sitzmann (sitzmann@mit.edu)
    • Schedule: TR2:30-4, room 4-270
    • Satisfies:  AAGS, grad_AUS2; Concentration subject in AI or Graphics & HCI

    Description

    From a single picture, humans reconstruct a mental representation of the underlying 3D scene that is incredibly rich in information such as shape, appearance, physical properties, purpose, how things would feel, smell, sound, etc. These mental representations allow us to understand, navigate, and interact with our environment in our everyday lives. We learn this from little supervision, mainly by interacting with our world and observing the world around us.

    Emerging neural scene representations aim to build models that replicate this behavior: Trained in a self-supervised manner, the goal is to reconstruct rich representations of 3D scenes that can then be used in downstream tasks such as computer vision, robotics, and graphics. 

    This course covers fundamental and advanced techniques in this field at the intersection of computer vision, computer graphics, and deep learning. It will lay the foundations of how cameras see the world, how we can represent 3D scenes for artificial intelligence, how we can learn to reconstruct these representations from only a single image, how we can guarantee certain kinds of generalization, and how we can train these models in a self-supervised way.


    6.S981 Introduction to Program Synthesis
    • Level: Graduate
    • Units: 3-0-9
    • Prereqs: 6.1010, 6.1200 or equivalent
    • Instructor: Prof. Armando Solar Lezama (asolar@)csail.mit.edu)
    • Schedule: TR1-2:30, room 26-328
    • Satisfies: grad_AUS, 6-3 Track: Programming Principles and Tools, II, AAGS, Concentration subject in Computer Systems

    Description

    The goal of this course is to provide a comprehensive introduction to the field of Software synthesis, an emerging field that sits at the intersection of programming systems, formal methods and artificial intelligence. The course will be divided into three major sections: the first will focus on program induction from examples and will cover a variety of techniques to search large program spaces. The second section will focus on synthesis from expressive specifications and the interaction between synthesis and verification. Finally, the third unit will focus on synthesis with quantitative specifications and the intersection between program synthesis and machine learning. The course will be graded on the basis of three problem sets and an open ended final project.


    6.5630 Advanced Topics in Cryptography
    • Level: Graduate
    • Units: 3-0-9
    • Prereqs: 6.5620
    • Instructor: Professor Yael Tauman Kalai (tauman@mit.edu)
    • Schedule: Fridays 1-4, 26-322

    Description

    In this course we will learn about the evolution of proofs in computer science.  In particular, we will learn about fascinating proof systems such as interactive proofs, multi-prover interactive proofs and probabilistically checkable proofs.  We will then show how to use cryptography to convert these powerful proof systems into succinct non-interactive arguments (SNARGs).  


    6.5940 TinyML and Efficient Deep Learning Computing (was 6.S965)
    • Level: Graduate
      Units: 3-0-9
    • Prereqs: 6.1910 and 6.3900
    • Instructor: Professor Song Han
    • Schedule: TR3.30-5, room 36-156
    • Satisfies: AAGS, grad_AUS2; Concentration subject in Computer Systems or AI

    Description

    Introduces efficient deep learning computing techniques that enable powerful deep learning applications on resource-constrained devices. Topics include model compression, pruning, quantization, neural architecture search, distributed training, data/model parallellism, gradient compression, on-device fine-tuning. It also introduces application-specific acceleration techniques for video recognition, point cloud, and generative AI (diffusion model, LLM). Students will get hands-on experience accelerating deep learning applications with an open-ended design project.


    6.7120/6.7121 Principles of Modeling, Computing and Control for Decarbonized Electric Energy Systems (was 6.S082 & 6.S967)
    • Level: Grad & Undergraduate
    • Units: 4-0-8
    • Prereqs; 6.2200, (6.2000 and 6.3100), or permission of instructor
    • Instructor: Professor Marija Ilic, (ilic@mit.edu)
    • Schedule: Lec: MW10:30-12, room 26-322; Recitation: R11 (26-314)
    • Satisfies: AAGS Control; Concentration in Control; AUS2; grad AUS2

    Description

    Introduces fundamentals of electric energy systems as complex dynamical network systems. Topics include coordinated and distributed modeling and control methods for efficient and reliable power generation, delivery, and consumption; data-enabled algorithms for integrating clean intermittent resources, storage, and flexible demand, including electric vehicles; examples of network congestion management, frequency, and voltage control in electrical grids at various scales; and design and operation of supporting markets. Students taking graduate version complete additional assignments.


    6.7920 Reinforcement Learning: Foundations and Methods (was 6.246)
    • Level: Graduate
    • Units: 4-0-8
    • Prereqs: 6.3700 or permission of instructor
    • Instructor: Prof. Cathy Wu (cathywu@mit.edu)
    • Schedule: TR2:30-4, room 4-237
    • Satisfies: Concentration subject in Control

    Description

    Examines reinforcement learning (RL) as a methodology for approximately solving sequential decision-making under uncertainty, with foundations in optimal control and machine learning. Provides a mathematical introduction to RL, including dynamic programming, statistical, and empirical perspectives, and special topics. Core topics include: dynamic programming, special structures, finite and infinite horizon Markov Decision Processes, value and policy iteration, Monte Carlo methods, temporal differences, Q-learning, stochastic approximation, and bandits. Also covers approximate dynamic programming, including value-based methods and policy space methods. Applications and examples drawn from diverse domains. Focus is mathematical, but is supplemented with computational exercises. An analysis prerequisite is suggested but not required; mathematical maturity is necessary.