Subject Updates Fall 2023

    The following subjects will be offered:

    6.S043/6.S953 Machine Learning for Drug Discovery
    • Level: Undergraduate/Graduate
    • Units: 3-0-9
    • Prereqs: 6.3900 or 6.C01. or equivalents
    • Instructor: Prof. Regina Barzilay (regina@csail.mit.edu)
    • Schedule: MW2:30-4, room 34-302
    • Satisfies: AUS (U); 6-4 AUS (U); Concentration subject in BioEECS (G), AAGS (G), grad_AUS (G)

    Description

    Covers concepts and machine learning algorithms for design and development of therapeutics, ranging from early molecular discovery to optimization of clinical trials.  Emphasis on learning neural representations of biological objects (e.g., small molecules, proteins, cells) and modeling their interactions. Covers machine learning for property prediction, de-novo molecular design, binding, docking, target discovery, and methods for experimental design. Students enrolled in the graduate version complete additional assignments.


    6.S062 Generative Machine Learning in K-12 Education
    • Level: Undergraduate
    • Units: 2-0-10
    • Prereqs: 6.100A or permission of instructor
    • Instructor:  Prof. Hal Abelson (hal@mit.edu); Prof. Cynthia Breazeal (cynthiab@media.mit.edu)
    • Schedule:  T1-3, room 34-303
    • Satisfies: II

    Description

    The introduction of transformer architectures in 2017 triggered an evolution in machine learning that today lets anyone make original computer-generated essays, stories, pictures, videos and programs, all without the need to code.  Participants in this class will get a foundation to this technology and explore new opportunities that it enables In K-12 education.  Much of the work will be project-based involving the construction of new learning tools and testing them with K-12 students and their teachers


    6.S898 Deep Learning
    • Level: Graduate
    • Units: 3-0-9
    • Prereqs: (6.3900[6.036] or 6.C01 or 6.3720[6.401]) and (6.3700[6.041] or 6.3800[6.008] or 18.05) and (18.C06 or 18.06)
    • Instructor: Professor Phillip Isola (phillipi@mit.edu) and Prof. Sara Beery (beery@mit.edu)
    • Schedule: TR1-2:30, room 37-212
    • Satisfies: AUS2, II; AAGS, grad_AUS2; Concentration Subject in AI

    Description

    Fundamentals of deep learning, including both theory and applications. Topics include neural net architectures (MLPs, CNNs, RNNs, graph nets, transformers), geometry and invariances in deep learning, backpropagation and automatic differentiation, learning theory and generalization in high-dimensions, and applications to computer vision, natural language processing, and robotics.


    6.S951 Modern Mathematical Statistics
    • Level: Graduate
    • Units: 3-0-9
    • Prereqs: (6.S06 or 18.C06) and (6.3700 or 18.600) and (18.650 or 6.3720) and 18.100 or equivalent
    • Instructor:  Prof. Martin Wainwright (mjwain@mit.edu), Prof. Alexander Rakhlin (rakhlin@mit.edu), Prof. Philippe Rigollet (rigollet@math.mit.edu)
    • Schedule: TR9:3011, room 36-156
    • Satisfies: AUS; 6-4 AUS; AAGS; AI & Signal Systems Concentration

    Description

    Graduate class in mathematical statistics targeted to students interested in statistical research.   Requires previous exposure to undergraduate statistics (e.g., 18.650/6.3720), along with strong undergraduate background in linear algebra, probability, and real analysis.  Emphasis on proofs and fundamental understanding.


    6.S954 Algorithmic Lower Bounds: Fun with Hardness
    • Level: Graduate
    • Units: 3-0-9
    • Prereqs: 6.1210
    • Instructor: Prof. Erik Demaine (edemaine@mit.edu)
    • Schedule:  MW3-4:30, room 32-082
    • Satisfies: grad_AUS; AAGS; CS Theory Track, Theoretical CS Concentration            

    Description

    A practical algorithmic approach to proving problems computationally hard for various complexity classes: P, NP, ASP, #P, APX, W[1], PSPACE, EXPTIME, PPAD, R. Variety of hardness proof styles, reductions, and gadgets. Hardness of approximation, counting solutions, and fixed-parameter algorithms. Connection between games and computation, with many examples drawn from games and puzzles.


    6.S955 Applied Numerical Algorithms
    • Level: Graduate
    • Units:  3-0-9
    • Prereqs: linear algebra, multivariable differential calculus, and basic coding
    • Instructor:  Prof. Justin Solomon (jsolomon@mit.edu)
    • Schedule: TR2:30-4, room 24-307
    • Satisfies: AUS; 6-3 CS Theory Track; 6-4 AUS; AAGS; Graphics HCI, Numerical Methods Concentration

    Description

    Broad survey of numerical algorithms used in graphics, vision, robotics, machine learning, and scientific computing, with emphasis on incorporating these algorithms into downstream applications.  We focus on challenges that arise in applying/implementing numerical algorithms and recognizing which numerical methods are relevant to different applications.  Topics include numerical linear algebra (QR, LU, SVD matrix factorizations; eigenvectors; conjugate gradients), ordinary and partial differential equations (divided differences, finite element method), and nonlinear systems and optimization (gradient descent, Newton/quasi-Newton methods, gradient-free optimization, constrained optimization).  Examples and case studies drawn from the computer science literature.


    6.S980 Machine Learning for Inverse Graphics
    • Level: Graduate
    • Units: 3-0-9
    • Prereqs: 6.3900[6.036] OR permission of instructor
    • Instructor: Professor Vincent Sitzmann (sitzmann@mit.edu)
    • Schedule: TR2:30-4, room 4-270
    • Satisfies:  AAGS, grad_AUS2; AI or Graphics & HCI concentration subjects

    Description

    From a single picture, humans reconstruct a mental representation of the underlying 3D scene that is incredibly rich in information such as shape, appearance, physical properties, purpose, how things would feel, smell, sound, etc. These mental representations allow us to understand, navigate, and interact with our environment in our everyday lives. We learn this from little supervision, mainly by interacting with our world and observing the world around us.

    Emerging neural scene representations aim to build models that replicate this behavior: Trained in a self-supervised manner, the goal is to reconstruct rich representations of 3D scenes that can then be used in downstream tasks such as computer vision, robotics, and graphics. 

    This course covers fundamental and advanced techniques in this field at the intersection of computer vision, computer graphics, and deep learning. It will lay the foundations of how cameras see the world, how we can represent 3D scenes for artificial intelligence, how we can learn to reconstruct these representations from only a single image, how we can guarantee certain kinds of generalization, and how we can train these models in a self-supervised way.


    6.5630 Advanced Topics in Cryptography

    Level: Graduate

    Units: 3-0-9

    Prereqs: 6.5620

    Instructor: Professor Yael Tauman Kalai (tauman@mit.edu)

    Schedule: TBA

    Description

    This course is about the evolution of proofs in computer science.  We will learn about the power of interactive proofs, multi-prover interactive proofs and probabilistically checkable proofs.  We will then show how to use cryptography to convert these powerful proof systems into computationally sound non-interactive arguments (SNARGs).