Subject Updates Fall 2022

    These are the subject updates for the fall term 2022

    When new subject numbers are listed below, previous subject numbers are listed in [brackets].


    6.1900 [6.0004] / 6.S077 Introduction to Low-Level Programming in C and Assembly

    Half Term Subject, taught in both first half (H1) and second half (H2); half-semester assignments announced before semester starts

    Level: Undergraduate

    Units:  2-2-2

    Prereqs:  6.100A [6.0001]

    Instructor:  Joseph Steinmeyer (jodalys@mit.edu) and Silvina Hanono Wachman (silvina@mit.edu)

    Schedule: M12:30-2, room 34-101

    Satisfies:  same as 6.1900 [6.0004]

    Description

    Introduction to C and assembly language for students coming from a Python background (6.0001). Studies the C language, focusing on memory and associated topics including pointers, and how different data structures are stored in memory, the stack, and the heap in order to build a strong understanding of the constraints involved in manipulating complex data structures in modern computational systems. Studies assembly language to facilitate a firm understanding of how high-level languages are translated to machine-level instructions.

    Since 6.1900 is taught in both halves of the semester, and requires hardware and lab space, you will be assigned to a section in either the first half or second half in order to balance section size. Please preregister or register for 6.1900. You will be asked for your constraints and preferences, and before the semester starts, you will receive an assignment to either the first half or the second half of the semester. (The subject number for the first half course may be 6.S077 for scheduling reasons, but 6.S077 is the same as 6.1900.)


    6.S045 Computational Imaging: Physics and Algorithms

    Level: Undergraduate

    Units: 3-0-9

    Prerequisites: (1.000 or 1.00 or 2.086 or 3.019 or 6.100A[6.0001] and 18.C06

    Instructors: Professors George Barbastathis (gbarb@mit.edu), Rajeev Ram (rajeev@mit.edu), Sixian You (sixian@mit.edu), James LeBeau (lebeau@mit.edu)

    Schedule:  Lectures MW11, room 36-156 ; Recitation F11, room 34-302

    Satisfies:

    Description

    Contemporary understanding of imaging is essentially computational: it involves encoding onto a form of radiation the information about a physical object, transferring the radiation through the imaging system, converting it to a digital signal, and finally computationally decoding the object information and presenting it to the user. This class introduces a unified formulation of computational imaging systems in the form of a three-round “learning spiral”: in the first two rounds of the spiral, the instructors describe the physical and algorithmic parts in two exemplary imaging systems. The third round the students conduct themselves in the context of the class project on an imaging system of their choice. The undergraduate and graduate versions share lectures but have different recitations. Throughout the term we also conduct optional “clinics” to even out background knowledge of linear algebra, optimization, and computational imaging-related programming best practices for students of diverse disciplinary backgrounds. 


    6.S046/6.S976  Silicon Photonics

    Units: 3-0-9

    Level: Undergraduate/Graduate

    Prereqs: 6.2300 (6.013) or 8.07

    Instructor: Prof. Jelena Notaros (notaros@mit.edu)

    Schedule: MW3-4:30, room 26-328

    Satisfies: Physics Concentration, AUS2, DLAB, AAGS, grad_AUS2, TQE(6.S976)

    Description

    Introduces students to the field of silicon photonics with topics spanning silicon-photonics-based devices, circuits, systems, platforms, and applications. Covers the foundational concepts behind silicon photonics based in electromagnetics, optics, and device physics; the design of silicon-photonics-based devices (including waveguides, couplers, splitters, resonators, antennas, modulators, detectors, and lasers) using both theoretical analysis and numerical simulation tools; the engineering of silicon-photonics-based circuits and systems with a focus on a variety of applications areas (spanning computing, communications, sensing, quantum, displays, and biophotonics); the development of silicon-photonics-based platforms, including fabrication and materials considerations; and the characterization of these silicon-photonics-based devices and systems through hands-on laboratory demonstrations and projects. Students taking graduate version complete additional assignments.


    6.S040 Computational Foundations for Ethical ML in Life Sciences and Health Care

    Level: Undergraduate

    Units: 2-0-4

    Prerequisites: 6.3900[6.036] or 6.C01 or (pre-req or coreq 6.9320[6.904]*)

    Instructor: Prof. Regina Barzilay, regina@csail.mit.edu

    Schedule: Thursdays: L 12:30-2:30, room 24-307

    Satisfies: EECS elective when taken concurrently with 6.9320[6.904]

    Enrollment limited to 20

    Description

    The class focuses on designing machine learning methods for life sciences and their ethical implications. The class will cover common pitfalls in data preparation and algorithm design that lead to biased and unsafe systems. Next, we will cover algorithmic solutions that can address this deficiencies. Such methods include automatic bias detection, algorithms for robust learning in presence of bias, privacy-preserving data access, and methods for uncertainty estimation. We will also discuss the impact of regulatory policies on algorithm design and ethical considerations. The class will combine lectures and group discussions.

    Course 6 has proposed that students who complete this subject in the fall term of AY2022-2023 be awarded CI-M (Communication Intensive in the Major) credit. The subject is currently under review by the Subcommittee on the Communication Requirement (SOCR). A final decision should be made no later than Drop Date, November 23, 2022.

    If you have questions about this subject or your individual status with respect to the Communication Requirement, please email commreq@mit.edu.

    *can be taken concurrently with ethics for Engineers or it can be a prerequisite if taken during another term.


    6.S082/6.S967 Principles of Modeling, Computing and Control for Decarbonized Electric Energy Systems

    Level: Undergraduate/Graduate

    Units: 4-0-8

    Prerequisites:  6.2000 [6.002] and 6.3000, or permission of instructor

    Instructor: Dr. Marija Ilic (ilic@mit.edu)

    Schedule: Lectures MW1-2:30, room 34-302; Recitation F11:30-12:30, room 5-233, or F4, room 38-166

    Satisfies: AAGS Control; Concentration in Control; AUS2; grad AUS2, TQE (6.S967)

    Description

    The overarching goal of this course is to prompt students to apply systems-level thinking and engage in emerging research on efficient, sustainable, and physically and economically feasible electric power systems.  It offers modeling principles of modern electric power systems starting from a brief review of their structure and their physical components. In particular, a novel unified modeling in energy/power dynamics is  introduced  to  conceptualize their operations and control and manage temporal,functional  and spatial complexity.. Examples of potential benefits from novel hardware and software technologies  are introduced  to show how digitalization  and  distributed control  play the key role  on the path to decarbonized  electric energy services.


    6.S083 Julia – Solving Real World Problems with Computation (meets with 1.S992, 16S.686, 18.S191, 22.S093)

    Level: Undergraduate

    Units:  3-0-9

    Prereqs:  18.03, 18.06, 6.100A[6.0001], or equivalents

    Instructor:  Professor Alan Edelman (edelman@mit.edu)

    Schedule:  TR1-2:30, room 2-131

    Satisfies:  AUS2, II, Concentration in Numerical Methods

    Description

    Focuses on algorithms and techniques for writing and using modern technical software in a job, lab, or research group environment that may consist of interdisciplinary teams, where performance may be critical, and where the software needs to be flexible and adaptable. Topics covered include automatic differentiation, matrix calculus, scientific machine learning, parallel and GPU computing, and performance optimization with introductory applications to climate science, economics, agent based modeling, and other areas.  Labs and projects focused on performant, readable, composable algorithms and software. Programming will be in Julia.  We expect students to have some familiarity with Python, Matlab, or R but not Julia.


    6.S084 Linear Algebra and Optimization (meets with 18.C06)

    Level: Undergraduate

    Units: 5-0-7

    Prereqs: 18.02

    Instructors: Professors Ankur Moitra (moitra@mit.edu), and Pablo Parrilo (parrilo@mit.edu)

    Schedule: Lecture: MWF1, room 4-370, Recitation: TR10 (2-135) or TR12 (4-149) or TR1 (4-149) or TR3 (2-139)

    Satisfies: can be substituted for 18.06 requirement

    Description

    Introductory course in linear algebra and optimization, assuming no prior exposure to linear algebra and starting from the basics, including vectors, matrices, eigenvalues, singular values, and least squares. Covers the basics in optimization including convex optimization, linear/quadratic programming, gradient descent, and regularization, building on insights from linear algebra. Explores a variety of applications in science and engineering, where the tools developed give powerful ways to understand complex systems and also extract structure from data.


    6.S897 Advanced Sensorimotor Learning

    Level: Graduate

    Units: 2-0-10

    Prereqs: 6.8200[6.484]

    Instructor: Prof. Pulkit Agrawal, pulkitag@mit.edu

    Schedule: Tuesdays 11-1, room 4-257

    Satisfies: AAGS, Concentration subject in AI

    Description

    Surveys advanced concepts in implementing machine learning algorithms for control through a course project and reading research papers. Topics include reinforcement learning, self-supervised learning, and learning from demonstrations, model-based learning, sim-to-real transfer, and specific machine learning challenges unique to building sensorimotor systems. Discusses when it is appropriate to use learning-based methods for decision-making problems, which algorithms to use, and best practices in adapting state-of-the-methods to the problem of student interests. Instruction involves one lecture a week where students present research papers or their project, and a semester-long course project.  


    6.S898 Deep Learning

    Level: Graduate

    Units:  3-0-9

    Prereqs:  (6.3900[6.036] or 6.C01 or 6.3720[6.401]) and (6.3700[6.041] or 6.3800[6.008] or 18.05) and (18.C06 or 18.06)

    Instructor:  Professor Phillip Isola (phillipi@mit.edu) and Stefanie Jegelka (stefje@csail.mit.edu)

    Schedule: TR1-2:30, room 4-231

    Satisfies:  AUS2, II; AAGS, grad_AUS2; Concentration Subject in AI

    Enrollment limited

    Description

    Fundamentals of deep learning, including both theory and applications. Topics include neural net architectures (MLPs, CNNs, RNNs, transformers), backpropagation and automatic differentiation, learning theory and generalization in high-dimensions, and applications to computer vision, natural language processing, and robotics. 


    6.S899 Learning of Time Series with Interventions (meets with IDS.S24)

    Level: Graduate

    Units: 3-0-9

    Prerequisites: Besides general mathematical maturity, the minimal suggested requirement for the course is linear algebra (e.g., 18.06 / 18.700)

    Instructors: Professors Munther Dahleh (dahleh@mit.edu), Devavrat Shah (devavrat@mit.edu)  

    Schedules: TR 1-2:30, room 35-225

    Satisfies: AAGS, grad_AUS2, 6-4 AUS, Concentration subject in AI

    Description

    A time series is a time-stamped set of noisy observations from an underlying process that evolves over time. These observations are dependent on each other in a particular, unknown, fashion. Examples of such series include stock values, value of a currency with respect to the dollar, mean housing prices, the number of Covid-19 infections, or the pitch angle of an airplane during flights. Modeling such processes for the purpose of prediction or intervention is a fundamental problem in statistical learning. This is a research-oriented graduate-level course that will address three lines of development:

    Learning Structured Models: In this part of the course, we focus on learning the underlying stochastic dynamic model that generates the data. We discuss how algorithms depend on the underlying class of models adopted for this learning. We address the accuracy and reliability of our learned models particularly if the model class does not contain a ‘true’ model. We extend these ideas to dynamic systems with flexible input design.

    Prediction: In this part of the course, we make no assumptions on how the data is generated and we focus on predicting the next outcome of the process based on past observations. In this context, we analyze Matrix and Tensor Completion Methods in providing such predictions and we analyze the accuracy of these prediction in the presence of noise, missing data, and finally with interventions.

    Optimal Intervention and RL: A key ingredient of RL is a simulator that can estimate the value of a reward for a given intervention. In this part of the course, we build on techniques from Reinforcement Learning as well as the first two parts to show how new intervention/control can be derived with better outcomes. The course will consist of three guided projects, corresponding to each of the units above, as well as a final project selected by the students.


    6.S974/6.1420[6.054] Fixed Parameter and Fine-grained Complexity

    Level: Graduate/Undergraduate

    Units: 3-0-9

    Prereqs:  6.1200[6.042] and 6.1210[6.006] and (6.1220[6.046] or 6.1400[6.045] or 18.404)

    Instructors: Professors Ryan Williams (rrw@mit.edu) and Virginia Williams (virgi@mit.edu)

    Schedule: TR2.30-4, room 5-134

    Satisfies: AUS2, II; AAGS, grad_AUS2; Concentration subject in Theoretical Computer Science

    Description

    An overview of the theory of parameterized algorithms and the “problem-centric” theory of fine-grained complexity, both of which reconsider how to measure the difficulty and feasibility of solving computational problems. Topics include: fixed-parameter tractability (FPT) and its characterizations, the W-hierarchy (W[1], W[2], W[P], etc.), 3-sum-hardness, all-pairs shortest paths (APSP)-equivalences, strong exponential time hypothesis (SETH) hardness of problems, and the connections to circuit complexity and other aspects of computing.



    6.S977 The Sum of Squares Methods

    Level: Graduate

    Units: 3-0-9

    Prereqs:

    Schedule:  Friday 9:30-12:30, room 66-144

    Instructor: Prof. Sam Hopkins (samhop@mit.edu)

    Satisfies: AAGS, grad_AUS2; Theoretical Computer Science concentration subject

    Description

    Study of algorithms and computational complexity through the lens of the Sum of Squares method (SoS), a powerful approach to algorithm design generalizing linear programming and spectral methods. Specific sub-topics vary and are chosen with student input, potentially including algorithms for combinatorial and continuous optimization (graphs, constraint satisfaction problems, unique games conjecture), applications to high-dimensional algorithmic statistics (robustness, privacy, method of moments), applications to quantum information, and an SoS perspective on computational complexity (of NP-hard problems and/or of statistical inference).


    6.S978 Tissue vs. Silicon in Machine Learning CANCELLED (moved to spring term 2023)

    Level: Graduate

    Units: 3-0-9

    Prereqs: none

    Instructor:  Prof. Nir Shavit (shanir@csail.mit.edu)

    Schedule: Tuesdays 1-4, room 37-212

    Satisfies:  Concentration subject in AI; AAGS

    Description

    A seminar style class that investigates learnings from neurobiology in the design of ML hardware and software. Students should have an understanding of ML techniques.


    6.S980 Machine Learning for Inverse Graphics

    Level: Graduate

    Units: 3-0-9

    Prereqs: 6.3900[6.036] OR 6.3700[6.041] OR 6.1200[6.042] OR 18.06 (i.e., an intro machine learning course)

    Instructor: Professor Vincent Sitzmann (sitzmann@mit.edu)

    Schedule: TR2:30-4, room 32-124

    Satisfies:  AAGS, grad_AUS2; AI or Graphics & HCI concentration subjects

    Description

    From a single picture, humans reconstruct a mental representation of the underlying 3D scene that is incredibly rich in information such as shape, appearance, physical properties, purpose, how things would feel, smell, sound, etc. These mental representations allow us to understand, navigate, and interact with our environment in our everyday lives. We learn this from little supervision, mainly by interacting with our world and observing the world around us.

    Emerging neural scene representations aim to build models that replicate this behavior: Trained in a self-supervised manner, the goal is to reconstruct rich representations of 3D scenes that can then be used in downstream tasks such as computer vision, robotics, and graphics. 

    This course covers fundamental and advanced techniques in this field at the intersection of computer vision, computer graphics, and deep learning. It will lay the foundations of how cameras see the world, how we can represent 3D scenes for artificial intelligence, how we can learn to reconstruct these representations from only a single image, how we can guarantee certain kinds of generalization, and how we can train these models in a self-supervised way.


    6.S981 Introduction to Program Synthesis

    Level: Graduate

    Units: 3-0-9

    Prereqs: 6.1010[6.009] and 6.1200[6.042] or equivalent

    Instructor:  Professor Armando Solar Lezama (asolar@csail.mit.edu)

    Schedule:  TR1-2:30, room 26-328

    Satisfies: AAGS, grad_AUS2, Concentration Subject in Systems

    Description

    The goal of this course is to provide a comprehensive introduction to the field of Software synthesis, an emerging field that sits at the intersection of programming systems, formal methods and artificial intelligence. The course will be divided into three major sections: the first will focus on program induction from examples and will cover a variety of techniques to search large program spaces. The second section will focus on synthesis from expressive specifications and the interaction between synthesis and verification. Finally, the third unit will focus on synthesis with quantitative specifications and the intersection between program synthesis and machine learning. The course will be graded on the basis of three problem sets and an open-ended final project.


    6.S982 Clinical Data Learning, Visualization, and Deployments (meets with HST.953)

    Level: Graduate

    Units:  3-0-9

    Prereqs: 6.7900[6.867] and HST.956

    Instructor:  Professor Marzyeh Ghassemi (mghassemi@gmail.com)

    Schedule: Lectures: F9:30-12:30, room E25-117

    Satisfies: AAGS, grad_AUS2, BIOEECS_AAGS; AI or BioEECS Concentration Subject

    Description

    Examines the practical considerations for operationalizing machine learning in healthcare settings, with a focus on robust, private and fair modeling using real retrospective healthcare data. Explores the pre-modeling creation of dataset pipeline to the post-modeling “implementation science” that addresses how models are incorporated at the point of care. Students will complete three homeworks (one each on machine learning, visualization and implementation) followed by a course project proposal and presentation. Students gain experience in dataset creation and curation, machine learning training, visualization and deployment considerations that target utility and clinical value. Most importantly, students will appreciate the multidisciplinary nature of data science by partnering with computer scientists, engineers, social scientists and clinicians.


    6.S965 TinyML and Efficient Deep Learning Computing

    Level: Graduate

    Units: 3-0-9

    Prereqs: 6.004[6.1900] and 6.3900[6.036] or equivalents

    Instructor:  Professor Song Han (songhan@mit.edu)

    Schedule: TR3:30-5, room 36-156

    Satisfies: AAGS, grad_AUS2; Concentration subject in Computer Systems or AI

    Description

    This course introduces tiny machine learning techniques that enable powerful deep learning applications on resource-constrained devices. Topics include model compression, pruning, quantization, neural architecture search, distributed training, gradient compression, on-device transfer learning, federated learning, efficient kernel design, auto-tuning, benchmarking and profiling, and quantum machine learning. It also introduces application-specific tinyML techniques for video recognition, GAN, point cloud, and natural language understanding. Students will get hands-on experience implementing deep learning applications on microcontrollers and mobile phones and quantum ML on real quantum machines with an open-ended design project.


    6.7950[6.246] Reinforcement Learning: Foundations and Methods

    Level: Graduate

    Units: 3-0-9

    Prereqs: see below

    Instructor: Prof. Cathy Wu (cathywu@mit.edu)

    Schedule: TR2:30-4, room 4-237

    Satisfies: Concentration subject in Control

    More information can be found at https://web.mit.edu/6.7950/www/

    Enrollment limited

    Description

    Reinforcement learning (RL) as a methodology for approximately solving sequential decision-making under uncertainty, with foundations in optimal control and machine learning. Finite horizon and infinite horizon dynamic programming, focusing on discounted Markov decision processes. Value and policy iteration. Monte Carlo, temporal differences, Q-learning, and stochastic approximation. Approximate dynamic programming, including value-based methods and policy space methods. Special topics at the boundary of theory and practice in RL. Applications and examples drawn from diverse domains. While an analysis prerequisite is not required, mathematical maturity is necessary. Enrollment Limited

    Expectations and prerequisites: There is a large class participation component. In terms of prerequisites, students should be comfortable at the level of receiving an A grade in probability (6.3700[6.041] or equivalent), machine learning (6.7900[6.867] or equivalent), convex optimization (from 6.7200[6.255] / 6.3900[6.036] / 6.7900[6.867] or equivalent), linear algebra (18.06 or equivalent), and programming (Python). Mathematical maturity is required. This is not a Deep RL course. This class is most suitable for PhD students who have already been exposed to the basics of reinforcement learning and deep learning (as in 6.3900[6.036] / 6.7900[6.867] / 1.041 / 1.200), and are conducting or have conducted research in these topics.