More than 300 students signed up for the 2018 deep-learning class. Photo: Lillie Paquette, School of Engineering
Catherine O'Neill Grace | EECS Contributor
Can a computer learn to diagnose a collapsed lung from real X-rays? Write new music and poetry in a particular style? Or even teach itself, from scratch, how to steer an autonomous vehicle? Introduction to Deep Learning (6.S191), a course designed and led by students, teaches how deep learning enables these activities and many others, including machine translation, image recognition, and game-playing. (Learn more about the class in this brief video.)
Offered for the third year, and for the second year with EECS student coordinators Alexander Amini and Ava Soleimany, the course will be held during MIT’s winter Independent Activities Period (IAP) in early 2019. The for-credit class features lectures, peer brainstorming sessions, and labs in the software library TensorFlow. In a final competition, students propose new deep-learning algorithms or applications, ranging from those involving fundamental theory to those in the health care or fashion industries, and pitch their ideas live to a panel of industry sponsors. (And as if that isn’t enough, there’s also a pretty cool “Think Deeper” class T-shirt.)
Amini is a PhD candidate in computer science, advised by Daniela Rus, Andrew (1956) and Erna Viterbi Professor of EECS and director of MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL). A National Science Foundation (NSF) Fellow, Amini received bachelor’s and master’s degrees in EECS, with a minor in mathematics, from MIT. Soleimany, a PhD candidate in biophysics at Harvard and also a NSF Fellow, received a bachelor’s degree in computer science and molecular biology from MIT. She is a member of the Laboratory for Multiscale Regenerative Technologies at MIT's Koch Institute, advised by Sangeeta Bhatia, the John J. and Dorothy Wilson Professor at MIT’s Institute for Medical Engineering and Science (IMES) and in EECS.
“We are passionate about expanding beyond the current state-of-the-art deep- learning methods, and we look forward to how we can build general-purpose, intelligent machines capable of solving complex tasks,” Amini says. “That starts with providing MIT students of all backgrounds with a solid foundation of modern deep learning algorithms.”
A subset of artificial intelligence (AI), deep learning focuses on building predictive models from data. “Neural networks are built of layers of processing units called neurons. Layers of these neurons capture a hierarchy of concepts, and ‘deep’ refers to a network with many layers,” Amini says. Deep learning’s foundations have existed for decades, Soleimany notes, adding: “But the field has exploded in the last five years because of the availability of larger datasets and the advancements in hardware that really enable deep learning.”
Amini and Soleimany observed that in artificial intelligence courses at the undergraduate and graduate levels, deep learning is typically highlighted only briefly.
Created with flickr slideshow.
Slideshow: Scenes from Introduction to Deep Learning, IAP 2018
"We saw that there was a need for a class where practical deep learning skills was the main focus. We’re looking at the foundational principles of what’s out there and what’s up and coming,” Amini says. “While many of these advancements of deep learning are incredible, we also spend time focusing on the limitations and drawbacks of these algorithms, as well as the aspects which they currently can’t handle well.”
In the 2018 course, students learned how to build a deep learning-based computer vision system to identify and diagnose a pneumothorax (collapsed lung), using a large chest X-ray dataset. Students also developed software that learned how to generate music by learning from snippets of modern pop songs. “You train the algorithms to do it,” Amini explains.
More than 300 students took the introductory course during IAP 2018. “Not only did we have undergrads and graduate students, we had postdocs and professors take our class, too,” says Soleimany. Among their most engaged students: George Zweig, a particle physicist who introduced the quark model during his graduate years. A Nobel Prize contender, Zweig had a longstanding career in physics and later transitioned into neuroscience. “He’s 81 years old, and there he was the in the front row,” Soleimany says. “He was doing the homework every day and engaging with the other students, asking questions.”
Amini attributes the class’s widespread appeal to the course creators’ intentional approach to curriculum design. “Our vision is for students to be able to appreciate deep learning regardless of their background,” he says. “We teach the practical things, and they then get to implement them — and they pitch their work to billion-dollar companies.”
The 2019 class runs from Jan. 28–Feb. 1, 2019. Visit http://introtodeeplearning.com/ for information and links to registration.