Recent chair announcements within EECS

A collage of professional headshots includes Phillip Isola, Will Oliver, Costis Daskalakis, Manish Raghavan, Stefanie Mueller, Martin Wainwright, Muriel Médard, Martha Gray, Polina Golland, and David Perreault.

The Department of Electrical Engineering and Computer Science (EECS) recently announced the following crop of chair appointments, all effective July 1, 2022.

Karl Berggren has been named the Joseph F. and Nancy P. Keithley Professor. Berggren heads the Quantum Nanostructures and Nanofabrication Group. He is also Director of the Nanostructures Laboratory in the Research Laboratory of Electronics and is a core faculty member in the Microsystems Technology Laboratory (MTL). From December of 1996 to September of 2003, Berggren served as a staff member at MIT Lincoln Laboratory in Lexington, Massachusetts, and from 2010 to 2011, was on sabbatical at the Technical University of Delft.

His current research focuses on methods of nanofabrication, especially applied to superconductive quantum circuits, photodetectors, high-speed superconductive electronics, and energy systems. His thesis work focused on nanolithographic methods using neutral atoms.

Costis Daskalakis has been named the Inaugural Armen Avanessians (1982) Professor. Daskalakis is a member of the Computer Science and Artificial Intelligence Laboratory (CSAIL) and an affiliate of the Laboratory for Information and Decision Systems (LIDS) and the Operations Research Center (ORC). He is also an investigator in the MIT Institute for Foundations of Data Science (MIFODS). He primarily works on computation theory and its interface with game theory, economics, probability theory, statistics and machine learning.

Daskalakis completed his undergraduate studies in Greece, at the National Technical University of Athens, and obtained a PhD in Computer Science at UC Berkeley. He was a postdoctoral researcher in Microsoft Research-New England in 2008-2009, and has been with the MIT faculty since 2009.

Polina Golland has been named the Inaugural Sunlin (1996) and Priscilla Chou Professor. A principal investigator in CSAIL at MIT, Golland’s primary research interest is in developing novel techniques for biomedical image analysis and understanding. She is interested in shape modeling and representation, predictive modeling and visualization of statistical models. Her current research focuses on developing statistical analysis methods for characterization of biological processes based on image information. In this domain, she models biological shape and function, as well as how they relate to each other and vary across individuals.

After studying computer science at the Technion – Israel Institute of Technology and earning bachelor’s and master’s degrees there, Golland earned her Ph.D. at MIT in 2001. She joined the MIT faculty in 2003.

Martha Gray has been named the Whitaker Professor in Biomedical Engineering. Gray is also a core faculty member at the Institute for Medical Engineering and Science (IMES), and is a member of the faculty of the Harvard-MIT Program in Health Sciences and Technology (HST).

Trained in computer science and electrical and biomedical engineering (BS in computer science Michigan State University, SM ’81, PhD ’86), and serving as an MIT faculty for three decades, Gray was the first woman to lead a science or engineering department at MIT. For more than 13 years she directed HST (where she received her PhD in medical engineering), and she currently directs MIT LinQ, which operates several multi-institutional ventures focusing on accelerating and deepening early-career researchers’ potential for impact.

Muriel Médard has been named the NEC Professor of Software Science and Engineering. Médard leads the Network Coding and Reliable Communications Group within RLE. Her research interests are in the areas of network coding and reliable communications, particularly for optical and wireless networks. Her work in network coding, hardware implementation, and her original algorithms have received widespread recognition and awards.

Médard obtained three Bachelors degrees (EECS 1989, Mathematics 1989 and Humanities 1991), as well as her M.S. (1991) and Sc.D (1995), all from MIT. Additionally, she is the co-founder of three companies to commercialize network coding — CodeOn, Steinwurf and Chocolate Cloud.

Will Oliver has been named the Henry Ellis Warren (1894) Professor in Electrical Engineering and Computer Sciences. A joint appointment, Oliver is also a Professor of Physics. A Lincoln Laboratory Fellow, the Associate Director of RLE, and the Director of the Center for Quantum Engineering, Oliver works with the Quantum Information and Integrated Nanosystems Group at Lincoln Laboratory and the Engineering Quantum Systems Group at MIT, where he provides programmatic and technical leadership for programs related to the development of quantum and classical high-performance computing technologies for quantum information science applications. His interests include the materials growth, fabrication, design, and control of superconducting quantum processors, as well as the development of cryogenic packaging and control electronics involving cryogenic CMOS and single-flux quantum digital logic.

Oliver received his B.S. in Electrical Engineering and B.A. in Japanese from the University of Rochester (NY), his M.S. in Electrical Engineering and Computer Science from MIT, and his Ph.D. in Electrical Engineering from Stanford University.

David Perreault has been named the Ford Foundation Professor of Engineering. Perreault’s research interests include design, manufacturing, and control techniques for power electronic systems and components, and their use in a wide range of applications. He also consults in industry, and co-founded Eta Devices, inc. (acquired by Nokia in 2016) and Eta Wireless, inc. (acquired by Murata in 2021) —both startup companies focusing on power management for high-efficiency RF power amplifiers. Over the years, Perreault has held multiple roles within the department, including a stint as Associate Department Head from November 2013 – December 2016.

Perreault received his B.S. from Boston University in 1989, and the S.M. and Ph.D. degrees from MIT in 1991 and 1997, respectively.  In 1997, he joined the MIT Laboratory for Electromagnetic and Electronic Systems as a Postdoctoral Associate, and became a Research Scientist in the laboratory in 1999.

Martin Wainwright has been named the Cecil H. Green Professor. Wainwright’s research interests include high-dimensional statistics, statistical machine learning, information theory, and optimization theory. Prior to joining MIT, Wainwright was the Howard Friesen Chair at the University of California at Berkeley, with a joint appointment between the Department of Statistics and the Department of EECS.

Wainwright received his Bachelor’s degree in Mathematics from University of Waterloo, Canada, and Ph.D. degree in EECS from MIT.

Career Development Chairs

Phillip Isola has been named the Class of 1948 Career Development Professor. Isola’s research explores learning representations that capture the commonalities between disparate domains, and thereby achieve generality; directly linking experiences via visual translation; and designing representations that can adapt fast. A leader in the use of machine learning to analyze and create images, Isola’s series of 2017 papers introduced a solution to the problem of image translation. His most recent work addresses another fundamental computer vision problem: the requirement of large amounts of labelled, or supervised, training data, which limits most learning-based approaches to computer vision.

Isola joined EECS as an Assistant Professor in July of 2018. He received his Ph.D. in 2015 from the Brain and Cognitive Sciences (BCS) Department at MIT before taking on a postdoctoral position at Berkeley, followed by a visiting research scientist position at Open AI.

Stefanie Mueller has been named the TIBCO Career Development Professor. Mueller is the lead of the Human Computer Interaction (HCI) Engineering group at MIT CSAIL; in her research, she develops novel hardware and software systems that advance personal fabrication technologies. Of her group, she says that, “Our long-term vision is to give physical objects digital capabilities, such as allowing physical objects to change their appearance as easily as we can change the color of digital models today.”

Mueller earned her Bachelor’s degree from the University of Applied Science Harz in 2010, and her MSc and her PhD from the Hasso Plattner Institute, in 2013 and 2016, respectively.

Manish Raghavan joined the Sloan School of Management and the Department of EECS as an assistant professor in September 2022, and has been named the Drew Houston (2005) Career Development Professor. His research interests lie in the application of computational techniques to domains of social concern, including algorithmic fairness and behavioral economics, with a particular focus on the use of algorithmic tools in the hiring pipeline.

Raghavan received his bachelor’s degree in electrical engineering and computer science from the University of California, Berkeley, and PhD from the Computer Science department at Cornell University. Prior to joining MIT, he was a postdoctoral fellow at the Harvard Center for Research on Computation and Society.

6.9930 New Women in EECS Seminar: another fun semester of conversation and networking underway!

For the fall term 2022, 6.9930 New Women in EECS Seminar is off to a terrific start! The seminar, originally offered in 2005, is now starting its 18th year of supporting women PhD students in the first semester of the EECS doctoral program. There are a number of goals for the seminar including (1) providing an opportunity for networking among the women, (2) providing support in the first semester as the women transition from their undergraduate programs to graduate school, (3) providing information regarding EECS, MIT and the surrounding New England metropolitan area, and importantly (4) chatting with guests from around the institute, or guests who are also on the PhD journey or guests who have completed the PhD in EECS. The seminar meets Fridays at 9am with breakfast and lively conversation!

On October 7, the seminar attendees welcomed a very special guest: EECS Department Head Professor Asu Ozdaglar, herself an alumna of the EECS program at MIT (2003). Professor Ozdaglar shared her story of completing her undergraduate degree in Turkey, coming to the graduate program in EECS to earn her PhD degree, then taking on a leadership role as Director of LIDS, then Associate Department Head, and on to Department Head of EECS and Deputy Dean of Academics for the Schwarzman College of Computing at MIT. 

Pictured, from left to right: first-year PhD student Ane Zuniga; Department Head Asu Ozdaglar; Professor Leslie Kolodziejski. Photo credit: Janet Fischer.

Professor Ozdaglar openly shared her strategies, advice and wisdom regarding how she leads and manages the largest department at MIT. Students asked, “Would you do it all again and what might you do differently?” Professor Ozdaglar reflected that, although she never anticipated or planned such a journey, she would do it again as she is continually learning and greatly enjoys working with so many talented, dedicated and amazing people in the EECS department and across the rest of MIT. Of things she might do differently, she replied that she would be more strategic in networking with colleagues while working toward her PhD degree; Professor Ozdaglar shared that she worked very hard and focused much of her time on her research as a graduate student. Now, she wishes she would have taken more time to engage with her lab-mates and other graduate students to create more community and friendships while on the PhD journey.

The seminar group meets regularly to provide support and share wisdom. Photo credit: Janet Fischer.

Other topics in the conversation included work-life balance and making time for both family and herself, and also managing her sizable (10 person) research group. Professor Ozdaglar carves out blocks of time in her week to ensure opportunity to engage with and mentor her group members and to help advance their research agenda. As the seminar hour drew to a close, attendees were grateful to meet Professor Ozdaglar and to learn about her and her experiences, and to benefit from her sage wisdom and advice. As one graduate student attendee left the room to begin her Friday, she shared that she was very inspired by the conversation and happy she woke up so early to attend!

Prepared each morning, 8-minute soft-boiled eggs are a standard item on the 6.9330 breakfast menu, making a great, nutritious way for attendees to begin their Friday and to end the week! Photo credit: Janet Fischer.

As Thriving Stars ended its first year, officially being launched on October 12, 2021, the incoming class of EECS PhD students identifying as women is the largest in the history of the department; EECS is welcoming 47 new women (this number also represents the largest-ever percentage of women in the incoming graduate class, at 30%). The seminar therefore has the largest number of registrants at 23 first-year PhD students. Other topics addressed as part of the seminar curriculum include: the importance of networking and mentoring, thinking outside of the lab and striving to be a whole person, and survival tips while navigating MIT. Many various guests will join the seminar from the department, around the institute, and also alumni will come back to share their advice.

MIT system “sees” the inner structure of the body during physical rehab

A woman wearing workout gear studded with electrical nodes practices lifting and flexing her foot as a virtual model of the human body, with muscle groups highlighted, appears next to her.

A growing number of people are living with conditions that could benefit from physical rehabilitation — but there aren’t enough physical therapists (PTs) to go around. The growing need for PTs is racing alongside population growth, and aging, as well as higher rates of severe ailments, are contributing to the problem. 

An upsurge in sensor-based techniques, such as on-body motion sensors, has provided some autonomy and precision for patients who could benefit from robotic systems to supplement human therapists. Still, the minimalist watches and rings that are currently available largely rely on motion data, which lack more holistic data a physical therapist pieces together, including muscle engagement and tension, in addition to movement. 

This muscle-motion language barrier recently prompted the creation of an unsupervised physical rehabilitation system, MuscleRehab, by researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) and Massachusetts General Hospital. There are three ingredients: motion tracking that captures motion activity, an imaging technique called electrical impedance tomography (EIT) that measures what the muscles are up to, and a virtual reality (VR) headset and tracking suit that lets a patient watch themselves perform alongside a physical therapist. 

Patients put on the sleek ninja-esque all-black tracking suit and then perform various exercises such as lunges, knee bends, dead lifts, leg raises, knee extensions, squats, fire hydrants, and bridges that measure activity of quadriceps, sartorius, hamstrings, and abductors. VR captures 3D movement data.

MIT system “sees” inner structure of the body during physical rehab.

In the virtual environment, patients are given two conditions. In both cases, their avatar performs alongside a physical therapist. In the first situation, just the motion tracking data is overlaid onto their patient avatar. In the second situation, the patient puts on the EIT sensing straps, and then they have all the information of the motion and muscle engagement. 

With these two conditions, the team compared the exercise accuracy and handed the results to a professional therapist, who explained which muscle groups were supposed to be engaged during each of the exercises. By visualizing both muscle engagement and motion data during these unsupervised exercises instead of just motion alone, the overall accuracy of exercises improved by 15 percent. 

The team then did a cross-comparison of how much time during the exercises the correct muscle group got triggered between the two conditions. In the condition where they show the muscle engagement data in real-time, that’s the feedback. By monitoring and recording the most engagement data, the PTs reported a much better understanding of the quality of the patient’s exercise, and that it helped to better evaluate their current regime and exercise based on those stats.

“We wanted our sensing scenario to not be limited to a clinical setting, to better enable data-driven unsupervised rehabilitation for athletes in injury recovery, patients currently in physical therapy, or those with physical limiting ailments, to ultimately see if we can assist with not only recovery, but perhaps prevention,” says Junyi Zhu, MIT PhD student in electrical engineering and computer science, CSAIL affiliate, and lead author on a new paper about MuscleRehab. “By actively measuring deep muscle engagement, we can observe if the data is abnormal compared to a patient’s baseline, to provide insight into the potential muscle trajectory.” 

Current sensing technologies focus mostly on tracking behaviors and heart rates, but Zhu was interested in finding a better way than electromyography (EMG) to sense the engagement (blood flow, stretching, contracting) of different layers of the muscles. EMG only captures muscle activity right beneath the skin, unless it’s done invasively. 

Zhu has been digging into the realm of personal health-sensing devices for some time now. He’d been inspired by using EIT, which measures electrical conductivity of muscles, for his project in 2021 that used the noninvasive imaging technique to create a toolkit for designing and fabricating health and motion sensing devices. To his knowledge, EIT, which is usually used for monitoring lung function, detecting chest tumors, and diagnosing pulmonary embolism, hadn’t been done before. 

With MuscleRehab, the EIT sensing board serves as the “brains” behind the system. It’s accompanied by two straps filled with electrodes that are slipped onto a user’s upper thigh to capture 3D volumetric data. The motion capturing process uses 39 markers and a number of cameras that sense very high frame rates per second. The EIT sensing data shows actively triggered muscles highlighted on the display, and a given muscle becomes darker with more engagement. 

Currently, MuscleRehab focuses on the upper thigh and the major muscle groups inside, but down the line they’d like to expand to the glutes. The team is also exploring potential avenues in using EIT in radiotherapy in collaboration with Piotr Zygmanski, medical physicist at the Brigham and Women’s Hospital and Dana-Farber Cancer Institute and Associate Professor of Radiation at Harvard Medical School.

“We are exploring utilization of electrical fields and currents for detection of radiation as well as for imaging of the of dielectric properties of patient anatomy during radiotherapy treatment, or as a result of the treatment,” says Zygmanski. “Radiation induces currents inside tissues and cells and other media — for instance, detectors — in addition to making direct damage at the molecular level (DNA damage). We have found the EIT instrumentation developed by the MIT team to be particularly suitable for exploring such novel applications of EIT in radiotherapy. We are hoping that with the customization of the electronic parameters of the EIT system we can achieve these goals.”

“This work advances EIT, a sensing approach conventionally used in clinical settings, with an ingenious and unique combination with virtual reality,” says Yang Zhang, assistant professor in electrical and computer engineering at the UCLA Samueli School of Engineering, who was not involved in the paper. “The enabled application that facilitates rehabilitation potentially has a wide impact across society to help patients conduct physical rehabilitation safely and effectively at home. Such tools to eliminate the need for clinical resources and personnel have long been needed for the lack of workforce in healthcare.”

The paper’s MIT co-authors are graduate students Yuxuan Lei and Gila Schein, MIT undergraduate student Aashini Shah, and MIT Professor Stefanie Mueller, all CSAIL affiliates. Other authors are Hamid Ghaednia, instructor at the Department of Orthopaedic Surgery of Harvard Medical School and co-director of Center for Physical Artificial Intelligence at Mass General Hospital; Joseph Schwab, chief of the Orthopaedic Spine Center, director of spine oncology, co-director of the Stephan L. Harris Chordoma Center, and associate professor of orthopedic surgery at Harvard Medical School; as well as Casper Harteveld, associate dean and professor at Northeastern University. They will present the paper at The ACM Symposium on User Interface Software and Technology later this month. 

L. Rafael Reif receives National Academy of Engineering’s Simon Ramo Founders Award

President L. Rafael Reif stands in front of his desk.

The National Academy of Engineering (NAE) has named MIT President L. Rafael Reif as the winner of its 2022 Simon Ramo Founders Award.

Reif, who has served at MIT’s president for more than 10 years, is being honored “for pioneering leadership to reimagine and advance higher education, university-based entrepreneurship, the future of computing, the future of work, sustainability and semiconductor technology,” according to the academy’s citation.

The Founders Award was established in 1965 by the NAE to honor an outstanding member or international member who has upheld the ideals and principles of the NAE through professional, educational, and personal achievement and accomplishment. Reif accepted the award on Sunday at the NAE annual meeting.

“I am delighted to be recognized by the National Academy of Engineering and its members because choosing to become an engineer was one of the most important decisions I ever made,” Reif said. “As a leader, one is often required to make important decisions under conditions of great uncertainty — which is exactly what engineering trains you to do.”

Reif joined MIT in 1980 as an assistant professor of electrical engineering and later served for seven years as provost, becoming MIT’s 17thpresident in 2012. He announced in February that he will step down from the role at the end of 2022.

As president, Reif led the Institute through a period of dynamic growth as well as novel challenges such as the Covid-19 pandemic. He oversaw the creation of an innovation ecosystem on campus and in Kendall Square, including the formation of “tough tech” accelerator The Engine; launched the MIT Schwarzman College of Computing to bring the power of computing and artificial intelligence to all fields of study; helped to reimagine the future of higher education through open-source online learning initiatives such as edX; and cleared new pathways for MIT scholars to develop solutions for addressing climate change.

Reif also promoted health and well-being among MIT students, faculty, and staff; championed MIT’s international community; and worked to revitalize the MIT campus. And, he convened MIT experts to deliver nationally significant reports on topics such the future of work and reasserting U.S. leadership in the semiconductor industry.

After he steps down as president, Reif will take a sabbatical, then return to the faculty of the Department of Electrical Engineering and Computer Science.

“We live in a moment when society is starving for principled, inspired, constructive leadership, in the face of immense global challenges,” Reif said in his acceptance remarks. “In a time of such tremendous need, I believe in the vision, potential and capacity to do good of the members of the National Academy of Engineering, and that includes all of you here tonight. And I trust you will each find many ways to contribute to move the needle in the right direction for the benefit of humankind.

“I am deeply fortunate to have been welcomed into this transformative profession so many years ago, and I am profoundly grateful for the immense honor of this truly wonderful award.”

Four from MIT receive NIH New Innovator Awards for 2022

Clockwise from top left: MIT NIH New Innovator Award winners Lindsay Case, Siniša Hrvatin, Caroline Uhler, and Deblina Sarkar.

The National Institutes of Health (NIH) has awarded grants to four MIT faculty members as part of its High-Risk, High-Reward Research program.

The program supports unconventional approaches to challenges in biomedical, behavioral, and social sciences. Each year, NIH Director’s Awards are granted to program applicants who propose high-risk, high-impact research in areas relevant to the NIH’s mission. In doing so, the NIH encourages innovative proposals that, due to their inherent risk, might struggle in the traditional peer-review process.

This year, Lindsay Case, Siniša Hrvatin, Deblina Sarkar, and Caroline Uhler have been chosen to receive the New Innovator Award, which funds exceptionally creative research from early-career investigators. The award, which was established in 2007, supports researchers who are within 10 years of their final degree or clinical residency and have not yet received a research project grant or equivalent NIH grant.

Lindsay Case, the Irwin and Helen Sizer Department of Biology Career Development Professor and an extramural member of the Koch Institute for Integrative Cancer Research, uses biochemistry and cell biology to study the spatial organization of signal transduction. Her work focuses on understanding how signaling molecules assemble into compartments with unique biochemical and biophysical properties to enable cells to sense and respond to information in their environment. Earlier this year, Case was one of two MIT assistant professors named as Searle Scholars.

Siniša Hrvatin, who joined the School of Science faculty this past winter, is an assistant professor in the Department of Biology and a core member at the Whitehead Institute for Biomedical Research. He studies how animals and cells enter, regulate, and survive states of dormancy such as torpor and hibernation, aiming to harness the potential of these states therapeutically.

Deblina Sarkar is an assistant professor and AT&T Career Development Chair Professor at the MIT Media Lab​. Her research combines the interdisciplinary fields of nanoelectronics, applied physics, and biology to invent disruptive technologies for energy-efficient nanoelectronics and merge such next-generation technologies with living matter to create a new paradigm for life-machine symbiosis. Her high-risk, high-reward proposal received the rare perfect impact score of 10, which is the highest score awarded by NIH.

Caroline Uhler is a professor in the Department of Electrical Engineering and Computer Science and the Institute for Data, Systems, and Society. In addition, she is a core institute member at the Broad Institute of MIT and Harvard, where she co-directs the Eric and Wendy Schmidt Center. By combining machine learning, statistics, and genomics, she develops representation learning and causal inference methods to elucidate gene regulation in health and disease.

The High-Risk, High-Reward Research program is supported by the NIH Common Fund, which oversees programs that pursue major opportunities and gaps in biomedical research that require collaboration across NIH Institutes and Centers. In addition to the New Innovator Award, the NIH also issues three other awards each year: the Pioneer Award, which supports bold and innovative research projects with unusually broad scientific impact; the Transformative Research Award, which supports risky and untested projects with transformative potential; and the Early Independence Award, which allows especially impressive junior scientists to skip the traditional postdoctoral training program to launch independent research careers.

This year, the High-Risk, High-Reward Research program is awarding 103 awards, including eight Pioneer Awards, 72 New Innovator Awards, nine Transformative Research Awards, and 14 Early Independence Awards. These 103 awards total approximately $285 million in support from the institutes, centers, and offices across NIH over five years. “The science advanced by these researchers is poised to blaze new paths of discovery in human health,” says Lawrence A. Tabak DDS, PhD, who is performing the duties of the director of NIH. “This unique cohort of scientists will transform what is known in the biological and behavioral world. We are privileged to support this innovative science.”

Learning on the edge

Microcontrollers, miniature computers that can run simple commands, are the basis for billions of connected devices, from internet-of-things (IoT) devices to sensors in automobiles. But cheap, low-power microcontrollers have extremely limited memory and no operating system, making it challenging to train artificial intelligence models on “edge devices” that work independently from central computing resources.

Training a machine-learning model on an intelligent edge device allows it to adapt to new data and make better predictions. For instance, training a model on a smart keyboard could enable the keyboard to continually learn from the user’s writing. However, the training process requires so much memory that it is typically done using powerful computers at a data center, before the model is deployed on a device. This is more costly and raises privacy issues since user data must be sent to a central server.

To address this problem, researchers at MIT and the MIT-IBM Watson AI Lab developed a new technique that enables on-device training using less than a quarter of a megabyte of memory. Other training solutions designed for connected devices can use more than 500 megabytes of memory, greatly exceeding the 256-kilobyte capacity of most microcontrollers (there are 1,024 kilobytes in one megabyte).

The intelligent algorithms and framework the researchers developed reduce the amount of computation required to train a model, which makes the process faster and more memory efficient. Their technique can be used to train a machine-learning model on a microcontroller in a matter of minutes.

This technique also preserves privacy by keeping data on the device, which could be especially beneficial when data are sensitive, such as in medical applications. It also could enable customization of a model based on the needs of users. Moreover, the framework preserves or improves the accuracy of the model when compared to other training approaches.

“Our study enables IoT devices to not only perform inference but also continuously update the AI models to newly collected data, paving the way for lifelong on-device learning. The low resource utilization makes deep learning more accessible and can have a broader reach, especially for low-power edge devices,” says Song Han, an associate professor in the Department of Electrical Engineering and Computer Science (EECS), a member of the MIT-IBM Watson AI Lab, and senior author of the paper describing this innovation.

Joining Han on the paper are co-lead authors and EECS PhD students Ji Lin and Ligeng Zhu, as well as MIT postdocs Wei-Ming Chen and Wei-Chen Wang, and Chuang Gan, a principal research staff member at the MIT-IBM Watson AI Lab. The research will be presented at the Conference on Neural Information Processing Systems.

Han and his team previously addressed the memory and computational bottlenecks that exist when trying to run machine-learning models on tiny edge devices, as part of their TinyML initiative.

Lightweight training

A common type of machine-learning model is known as a neural network. Loosely based on the human brain, these models contain layers of interconnected nodes, or neurons, that process data to complete a task, such as recognizing people in photos. The model must be trained first, which involves showing it millions of examples so it can learn the task. As it learns, the model increases or decreases the strength of the connections between neurons, which are known as weights.

The model may undergo hundreds of updates as it learns, and the intermediate activations must be stored during each round. In a neural network, activation is the middle layer’s intermediate results. Because there may be millions of weights and activations, training a model requires much more memory than running a pre-trained model, Han explains.

Han and his collaborators employed two algorithmic solutions to make the training process more efficient and less memory-intensive. The first, known as sparse update, uses an algorithm that identifies the most important weights to update at each round of training. The algorithm starts freezing the weights one at a time until it sees the accuracy dip to a set threshold, then it stops. The remaining weights are updated, while the activations corresponding to the frozen weights don’t need to be stored in memory.

“Updating the whole model is very expensive because there are a lot of activations, so people tend to update only the last layer, but as you can imagine, this hurts the accuracy. For our method, we selectively update those important weights and make sure the accuracy is fully preserved,” Han says.

Their second solution involves quantized training and simplifying the weights, which are typically 32 bits. An algorithm rounds the weights so they are only eight bits, through a process known as quantization, which cuts the amount of memory for both training and inference. Inference is the process of applying a model to a dataset and generating a prediction. Then the algorithm applies a technique called quantization-aware scaling (QAS), which acts like a multiplier to adjust the ratio between weight and gradient, to avoid any drop in accuracy that may come from quantized training.

The researchers developed a system, called a tiny training engine, that can run these algorithmic innovations on a simple microcontroller that lacks an operating system. This system changes the order of steps in the training process so more work is completed in the compilation stage, before the model is deployed on the edge device.

“We push a lot of the computation, such as auto-differentiation and graph optimization, to compile time. We also aggressively prune the redundant operators to support sparse updates. Once at runtime, we have much less workload to do on the device,” Han explains.

A successful speedup

Their optimization only required 157 kilobytes of memory to train a machine-learning model on a microcontroller, whereas other techniques designed for lightweight training would still need between 300 and 600 megabytes.

They tested their framework by training a computer vision model to detect people in images. After only 10 minutes of training, it learned to complete the task successfully. Their method was able to train a model more than 20 times faster than other approaches.

Now that they have demonstrated the success of these techniques for computer vision models, the researchers want to apply them to language models and different types of data, such as time-series data. At the same time, they want to use what they’ve learned to shrink the size of larger models without sacrificing accuracy, which could help reduce the carbon footprint of training large-scale machine-learning models.

“AI model adaptation/training on a device, especially on embedded controllers, is an open challenge. This research from MIT has not only successfully demonstrated the capabilities, but also opened up new possibilities for privacy-preserving device personalization in real-time,” says Nilesh Jain, a principal engineer at Intel who was not involved with this work. “Innovations in the publication have broader applicability and will ignite new systems-algorithm co-design research.”

“On-device learning is the next major advance we are working toward for the connected intelligent edge. Professor Song Han’s group has shown great progress in demonstrating the effectiveness of edge devices for training,” adds Jilei Hou, vice president and head of AI research at Qualcomm. “Qualcomm has awarded his team an Innovation Fellowship for further innovation and advancement in this area.”

This work is funded by the National Science Foundation, the MIT-IBM Watson AI Lab, the MIT AI Hardware Program, Amazon, Intel, Qualcomm, Ford Motor Company, and Google.

MIT engineers build a battery-free, wireless underwater camera

Two rounded glass compartments, one holding a computer chip, frame a view of the Charles River.

Scientists estimate that more than 95 percent of Earth’s oceans have never been observed, which means we have seen less of our planet’s ocean than we have the far side of the moon or the surface of Mars.

The high cost of powering an underwater camera for a long time, by tethering it to a research vessel or sending a ship to recharge its batteries, is a steep challenge preventing widespread undersea exploration.

MIT researchers have taken a major step to overcome this problem by developing a battery-free, wireless underwater camera that is about 100,000 times more energy-efficient than other undersea cameras. The device takes color photos, even in dark underwater environments, and transmits image data wirelessly through the water.

The autonomous camera is powered by sound. It converts mechanical energy from sound waves traveling through water into electrical energy that powers its imaging and communications equipment. After capturing and encoding image data, the camera also uses sound waves to transmit data to a receiver that reconstructs the image. 

Because it doesn’t need a power source, the camera could run for weeks on end before retrieval, enabling scientists to search remote parts of the ocean for new species. It could also be used to capture images of ocean pollution or monitor the health and growth of fish raised in aquaculture farms.

“One of the most exciting applications of this camera for me personally is in the context of climate monitoring. We are building climate models, but we are missing data from over 95 percent of the ocean. This technology could help us build more accurate climate models and better understand how climate change impacts the underwater world,” says Fadel Adib, associate professor in the Department of Electrical Engineering and Computer Science and director of the Signal Kinetics group in the MIT Media Lab, and senior author of a new paper on the system.

Joining Adib on the paper are co-lead authors and Signal Kinetics group research assistants Sayed Saad Afzal, Waleed Akbar, and Osvy Rodriguez, as well as research scientist Unsoo Ha, and former group researchers Mario Doumet and Reza Ghaffarivardavagh. The paper is published today in Nature Communications.

Going battery-free

To build a camera that could operate autonomously for long periods, the researchers needed a device that could harvest energy underwater on its own while consuming very little power.

The camera acquires energy using transducers made from piezoelectric materials that are placed around its exterior. Piezoelectric materials produce an electric signal when a mechanical force is applied to them. When a sound wave traveling through the water hits the transducers, they vibrate and convert that mechanical energy into electrical energy.

Those sound waves could come from any source, like a passing ship or marine life. The camera stores harvested energy until it has built up enough to power the electronics that take photos and communicate data.

To keep power consumption as a low as possible, the researchers used off-the-shelf, ultra-low-power imaging sensors. But these sensors only capture grayscale images. And since most underwater environments lack a light source, they needed to develop a low-power flash, too.

“We were trying to minimize the hardware as much as possible, and that creates new constraints on how to build the system, send information, and perform image reconstruction. It took a fair amount of creativity to figure out how to do this,” Adib says.

They solved both problems simultaneously using red, green, and blue LEDs. When the camera captures an image, it shines a red LED and then uses image sensors to take the photo. It repeats the same process with green and blue LEDs.

Even though the image looks black and white, the red, green, and blue colored light is reflected in the white part of each photo, Akbar explains. When the image data are combined in post-processing, the color image can be reconstructed.

“When we were kids in art class, we were taught that we could make all colors using three basic colors. The same rules follow for color images we see on our computers. We just need red, green, and blue — these three channels — to construct color images,” he says.

Sending data with sound

Once image data are captured, they are encoded as bits (1s and 0s) and sent to a receiver one bit at a time using a process called underwater backscatter. The receiver transmits sound waves through the water to the camera, which acts as a mirror to reflect those waves. The camera either reflects a wave back to the receiver or changes its mirror to an absorber so that it does not reflect back.

A hydrophone next to the transmitter senses if a signal is reflected back from the camera. If it receives a signal, that is a bit-1, and if there is no signal, that is a bit-0. The system uses this binary information to reconstruct and post-process the image.

“This whole process, since it just requires a single switch to convert the device from a nonreflective state to a reflective state, consumes five orders of magnitude less power than typical underwater communications systems,” Afzal says.

The researchers tested the camera in several underwater environments. In one, they captured color images of plastic bottles floating in a New Hampshire pond. They were also able to take such high-quality photos of an African starfish that tiny tubercles along its arms were clearly visible. The device was also effective at repeatedly imaging the underwater plant Aponogeton ulvaceus in a dark environment over the course of a week to monitor its growth.

Now that they have demonstrated a working prototype, the researchers plan to enhance the device so it is practical for deployment in real-world settings. They want to increase the camera’s memory so it could capture photos in real-time, stream images, or even shoot underwater video.

They also want to extend the camera’s range. They successfully transmitted data 40 meters from the receiver, but pushing that range wider would enable the camera to be used in more underwater settings.

“This will open up great opportunities for research both in low-power IoT devices as well as underwater monitoring and research,” says Haitham Al-Hassanieh, an assistant professor of electrical and computer engineering at the University of Illinois Urbana-Champaign, who was not involved with this research.

This research is supported, in part, by the Office of Naval Research, the Sloan Research Fellowship, the National Science Foundation, the MIT Media Lab, and the Doherty Chair in Ocean Utilization.

2022-23 EECS Faculty Award Roundup


This ongoing listing of awards and recognitions won by our faculty is added to all year, beginning in September.

Pulkit Agrawal, Assistant Professor, was awarded a Multidisciplinary University Research Initiative (MURI) award for 2023, for his project, “Neuro‐Inspired Distributed Deep Learning (NIDDL)”.

Mohammah Alizadeh, Associate Professor, was awarded the 2022 ACM Grace Murray Hopper Award “for pioneering and impactful contributions to data center networks”.

Jacob Andreas, Assistant Professor of EECS, was awarded the Junior Bose teaching award.

Regina Barzilay, School of Engineering Distinguished Professor for AI and Health, was elected to the National Academy of Engineering for “machine learning models that understand structures in text, molecules, and medical images”.

Connor Wilson Coley, Assistant Professor, was named in the first cohort of AI2050 Early Career Fellows by Schmidt Futures.

Costis Daskalakis,  the Armen Avanessians (1982), was named to the 2022 cohort of ACM Fellows “for contributions to the foundations of algorithmic game theory, mechanism design, sublinear algorithms, and theoretical machine learning.”

Manya Ghobadi, Assistant Professor, was awarded the ACM-W Rising Star Award.

Dylan Hadfield-Menell, Assistant Professor, was named in the first cohort of AI2050 Early Career Fellows by Schmidt Futures.

Ruonan Han, Associate Professor, was awarded the 2023 IEEE Solid-State Circuits Society New Frontier Award.

Song Han, Associate Professor, was named to the cohort of 2023 Sloan Research Fellows by the Sloan Foundation.

Piotr Indyk, Thomas D. and Virginia W. Cabot Professor, was elected a member of the American Academy of Arts & Sciences.

Yael Tauman Kalai, Adjunct Associate Professor of EECS, was named the recipient of the 2022 ACM Prize in Computing for “fundamental contributions to cryptography”.

Dina Katabi, Thuan (1990) and Nicole Pham Professor, was elected a member of the National Academy of Sciences.

Farnaz Niroui, Assistant Professor, was awarded the DARPA Young Faculty Award.

Jelena Notaros, Assistant Professor of EE, was awarded the 2022 Advanced Photonics Congress Student Paper Prize at Optica’s Advanced Photonics Congress, along with her coauthor Sabrina Corsetti.

Jelena Notaros, Assistant Professor of EE, was awarded the NSF Career Award in 2022.

David Perreault, Ford Foundation Professor of Engineering, was awarded the 2024 IEEE William E. Newell Power Electronics Award.

Jonathan Ragan-Kelley, Esther and Harold E. Edgerton Assistant Professor, was named to the cohort of 2023 Sloan Research Fellows by the Sloan Foundation.

Ronitt Rubinfeld, Edwin Sibley Webster Professor, was named a 2023 Guggenheim Fellow by the John Simon Guggenheim Memorial Foundation.

Devavrat Shah, Andrew (1956) and Erna Viterbi Professor, along with coauthors Mohammad Alizadeh, Abdullah Alomar, Anish Agarwal, and collaborators from MIT CSAIL, received the Best Paper Award at the 20th USENIX Symposium on Networked Systems Design and Implementation (NSDI ‘23) for their paper “CausalSim: A Causal Framework for Unbiased Trace-Driven Simulation”.

Justin Solomon, Associate Professor, has been awarded MIT’s annual Harold E. Edgerton Faculty Achievement Award.

Gerald Sussman, Panasonic Professor, has been awarded the IEEE Educational Activities Board Major Education Innovation Award.

Russell Tedrake, Toyota Professor, was awarded MIT School of Engineering’s 2023 Teaching With Digital Technology Award.

Caroline Uhler, Professor of EECS and in the Institute for Data, Systems and Society (IDSS), has been named a Fellow of the Society for Industrial and Applied Mathematics (SIAM), Class of 2023.  

Caroline Uhler, Professor of EECS and in the Institute for Data, Systems and Society (IDSS), received the NIH New Innovator Award for 2022.

Vinod Vaikuntanathan, Professor, was awarded the 2023 IACR “Test of Time” Award for his Crypto 2008 paper titled ” A Framework for Efficient and Composable Oblivious Transfer”.

Joel Voldman, faculty head of EE and Clarence J. LeBel Professor in Electrical Engineering and Computer Science, was awarded MIT School of Engineering’s 2023 Teaching With Digital Technology Award.

Mengjia Yan, Assistant Professor of CS, was awarded the Intel® Rising Star Faculty Award 2022.

Investigating at the interface of data science and computing

Guy Bresler sits on the bottom step of a spiral staircase.

A visual model of Guy Bresler’s research would probably look something like a Venn diagram. He works at the four-way intersection where theoretical computer science, statistics, probability, and information theory collide.

“There are always new things to do be done at the interface. There are always opportunities for entirely new questions to ask,” says Bresler, an associate professor who recently earned tenure in MIT’s Department of Electrical Engineering and Computer Science (EECS).

A theoretician, he aims to understand the delicate interplay between structure in data, the complexity of models, and the amount of computation needed to learn those models. Recently, his biggest focus has been trying to unveil fundamental phenomena that are broadly responsible for determining the computational complexity of statistics problems — and finding the “sweet spot” where available data and computation resources enable researchers to effectively solve a problem.

When trying to solve a complex statistics problem, there is often a tug-of-war between data and computation. Without enough data, the computation needed to solve a statistical problem can be intractable, or at least consume a staggering amount of resources. But get just enough data and suddenly the intractable becomes solvable; the amount of computation needed to come up with a solution drops dramatically.

The majority of modern statistical problems exhibits this sort of trade-off between computation and data, with applications ranging from drug development to weather prediction. Another well-studied and practically important example is cryo-electron microscopy, Bresler says. With this technique, researchers use an electron microscope to take images of molecules in different orientations. The central challenge is how to solve the inverse problem — determining the molecule’s structure given the noisy data. Many statistical problems can be formulated as inverse problems of this sort.

One aim of Bresler’s work is to elucidate relationships between the wide variety of different statistics problems currently being studied. The dream is to classify statistical problems into equivalence classes, as has been done for other types of computational problems in the field of computational complexity. Showing these sorts of relationships means that, instead of trying to understand each problem in isolation, researchers can transfer their understanding from a well-studied problem to a poorly understood one, he says.

Adopting a theoretical approach

For Bresler, a desire to theoretically understand various basic phenomena inspired him to follow a path into academia.

Both of his parents worked as professors and showed how fulfilling academia can be, he says. His earliest introduction to the theoretical side of engineering came from his father, who is an electrical engineer and theoretician studying signal processing. Bresler was inspired by his work from an early age. As an undergraduate at the University of Illinois at Urbana-Champaign, he bounced between physics, math, and computer science courses. But no matter the topic, he gravitated toward the theoretical viewpoint.

In graduate school at the University of California at Berkeley, Bresler enjoyed the opportunity to work in a wide variety of topics spanning probability, theoretical computer science, and mathematics. His driving motivator was a love of learning new things.

“Working at the interface of multiple fields with new questions, there is a feeling that one had better learn as much as possible if one is to have any chance of finding the right tools to answer those questions,” he says.

That curiosity led him to MIT for a postdoc in the Laboratory for Information and Decision Systems (LIDS) in 2013, and then he joined the faculty two years later as an assistant professor in EECS, a member of LIDS, and a core faculty member in the Institute for Data, Systems, and Society (IDSS). He was named an associate professor in 2019.

Bresler says he was drawn to the intellectual atmosphere at MIT, as well as the supportive environment for launching bold research quests and trying to make progress in new areas of study.

Opportunities for collaboration

“What really struck me was how vibrant and energetic and collaborative MIT is. I have this mental list of more than 20 people here who I would love to have lunch with every single week and collaborate with on research. So just based on sheer numbers, joining MIT was a clear win,” he says.

He’s especially enjoyed collaborating with his students, who continually teach him new things and ask deep questions that drive exciting research projects. One such student, Matthew Brennan, who was one of Bresler’s closest collaborators, tragically and unexpectedly passed away in January, 2021.

The shock from Brennan’s death is still raw for Bresler, and it derailed his research for a time.

“Beyond his own prodigious capabilities and creativity, he had this amazing ability to listen to an idea of mine that was almost completely wrong, extract from it a useful piece, and then pass the ball back,” he says. “We had the same vision for what we wanted to achieve in the work, and we were driven to try to tell a certain story. At the time, almost nobody was pursuing this particular line of work, and it was in a way kind of lonely. But he trusted me, and we encouraged one another to keep at it when things seemed bleak.”

Those lessons in perseverance fuel Bresler as he and his students continue exploring questions that, by their nature, are difficult to answer.

One area he’s worked in on-and-off for over a decade involves learning graphical models from data. Models of certain types of data, such as time-series data consisting of temperature readings, are often constructed by domain experts who have relevant knowledge and can build a reasonable model, he explains.

But for many types of data with complex dependencies, such as social network or biological data, it is not at all clear what structure a model should take. Bresler’s work seeks to estimate a structured model from data, which could then be used for downstream applications like making recommendations or better predicting the weather.

The basic question of identifying good models, whether algorithmically in a complex setting or analytically, by specifying a useful toy model for theoretical analysis, connects the abstract work with engineering practice, he says.

“In general, modeling is an art. Real life is complicated and if you write down some super-complicated model that tries to capture every feature of a problem, it is doomed,” says Bresler. “You have to think about the problem and understand the practical side of things on some level to identify the correct features of the problem to be modeled, so that you can hope to actually solve it and gain insight into what one should do in practice.”

Outside the lab, Bresler often finds himself solving very different kinds of problems. He is an avid rock climber and spends much of his free time bouldering throughout New England.

“I really love it. It is a good excuse to get outside and get sucked into a whole different world. Even though there is problem solving involved, and there are similarities at the philosophical level, it is totally orthogonal to sitting down and doing math,” he says.

Massachusetts Microelectronics Internship Program: a big focus on critical tiny components

A row of students stand in the lobby of MIT's Lincoln Laboratory.

Jane Halpern | Department of Electrical Engineering and Computer Science

One of the most critical components of our technological future is easy to overlook. Microelectronics, the devices and circuits at the core of computer and communication chips, are aptly named: built on the micrometer (and nanometer!) scale, they cannot be seen with the naked eye, but they power almost everything around us from smart watches, cell phones, or computers, to electric vehicles and the sophisticated tools used in DNA sequencing and drug discovery. Although many times we take them for granted, the pandemic and the supply chain issues that it has fostered have highlighted how critical these tiny electronic building blocks are. Unfortunately, although microelectronics got started in the U.S., production of microelectronics chips in the U.S. in the last 30 years has lagged behind that of other countries, creating a critical shortage of national capability in research, development and manufacturing.

The Massachusetts Microelectronics Internship Program (MMIP) is trying to change that. Designed by a coalition of universities, private companies, and state governmental organizations, the new initiative was launched over the summer to connect undergraduates interested in learning more about microelectronics with industry.

“We understand that it’s sometimes difficult for first-years and sophomores to find internship opportunities in this fast-moving highly-technical field,” says Tomás Palacios, Professor of Electrical Engineering and Industry Officer at MIT EECS Alliance, one of the program’s sponsors. “We want to fix that and make sure that anyone interested in hardware and the exciting field of semiconductors and electronics has internship opportunities from the very beginning, so we created this program.”

Open to all freshmen and sophomores registered at Massachusetts universities, the MMIP offers a ten-week, full-time internship opportunity in microelectronics and hardware—with no prior experience required. The program includes training modules, mentor support, and network-building events designed to help students find their way in a burgeoning industry.

“The Commonwealth of Massachusetts has prioritized internships, mentorships, and apprenticeships as a critical way to get students interested in tech and innovation careers,” says Christine Nolan, Director of the Center for Advanced Manufacturing at the MassTech Collaborative, a state economic development agency that sponsors the program. “Through this effort, we’re directly addressing the talent gap that exists within microelectronics and hardware, by introducing the career opportunities early-on in the student’s college experience and doing so through hands-on training. By making these paid positions, we’re also recognizing the challenges around equity and economic opportunity, allowing any interested student to get a foothold in this industry.”

The potential impact that these students can have in the world is nearly limitless; as computing transforms industries, the demand for the hardware that makes this possible (i.e., semiconductors and microelectronics) continues to grow.

“The semiconductor industry forms the foundation of the digital revolution that is shaping the future of economies, and indeed, the future of humanity and our planet as well,” says Vincent Roche, CEO of program sponsor Analog Devices (ADI). “The foundation of the semiconductor industry, though, is the incredible people bringing its innovations to life. ADI is excited to be a founding member of the Massachusetts Microelectronics Internship Program. This program is a unique opportunity for students just beginning their careers to grow their network, work hand-in-hand with their peers and industry experts on the technologies that are shaping our future, and gain the real-world skills and knowledge that will empower them to become our industry’s next generation of leaders.”

This summer, in the program’s very first season, a cohort of 20 freshmen and sophomore students from different Massachusetts universities and colleges were selected out of approximately hundred applicants, and spent the summer interning at different Massachusetts microelectronics companies. They received personalized mentorship and participated in a variety of training events and visits to microelectronics companies such as Raytheon Technologies, MITRE and Lincoln Laboratory, where they had an opportunity to learn about semiconductor fabrication, enter a cleanroom facility where some of the most advanced chips for satellites are made, and try to hack modern microelectronics chips.

“Workforce is one of the most essential elements in reasserting U.S. leadership in microelectronics, which is why we believe participating in the new Massachusetts Microelectronics Internship Program is so important,” says Robert Atkins, Head of the Advanced Technology Division at Lincoln Laboratory, another one of the MMIP’s sponsoring organizations. “We are blessed to have significant microelectronics fabrication facilities here at Lincoln Laboratory, and while those facilities are critical to our research, they also provide a wonderful learning environment. Our hope is to use this setting to engage students, motivating them to pursue careers in microelectronics.”

The application process for the second class of the MMIP is just starting, and the application deadline is October 15th. As the program develops and grows, more students around all of Massachusetts and the nation will discover how they too can play a key role in the hardware that drives computing, energy, communications and health, even as freshmen and sophomores.

WEBSITES RELATED TO THIS ANNOUNCEMENT

MMIP Program website and application site: https://www.ma-microelectronics.org/

The Role of Universities in Reasserting US Leadership in Microelectronics: https://usmicroelectronics.mit.edu/

Massachusetts Technology Collaborative: https://masstech.org/