Researchers create a tool for accurately simulating complex systems

Researchers often use simulations when designing new algorithms, since testing ideas in the real world can be both costly and risky. But since it’s impossible to capture every detail of a complex system in a simulation, they typically collect a small amount of real data that they replay while simulating the components they want to study.

Known as trace-driven simulation (the small pieces of real data are called traces), this method sometimes results in biased outcomes. This means researchers might unknowingly choose an algorithm that is not the best one they evaluated, and which will perform worse on real data than the simulation predicted that it should.

MIT researchers have developed a new method that eliminates this source of bias in trace-driven simulation. By enabling unbiased trace-driven simulations, the new technique could help researchers design better algorithms for a variety of applications, including improving video quality on the internet and increasing the performance of data processing systems.

The researchers’ machine-learning algorithm draws on the principles of causality to learn how the data traces were affected by the behavior of the system. In this way, they can replay the correct, unbiased version of the trace during the simulation.

When compared to a previously developed trace-driven simulator, the researchers’ simulation method correctly predicted which newly designed algorithm would be best for video streaming — meaning the one that led to less rebuffering and higher visual quality. Existing simulators that do not account for bias would have pointed researchers to a worse-performing algorithm.

“Data are not the only thing that matter. The story behind how the data are generated and collected is also important. If you want to answer a counterfactual question, you need to know the underlying data generation story so you only intervene on those things that you really want to simulate,” says Arash Nasr-Esfahany, an electrical engineering and computer science (EECS) graduate student and co-lead author of a paper on this new technique.

He is joined on the paper by co-lead authors and fellow EECS graduate students Abdullah Alomar and Pouya Hamadanian; recent graduate student Anish Agarwal PhD ’21; and senior authors Mohammad Alizadeh, an associate professor of electrical engineering and computer science; and Devavrat Shah, the Andrew and Erna Viterbi Professor in EECS and a member of the Institute for Data, Systems, and Society and of the Laboratory for Information and Decision Systems. The research was recently presented at the USENIX Symposium on Networked Systems Design and Implementation.

Specious simulations

The MIT researchers studied trace-driven simulation in the context of video streaming applications.

In video streaming, an adaptive bitrate algorithm continually decides the video quality, or bitrate, to transfer to a device based on real-time data on the user’s bandwidth. To test how different adaptive bitrate algorithms impact network performance, researchers can collect real data from users during a video stream for a trace-driven simulation.

They use these traces to simulate what would have happened to network performance had the platform used a different adaptive bitrate algorithm in the same underlying conditions.

Researchers have traditionally assumed that trace data are exogenous, meaning they aren’t affected by factors that are changed during the simulation. They would assume that, during the period when they collected the network performance data, the choices the bitrate adaptation algorithm made did not affect those data.

But this is often a false assumption that results in biases about the behavior of new algorithms, making the simulation invalid, Alizadeh explains.

“We recognized, and others have recognized, that this way of doing simulation can induce errors. But I don’t think people necessarily knew how significant those errors could be,” he says.

To develop a solution, Alizadeh and his collaborators framed the issue as a causal inference problem. To collect an unbiased trace, one must understand the different causes that affect the observed data. Some causes are intrinsic to a system, while others are affected by the actions being taken.

In the video streaming example, network performance isaffected by the choices the bitrate adaptation algorithm made — but it’s also affected by intrinsic elements, like network capacity.

“Our task is to disentangle these two effects, to try to understand what aspects of the behavior we are seeing are intrinsic to the system and how much of what we are observing is based on the actions that were taken. If we can disentangle these two effects, then we can do unbiased simulations,” he says.

Learning from data

But researchers often cannot directly observe intrinsic properties. This is where the new tool, called CausalSim, comes in. The algorithm can learn the underlying characteristics of a system using only the trace data.

CausalSim takes trace data that were collected through a randomized control trial, and estimates the underlying functions that produced those data. The model tells the researchers, under the exact same underlying conditions that a user experienced, how a new algorithm would change the outcome.

Using a typical trace-driven simulator, bias might lead a researcher to select a worse-performing algorithm, even though the simulation indicates it should be better. CausalSim helps researchers select the best algorithm that was tested.

The MIT researchers observed this in practice. When they used CausalSim to design an improved bitrate adaptation algorithm, it led them to select a new variant that had a stall rate that was nearly 1.4 times lower than a well-accepted competing algorithm, while achieving the same video quality. The stall rate is the amount of time a user spent rebuffering the video.

By contrast, an expert-designed trace-driven simulator predicted the opposite. It indicated that this new variant should cause a stall rate that was nearly 1.3 times higher. The researchers tested the algorithm on real-world video streaming and confirmed that CausalSim was correct.

“The gains we were getting in the new variant were very close to CausalSim’s prediction, while the expert simulator was way off. This is really exciting because this expert-designed simulator has been used in research for the past decade. If CausalSim can so clearly be better than this, who knows what we can do with it?” says Hamadanian.

During a 10-month experiment, CausalSim consistently improved simulation accuracy, resulting in algorithms that made about half as many errors as those designed using baseline methods.

In the future, the researchers want to apply CausalSim to situations where randomized control trial data are not available or where it is especially difficult to recover the causal dynamics of the system. They also want to explore how to design and monitor systems to make them more amenable to causal analysis.

Researchers develop novel AI-based estimator for manufacturing medicine

When medical companies manufacture the pills and tablets that treat any number of illnesses, aches, and pains, they need to isolate the active pharmaceutical ingredient from a suspension and dry it. The process requires a human operator to monitor an industrial dryer, agitate the material, and watch for the compound to take on the right qualities for compressing into medicine. The job depends heavily on the operator’s observations.   

Methods for making that process less subjective and a lot more efficient are the subject of a recent Nature Communications paper authored by researchers at MIT and Takeda. The paper’s authors devise a way to use physics and machine learning to categorize the rough surfaces that characterize particles in a mixture. The technique, which uses a physics-enhanced autocorrelation-based estimator (PEACE), could change pharmaceutical manufacturing processes for pills and powders, increasing efficiency and accuracy and resulting in fewer failed batches of pharmaceutical products.  

“Failed batches or failed steps in the pharmaceutical process are very serious,” says Allan Myerson, a professor of practice in the MIT Department of Chemical Engineering and one of the study’s authors. “Anything that improves the reliability of the pharmaceutical manufacturing, reduces time, and improves compliance is a big deal.”

The team’s work is part of an ongoing collaboration between Takeda and MIT, launched in 2020. The MIT-Takeda Program aims to leverage the experience of both MIT and Takeda to solve problems at the intersection of medicine, artificial intelligence, and health care.

In pharmaceutical manufacturing, determining whether a compound is adequately mixed and dried ordinarily requires stopping an industrial-sized dryer and taking samples off the manufacturing line for testing. Researchers at Takeda thought artificial intelligence could improve the task and reduce stoppages that slow down production. Originally the research team planned to use videos to train a computer model to replace a human operator. But determining which videos to use to train the model still proved too subjective. Instead, the MIT-Takeda team decided to illuminate particles with a laser during filtration and drying, and measure particle size distribution using physics and machine learning. 

“We just shine a laser beam on top of this drying surface and observe,” says Qihang Zhang, a doctoral student in MIT’s Department of Electrical Engineering and Computer Science and the study’s first author. 

A physics-derived equation describes the interaction between the laser and the mixture, while machine learning characterizes the particle sizes. The process doesn’t require stopping and starting the process, which means the entire job is more secure and more efficient than standard operating procedure, according to George Barbastathis, professor of mechanical engineering at MIT and corresponding author of the study.

The machine learning algorithm also does not require many datasets to learn its job, because the physics allows for speedy training of the neural network.

“We utilize the physics to compensate for the lack of training data, so that we can train the neural network in an efficient way,” says Zhang. “Only a tiny amount of experimental data is enough to get a good result.”

Today, the only inline processes used for particle measurements in the pharmaceutical industry are for slurry products, where crystals float in a liquid. There is no method for measuring particles within a powder during mixing. Powders can be made from slurries, but when a liquid is filtered and dried its composition changes, requiring new measurements. In addition to making the process quicker and more efficient, using the PEACE mechanism makes the job safer because it requires less handling of potentially highly potent materials, the authors say. 

The ramifications for pharmaceutical manufacturing could be significant, allowing drug production to be more efficient, sustainable, and cost-effective, by reducing the number of experiments companies need to conduct when making products. Monitoring the characteristics of a drying mixture is an issue the industry has long struggled with, according to Charles Papageorgiou, the director of Takeda’s Process Chemistry Development group and one of the study’s authors. 

“It is a problem that a lot of people are trying to solve, and there isn’t a good sensor out there,” says Papageorgiou. “This is a pretty big step change, I think, with respect to being able to monitor, in real time, particle size distribution.”

Papageorgiou said that the mechanism could have applications in other industrial pharmaceutical operations. At some point, the laser technology may be able to train video imaging, allowing manufacturers to use a camera for analysis rather than laser measurements. The company is now working to assess the tool on different compounds in its lab. 

The results come directly from collaboration between Takeda and three MIT departments: Mechanical Engineering, Chemical Engineering, and Electrical Engineering and Computer Science. Over the last three years, researchers at MIT and Takeda have worked together on 19 projects focused on applying machine learning and artificial intelligence to problems in the health-care and medical industry as part of the MIT-Takeda Program. 

Often, it can take years for academic research to translate to industrial processes. But researchers are hopeful that direct collaboration could shorten that timeline. Takeda is a walking distance away from MIT’s campus, which allowed researchers to set up tests in the company’s lab, and real-time feedback from Takeda helped MIT researchers structure their research based on the company’s equipment and operations. 

Combining the expertise and mission of both entities helps researchers ensure their experimental results will have real-world implications. The team has already filed for two patents and has plans to file for a third.  

Open-source platform simulates wildlife for soft robotics designers

Since the term “soft robotics” was adopted in 2008, engineers in the field have been building diverse representations of flexible machines useful in exploration, locomotion, rehabilitation, and even space. One source of inspiration: the way animals move in the wild.

A team of MIT researchers has taken this a step further, developing SoftZoo, a bio-inspired platform that enables engineers to study soft robot co-design. The framework optimizes algorithms that consist of design, which determines what the robot will look like; and control, or the system that enables robotic motion, improving how users automatically generate outlines for potential machines.

Taking a walk on the wild side, the platform features 3-D models of animals such as panda bears, fishes, sharks, and caterpillars as designs that can simulate soft robotics tasks like locomotion, agile turning, and path following in different environments. Whether by snow, desert, clay, or water, the platform demonstrates the performance trade-offs of various designs in different terrains.

“Our framework can help users find the best configuration for a robot’s shape, allowing them to design soft robotics algorithms that can do many different things,” says MIT PhD student Tsun-Hsuan Wang, an affiliate of the Computer Science and Artificial Intelligence Laboratory (CSAIL) who is a lead researcher on the project. “In essence, it helps us understand the best strategies for robots to interact with their environments.”

SoftZoo is more comprehensive than similar platforms, which already simulate design and control, because it models movement that reacts to the physical features of various biomes. The framework’s versatility comes from a differentiable multiphysics engine, which allows for the simulation of several aspects of a physical system at the same time, such as a baby seal turning on ice or a caterpillar inching across a wetland environment. The engine’s differentiability optimizes co-design by reducing the number of the often expensive simulations required to solve computational control and design problems. As a result, users can design and move soft robots with more sophisticated, specified algorithms.

The system’s ability to simulate interactions with different terrain illustrates the importance of morphology, a branch of biology that studies the shapes, sizes, and forms of different organisms. Depending on the environment, some biological structures are more optimal than others, much like comparing blueprints for machines that complete similar tasks. 

These biological outlines can inspire more specialized, terrain-specific artificial life. “A jellyfish’s gently undulating geometry allows it to efficiently travel across large bodies of water, inspiring researchers to develop new breeds of soft robots and opening up unlimited possibilities of what artificial creatures cultivated entirely in silico can be capable of,” says Wang. “Additionally, dragonflies can perform very agile maneuvers that other flying creatures cannot complete because they have special structures on their wings that change their center of mass when they fly. Our platform optimizes locomotion the same way a dragonfly is naturally more adept at working through its surroundings.”

Robots previously struggled to navigate through cluttered environments because their bodies were not compliant with their surroundings. With SoftZoo, though, designers could develop the robot’s brain and body simultaneously, co-optimizing both terrestrial and aquatic machines to be more aware and specialized. With increased behavioral and morphological intelligence, the robots would then be more useful in completing rescue missions and conducting exploration. If a person went missing during a flood, for example, the robot could potentially traverse the waters more efficiently because it was optimized using methods demonstrated in the SotftZoo platform.

“SoftZoo provides open-source simulation for soft robot designers, helping them build real-world robots much more easily and flexibly while accelerating the machines’ locomotion capabilities in diverse environments,” adds study co-author Chuang Gan, a research scientist at the MIT-IBM Watson AI Lab who will soon be an assistant professor at the University of Massachusetts at Amherst.

“This computational approach to co-designing the soft robot bodies and their brains (that is, their controllers) opens the door to rapidly creating customized machines that are designed for a specific task,” adds Daniela Rus, director of CSAIL and the Andrew and Erna Viterbi Professor in the MIT Department of Electrical Engineering and Computer Science (EECS), who is another author of the work.

Before any type of robot is constructed, the framework could be a substitute for field testing unnatural scenes. For example, assessing how a bear-like robot behaves in a desert may be challenging for a research team working in the urban plains of Boston. Instead, soft robotics engineers could use 3-D models in SoftZoo to simulate different designs and evaluate how effective the algorithms controlling their robots are at navigation. In turn, this would save researchers time and resources.

Still, the limitations of current fabrication techniques stand in the way of bringing these soft robot designs to life. “Transferring from simulation to physical robot remains unsolved and requires further study,” says Wang. “The muscle models, spatially varying stiffness, and sensorization in SoftZoo cannot be straightforwardly realized with current fabrication techniques, so we are working on these challenges.”

In the future, the platform’s designers are eyeing applications in human mechanics, such as manipulation, given its ability to test robotic control. To demonstrate this potential, Wang’s team designed a 3-D arm throwing a snowball forward. By including the simulation of more human-like tasks, soft robotics designers could then use the platform to assess soft robotic arms that grasp, move, and stack objects.

Wang, Gan, and Rus wrote a paper on the work alongside EECS PhD student and CSAIL affiliate Pingchuan Ma, Harvard University postdoc Andrew Spielberg PhD ’21, Carnegie Mellon University PhD student Zhou Xian, UMass Amherst Associate Professor Hao Zhang, and MIT professor of brain and cognitive sciences and CSAIL affiliate Joshua B. Tenenbaum.

Wang completed this work during an internship at the MIT-IBM Watson AI Lab, with the NSF EFRI Program, DARPA MCS Program, MIT-IBM Watson AI Lab, and gift funding from MERL, Cisco, and Amazon all providing support for the project. The team’s research will be presented at the 2023 International Conference on Learning Representations this month.

Jelena Notaros receives 2023 NSF CAREER Award

Jelena Notaros, Assistant Professor in EECS, has received the 2023 NSF CAREER Award. The Faculty Early Career Development (CAREER) Program is a Foundation-wide activity that offers the National Science Foundation’s most prestigious awards in support of early-career faculty who have the potential to serve as academic role models in research and education and to lead advances in the mission of their department or organization.

With this prestigious five-year award, Notaros and her group plan to develop novel integrated-photonics-based platforms, devices, and systems that enable emission of light spanning the visible spectrum and with complex reconfigurable holographic emission profiles. These fundamental developments will be applied to enable next-generation chip-based solutions for augmented-reality displays, impacting consumer, military, and medical applications; optical traps, impacting medical diagnostics, cell experimentation, and human health; and 3D printers, impacting the way society creates.
 
In addition to these research advancements, this award will allow Notaros and her group to have a broad impact on the future of a well-trained and diverse workforce in the field of silicon photonics and in STEM as a whole. The proposed educational and teaching initiatives will tightly integrate research and education by developing educational materials and curricula that are closely tied to the proposed research activities, and will have a broad impact on the future of photonics education by influencing pedagogy nationwide and engaging learners from diverse backgrounds.

Jelena Notaros is the Robert J. Shillman (1974) Career Development Assistant Professor of Electrical Engineering and Computer Science at the Massachusetts Institute of Technology, a Principal Investigator in the MIT Research Laboratory of Electronics, and a Core Faculty Member of the MIT Microsystems Technology Laboratories. She received her Ph.D. and M.S. degrees from the Massachusetts Institute of Technology in 2020 and 2017, respectively, and B.S. degree from the University of Colorado Boulder in 2015. 

MIT engineers “grow” atomically thin transistors on top of computer chips

Emerging AI applications, like chatbots that generate natural human language, demand denser, more powerful computer chips. But semiconductor chips are traditionally made with bulk materials, which are boxy 3D structures, so stacking multiple layers of transistors to create denser integrations is very difficult.

However, semiconductor transistors made from ultrathin 2D materials, each only about three atoms in thickness, could be stacked up to create more powerful chips. To this end, MIT researchers have now demonstrated a novel technology that can effectively and efficiently “grow” layers of 2D transition metal dichalcogenide (TMD) materials directly on top of a fully fabricated silicon chip to enable denser integrations.

Growing 2D materials directly onto a silicon CMOS wafer has posed a major challenge because the process usually requires temperatures of about 600 degrees Celsius, while silicon transistors and circuits could break down when heated above 400 degrees. Now, the interdisciplinary team of MIT researchers has developed a low-temperature growth process that does not damage the chip. The technology allows 2D semiconductor transistors to be directly integrated on top of standard silicon circuits.

In the past, researchers have grown 2D materials elsewhere and then transferred them onto a chip or a wafer. This often causes imperfections that hamper the performance of the final devices and circuits. Also, transferring the material smoothly becomes extremely difficult at wafer-scale. By contrast, this new process grows a smooth, highly uniform layer across an entire 8-inch wafer.

The new technology is also able to significantly reduce the time it takes to grow these materials. While previous approaches required more than a day to grow a single layer of 2D materials, the new approach can grow a uniform layer of TMD material in less than an hour over entire 8-inch wafers.

Due to its rapid speed and high uniformity, the new technology enabled the researchers to successfully integrate a 2D material layer onto much larger surfaces than has been previously demonstrated. This makes their method better-suited for use in commercial applications, where wafers that are 8 inches or larger are key.

“Using 2D materials is a powerful way to increase the density of an integrated circuit. What we are doing is like constructing a multistory building. If you have only one floor, which is the conventional case, it won’t hold many people. But with more floors, the building will hold more people that can enable amazing new things. Thanks to the heterogenous integration we are working on, we have silicon as the first floor and then we can have many floors of 2D materials directly integrated on top,” says Jiadi Zhu, an electrical engineering and computer science graduate student and co-lead author of a paper on this new technique.

Zhu wrote the paper with co-lead-author Ji-Hoon Park, an MIT postdoc; corresponding authors Jing Kong, professor of electrical engineering and computer science (EECS) and a member of the Research Laboratory for Electronics; and Tomás Palacios, professor of EECS and director of the Microsystems Technology Laboratories (MTL); as well as others at MIT, MIT Lincoln Laboratory, Oak Ridge National Laboratory, and Ericsson Research. The paper appears today in Nature Nanotechnology.

Slim materials with vast potential

The 2D material the researchers focused on, molybdenum disulfide, is flexible, transparent, and exhibits powerful electronic and photonic properties that make it ideal for a semiconductor transistor. It is composed of a one-atom layer of molybdenum sandwiched between two atoms of sulfide.

Growing thin films of molybdenum disulfide on a surface with good uniformity is often accomplished through a process known as metal-organic chemical vapor deposition (MOCVD). Molybdenum hexacarbonyl and diethylene sulfur, two organic chemical compounds that contain molybdenum and sulfur atoms, vaporize and are heated inside the reaction chamber, where they “decompose” into smaller molecules. Then they link up through chemical reactions to form chains of molybdenum disulfide on a surface.

But decomposing these molybdenum and sulfur compounds, which are known as precursors, requires temperatures above 550 degrees Celsius, while silicon circuits start to degrade when temperatures surpass 400 degrees.

So, the researchers started by thinking outside the box — they designed and built an entirely new furnace for the metal-organic chemical vapor deposition process.

The oven consists of two chambers, a low-temperature region in the front, where the silicon wafer is placed, and a high-temperature region in the back. Vaporized molybdenum and sulfur precursors are pumped into the furnace. The molybdenum stays in the low-temperature region, where the temperature is kept below 400 degrees Celsius — hot enough to decompose the molybdenum precursor but not so hot that it damages the silicon chip.

The sulfur precursor flows through into the high-temperature region, where it decomposes. Then it flows back into the low-temperature region, where the chemical reaction to grow molybdenum disulfide on the surface of the wafer occurs.

“You can think about decomposition like making black pepper — you have a whole peppercorn and you grind it into a powder form. So, we smash and grind the pepper in the high-temperature region, then the powder flows back into the low-temperature region,” Zhu explains.

Faster growth and better uniformity

One problem with this process is that silicon circuits typically have aluminum or copper as a top layer so the chip can be connected to a package or carrier before it is mounted onto a printed circuit board. But sulfur causes these metals to sulfurize, the same way some metals rust when exposed to oxygen, which destroys their conductivity. The researchers prevented sulfurization by first depositing a very thin layer of passivation material on top of the chip. Then later they could open the passivation layer to make connections.

They also placed the silicon wafer into the low-temperature region of the furnace vertically, rather than horizontally. By placing it vertically, neither end is too close to the high-temperature region, so no part of the wafer is damaged by the heat. Plus, the molybdenum and sulfur gas molecules swirl around as they bump into the vertical chip, rather than flowing over a horizontal surface. This circulation effect improves the growth of molybdenum disulfide and leads to better material uniformity.

In addition to yielding a more uniform layer, their method was also much faster than other MOCVD processes. They could grow a layer in less than an hour, while typically the MOCVD growth process takes at least an entire day.

Using the state-of-the-art MIT.Nano facilities, they were able to demonstrate high material uniformity and quality across an 8-inch silicon wafer, which is especially important for industrial applications where bigger wafers are needed.

“By shortening the growth time, the process is much more efficient and could be more easily integrated into industrial fabrications. Plus, this is a silicon-compatible low-temperature process, which can be useful to push 2D materials further into the semiconductor industry,” Zhu says.

In the future, the researchers want to fine-tune their technique and use it to grow many stacked layers of 2D transistors. In addition, they want to explore the use of the low-temperature growth process for flexible surfaces, like polymers, textiles, or even papers. This could enable the integration of semiconductors onto everyday objects like clothing or notebooks.

“This work made an important progress in the synthesis technology of monolayer molybdenum disulfide material,” says Han Wang, the Robert G. and Mary G. Lane Endowed Early Career Chair and Associate Professor of Electrical and Computer Engineering and Chemical Engineering and Materials Science at the University of Southern California, who was not involved with this research. “The new capability of low thermal budget growth on an 8-inch scale enables the back-end-of-line integration of this material with silicon CMOS technology and paves the way for its future electronics application.”

This work is partially funded by the MIT Institute for Soldier Nanotechnologies, the National Science Foundation Center for Integrated Quantum Materials, Ericsson, MITRE, the U.S. Army Research Office, and the U.S. Department of Energy. The project also benefitted from the support of TSMC University Shuttle.

President Yoon Suk Yeol of South Korea visits MIT

President Yoon Suk Yeol of South Korea visited MIT on Friday, participating in a roundtable discussion with Institute leaders and faculty about biomedical research and discussing the fundamentals of technology-driven innovation clusters. 

South Korea, Yoon noted in his remarks, has highly regarded educational institutions, hospitals, and research facilities, along with robust legal and business systems. However, he added, the country still aims to develop the kind of biomedical innovation cluster exemplified by the Kendall Square area in Cambridge, Massachusetts, where a confluence of established and startup firms, academic research, and agile investment capital has created a world center of bioscience work. 

“We need [clusters] to make the whole greater than the sum of the parts,” Yoon said during the event, held in the MIT.nano building. 

Yoon’s visit included a look at the Cryo-Electron Microscopy Facility in MIT.nano, which enables nearly atom-level evaluation of the structures of molecules and numerous other organic materials. It also featured presentations from MIT faculty, with a follow-up discussion among the participants. 

“I hope the Republic of Korea can benchmark what you are doing,” Yoon told the MIT participants, in remarks translated to the audience. “I know that won’t happen overnight.”

Yoon made the visit in the midst of a six-day state visit to the U.S., which included a formal White House state dinner this week hosted by U.S. President Joe Biden. The trip was aimed at further strengthening what is now a 70-year alliance between the two countries, with both security and economic topics on the agenda.

The visit to MIT was hosted by Richard Lester, associate provost for international activities and the Japan Steel Industry Professor of Nuclear Science and Engineering, and Anantha Chandrakasan, dean of the MIT School of Engineering and the Vannevar Bush Professor of Electrical Engineering and Computer Science. Thomas Schwartz, the Boris Magasanik Professor of Biology at MIT, initiated the visit by showing Yoon the microscopy facility. Yoon then proceeded to the roundtable discussion, which was moderated by Chandrakasan.

Lester welcomed Yoon to MIT, saying it was a “great honor” to have him at the Institute. 

For his part, Yoon offered remarks to formally start the discussion, noting among other things that “boosting investments in science and technology” on an ongoing basis was “extremely important” to his government. 

Six MIT professors spoke to Yoon about different aspects of biotechnology and enhancing innovation. Robert Langer, the David H. Koch Institute Professor, outlined the model of research-driven biotech startups he has helped create in recent decades. The group of about 40 firms Langer has helped found includes Moderna, the high-profile Covid-19 vaccine developer. Such ventures have had global impact, while changing the local cityscape around MIT. 

“It’s been remarkable to see the area around MIT transform,” Langer said, while suggesting that South Korean work in the field was clearly on the rise. “There is a lot of interest in biotechnology companies in [South] Korea,” Langer also noted. 

Dina Katabi, the Thuan and Nicole Pham Professor at MIT, explained how AI could be used to better diagnose some serious illnesses, including Parkinson’s, by using new tools to assess brain activity and its relationship to the potential for disease formation. “We can innovate in medicine, once you have such data,” she said. 

In some cases, new biotech tools are being deployed against familiar contagions. James Collins, the Termeer Professor of Medical Engineering and Science, discussed how he and colleagues are using AI tools to develop new medicines that could be used against increasingly antibiotic-resistant bacterial illnesses. Without new advances, Collins said, such illnesses could kill 10 million people a year by 2050. 

“We’re using AI models not only to discover but to design new antibiotics,” Collins said. “We think collaboration across nation-states, as well as universities worldwide, is really going to be needed to address this crisis.”

In response to prepared questions from the South Korea delegation, MIT faculty also talked about an array of topics pertaining to research and development and innovation.

Collin Stultz, the Nina T. and Robert H. Rubin Professor in Electrical Engineering and Computer Science as well as co-director of the Harvard-MIT Program in Health Sciences and Technology (HST) and associate director of MIT’s Institute for Medical Engineering and Science (IMES), talked about developing talent in the biomedical field.

“Cultivating innovation in this space requires educating scholars who not only have technical expertise but have a real understanding of the biological and biomedical questions that are the most pressing,” Stultz said, noting that HST students work in hospital settings. “A hallmark of the HST program is not only to get specialized training in particular fundamental engineering or science disciplines … but also [to go] very deep into our actions with the medical community.”

Kwanghun Chung, an associate professor in the Department of Chemical Engineering, a core member of IMES, an investigator in The Picower Institute for Learning and Memory, and a faculty member in Brain and Cognitive Sciences, was asked to talk about the different components making up a successful innovation ecosystem. He noted that the process was analogous to seeding many things in nature, in order to create extensive growth. His remarks drew a response from Yoon, who added, “I realize such a forest doesn’t happen overnight.” 

Giovanni Traverso, the Karl Van Tassel Career Development Professor, showed Yoon models of small drug-delivery systems he and his colleagues have developed, demonstrating, for instance, how such devices attach to their targets properly.

In addition to Yoon, the participants from South Korea included Lee Jong Ho, minister of science and ICT; Choi Sang Mok, senior secretary to the president for economic affairs; Songyee Yoon PhD ’00, a member of the MIT Corporation and president and chief strategy officer of NCSOFT, a Korean gaming company; and Kim Young Tae, Seoul National University Hospital president and CEO.

Lee, who at the start of his career was a postdoc at MIT’s Microsystems Technology Laboratory, called the MIT visit a “very meaningful opportunity” to discuss trends and strategies, in a statement released by the South Korean government after the event. 

Yoon was elected as president in March of 2022. He grew up in Seoul, graduated from Seoul National University after studying law, and has had a lengthy career as a prosecutor. Yoon served as the country’s prosecutor general from 2019 to 2021. 

MIT faculty tackle big ideas in a symposium kicking off Inauguration Day

Big ideas took the stage on Monday morning, ahead of the inauguration of MIT’s 18th president, Sally Kornbluth. As final preparations were underway on Killian Court for the afternoon’s ceremonies, members of the MIT community gathered to welcome Kornbluth with an academic symposium exploring the theme “Where Big Ideas Come From — and Why They Matter.”

Held at MIT’s Samberg Conference Center and streamed online, the symposium featured eight MIT faculty members representing a range of disciplines across the Institute. They took turns presenting their research and sharing their perspectives on how MIT can cultivate ideas and innovations to meet the major challenges of the world today. 

“It’s wonderful to see you all at the start of quite an exciting day,” Kornbluth said, greeting the audience. “MIT has a bit more than 1,000 faculty members spread across dozens of fields. So you can consider the brilliant speakers today as a sort of tasting menu to whet your appetite to the rich intellectual environment of MIT.”

The event was convened by Anantha Chandrakasan, dean of the School of Engineering and the Vannevar Bush Professor of Electrical Engineering and Computer Science, who kicked off the symposium by enthusiastically welcoming Kornbluth to MIT.

“I’ve been inspired by your seemingly unbounded curiosity and passion for learning,” Chandrakasan said. “These two qualities that we find so important in our students, we find reflected in you.”

Chandrakasan went on to emphasize the need for curiosity and collaboration across multiple fields in tackling the world’s most pressing problems. 

“As our global challenges become increasingly urgent, now more than ever we are in need of big ideas, which our faculty, students, staff, and alumni are working hard to generate,” he continued. “Problems like climate change simply cannot be solved unless researchers from nearly every discipline collaborate. Today we’ll hear of ideas that span disciplines.”

Finding a flow

Cullen Buie, associate professor of mechanical engineering, was the first speaker to take the stage. In opening his talk, Buie marveled at the progress of gene therapy, pointing out that the technology has advanced in recent years such that some patients’ own cells can be genetically altered to eradicate diseases such as sickle cell disease and certain types of leukemia.

“These amazing technologies are ushering in a new frontier,” Buie said. “But many patients will not receive these therapies.”

The reason, he found, was that it simply takes too long to make these therapies. And the major bottleneck stems from one specific step in the manufacturing process: the delivery of genetic material into a patient’s cells. In learning about the different ways in which researchers are attempting to streamline gene therapy, Buie came across a company that was close to automating the entire manufacturing process, save for the crucial step of gene delivery, which involved painstaking, highly individualized labwork.

“This company had created a sports car for cell engineering, but it was limited to second gear because the gene delivery was too slow,” Buie said.

This realization kicked off an idea: What if genes could be delivered in a faster, continuously flowing fashion? The idea sparked Kytopen, a startup co-founded by Buie and former MIT postdoc and research scientist Paolo Garcia, who developed a new technology to quickly and continuously deliver genetical material into human cells.

The technology, he says, is “poised to revolutionize the field.” And closer to home, the idea would help patients like Buie’s own son, who has sickle cell disease and has experienced the excruciating “pain crises” that the condition brings on.

With Kytopen, and the technologies that Buie and his colleagues are developing, he hopes to have a solution for his son and others who suffer from genetic diseases.

“These big ideas come from you, from me, from anyone who sees someone like my son suffering, and they decide they want to do something about it,” Buie said.

Medical subscriptions

As innovations are made in medicine, there will also need to be innovations in how we finance them — a point that was made by Andrew Lo, the Charles E. and Susan T. Harris Professor of Finance, who followed Buie’s talk. He began by echoing Buie’s excitement for new gene therapies that target and correct “typos” in DNA. These therapies have been proven to cure certain rare diseases, and are on the verge of success with more common diseases.

“This is a really big deal,” Lo said. “The problem is, these cures don’t come cheap. There are eye-popping numbers that come with these therapies. They’re hitting health care budgets really hard. Can we afford them?”

Lo believes that we can, with a new, subscription-based model that he likens to Netflix for health care. He envisions a company’s health plan could pay a subscription fee directly to various drug manufacturers, such as Novartis, and these drug manufacturers in turn would administer therapies to patients with zero to little additional cost.

“A Netflix model could offer a menu of therapies and could give more access to therapies,” Lo said.

He and a colleague, Yutong Sun, have founded a company, Quantile Health, to develop and test such a subscription-based health plan.

“Finance doesn’t always have to be a zero-sum game,” Lo said. “With the right kind of financing, scale, and business model, we can do well by doing good. And we can do it now.”

A longstanding investment

Anne White, associate provost and associate vice president for research administration, spoke next, about the promise of fusion technology in providing electricity at a large scale.

“If you could harness the energy released from fusion reactions from deuterium and tritium, you could power a city the size of Boston with a pickup truck of fuel,” White said. “No greenhouse emissions, minimal waste, little land usage, small environmental impact. This is the huge promise of fusion.”

That promise, White said, is closer to realization than ever, thanks to a convergence of four essential pieces: community consensus and mature science, policy and public private partnership, private funding, and longstanding university leadership.

MIT, she pointed out, has been significantly involved this endeavor, in part through the continued work of researchers in the Plasma Science and Fusion Center, the development of a new superconducting magnet technology to enable smaller, faster fusion devices, and various MIT startups focused on advancing key steps of fusion technology.

“The fusion ecosystem is flourishing,” White said. “We could be 15 years away from a demonstration of electricity from fusion — a very clean, very safe, nearly unlimited source of low-carbon energy.”

“Big ideas don’t come out of thin air,” she concluded. “They come from longstanding investment in people and their ideas that might just change the world.”

Accounting for carbon

Jinhua Zhao is looking to decarbonize transportation by changing the way we behave. Zhao, who is the Edward and Joyce Linde Associate Professor of City and Transportation Planning, is also the founder and director of the MIT Mobility Initiative, where he brings together work across the Institute on transportation research, education, entrepreneurship, and civic engagement.

Zhao opened his talk with a picture of Beijing traffic today — an image that he pointed out is not too different from Boston gridlock. As a counterpart, he showed the same region of Beijing, 40 years ago, its streets filled with pedestrians, bicyclists, and electric trolleys.

“We call this sustainable, low-carbon, active traffic — all this beautiful vocabulary,” Zhao said. “In the past 40 years, we have moved away from this paradigm” in favor of a gasoline-based infrastructure. Now, in order to move away from a gasoline economy, he says electrification is key. But so is a change in our own behavior.

“What did you have for dinner last night? How much did that cost you? How much time did it take you? How much carbon did that meal cost?” Zhao said. “Most of us would have no clue. We wouldn’t even have the order of magnitude right.”

“We want to establish money, time, and carbon as fundamental units of societal accounting,” said Zhao, who has set up a company, Tram Global, that establishes a digital marketplace that aims to reward those who take certain actions to reduce their daily carbon emissions.

“To change individual behavior is hard,” said Zhao. “Norms are sticky, and take about a generation to change. But maybe we can accelerate that process just enough to save our planet.”

A musical pivot

Eran Egozy, professor of the practice in music technology, was next to speak, participating via video. The co-founder of the videogame developer Harmonix took the symposium audience through the early days of the company and the development of its most popular game, “Guitar Hero,” which aimed to let every player experience the joy of making music.

Egozy, who is himself a clarinetist, remembers the painful early days of learning the instrument — an experience that defeats many beginners.

“If you can overcome this chasm, and get to the point where you derive real pleasure from playing your instrument, you can launch yourself into this beautiful world of music making,” he said.

While this vision ultimately led to a very successful product in “Guitar Hero,” Egozy said the team developed 10 games along the way that were commercial failures.

“But we learned from our mistakes, pivoted and adjusted, lived another day, and made another product or game that was a little better than the previous one,” he said.

“Guitar Hero” has since inspired many to take up actual instruments, including the rapper and singer/songwriter Post Malone and composer and guitarist Yasmin Williams. Ergozy closed his talk by playing a video of Williams playing, in a twist of “karmic awesomeness,” a cover of Post Malone’s “Sunflower.”

A new twist

Pablo Jarillo-Herrero, the Cecil and Ida Green Professor of Physics, next introduced the attendees to quantum matter and the study of multiple interacting particles at the quantum scale. He pointed out that the behavior of a single particle such as an electron is well-understood. But when part of a cloud of many interacting particles, electrons collectively demonstrate entirely new and unknown behaviors that can result in “fascinating states of matter.”

Jarillo-Herrero and his colleagues have discovered many new quantum states of matter that can arise when two layers of carbon atoms, or sheets of graphene, are placed atop each other and twisted just slightly. At certain angles, the particles within the sheets can take on unexpected, exotic properties, such as superconductivity, that were not possible with each sheet separately.

“With this new platform, we have realized all of the phases of quantum matter known in nature, and a few new ones,” Jarillo-Herrero said. “We don’t understand why this happens, but there’s hope that if we can, we can design new technologies, such as better magnets for fusion technology.”

A radio world

Dina Katabi, the Thuan and Nicole Pham Professor of Electrical Engineering and Computer Science, is harnessing the behavior of radio signals to continuously and noninvasively read a patient’s health status.

Rather than hook patients up to electrodes and monitors to track their heartbeats, brain activity, and breathing, she says that wireless devices that emit radio signals could be sensitive enough to track a patient’s vitals. Her group has developed devices that they are currently using to collect data from patients with Parkinson’s disease, Alzheimer’s disease, Crohn’s disease, and Covid-19, and are using artificial intelligence and machine learning techniques to decipher meaningful patterns in the data.

For Parkinson’s in particular, they are hoping the radio-based monitoring could help to diagnose the disease much earlier in its progression. For most patients, a diagnosis comes only after motor symptoms such as tremors and stiffness become apparent. But, Katabi noted, even James Parkinson, who was first to describe the disease’s symptoms, noticed changes in breathing early on in the disease.

Katabi wondered: Could Parkinson’s be diagnosed much earlier through changes in a patient’s breathing? She and her colleagues are gathering data from patients, and developing AI methods to pick up on meaningful patterns in breathing that could enable an early diagnosis.

“We envision this being in every home, like smoke detectors,” says Katabi, who proposed that devices could be programmed to check for signs of various diseases, conditions, and behaviors. “We could write applications to check on Grandma if she’s taking her medications,” she offered.

Iterative invention

Sangeeta Bhatia, the John J. and Dorothy Wilson Professor of Health Sciences and Technology and of Electrical Engineering and Computer Science, emphasized the importance of “convergence” in cultivating new inventions. Her group combines the fields of nanotechnology and medicine to develop new therapies for patients with cancer by harnessing the unique properties of materials at the nanoscale.

“Just as these materials have unique physical properties at nanoscale, they also have unique biological properties,” Bhatia said. “How they traffic inside the body changes with their size. So, you can design materials that speak the language of biology.”

To her point, Bhatia’s group has recently developed an ultrasensitive sensor in the form of particles that are each 1/1,000 the width of a human hair and that can travel through a patient’s bloodstream and detect the presence of cancer. The particles are small enough to pass through the kidney and out of the body, making the sensor “completely noninvasive.”

The team set up a company, Glympse Bio, to advance the technology, and has explored a variety of methods to deliver the sensors, including as a urine test akin to a pregnancy test, and an oral test similar to a breathalyzer.

“Invention is an iterative process,” Bhatia said. “I imagine it’s like writing a song. You start in one direction and make it up as you go. Invention begets invention.”

Taking note

The symposium closed out with a roundtable discussion moderated by Boston Globe Media CEO Linda Pizzuti Henry SM ’05, who asked the group to offer advice for Kornbluth as she takes on the MIT presidency.

“Climate change is a big one,” White said. “We will have to pull from every discipline, every department, will have to inspire people through art and music to think about the planet, and combine life sciences with computing to connect back to the climate in inventive ways that we haven’t even thought of. So, keep the imagination going in thinking about climate change.”

“It’s important for our future that we are more inclusive than we have been historically,” Bhatia added. She noted that there there are 130 startups with roots at the Koch Institute, but she and her colleagues found there could have been 40 more because women at MIT were underfounding companies in the life sciences. “If we are really going to meet the future, we have to make sure we get all the minds at the table.”

For Jarillo-Herrero, support of basic research was foremost, notably for research in quantum science, which could ultimately impact everything from computing to environmental sensing to health care monitoring.

“When you start exploring things you don’t understand, something could happen that could change the world,” he said.

“Think big,” Lo added. “Because we are now at an inflection point in society in dealing with difficult challenges, and if we don’t deal with it successfully, there won’t be future generations. I think MIT is the perfect place, we are poised to have that impact.”

Ghobadi named ACM-W Rising Star

The Association for Computing Machinery (ACM) awarded Manya Ghobadi, MIT EECS associate professor and CSAIL principal investigator with the ACM-W Rising Star Award for her impactful computer networks research. Each year, the ACM honors a woman who has made significant contributions to the discipline early in their career.

Ghobadi’s current research interests are centered on building efficient network infrastructures that optimize resource use, energy consumption, and high availability. A key aspect of her work involves enabling physical-layer reconfigurability in modern networks to achieve high throughput, low latency, and fast recovery from failures. Ghobadi uses optical devices and other advanced hardware to develop architectures, algorithms, and protocols that maximize the infrastructure of systems and networks.

Ghobadi’s recent research on efficient systems for machine learning exemplifies her “rising star” status. In a series of technical innovations, Ghobadi’s research group proposed several practical solutions to improve the efficiency of machine learning datacenters. Her group showed that distributed Deep Neural Network (DNN) training workloads do not satisfy common assumptions about datacenter traffic, making today’s networks a bottleneck for large DNN training jobs. To address this challenge, her team proposed TopoOpt, a reconfigurable optical datacenter for distributed DNN training. TopoOpt co-optimizes the distributed training process across three dimensions: computation, communication, and network topology using an alternating optimization technique and Euler’s Totient function. 

Ghobadi’s work inspired sophisticated and adaptable algorithms that assist with modern network applications and datacenter environments. Last year, her team demonstrated the flaws of fair-sharing, a “holy grail” of congestion control algorithms, for machine learning clusters. The researchers proposed a novel scheduler that automatically interleaves jobs on network links, reducing the number of congestion events by an order of magnitude while demonstrating the importance of re-evaluating traditional congestion control paradigms.

“I’m deeply honored and humbled by this recognition,” says Ghobadi. “My journey in computer science and technology began during my undergraduate studies at Sharif University of Technology in Tehran, Iran, where I pursued a Bachelor’s degree in Computer Engineering. It was during this time that I developed a strong passion for technology, driven by the potential of using computation to make a positive impact on society.”

The ACM-W Rising Star Award was first given out in 2020 to MIT EECS associate professor and CSAIL principal investigator Vivienne Sze, whose innovative work includes accelerators that assist with power reduction in computer vision and deep learning. As the 2023 recipient, Ghobadi has made significant contributions to the systems and networking domain early on in her career, developing technologies now utilized by Microsoft, Google, Meta, and Juniper Networks in their real-world systems.

Ghobadi has been recognized as a promising innovator before, having won the SIGCOMM Rising Star Award this past October. She has also received a Sloan Fellowship in Computer Science, an NSF Career Award, the first Optica Simmons Memorial Speakership, and best paper awards at the Conference on Machine Learning and Systems (MLSys) and ACM Internet Measurement Conference (IMC).

Drones navigate unseen environments with liquid neural networks

In the vast, expansive skies where birds once ruled supreme, a new crop of aviators is taking flight. These pioneers of the air are not living creatures, but rather a product of deliberate innovation: drones. But these aren’t your typical flying bots, humming around like mechanical bees. Rather, they’re avian-inspired marvels that soar through the sky, guided by liquid neural networks to navigate ever-changing and unseen environments with precision and ease.

Inspired by the adaptable nature of organic brains, researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have introduced a method for robust flight navigation agents to master vision-based fly-to-target tasks in intricate, unfamiliar environments. The liquid neural networks, which can continuously adapt to new data inputs, showed prowess in making reliable decisions in unknown domains like forests, urban landscapes, and environments with added noise, rotation, and occlusion. These adaptable models, which outperformed many state-of-the-art counterparts in navigation tasks, could enable potential real-world drone applications like search and rescue, delivery, and wildlife monitoring.

The researchers’ recent study, published today in Science Robotics, details how this new breed of agents can adapt to significant distribution shifts, a long-standing challenge in the field. The team’s new class of machine-learning algorithms, however, captures the causal structure of tasks from high-dimensional, unstructured data, such as pixel inputs from a drone-mounted camera. These networks can then extract crucial aspects of a task (i.e., understand the task at hand) and ignore irrelevant features, allowing acquired navigation skills to transfer targets seamlessly to new environments.

“We are thrilled by the immense potential of our learning-based control approach for robots, as it lays the groundwork for solving problems that arise when training in one environment and deploying in a completely distinct environment without additional training,” says Daniela Rus, CSAIL director and the Andrew (1956) and Erna Viterbi Professor of Electrical Engineering and Computer Science at MIT. “Our experiments demonstrate that we can effectively teach a drone to locate an object in a forest during summer, and then deploy the model in winter, with vastly different surroundings, or even in urban settings, with varied tasks such as seeking and following. This adaptability is made possible by the causal underpinnings of our solutions. These flexible algorithms could one day aid in decision-making based on data streams that change over time, such as medical diagnosis and autonomous driving applications.”

A daunting challenge was at the forefront: Do machine-learning systems understand the task they are given from data when flying drones to an unlabeled object? And, would they be able to transfer their learned skill and task to new environments with drastic changes in scenery, such as flying from a forest to an urban landscape? What’s more, unlike the remarkable abilities of our biological brains, deep learning systems struggle with capturing causality, frequently over-fitting their training data and failing to adapt to new environments or changing conditions. This is especially troubling for resource-limited embedded systems, like aerial drones, that need to traverse varied environments and respond to obstacles instantaneously. 

The liquid networks, in contrast, offer promising preliminary indications of their capacity to address this crucial weakness in deep learning systems. The team’s system was first trained on data collected by a human pilot, to see how they transferred learned navigation skills to new environments under drastic changes in scenery and conditions. Unlike traditional neural networks that only learn during the training phase, the liquid neural net’s parameters can change over time, making them not only interpretable, but more resilient to unexpected or noisy data. 

In a series of quadrotor closed-loop control experiments, the drones underwent range tests, stress tests, target rotation and occlusion, hiking with adversaries, triangular loops between objects, and dynamic target tracking. They tracked moving targets, and executed multi-step loops between objects in never-before-seen environments, surpassing performance of other cutting-edge counterparts. 

The team believes that the ability to learn from limited expert data and understand a given task while generalizing to new environments could make autonomous drone deployment more efficient, cost-effective, and reliable. Liquid neural networks, they noted, could enable autonomous air mobility drones to be used for environmental monitoring, package delivery, autonomous vehicles, and robotic assistants. 

“The experimental setup presented in our work tests the reasoning capabilities of various deep learning systems in controlled and straightforward scenarios,” says MIT CSAIL Research Affiliate Ramin Hasani. “There is still so much room left for future research and development on more complex reasoning challenges for AI systems in autonomous navigation applications, which has to be tested before we can safely deploy them in our society.”

“Robust learning and performance in out-of-distribution tasks and scenarios are some of the key problems that machine learning and autonomous robotic systems have to conquer to make further inroads in society-critical applications,” says Alessio Lomuscio, professor of AI safety in the Department of Computing at Imperial College London. “In this context, the performance of liquid neural networks, a novel brain-inspired paradigm developed by the authors at MIT, reported in this study is remarkable. If these results are confirmed in other experiments, the paradigm here developed will contribute to making AI and robotic systems more reliable, robust, and efficient.”

Clearly, the sky is no longer the limit, but rather a vast playground for the boundless possibilities of these airborne marvels. 

Hasani and PhD student Makram Chahine; Patrick Kao ’22, MEng ’22; and PhD student Aaron Ray SM ’21 wrote the paper with Ryan Shubert ’20, MEng ’22; MIT postdocs Mathias Lechner and Alexander Amini; and Rus.

This research was supported, in part, by Schmidt Futures, the U.S. Air Force Research Laboratory, the U.S. Air Force Artificial Intelligence Accelerator, and the Boeing Co.

Moving perovskite advancements from the lab to the manufacturing floor

Tandem solar cells are made of stacked materials — such as silicon paired with perovskites — that together absorb more of the solar spectrum than single materials, resulting in a dramatic increase in efficiency. Their potential to generate significantly more power than conventional cells could make a meaningful difference in the race to combat climate change and the transition to a clean-energy future.

However, current methods to create stable and efficient perovskite layers require time-consuming, painstaking rounds of design iteration and testing, inhibiting their development for commercial use. Today, the U.S. Department of Energy Solar Energy Technologies Office (SETO) announced that MIT has been selected to receive an $11.25 million cost-shared award to establish a new research center to address this challenge by using a co-optimization framework guided by machine learning and automation.

A collaborative effort with lead industry participant CubicPV, solar startup Verde Technologies, and academic partners Princeton University and the University of California San Diego (UC San Diego), the center will bring together teams of researchers to support the creation of perovskite-silicon tandem solar modules that are co-designed for both stability and performance, with goals to significantly accelerate R&D and the transfer of these achievements into commercial environments.

“Urgent challenges demand rapid action. This center will accelerate the development of tandem solar modules by bringing academia and industry into closer partnership,” says MIT professor of mechanical engineering Tonio Buonassisi, who will direct the center. “We’re grateful to the Department of Energy for supporting this powerful new model and excited to get to work.”

Adam Lorenz, CTO of solar energy technology company CubicPV, stresses the importance of thinking about scale, alongside quality and efficiency, to accelerate the perovskite effort into the commercial environment. “Instead of chasing record efficiencies with tiny pixel-sized devices and later attempting to stabilize them, we will simultaneously target stability, reproducibility, and efficiency,” he says. “It’s a module-centric approach that creates a direct channel for R&D advancements into industry.”

The center will be named Accelerated Co-Design of Durable, Reproducible, and Efficient Perovskite Tandems, or ADDEPT. The grant will be administered through the MIT Research Laboratory for Electronics (RLE).

David Fenning, associate professor of nanoengineering at UC San Diego, has worked with Buonassisi on the idea of merging materials, automation, and computation, specifically in this field of artificial intelligence and solar, since 2014. Now, a central thrust of the ADDEPT project will be to deploy machine learning and robotic screening to optimize processing of perovskite-based solar materials for efficiency and durability.

“We have already seen early indications of successful technology transfer between our UC San Diego robot PASCAL and industry,” says Fenning. “With this new center, we will bring research labs and the emerging perovskite industry together to improve reproducibility and reduce time to market.”

“Our generation has an obligation to work collaboratively in the fight against climate change,” says Skylar Bagdon, CEO of Verde Technologies, which received the American-Made Perovskite Startup Prize. “Throughout the course of this center, Verde will do everything in our power to help this brilliant team transition lab-scale breakthroughs into the world where they can have an impact.”

Several of the academic partners echoed the importance of the joint effort between academia and industry. Barry Rand, professor of electrical and computer engineering at the Andlinger Center for Energy and the Environment at Princeton University, pointed to the intersection of scientific knowledge and market awareness. “Understanding how chemistry affects films and interfaces will empower us to co-design for stability and performance,” he says. “The center will accelerate this use-inspired science, with close guidance from our end customers, the industry partners.”

A critical resource for the center will be MIT.nano, a 200,000-square-foot research facility set in the heart of the campus. MIT.nano Director Vladimir Bulović, the Fariborz Maseeh (1990) Professor of Emerging Technology, says he envisions MIT.nano as a hub for industry and academic partners, facilitating technology development and transfer through shared lab space, open-access equipment, and streamlined intellectual property frameworks.

“MIT has a history of groundbreaking innovation using perovskite materials for solar applications,” says Bulović. “We’re thrilled to help build on that history by anchoring ADDEPT at MIT.nano and working to help the nation advance the future of these promising materials.”

MIT was selected as a part of the SETO Fiscal Year 2022 Photovoltaics (PV) funding program, an effort to reduce costs and supply chain vulnerabilities, further develop durable and recyclable solar technologies, and advance perovskite PV technologies toward commercialization. ADDEPT is one project that will tackle perovskite durability, which will extend module life. The overarching goal of these projects is to lower the levelized cost of electricity generated by PV.

Research groups involved with the ADDEPT project at MIT include Buonassisi’s Accelerated Materials Laboratory for Sustainability (AMLS), Bulović’s Organic and Nanostructured Electronics (ONE) Lab, and the Bawendi Group led by Lester Wolfe Professor in Chemistry Moungi Bawendi. Also working on the project is Jeremiah Mwaura, research scientist in the ONE Lab.