Learning to grow machine-learning models
New LiGO technique accelerates training of large machine-learning models, reducing the monetary and environmental cost of developing AI applications.
Efficient technique improves machine-learning models’ reliability
The method enables a model to determine its confidence in a prediction, while using no additional data and far fewer computing resources than other methods.
Putting a new spin on computer hardware
Luqiao Liu utilizes a quantum property known as electron spin to build low-power, high-performance computer memories and programmable computer chips.
A faster way to preserve privacy online
New research enables users to search for information without revealing their queries, based on a method that is 30 times faster than comparable prior techniques.
Models trained on synthetic data can be more accurate than other models in some cases, which could eliminate some privacy, copyright, and ethical concerns from using real data.
Expanding the MIT-IBM Watson AI Lab’s network of neurons
On October 6, nearly 50 undergraduate and graduate students and postdocs, primarily from MIT, attended the MIT-IBM Watson AI Lab’s networking event. The goal was to connect young…
Learning on the edge
A new technique enables AI models to continually learn from new data on intelligent edge devices like smartphones and sensors, reducing energy costs and privacy risks.
New hardware offers faster computation for artificial intelligence, with much less energy
Engineers working on “analog deep learning” have found a way to propel protons through solids at unprecedented speeds.
Student-powered machine learning
Recent MEng graduates reflect on their application-focused research as affiliates of the MIT-IBM Watson AI Lab.
On the road to cleaner, greener, and faster driving
Researchers use artificial intelligence to help autonomous vehicles avoid idling at red lights.