Jane Halpern

FILTER
Selected:
clear all

Solving the “Whac-a-mole dilemma”: A smarter way to debias AI vision models

May 1, 2026

A new debiasing technique called WRING avoids creating or amplifying biases that can occur with existing debiasing approaches.

A faster way to estimate AI power consumption

April 29, 2026

The “EnergAIzer” method generates reliable results in seconds, enabling data center operators to efficiently allocate resources and reduce wasted energy.

Self-organizing “pencil beam” laser could help scientists design brain-targeted therapies

April 29, 2026

MIT researchers leveraged a surprise discovery to devise a faster and more precise biomedical imaging technique.

Teaching AI models to say “I’m not sure”

April 24, 2026

A new training method improves the reliability of AI confidence estimates without sacrificing performance, addressing a root cause of hallucination in reasoning models.

New chip can protect wireless biomedical devices from quantum attacks

April 24, 2026

Ultra-efficient chip design enables extremely strong cryptography algorithms to run on energy-constrained edge devices.

Jacob Andreas and Brett McGuire named Edgerton Award winners

April 17, 2026

The associate professors of EECS and chemistry, respectively, are honored for exceptional contributions to teaching, research, and service at MIT.

Vinod Vaikuntanathan teaches Advanced Topics in Cryptography: Learning with Errors and Post-Quantum Cryptography (Course 6.876J) in 2018.

Vinod Vaikuntanathan earns 2026 Guggenheim Fellowship

April 14, 2026

The award is given out yearly to leading thinkers, innovators, and creators across art, science, and scholarship to tackle current issues.

Helping data centers deliver higher performance with less hardware

April 14, 2026

Researchers developed a system that intelligently balances workloads to improve the efficiency of flash storage hardware in a data center.

Jack Dennis, Professor of CS and Engineering (Emeritus), Dies at 94

April 9, 2026

The first leader of the Computation Structures Group, he pioneered the development of dataflow models of computation.

New technique makes AI models leaner and faster while they’re still learning

April 9, 2026

Researchers use control theory to shed unnecessary complexity from AI models during training, cutting compute costs without sacrificing performance.