II - Computer Science (Theory)

SHARE:
  • We study several natural problems in which an {\em unknown} distribution over an {\em unknown} population of vectors needs to be recovered from partial or noisy samples. Such problems naturally arise in a variety of contexts in learning, clustering, statistics, data mining and database privacy, where loss and error may be introduced by nature, inaccurate measurements, or on purpose.
  • Applying random linear projections, a.k.a. "sketching", has become a standard technique when analyzing high-dimensional data sets. The resulting algorithms are embarrassingly parallelizable and suitable for stream processing.
  • In his seminal paper, Myerson [1981] provides a revenue-optimal auction for a seller who is looking to sell a single item to multiple bidders. Extending this auction to simultaneously selling multiple heterogeneous items has been one of the central problems in Mathematical Economics. We provide such an extension that is also computationally efficient.
  • We solve a 20-year old problem posed by Yannakakis and prove that there exists no polynomial-size linear program (LP) whose associated polytope projects to the traveling salesman polytope, even if the LP is not required to be symmetric.
  • Shafi Goldwasser, the RSA professor of electrical engineering and computer science at MIT and principal investigator with the Computer Science and Artificial Intelligence Lab, is among three at MIT selected as Simons Investigator by the Simons Foundation.
  • Professor of applied math and computer science at MIT and head of the Computation and Biology Group, Bonnie Berger, with former and current students, has developed an algorithm that allows researchers to access huge amounts of data in geneome databases despite the rate of genome sequencing that threatens to outpace researchers' ability to analyze the added data.
  • Daskalakis, students use game theory to tackle 30 year economics problem - extending Nobel winner’s work on single-item auctions to auctions involving multiple items.
  • The goal of this project is to develop powerful algorithmic sampling techniques which allow one to estimate parameters of the data by viewing only a miniscule portion of it. Such parameters may be combinatorial, such as whether a large network has the "six degrees of separation property", algebraic, such as whether the data is well-approximated by a linear function, or even distributional, such as whether the data comes from a distribution over a large number of distinct elements.
  • At a major Computer Science and Artificial Intelligence event held May 31, 2012 at the MIT Stata Center, a new CSAIL initiative known as "bigdata@CSAIL" was announced by MIT president Susan Hockfield as Intel’s CTO, Justin Rattner announced that MIT would house a new Intel research center to focus on techniques for the science and engineering of big data -- the huge amounts of information generated by Web users and networked sensors. In addition, Massachusetts Gov. Deval Patrick announced the Massachusetts Big Data Initiative to investigate how big-data technologies can improve government.

Pages

Subscribe to II - Computer Science (Theory)