Swarthmore College Department of Computer Science

Talk by Ben Vigoda

Building machine learning with analog, continuous-time computers
Friday, September 10 2004
3pm in Science Center 240

Abstract

Could an entirely different theory of computation enable us to build ever more powerful computers? In the past, to make more powerful computers, we have continued to scale down to smaller and smaller transistors. But as we make transistors smaller, they increasingly exhibit their inherent physical dynamics which are actually not digital, but statistical, analog, and continuous-time.

As computer scaling begins to reach fundamental physical limits in the next few years, rather than intentionally ignoring physical dynamics, we must find a way to embrace it. A principled approach to do this is to view circuits as propagating probabilities in a message passing algorithm on a graph. Using this alternative model of computation, we can create robust, programmable, high-speed, low-power "analog" computers to compute a broad class of machine learning algorithms with immediate applications to communications systems and fault-tolerant logic architectures.

Biography

Benjamin Vigoda was an Intel Student Fellow at the MIT Media Laboratory in the Center for Bits and Atoms. His PhD developed a new set of AVLSI circuits for implementing computation by Belief Propagation. More generally, his research has looked at various aspects of the statistics of nonlinear distributed systems. Ben earned his undergraduate degree in physics from Swarthmore College in 1996. He has worked at the Santa Fe Institute on alternative models of computation, and at Hewlett Packard Labs where he helped transfer academic research to product divisions. He is now a researcher at Mitsubishi Electric Research Labs and a visiting scientist at the MIT Media Lab.