This page belongs to my old web-site.
To see my updated, new homepage, enable JavaScript and refresh this page.

Research Interests

With my students, I do research in the areas of robotics and computer vision, which present some of the most exciting challenges to anyone interested in
artificial intelligence. I am especially keen on Bayesian inference approaches to the difficult inverse problems that keep popping up in these areas. In many cases, exact solutions
to these problems are intractable, and as such we are interested in examining whether Monte Carlo (sampling-based) approximations are applicable in those cases. We think so.

Since coming to Georgia Tech I have explored the theme of probabilistic, model-based reasoning paired with randomized approximation methods in three main research areas:

Detailed Research Statement

My scientific interests are driven by the vision that new ways of computing will enable us to tackle problems of unprecedented scale in the coming decades. Many important
open problems hinge on our ability to make sense of vast amounts of data, generated by an explosively growing number of digital interfaces to the physical world. I am primarily driven by
such problems in the area of robotics and especially computer vision, as cameras are by far the highest bandwidth sensors that interface robots and computers to the real world. And in
computer vision and robotics, I am especially attracted to problems where there is more of everything: more robots, more cameras, more world to interface with. How can we make sense of a
year's worth of recorded behavior inside a beehive? How could we model the evolution of a city from tens of thousands of historical images? What if we insert one hundred robots in an
unknown environment with the goal to model it? Increasing computing power is inadequate by itself to tackle questions of this magnitude. The answer lies in the development of new computing
paradigms that are especially attuned to deal with problems of this nature.

In light of this vision, my research focuses on developing computationally efficient algorithms to construct models of physical phenomena from massive amounts of noisy,
ambiguous data. It is my firm belief that this should be done by building algorithms on a strong theoretical foundation, where explicit assumptions and approximations guide the search for
efficiency. In my view, the most fruitful theoretical framework in which to view problems of this type is that of probability theory, in order to deal with the imperfect nature of the data.
However, data does not exist in a vacuum: there is considerable expert domain knowledge that provides a context for how the data came into existence. Thus, a key to efficient algorithms is
the development of representations that exploit this knowledge. Finally, whereas exact probabilistic reasoning is often prohibitively expensive, one can devise theoretically sound
approximation algorithms that come at a fraction of the cost. Hence, my research deals with finding novel solutions in the following three directions: