A widely accepted prediction is that computing will move to the background, weaving itself into the fabric of our everyday living spaces and projecting the human user into the foreground. If this prediction is to come true, then next generation computing should be about anticipatory user interfaces that should be human-centered, built for humans based on human models. They should transcend the traditional keyboard and mouse to include natural, humanlike interactive functions including understanding and emulating certain human behaviors such as affective and social signaling. This article discusses how far are we from enabling computers to understand human behavior.