Both direct, and evolved, behavior-based approaches to mobile robots have yielded a number of interesting demonstrations of robots that navigate, map, plan and operate in the real world. The work can best be described as attempts to emulate insect level locomotion and navigation, with very little work on behavior-based non-trivial manipulation of the world. There have been some behavior-based attempts at exploring social interactions, but these too have been modeled after the sorts of social interactions we see in insects. But thinking how to scale from all this insect level work to full human level intelligence and social interactions leads to a synthesis that is very different from that imagined in traditional Artificial Intelligence and Cognitive Science. We report on work towards that goal.