To truly understand language, an intelligent system must be able to connect
words, phrases, and sentences to its perception of objects and events in the
world. Ideally, an AI system would be able to learn language like a human
child, by being exposed to utterances in a rich perceptual environment. The
perceptual context would provide the necessary supervisory information, and
learning the connection between language and perception would ground the
system's semantic representations in its perception of the world. As a step in
this direction, our research is developing systems that learn semantic parsers and language generators from sentences
paired only with their perceptual context. It is part of our research on natural language learning. Our research on this topic is
supported by the National Science Foundation through grants
IIS-0712097 and IIS-1016312.

Sonal Gupta, Joohyun Kim, Kristen Grauman and Raymond Mooney, In Proceedings of the European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML/PKDD), pp. 457--472, Antwerp Belgium, September 2008.