Computers Learning Like Humans Through NLP

Northwestern University reports, “Someday we might be able to build software for our computers simply by talking to them.”

Northwestern University reports, “Someday we might be able to build software for our computers simply by talking to them. Ken Forbus, Walter P. Murphy Professor of Electrical Engineering and Computer Science, and his team have developed a program that allows them to teach computers as you would teach a small child—through natural language and sketching. Called Companion Cognitive Systems, the architecture has the ability for human-like learning. ‘We think software needs to evolve to be more like people,’ Forbus says. ‘We’re trying to understand human minds by trying to build them.’ Forbus has spent his career working in the area of artificial intelligence, creating computer programs and simulations in an attempt to understand how the human mind works. At the heart of the Companions project is the claim that much of human learning is based on analogy. When presented with a new program or situation, the mind identifies similar past experiences to form an action. This allows us to build upon our own knowledge with the ability to continually adapt and learn.”

The article continues, “In one line of experiments, Forbus and his team teach the Companion how to play games, including tic-tac-toe and Freeciv, a strategy game in which players build a civilization. For tic-tac-toe, the user starts by introducing the game using natural language, letting the Companion know it’s a two-player marking game. Next the user sketches the three-by-three game board. The user introduces the idea of players by saying things like ‘X is a player’ and drawing an X. Game play is also described in natural language, such as ‘X goes first,’ ‘X and O take turns marking empty squares,’ and so on. Through demonstration, the user shows the Companion how to win, teaching more of the rules along the way.”