Humans, Do You Speak !~+V•&T1F0()?

Software that will let people and robots communicate and plan difficult, complex tasks, such as dismantling a nuclear power plant, is under development at the University of Aberdeen, Scotland. It will translate symbols of mathematical logic into text and vice versa, so humans and robots can share two-way communication in their own respective language.

Researchers at the university's School of Natural and Computing Sciences expect their technology to be used in several industries. These include unmanned exploration of hostile environments, such as the deep sea or the Martian surface; as well as more mundane tasks, such as maintaining and repairing railway lines.

Software that will let people and robots communicate to plan difficult and complex tasks, such as dismantling a nuclear power plant, is being developed at a Scottish university. (Source: Wikimedia Commons/Stefan Kühn)

In these situations, robots could become more autonomous if they could operate for long periods without continuous guidance from humans, as well as make their own decisions after processing data. The problem is, as it stands now, robots can make mistakes that aren't apparent to humans or to themselves, or do things that humans don't understand. In an operation as dangerous and complex as decommissioning a nuclear power plant, the results could be disastrous.

"Evidence shows there may be mistrust when there are no provisions to help a human to understand why an autonomous system has decided to perform a specific task, at a particular time, and in a certain way," said Dr. Wamberto Vasconcelos, senior lecturer in the Department of Computing Science, in a press release. "What we are creating is a new generation of autonomous systems, which are able to carry out a two-way communication with humans. The ability to converse with such systems will provide us with a novel tool to quickly understand, and if necessary correct, the actions of an automated system, increasing our confidence in, and the usefulness of, such systems."

To develop the autonomous robotics systems, the project will use Natural Language Generation (NLG), which translates complex information and data into simple text summaries. The university's School of Natural and Computing Sciences staff includes several NLG researchers.

In NLG, the information and data begin as symbols of mathematical logic. (Some representative logic symbols are shown in this article's headline, in no particular order.) They are automatically transformed into simple text, so that humans and robots can discuss and plan a set of tasks before the robot carries them out.

Later, when the robot is engaged in a task, the human can communicate with it using a keyboard. Humans can ask the robot questions about why it's taking certain actions or making specific decisions, and request justifications for them. Humans can also provide the robot with additional information it can integrate into its plans, suggest alternatives, and point out problems with the robot's chosen course of action.

Vasconcelos said his team hopes the systems they are developing will be applicable not only to robots, but also to mobile phones, "which can interact with a human in useful ways, which up until now haven't been explored."

The research is funded by a £1.1 million (US$1.7 million) grant from the UK's Engineering and Physical Sciences Research Council, a government agency that funds research and training.

ttemple, everything you said is correct about non-autonomous robots. This research team, like several others, is developing intelligent, autonomous robots, something very different. William's comment below, "Human-Robot communications", captures this difference.

Absolutely, William K. That is a very good description of the issue. Even routine functions can quickly change to ones that require past experience. That is why a lot of experts can operate on "gut feel". They can't explain their correct actions because it is based off of experience of similar occurences. This simply cannot be captured in a program.

I have programmed industrial robots and the closest those robots came to "insight" was knowing that they had to slow-down in order to accurately make a turn. This presents a quandry of sorts when the robot is doing something like putting a sealant along a seal surface, where a larger radius rounded corner is not what is needed. The solution was to bring the robot to a point, then a separate move from that point to the change in direction point, and then start in the new direction. A simple work-around. But if the robot had been able to tell that it needed to do something in order to be able to change direction it may have been easier to figure out. Instead, it was nessesary to read the 4000 page instruction manual.

The problems that will come with attempting to give robots insight is that it may easily lead to giving the robots self-awareness, which would probably lead to robots having emotions, and that could be VERY BAD. That is because robot source code is written by programmers, and programmers are not normal people. We need to always remember that, and beware.

It won't be too much longer and hardware design, as we used to know it, will be remembered alongside the slide rule and the Karnaugh map. You will need to move beyond those familiar bits and bytes into the new world of software centric design.

People who want to take advantage of solar energy in their homes no longer need to install a bolt-on solar-panel system atop their houses -- they can integrate solar-energy-harvesting shingles directing into an existing or new roof instead.

Kaspersky Labs indicated at its February meeting that cyber attacks are far more sophisticated than previous thought. It turns out even air-gapping (disconnecting computers from the Internet to protect against cyber intrusion) isn’t a foolproof way to avoid getting hacked. And Kaspersky implied the NSA is the smartest attacker.

Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.