Subscribe to our newsletter

Login to ScienceNode

At Science Node, we need your help. We have ALMOST reached our fund-raising goal. In order to maintain our independence as a source of unbiased news and information, we don’t take money from big tech corporations. That’s why we’re asking our readers to help us raise the final $10,000 we need to meet our budget for the year. Donate now to Science Node's Gofundme campaign. Thank you!

Robot dog leads the pack

Training a guide dog for the visually impaired is expensive and time-consuming

Robotic dogs may be faster, cheaper, and without physical limitations

Student team’s prototype won first place in Intel Cup electronic design contest

According to Guide Dogs of America, a 16- to 18-month-old puppy will go through four to six months of training before it can become a guide dog. And that doesn’t consider the financial costs of training.

Robot Rover. Designed for wheelchair users, this robotic dog prototype is programmed to recognize dangers and guide the visually impaired. Courtesy 3TV/CBS5.

“The main motivations for this problem are how long it currently takes to train guide dogs for use and the cost of doing so,” computer science senior Stephen Lockhart said. “Along with this are the dog’s physical limitations such as color blindness.”

The team has developed a robotic guide dog for a motorized wheelchair for use by individuals who are visually impaired. Both prototypes are currently too small in physical scale to be used by people, but the project allows this concept to be easily demonstrated and tested for the future.

Many different systems were integrated for the guide dog to function properly. The guide dog uses many of the same communication protocols found in a smartphone, like Wi-Fi and Bluetooth.

“In addition to these protocols, we had to develop an algorithm in order to get the systems to behave in the way we wanted them to,” said Richard Simpson, a computer systems engineering and engineering (robotics) major. “For example, when the dog sees a cone we needed to make certain that it would decide to move around the cone rather than plowing into it.”

The guide dog also is equipped with Amazon’s Alexa technology to understand verbal commands.

“I believe using Alexa mostly came out of the necessity of needing an easy method of control that would work for a person who is blind,” Lockhart said. “We also demonstrated without the voice control since purely relying on voice commands in a noisy urban environment might prove more difficult than a set of buttons to press for commands.”

For the guide dog to “see” its surroundings, a GoPro camera was strapped to its head.

“The images it sees are then passed onto the wheelchair’s computer for a combination of AI visual recognition and specific filters for things like a cone or stoplight,” Lockhart said. “This information is then sent to a laptop running our logic system in a program called VIPLE.”

While that information is sent to the computer, a user can speak a command to Alexa, which is received by the laptop and put into VIPLE (visual internet of things/robotics programming language environment), along with the visual information from the GoPro. Depending on what is “seen” and the command given, instructions are sent back to the wheelchair and on to the guide dog to process what movement to make.

Despite ultimately winning first place, the project was not without its difficulties.

“We faced challenges in having the vision be consistent and not misidentifying a red T-shirt as a red stoplight,” Lockhart said.

“Another major challenge area was the motions of the dog. It had great difficulty turning left for a long time, and its movements could change drastically depending on the surface it was on.”

Leading up to the event, team mentors Yinong Chen, a computer science and engineering senior lecturer, and Jinhui Zhu, a visiting scholar at ASU’s School of Computing, Informatics, and Decision Systems Engineering, were very optimistic about the project’s performance in the competition.

“I honestly didn’t quite know what to expect for the competition and was mostly more concerned with getting things working properly before worrying about our place in the competition,” Lockhart said.

“I don’t think I really gave it much thought until Shanghai, but Professor Chen’s confidence rubbed off on most of us, so we were aiming to at the very least match previous ASU teams in terms of standing.”

Lockhart and Simpson may have been the only two team members to travel to China, but the team’s five other student members and multiple team mentors were rooting for them from home.

Computer systems engineering majors Matthew Koltes and Tyler Pavkov, computer science majors Aubree Dagilis and Yichenglong Zhong, and Denis Liu, a high school junior from Corona Del Sol High School in Tempe, Arizona, rounded out the team.

The guide dog project is a part of Chen’s ongoing and future-oriented computer science education projects.

“These projects are based on the integration of VIPLE, Alexa-based voice control, and machine learning,” Chen said. “The guide dog and wheelchair in this Intel Cup project extend the types of the devices that VIPLE can control, and thus offer more platforms for teaching and research.”

Contact

Science Node

Disclaimer: While Science Node ™ does its best to provide complete and up-to-date information, it does not warrant that the information is error-free and disclaims all liability with respect to results from the use of the information.

Republish

We encourage you to republish this article online and in print, it’s free under our creative commons attribution license, but please follow some simple guidelines:

You have to credit our authors.

You have to credit ScienceNode.org — where possible include our logo with a link back to the original article.

You can simply run the first few lines of the article and then add: “Read the full article on ScienceNode.org” containing a link back to the original article.

The easiest way to get the article on your site is to embed the code below.