Robotics researchers at Queensland University of Technology, Brisbane, Australia, have developed new technology to equip underground mining vehicles to navigate autonomously through dust, camera blur and bad lighting. Using mathematics and biologically-inspired algorithms, the technology uses vehicle-mounted cameras to track the location of the vehicle in tunnels to meter-accuracy.

The research has been led by a team from the Australian Centre for Robotic Vision (also at QUT), including Professor Michael Milford, in collaboration with partners Catepillar, Mining3 and the Queensland Government.

The combination of issues driving this development are that autonomous vehicles are increasingly used in the underground mining industry; the machinery needs to navigate through harsh environment and maze of tunnels; but mine operators are typically using costly sensing or infrastructure modifications. Moreover, new positioning technology is increasing efficiency and safety of operations underground.

“This is stage one of the project,” he said. “It’s commercially important to be able to track the location of all your mobile assets in an underground mine, especially if you can do it cheaply without needing to install extra infrastructure or use expensive laser sensing.”

“We have developed a positioning system that uses cameras rather than lasers, based on more than a decade of research in biologically-inspired navigation technology.”

The tough terrain means Global Positioning Systems cannot be used and wireless sensor networks are less reliable due to interference from the rock mass and lack of access points.

Professor Milford added that the conditions at mine sites were challenging. “It wasn’t all plain sailing for our experiments as the research systems did not work that well when first tested in mine site environments,” he said.

“We had to add some additional intelligence to the technology, to deal with the challenging environment. We developed a system which could intelligently evaluate the usefulness of the images coming in from the camera – and disregard ones that were blurry, dusty, or that were washed out from incoming vehicle lights.”

”Further field trips will enable us to start testing the second stage of the project, a more precise positioning technology,” Milford said. “If you can track the vehicle’s position to within a few centimetres then you can use that technology to run the vehicle autonomously.”