Seeing with sound

23rd March 2017

The 3D ultrasound sensor system created by Toposens enables users to ‘see’ via sound

Barbara Brauner reveals a new technology set to shake up the application of ultrasound for a variety of tasks

Bats and a few other animals have the The sensor system uses the principle of power to see their environment via echolocation, similar to the bat. Ultrasound signals are ultrasound.

Even though ultrasound is emitted by the sensor and reflected by the objects in all around us - in cars, factories, ships, the sensor’s range.

The signals travel back to the sensor medicine, etc - researchers and developers and an algorithm turns the runtimes into 3D position have not been able to use ultrasound sensors in more information in real time.

The output of the sensors are than one dimension, like the bat does. Until now, that is.

In 2016, a young German company showcased its first prototype of a 3D ultrasound sensor system.

Toposens is a start-up from Munich and its algorithms enable a small ultrasound sensor system to turn its surroundings into a ‘3D ultrasound map’.

The game- changing technology can be used for collision avoidance of autonomous vehicles, people tracking in smart buildings and gesture control for consumer electronics.

Over the course of the past year, Toposens implemented the technology into pilot projects, refined the sensor and is now offering the first sensor version as an evaluation kit to R&D departments for testing and prototyping.

“It’s a completely new technology. Customers are very interested, but don’t know what to expect. We want to give them the possibility to test our 3D ultrasound sensor system,” says managing director Tobias Bahnemann.

The sensor system uses the principle of echolocation, similar to the bat. Ultrasound signals are emitted by the sensor and reflected by the objects in the sensor’s range.

The signals travel back to the sensor and an algorithm turns the runtimes into 3D position information in real time. The output of the sensors are x, y and z coordinates in absolute units, together with an approximate size of the objects.

The sensor can be connected to the computer via Bluetooth or USB, and the sensor data can be easily visualised.

Various filter options enable the quick configuration of the sensor parameters and thus support rapid prototyping.

The current sensor version is 10 x 4.5 x 2cm, has a range of 6m and an opening angle of 90°.

The accuracy is approximately 1-5cm. The sensor scans with an average frame rate of 30fps and uses only 0.4W. Ultrasound is not sensitive to varying lighting conditions, darkness, dirt or dust. It does not intrude into people’s privacy, since no personal features are identified. Standard hardware components can be used, which makes the sensor system affordable.

Finally, people can ‘see’ via sound – an ability that is especially important for robots, as they can now integrate ultrasonic machine vision, which enables them to better interact with their environment. l