ROBOTIC NAVIGATION

In this use case, we mounted our Utrasonic 3D Echolocation Sensor on a robot:

  • Navigate through an environment
  • Detect objects and obstacles like transparent surfaces
  • Data processing in ROS

This video demonstrates which sensor data we receive when it is mounted on a robot.

 

Keep in mind that our sensor is in early prototype status.

SENSOR SYSTEM USED:

ULTRASONIC 3D ECHOLOCATION SENSOR

Ultrasonic 3D Echolocation Sensors work by combining the time-of-flight principle of conventional ultrasonic sensors with triangulation and advanced signal processing algorithms. Therefore the sensors can locate objects in three-dimensional space and distinguish between multiple objects in a single scan.

Learn more about the Technology

Increasing automation in industrial environments and our everyday lives requires autonomously moving robots, which take over repetitive and, gradually, more demanding tasks. The Ultrasonic 3D Echolocation Sensor can supply your robot with necessary information about its close-range environment. Objects can be detected through the locations of the observed echoes, protecting the robot from driving into obstacles, including protruding or transparent objects.

turtlebot-ultrasonic-3d-ultrasound

In this use case, an Ultrasonic 3D Echolocation Sensor was integrated into a mobile robot (TurtleBot3) and controlled (manually) through a previously defined arena.

With the help of the “Toposens” ROS packages, the sensor data were read out and converted into a point cloud data type. The robot was configured and controlled using the “Turtlebot3” ROS packages. The data were visualized in RViz.

This use case shows how an Ultrasonic 3D Echolocation Sensor can be used in conjunction with autonomous technologies such as robots to help locate objects and recognize the environment. Therefore, our sensor system helps to avoid collisions and to guide autonomous technologies. The great added value of our sensors is the reliable detection of objects regardless of their color, transparency, or lighting conditions.

Point cloud data before and after post processing.

We are also working on software modules that process the sensor data to produce occupancy grid maps, as well as software modules for automatic path planning and autonomous navigation.

 

If you are interested in using our sensor system in one of your applications, or if you are interested in this robotic application, contact us!

Do you want to know how to use
Advanced Ultrasonic Sensors for your application?

CONTACT US!