Covering the near-field environment
How can it be possible that a staggering one of every five motor vehicle accidents takes place in a parking lot? Even at slow speed in a parking situation, the driver cannot perceive his environment flawlessly, despite parking assistance functions like camera and PDC´s. Tight parking spots in car parks or crowded parking areas in front of shopping malls display everyday driving challenges. Thinking about it, it is quite reckless to not support your car with the cutting-edge sensor technology and thus leaving multiple moving tons without eyes. This way cars can even park autonomously in the most difficult situations – and every ride starts in a parking position and ends with a parking maneuver. Therefore, it is crucial to reliably perceive an autonomous car’s near-field environment.
While existing sensor technologies mostly focuses on covering far distances, the immediate environment around a car (0 – 5 m) is often left out of the discussion. That is where the 3D sensor startup Toposens comes into play. Toposens developed the first 3D ultrasound sensor worldwide. The sensor uses the principle of echolocation, like a bat, to perceive its environment in 3D in real-time. The sensor is targeted towards covering the near-field environment (0 – 5 m) of autonomous vehicles. It is small, robust against external influences provides a lean and simple data stream, and is inexpensive to produce. Without a doubt, Toposens newest “Automotive Dev Kit” yields a huge step towards improved near-field perception for both current vehicle generations as well as autonomous moving vehicles!
Launch at CES 2019: Automotive Development Kit
At CES 2019, Toposens presents its latest product – the Automotive Development Kit – to the public for the first time. The DevKit is targeted towards automotive companies, who want to prototype and implement the 3D ultrasound sensor technology on their own. It consists of four 3D ultrasound sensors, a control unit to aggregate the data stream, and a graphical user interface. The Automotive DevKit can be mounted onto an autonomous vehicle, where it will map its environment, detecting both still and moving objects.
Sophisticated algorithms that run on a chip directly on the sensor system itself, enable the sensor to “see with sound” in 3D. This unique combination of standard hardware and proprietary software creates a 3D sensor that is perfectly suited to the application in the autonomous vehicles space.