On-Orbit GPS Receivers: Not Just for Navigation

Dr. Rebecca Bishop, Senior Scientist, The...

On-Orbit GPS Receivers: Not Just for...

Autosmart Solution: The Flexibility of a Solution Increasing the Accuracy of Infrared Temperature Sensors

Bram Stelt, CEO of Exergen Global

Autosmart Solution: The Flexibility...

With an Aim to Optimize Telematics Security

Kevin Baltes, Director - Product Cybersecurity...

With an Aim to Optimize Telematics...

Making Sense of Environmentally-Aware Robots

John Dulchinos, Vice President, 3d Printing &...

Making Sense of Environmentally-Aware...

How Can Self-Driving Cars Have Better Vision?

Enterprise Technology Review | Wednesday, May 13, 2020

With the help of a camera, radar, and lidar sensors, autonomous vehicles are provided with superhuman vision.

Fremont, CA: In order to drive better than humans, autonomous vehicles must first be able to see better than humans. The most significant development of hurdle is building reliable vision capabilities for self-driving cars. However, the developers have been able to create a detection system that can “see” a vehicle’s environment better than the human eyes by combining a variety of sensors. To ensure that what a car is detecting is accurate, the three key elements of this system diversity, different types of sensors, and redundancy that overlaps the sensors verify the accuracy. The three predominant autonomous vehicle sensors are camera, radar, and lidar—working together, they provide the visuals of its surroundings to the car and help detect the speed and distance of nearby objects, as well as their three-dimensional shape. Additionally, sensors, also being inertial measurement units, help to track a vehicle’s acceleration and location. To get a better understanding of how these sensors work on a self-driving car and out-throne and improve human driving vision, here are some commonly used sensors.

Camera Never Lies

Top 10 Sensor Technology Companies in Europe - 2020Starting from photos to video, cameras are the most accurate way to create a visual representation of everything around us, precisely when it comes to self-driving cars. The autonomous vehicles depend on the cameras placed on every side to provide a 360-degree view of the environment when combined. Others focus on a narrower view to provide long-range visuals. Some cars have integrated fish-eye cameras that contain super-wide lenses that provide a panoramic view, to provide a complete picture of what is behind the vehicle for it to park itself.

Laser Focus

Common sensors like cameras and radar are used in most of the new cars for advanced driver assistance and park assist. Under human supervision, they can also cover lower levels of autonomy. To get full driverless capability, lidar, a sensor that measures distances by pulsing lasers, has come out to be incredibly useful. Lidar makes it seamless for the self-driving cars to have a 3D view of their environment. It works very well, even in low-light conditions.

The Radar

In times of low visibility, such as night driving, radar sensors can supplement camera vision and improve detection for self-driving cars. It traditionally used in detecting ships, aircraft, and weather formations to transmit radio waves in pulses. In vehicles’ cameras, radar sensors typically surround the car to detect objects at every angle. They are capable of determining speed and distance; they cannot distinguish between various types of vehicles.

see also: Top Automotive Technology Companies

Top