Blog Categories

Blog Archive

Autonomous Vehicles – How Far From Real Humanless Driving?

April 05 2021
Author: Varchasvi Shankar
Autonomous Vehicles – How Far From Real Humanless Driving?

Autonomous Vehicles (AV) and Electrical Vehicles (EV) can be paired together. However, AVs are not nearly as far along as EVs, not from a deployment standpoint. However, they are not co-dependent but are co-enabling. AVs don't have to be electric. Some of the electrification drive technologies such as brake by wire are beneficial to automated development. So, it is essential that they go together.

What exactly is an Autonomous Vehicle?

It is a vehicle capable of sensing its environment and navigating without human input. It involves advanced control systems that interpret sensory information to identify appropriate navigation paths and obstacles and any relevant signage, for example, a change in speed limit.

When we talk about vehicle automation, it's the fact that we're using something called mechatronics in the underlying technology, which is the combination of mechanical and electrical components. Behind this, there is also the use of artificial intelligence.

When we discuss the autonomous vehicle subject, there are three main things that people are usually interested in. First of all, there's the technology, how does it work? What's under the hood? What's, inside that box, black box. Another critical issue is our policy issues.

Sensors, Processing, Actuators

When we say technology, there are three main areas that technology falls into – Sensors, Processing, and Actuators. Now, these are not listed necessarily in the order of their complexity. If we look at it in terms of complexity, we would list actuators first. Actuators are very simple, they perform tasks, such as controlling the steering, managing the brakes, and while holding the brakes, hold the throttle at the same time.

Sensors

accomplish a more significant task. Sensors test and report on what's going on situationally, creating situational awareness for the vehicle. And then, of course, there's processing. There is a lot that goes into processing. For example, let’s look at a Google car. A Google car has few primary sensors, one of which GPS, which is pretty standard.

 

For a lot of vehicles with GPS, we have what's called LIDAR. It could have multiple stereo cameras to produce a 3-Dimensional image. It would also contain radar. A Google car has one or more radars, sensors, an ultrasonic position estimator, along a processor that does all the data handling.

LIDAR and AVs

What is LIDAR? LIDAR is short for Light plus Optical Radar with Scanner. It uses lasers and optics to identify objects around the vehicle. A laser emitter sends out a beam, the beam comes in contact with an object and reflects back to the scanner, and a receiver captures the reflected beam.

A laser sits on top of the vehicle and rotates. While rotating, the laser gets a sense of the entire landscape. The data or the images captured by the laser are supplied into the LIDAR system. The processor then analyzes the data.

The LIDAR provides a way to obtain and capture optical and distance information. The light goes out, and it bounces back. The amount of time it takes for the light to come back is an indication of the object’s distance.

As the LIDAR paints this picture, the environment and light also get distance information from the car to those objects. The whole concept gets painted like a 3-Dimensional picture, which is continuously updated.

Ultrasonic Position Estimator(UPE)

Next on the technology list is the Ultrasonic Position Estimator (UPE). The UPE uses high-frequency sound pulses; that's the ultrasound. Unlike the LIDAR, which gives high precision information, the UPE provides very coarse information. But that information is used in conjunction with the GPS with high-resolution mapping of the area.

So, when the Google car, for example, drives, they already know exactly where they're going. And they know all the main features of the terrain in which they are driving. So as that car is going along, it knows everything down to the inch. All of this creates very high precision mapping.

Different applications use Radar. Radar sensors detect objects more than 100 meters away. While LIDAR is perfect for objects at a certain distance, has detection accuracy +/- 0.3 meters at 90 meters, and a minimum range of 300 meters. Radar is good at giving precise information about close objects, while UPE is most likely suitable for the short-range analysis of objects.

Algorithms

Now, when we talk about processing, we use algorithms, and there are many different algorithms used for AVs. We have what are called low-level algorithms that deal with the raw data.

We have, for instance, image processing related to the cameras associated with the 3D cameras. We have the processing of the image and image data to identify essential objects. We have LIDAR processing and radar processing. These are low-level algorithms to process the information and provide certain critical information that the other part of higher-level processors can use to make decisions. So, in terms of higher-level algorithms, we have several different higher-level algorithms.

Let's suppose we have a feature extraction algorithm. This algorithm will determine lane lines and extracts those features. It identifies, for example, a signpost, what type of sign is on that signpost, and so forth.

Object Classifications

Object classification is also essential as the system must identify whether it is a bicyclist? A pedestrian? A dog? A fire hydrant? A traffic light? What is it seeing? The system needs to identify and classify objects as the vehicle must know how it should respond to a pedestrian versus a lamppost.

AV processors need to be able to identify and classify what is in their field of view or travel. Some obstructions need to be placed. So not only identifying that there is an object, but there's an object and it is an obstruction.

You need to pick what constitutes an obstruction, mapping, and localization. If you're going to go from point A to point B autonomously, you need to have some way of knowing exactly where you are at all times and what objects are in your path. So that's the idea of mapping and localization, which is an essential aspect of AV dynamics modeling.

Autonomous Vehicles and Electrical Vehicles

Electric cars have come a long way ever since the movie 'Who Killed the Electric Car?' - a documentary film that explores the creation, limited commercialization, and subsequent destruction of the battery EV in the United States released in 2006.

A decade-and-half later, people are talking about how EVs and AVs will change the dynamics of mobility.

AVs and EVs are entirely different, but they've gone through the revolution simultaneously in terms of popularity. They are intertwined in some way, to the point where AVs in the not-so-distant future will not be gas-powered, they will ultimately be a type of EV.

However, AVs are not nearly as far along as EVs, certainly not from a deployment standpoint. Yet, they are not co-dependent but are co-enabling.

Autonomous Vehicles don't have to be electric. Some of the electrification drive technologies, brake by wire, for example, are beneficial to an automated development. It's nice if they go together, but it will come down to a regulatory policy decision. The advantages or the benefits promised with AVs are increased access to areas that don't currently have a lot of mobility and the easing of congestion in overly populated areas.

Autonomous Vehicle Challenges

Autonomous Vehicles have come a long way and surely will change the future of the automobile industry but that will come with certain challenges that can’t be ignored in the future.

AVs not only bring comfort to passengers but tend to solve pollution, parking, and traffic issues. But there are other challenges as well in the AVs industry from environmental to legal to technological.

The first challenge that AVs need to look at is the responsibility for accidents caused by an autonomous vehicle, the reason could be anything from technical failure to bad weather conditions. The vehicle manufacturer? The passenger? The latest report suggests that a fully autonomous car will not have a dashboard or even a steering wheel, so a human passenger would not even have the possibility to take control of the vehicle in an emergency.

Humans have the ability to make eye contact with pedestrians and other drivers to read their facial expressions and body language to make a split-second judgment. Will autonomous cars be able to replicate this connection? Will they have the same life-saving abilities as human drivers?

Though the AVs have been successfully programmed, there will still be possible technical glitches that may happen. There are certain instances when humans try to cross the road, and a vehicle in front of the human takes a sudden turn, then will the AVs behind the human-controlled vehicle sense the danger and apply the brake on time?

These are not all the challenges, there are still unsolved questions about AVs which need more research and development. Will AVs be capable enough to sense the danger inside tunnels in bad weather? How will they do in bumper-to-bumper traffic? Will AVs be assigned to a specific lane?

The major challenge is to test in heavy storms, rainfall, snow, and bad weather. What if the snow has covered the road, and lane dividers disappear? How will the AV's behavior with the cameras and sensors track lane markings if the markings are obscured by water, oil, ice, or debris.

How Autonomous Vehicles can change the future

Reduce Accidents

Humans make the errors, not technology as it is only programming. Unlike a human, AVs prevent errors from happening as the technology controls the AVs, not humans. AVs are not distracted by any external factors like humans who are prone to interruptions. The AVs use complicated programmed algorithms that determine the correct stoppage distance, following lane lines, maintaining the proper distance between two vehicles, or following speed limits. Thereby, lessening the chance of accidents significantly.

Reduce Traffic

Autonomous Vehicles work on programmed algorithms and follow lane lines strictly. The AVs run on proper gaps and apply the brakes or accelerate simultaneously which significantly reduces congestion and improves traffic by increasing the lane capacity. The AVs technology is good enough to detect hand signals from motorists and reacts accordingly.

Reduce Parking Congestion

The fast-growing, unplanned, and busy stores, shopping centers, and office buildings result in less parking space and increase problems for human drivers. The AVs have a solution for this. AVs can drop the passenger off at the dedicated stop and head to vacant parking spots.

IN CONCLUSION

In terms of processing, we as humans don't have a lot of sensory information. For example, we can’t see in front and behind us at the same time. Even though systems can do that, we as humans are not able to. We have plenty of processing power, that's why we're able to drive relatively safely.

AVs are still quite a few years away from being production-ready, whether it's the policy or just safety. They still have a way to go.