Wanna go for a long ride…

The rise of self-driving cars is now a reality — and even more, will be available soon than relatively seen in Sci-Fi movies and books. A self-driving car, also known as a robot car, autonomous car, or driverless car, is a vehicle that is capable of sensing its environment and moving with little or no human input. Cars today already include many semi-autonomous features, like assisted parking, blind-spot monitoring, lane-keep assistance, forward collision warning, and self-braking systems. Autonomous vehicles — able to operate without human control — are rapidly becoming more of a reality. But, to get this technology we must wait for a while to get some speed and momentum to throttle up our seat-belts… The main motto of bringing this technology to the lines is to decrease the number of accidents occurring mainly due to human errors. To control this we need technology, sensors, devices, machines, systems, and less-human intervention. This piece-of-cake is proactively taking up the top lines in the automotive and technology sectors to witness the way how transportation occurs like never-before. Sensors, connectivity, and software/ hardware control algorithms are the topmost industry’s sneak-peeks to be challenged with. Sensors used for features such as RADAR, Ultrasonics, LIDAR, GPS, modules, odometry, inertial measurement units and cameras which provide the input necessary to navigate the car safely and securely. Connectivity means cars have access to the latest traffic, weather, surface conditions, construction, continuously scan the area around it, detecting hazards like vehicles, pedestrians, traffic lights, road markings, braking, accelerating or steering, maps, adjacent cars, and road infrastructure updates accordingly. This data is used to monitor a car’s surrounding operating environment to anticipate braking or avoid hazardous conditions.

It’s simply a computer brain which can learn, adapt and make decisions based on experience, rather than following a pre-programmed set of commands — essential for dealing with changing traffic conditions or unmapped roads. Finally, software/control algorithms are needed to reliably capture the data from sensors and connectivity and make decisions as humans so often. The decision-making of the algorithms must be able to handle a multitude of simple and complex driving situations flawlessly without any chances of miscommunications. The software used to implement these algorithms must be robust and fault-tolerant.

Level Classification:

There are six levels of automation for driverless cars:

At Level 0: All major systems are controlled by humans. The driver is in control of all aspects of driving — from steering to operating the pedals, navigating and monitoring their surroundings

At Level 1 (Driver Assistance — Hands-On): The driver and the automated system share control of the vehicle. Examples are Adaptive Cruise Control (ACC), and Parking Assistance. The driver must be ready to retake full control at any time. Lane Keeping Assistance (LKA) Type II is a further example of level 1 self-driving

At Level 2 (Partial Automation — Hands Off): The automated system takes full control of the vehicle (accelerating, braking, and steering). The driver must monitor the driving and be prepared to intervene immediately at any time if the automated system fails to respond properly. In fact, contact between hand and wheel is often mandatory during driving, to confirm that the driver is ready to intervene

At Level 3 (Conditional Automation — Eyes Off): The vehicle will handle situations that call for an immediate response, like emergency braking. Vehicle performs certain “safety-critical functions” under various traffic or environmental conditions. Level 3 autonomy allows your vehicle to monitor its surroundings and move between lanes. However, we must still be ready to resume control of the car if required

At Level 4 (High Automation — Mind Off): The car is fully-autonomous in some driving scenarios, though not all. A vehicle can operate without requiring human input. Autonomous cars found at Level 4 can run themselves but still require a human driver on board. They can control your ignition, your car’s steering, braking, and acceleration. These autonomous cars also boast technologies that allow them to monitor their surroundings across a broad range of environments. Level 4 cars also have the capability to handle any parking duties. However, even if you fail to intervene and something goes wrong, Level 4 cars will continue to drive autonomously

At Level 5 (Full Automation — Steering Wheel Optional): No human intervention is required at all. An example would be a robotic taxi. The car is completely capable of self-driving in any situation. They do not need pedals, a steering wheel, or even a human onboard. Lastly, there’s complete A-to-B autonomy, with no input from the driver. At this level of Automation, humans have no choice to choose or intervene

Sensors and Technologies:

Modern self-driving cars generally use Bayesian simultaneous localization and mapping (SLAM) algorithms, which fuse data from multiple sensors and an offline map into current location estimates and map updates. Driverless vehicles require some form of machine vision for the purpose of visual object recognition. The neural network depends on an extensive amount of data extracted from real-life driving scenarios, enabling the neural network to learn how to execute the best course of action. These technologies enable the vehicles to operate at six increasingly sophisticated and autonomous levels. The main sensors in autonomous cars include:

Ultrasonic sensors: Generally, these have low resolution and are used for short distances. Uses high-frequency sound waves and bounce-back to calculate distance. Best in close range

Odometry: Sensors use wheel speed to estimate how much vehicle travel

Dedicated short-range communication (DSRC): Used for V2V and V2I systems to receive and send vehicle information. Permitting vehicle to communicate with other vehicles using DSRC, a wireless communication standard that enables reliable data transmission in active safety applications

Infrared sensors: Use infrared spectrum to identify and track objects that are hard to detect in low lighting conditions

Global Positioning Systems (GPS): Accuracy is within several meters. Triangulates the position of the car using satellites. Current GPS technology is limited to a certain distance. Advanced GPS is in development

Cameras: With complex software suite and fewer hardware modules, cameras are essential for spotting things like lane lines on the highway, speed signs, and traffic lights. With better machine vision, they can use cameras to identify everything they see and navigate accordingly. Provides real-time obstacle detection

Light Detection and Ranging (LIDAR): Uses light beams to estimate the distance between obstacles and sensors with high resolutions. The spinning thing we see on top of most self-driving cars is LIDAR. It fires out millions of lasers beams every second, measure how long they take to bounce back, and uses the data to build a 3D map that’s more precise and constantly updates the 3D map that will spot obstacles instantly and it understands easily to a computer than a 2D camera image

Radio Detection and Ranging (RADAR): Uses electromagnetic eaves in certain bands to reflect off of an object and determine its speed and distance. Radars bounce radio waves around to see their surrounding and are especially good at spotting big metallic objects. They’re cheap and reliable. These detect short and long-range depths

Central Computer: It acts as the brain of the vehicle which receives information from various components and helps the vehicle to direct

Artificial intelligence: AI is a major focus for autonomous-vehicle testing and development, and the vehicles are applying AI — a collection of discrete technologies — in new and innovative ways. Deep learning, which mimics neuron activity, supports functions like voice and speech recognition, voice search, image recognition and processing, motion detection, and data analysis. To perceive visual surroundings, most self-driving cars have some combination of three visual systems: Cameras, RADAR and LIDAR. The AI synthesizes the data from these different systems to fully map out its surroundings and watch out for unexpected obstacles. Machine Learning, the AI tool trains computers to do things like detecting lane lines and identify cyclists, vehicles, and infrastructures by showing them millions of examples

Decision Making Approaches:

Fully autonomous cars can make thousands of decisions for every mile traveled. They need to do so correctly and consistently. Currently, AV designers use a few primary methods to keep their cars on the right path

Rule-based decision making: They come up with all possible combinations of if-then rules and then program vehicles accordingly

Hybrid approach: This employs both neural networks and rule-based programming as the best solution

Brute force: They expose vehicles to millions of driving miles to determine statistically that systems are safe and operate as expected

Software-in-the-loop or model-in-the loop simulations: A more feasible approach combines real-world tests with simulations, which can greatly reduce the number of testing miles required and is already familiar in the automotive industry. Digital Twin technology comes into the role in doing the major tasks

Hardware-in-the-loop simulations: To validate the operation of actual hardware. HIL simulations test it and fed by pre-recorded sensor data into the systems

Actuators: Actuators are the components of a machine responsible for controlling and moving the system. Actuators are like muscles of our body, responding to electrochemical signals from our brain. The car keeps the brake applied, which is the result of its decision-making process

Safety Concerns:

Whilst there is a great deal of apprehension around the safety of driverless cars on our roads, the latest vehicular communications have been designed to enable continuous, reliable, high-speed, authenticable interactions between moving vehicles. To ensure consistent safe operations, autonomous vehicles are equipped with numerous cameras and other types of sensors to carefully monitor the external environment where the vehicles are operating in.

Types of connectivity:

There are 5 ways a vehicle can be connected to its surroundings and communicate with them

V2I (Vehicle to Infrastructure): This technology captures data generated by the vehicle and provides information about the infrastructure to the driver. The V2I technology communicates information about safety, mobility or environment-related conditions. Therefore, allows for data exchange with the surrounding infrastructure to operate within the bounds of speed limits, traffic lights, and signage

V2V (Vehicle to Vehicle): This technology communicates information about speed and position of surrounding vehicles through a wireless exchange of information. The goal is to avoid accidents, ease traffic congestions and have a positive impact on the environment. Also, a road structure and an agreed-on set of rules of the road to guide self-driving vehicles are essential

V2C (Vehicle to Cloud): This technology exchanges information with a cloud system to use information from the cloud-connected industries like energy, transportation, and smart homes and make use of IoT

V2P (Vehicle to Pedestrian): This technology senses information about its environment and communicates it to other vehicles, infrastructure, and personal mobile devices. This enables the vehicle to communicate with pedestrians and is intended to improve safety and mobility on the roads

V2X (Vehicle to Everything): The technology interconnects all types of vehicles and infrastructure systems with another. This connectivity includes cars, highways, ships, trains, and airplanes. V2X technology is projected to significantly improve transportation safety

5G Communications for Self-Driving Cars:

5G promises to introduce faster speeds than 4G at a possible 10 to 20 GB per second, lower latency, more responsive to devise requests, and the ability to connect to multiple devices at once without sacrificing performance. 5G will operate by using a type of encoding called OFDM and run on frequencies below or above 6 GHz. Cellular-V2X is a developing communication platform that leverages LTE. It provides an integrated solution for V2V, V2I & V2N using cellular networks.

Autonomous Vehicles:

Software:

Whereas the hardware components of the autonomous car enable the car to perform such functions as see, communicate and move, the software is like the brain, which processes information about the environment so that the car understands what action to take — whether to move, stop, slow down, etc. Autonomous vehicle software can be categorized into 3 systems;

Perception Stage: The vehicle turns the raw information coming in from the perception stage into actual meaning. The camera information reveals the data in visualization formats

Planning Stage: The vehicle combines the sensing information processed during the perception stage with the incoming V2X information to determine how to behave

Control Stage: The car must translate its decision to not move into an action

Hardware:

The necessary hardware can be divided into built-in or brought-in connection systems. Most brought-in devices are plugged in the OBD (onboard diagnostics) port for electrification and access to vehicle data. The built-in solutions were mostly driven by safety. Brought-In devices usually focus on customer segments and specific use cases.

Automobiles of the future will be as different from today as the first automobiles differed from the horse and buggy. Driverless cars will be significantly more energy-efficient, safer, less damaging to the environment, and more economical to operate than any mode of transportation in the human experience. The landscape of vehicular transportation and urban safety is undergoing a fundamental change due to automation.

The transition to the future of automotive will not be quick. The pace of technological improvement will continue to increase. Most automakers have now understood the dimension of the challenge and will proceed further to succeed in these techno-space developments.

“Think of the autonomous vehicle as just being a computer on wheels. Think of how many times your computer does something that it wasn’t supposed to do.”