Opinion: Picking a Path for Autonomous Data

Artificial intelligence and automation could someday disrupt major sectors of the global economy, including trucking. While much is said about how automated operation could help mitigate the driver shortage, many developmental challenges remain before wide deployment of Class 8 trucks can be realized.

Cohen

For one, they are some of the heaviest and largest vehicles on the road. Plus, the shape of Class 8 trucks present opportunities and hurdles for these developmental challenges. These issues, and more, require technology more sophisticated than that found on autonomous passenger cars.

For example, Class 8 trucks require vision capabilities superior to those of smaller vehicles to allow earlier object detection. Truck heights provide cameras a superior vantage point for determining distance and assessing the driving environment — an advantage that can help with mapping an area. But trucks also can be prone to blind spots given their length. This means more cameras are needed.

Plus, while the tractors belong to the motor carriers, cargo containers often do not, meaning that both pieces of property will be outfitted for autonomous driving separately yet must work in tandem once hooked up for the road.

To address these challenges, the trucking industry has two technological paths it can follow: object fusion and raw data fusion.

Object fusion, currently widespread in the general AV industry, relies on sensors to detect objects separately. With this technology, unrelated sensors — for example, cameras and radar — won’t necessarily agree on what is unfolding and how the system should respond. Stationary objects are at risk of going undetected, which could cause the system to make faulty decisions. To compensate, many developers of AV perception systems include additional sensors — most commonly lidar — with higher accuracy, but these sensors can be incredibly expensive, with some costing upward of tens of thousands of dollars each.

Raw data fusion offers an alternative. This approach enables the detection of obstacles of any size and shape, and determines the amount of free space around an object without requiring an accurate classification of the object itself. The system’s low-level fusion of camera features with pre-filtered radar detection enables accurate image-based ego-motion (motion estimation of a camera system), which separates objects in space and provides an accurate positioning of the object.

As a result, any obstacle on the road can be detected, including stationary obstacles with arbitrary shapes, sizes and looks. Roads are full of “unexpected” objects that are absent from training data sets, even when those sets are captured while traveling long distances.

Unlike low-level fusion, systems that are mainly based on deep neural networks fail to detect these unexpected objects. An additional benefit is that sensors can be upgraded without major overhauls in their perception algorithms.

Someday, automated vehicles could be tasked with navigating urban places and could master those environments using image-based learning and digitized maps maintained regularly by local agencies. But to unlock the promise of automated trucks, AV developers must hone vehicles’ ability to detect everything in their path. After all, these trucks will operate in wide swaths of the country, experiencing a range of weather, environmental and infrastructure conditions. That includes detecting the unexpected — a tire on the road, an object protruding from another vehicle and so on. Further progress in autonomous driving for any vehicle broadly requires these improved detection capabilities, but the need is acute in trucking.

With a human truck driver along for the ride to take over should the need arise, an Embark autonomous truck earlier this year traveled 2,400 miles from Los Angeles to Jacksonville, Fla. Because a human driver was present, the truck had to take federally mandated rest breaks, bringing the trip’s length to five days. When the day comes that technology and infrastructure are advanced enough that a human aide is no longer required, developers expect such a journey to take two days. Handling the range of natural environments and driving conditions that truck might encounter will be a central focus of technological development in the years to come.

Ultimately, this technology could help the trucking industry realize gains in efficiency and productivity, fueling economic growth and job creation — all the more reason for the industry to commit to a clear-eyed vision for the future.

Ronny Cohen is co-founder and CEO of VAYAVISION, an environmental perception software solution provider for autonomous vehicles. Cohen has 30 years of experience in the technology industry, previously co-founding Backwell, a startup in the health industry.