Every business that needs to move around goods or people is going to have to face it: There are likely to be millions of autonomous vehicles on the roads a decade from now. But who’s going to teach them all to drive?

Maybe your colleagues, or your competitors, if Intel subsidiary Mobileye gets its way. And it expects you to pay for the privilege.

Mobileye, and other businesses including Nvidia, Visteon and Toyota are working to make car electronics as much a part of this year’s CES event in Las Vegas as consumer electronics.

New technology Mobileye is showing at CES 2018 in Las Vegas this week will allow cars with advanced driver assistance systems (ADAS) such as lane-keeping or adaptive cruise control to watch how and where you drive, gathering data that will be of use to fully autonomous vehicles in the future. Its crowd-sourcing on wheels.

Mobileye expects that auto makers including BMW, VW and Nissan will sell millions of cars equipped with front-facing cameras and the fourth generation of its EyeQ chips this year. As the vehicles are driven around, these chips will scan their surroundings and build detailed maps relating road features to stationary landmarks such as traffic signs and buildings.

The chips are powerful enough to condense 300 hundred million pixels per second of video into just 10 kilobytes of mapping data per kilometer, according to Mobileye’s Chief Communications Officer, Dan Galves.

That condensed data is then uploaded to Mobileye’s cloud database, part of what it calls its Road Experience Management program, where it can be used to train and guide future autonomous vehicles.

Walden Kirsch/Intel

Professor Amnon Shashua (left), senior vice president of Intel and CEO/CTO of Mobileye, arrived at the 2018 Consumer Electronics Show (CES) in Las Vegas in the back seat of an autonomous car. He joined Intel CEO Brian Krzanich on stage for Intel's preshow keynote on Monday, Jan. 8, 2018.

Owners and drivers concerned about privacy needn’t worry, said Galves: The data is anonymized. The only thing Mobileye records about its origin is the brand of vehicle it came from, and that’s only because of revenue-sharing agreements it has with certain auto makers.

But it’s worth bearing in mind that letting employees drive these vehicles into industrial or military establishments will allow them to create detailed 3D maps of potentially sensitive areas.

Mobileye’s EyeQ4 chip is powerful enough for so-called Level 2 and Level 3 autonomous vehicles -- ones that assist drivers with steering and braking, or do all of the driving under human supervision. It expects 11 auto makers to launch vehicles containing the chip this year or next.

Further out, it expects many auto makers to adopt its next generation chip, the EyeQ5. Building a Level 4 or Level 5 autonomous vehicle, capable of driving without human supervision, should be possible with just a couple of these chips and an Intel Atom processor, according to Galves. The combination can deliver around 50 tera-operations per second of processing at a lower cost and power consumption than competing architectures that require several hundred tera-ops to achieve the same results he said.

Mobileye is now working with Chinese auto maker SAIC on Level 3, 4 and 5 autonomous vehicles, it said at CES. SAIC will also deploy the REM data gathering system in China, working with local mapping company NavInfo.

Other chip makers and electronics aren’t just going sit back and watch while Intel drives off with the whole market; they’re pushing for their share too.

Auto parts maker Visteon has been talking up its own autonomous vehicle platform, DriveCore, since the middle of last year, but now you can finally see its hardware, software and development environment components in the company’s booth at CES.

DriveCore is a modular platform, allowing auto makers to choose as much processing capacity as they need for the functions they’re building, according to Upton Bowden, Visteon’s New Technology Planning Director. It can scale from 500 Gflops (500 billion floating-point operations per second) to 20 Tflops by adding processor daughter cards. The platform is not tied to a particular processor or accelerator architecture, allowing auto makers that have committed to Nvidia or Qualcomm processors, say, to use it with the appropriate daughter cards.

The software layer too allows auto makers to mix and match algorithms from different sources, perhaps using off-the-shelf navigation systems with a customised vision system.

This modularity raises the possibility that buyers might be able to upgrade their vehicles with after-market processor additions although, Bowden said, that’s unlikely to happen in practice for regulatory and product liability reasons.

Businesses developing autonomous vehicle apps will find plenty to work with at CES. In addition to Visteon’s controller, Toyota Research Institute has wheeled out its latest autonomous vehicle platform in Vegas too. "Platform 3.0" is built on a Lexus LS 600hL and includes a 360-degree sensor system that can "see" up to 200 meters, the company said.

Nvidia is also at the show. Its CEO Jensen Huang said the company is working with Aurora Innovation on a hardware platform for Level 4 and 5 autonomous vehicles, built around Nvidia’s DRIVE Xavier processor. VW’s ID Buzz, an electric microbus due on the roads around 2022, meanwhile, will have an AI copilot built on Nvidia’s Drive IX processor, he said.

Uber, the ride-hailing company, is planning a fleet of self-driving cars and trucks, and their AI systems will use Nvidia chips, Huang said.

Rival ride-hailing company Lyft has its vision of a self-driving transport system on the road in Las Vegas this week. It’s partnered with Aptiv (formerly Delphi Automotive) to offer rides around town in vehicles loaded with the company’s autonomous technologies. These are only Level 3 vehicles, needing a human driver to get them out of and back into parking lots, but back in Palo Alto Lyft is working on Level 5 driving technologies of its own.