Current LiDar systems are mechanical, slow, and expensive. And while there are many companies racing to mass produce a solid-state product cheaply, none are on the market yet. Moreover, those heading towards production can see only up to 250 meters in the best of circumstances. While that may be enough for autonomous vehicles to operate at low speeds in heavily mapped environments, it might not be enough for vehicles traveling at high speeds to stop quickly, and that goes doubly for semi-trucks.

These limitations are just part of what's keeping autonomous vehicles from the mainstream market. However, AEye says it can fix that.

Its AE100 solid-state LiDar is more than just a detection system--it's a computer vision system that physically combines LiDar with a high-resolution, low-light camera into a single unit, and uses artificial intelligence to processes the data on the fly.

Camera pixels and the LiDar voxels that make up a 3D point cloud of the scanned environment are merged in real-time to create what the company calls "dynamic vixels," and its software-definable feedback-loop zeros-in on objects deemed important in its field of vision in a way that mimics what the eye's visual cortex does.

In doing so, it throws out 90% of what AEye says is useless data that the system gathers, and send only the 10% that is crucial information to the vehicle's decision making system. Because it's trying to make sense of only 10% of the data, it's faster than any other system heading toward the market.

"The ability of the 3D component of sensing to actually change its pattern and mechanically be merged with the camera allows for intelligence," explains AEye Chief of Staff Blair LaCorte. "Just like your eye, it can simultaneously see things in its periphery and put more energy on things it's focused on."

Right now there is an ongoing debate about whether or not autonomous vehicle systems are or can ever be better than humans at processing and reacting to critical situations. The human eye processes information at 27 hz, while most LiDar systems function at 10 hz. That's because people are able to use multiple senses to quickly assess their environment and leverage past experience to determine what's important in the environment while ignoring the rest to decide how to react.

AEye has built a LiDar system that does the same thing, but better--it detects objects at 1,000 meters, performs at a scan rate of 100 hz, and uses a feed-back loop to send and receive information to help it prioritize relevant information near-instantaneously to influence the vehicle's path planning.

"We're searching 3,000 times faster and better than any system anywhere, but we're also predicting where we want to interrogate," says LaCorte in a phone interview. "So when I enter an intersection, it may focus on the entry points of the intersection, but it may also put extra energy in the direction I'm turning on an unprotected left-turn."

By combining the ability to search and focus on different things with the ability to take feedback from the vehicle's path finding system so that it knows what its doing and why it's doing it, it has created an agile solid state LiDar with an architecture of intelligence. This means that for the first time there could be a LiDar that process information like humans do.

Backing up this claim is VSI Labs, which AEye says has independently verified the technology. It tested the LiDar system's ability to track a 20-foot moving truck on an airport runway and found it consistently followed the vehicle and continuously scanned down the length of the 914 meter runway and and beyond it, as well as detect signs and markers en route.

In setting the new high bar for automotive-grade LiDar, AEye has caught the attention of Silicon Valley heavyweights such as Intel Capital and Airbus Ventures, and one of its co-founders, Jordan Greene, was featured in FORBES 30 Under 30 Manufacturing 2019 list. The company announced today that it closed a $40 million Series B funding led by Taiwania Capital. It will use the funds to scale and enter production, and expects to have its AE200 unit, which is targeted toward passenger vehicles with Level 4 capabilities, in low volume production during the first half of 2019. The A100, which is targeted towards "robo-taxis" like Waymo, will follow in the second half of 2019.

The AE100 is expected to retail around $5,000 and the AE200 will cost in the range of $1,000 depending on the system configuration. The Bay Area-based company says that there are 30-60 vehicles road testing its system, and will announce more partners in the next few months.