Startup Maps AI into Flash Array

Mythic preps low-power inference processor

Rick Merritt, EETimes 5/16/2018 02:01 AM EDT

AUSTIN, Texas — Wedged between a coffee shop and a hair salon in a gentrifying suburb here, a couple dozen engineers are exploring a new direction in computing. Startup Mythic aims to map neural networks into NOR memory arrays, calculating and storing results in ways that shave power consumption by perhaps two orders of magnitude.

If it works, the startup could leapfrog digital processors and cores from the likes of Intel, established IP providers, and a handful of well-heeled startups in China. They all aim to fill sockets in next-generation surveillance cameras, drones, factory gear — all sorts of embedded systems trying to hop on the bandwagon to artificial intelligence, including someday self-driving cars.

“We had known from grad school that mixed-signal processing was a great fit for this app,” said David Fick, who launched the company with a colleague at the University of Michigan. “You need to store a lot of weights and flash memory with its adjustable threshold voltage — every transistor is very appealing.”