If a cell phone can essentially see, hear, and detect movement like a person, shouldn’t it start to think like a person, too? That’s the basis of Qualcomm’s Zeroth processor, designed to emulate millions of the billions of neurons within the human brain.

A version of the Zeroth has already been built into a robotic platform that learns by being encouraged—quite literally, “good robot”—rather than being traditionally programmed, Qualcomm executives said.

For years, technologists have talked about personal assistants, pieces of code that pull in data and try to coalesce them into information that’s relevant and useful. Qualcomm’s Zeroth could form the hardware foundation upon which future personal assistants are built.

”Wouldn’t it be swell to have a device that you could train?” said M. Anthony Lewis, the senior director and the project engineer responsible for the Zeroth, in an interview. “It leads to the possibility of a customized user experience for each individual cellphone user, to be more like the phone that they want rather than the phone that they get.”

In a few years, Qualcomm envisions the Zeroth sitting alongside a future Qualcomm Snapdragon, Lewis said.

Snapdragon chips power a number of high-end smartphones and tablets, including the Samsung Galaxy S4, the Galaxy Note 3, the Google/Asus Nexus 7, and the HTC One mini, among others.

Conventional microprocessors were originally designed serially: to execute one instruction, than the next, than the next. That led to ever-increasing clock speeds, to execute those instructions as fast as possible.

Then other improvements were introduced: wider bus speeds, allowing the processor to chew on more data at any given time, and finally parallelism, which gave rise to the multicore chips that are now common today. The latter technology allows a microprocessor to process an instruction on one core while another processes a separate task simultaneously.

Cognitive computing

Massively parallel processors are seen as the future, if only because they can work on a multitiude of tasks at once. That’s how the human brain operates: processing the vast amount of data our eyes, ears, skin, nose, and mouth produce, building the sensory experience of a morning brunch on the patio of a mountain cabin, for example.

Instead of transistors and circuits, however, the brain uses a series of neurons to pass information. So-called cognitive computing is being worked on by IBM and Google, as well as national initiatives both within the United States, and separately within the European Union. Measuring the power of a neural network is usually dependent on the parameters or connection forged between the individual components; Lewis said that Zeroth was scaable to 10 million neurons and beyond—still a fraction of the hundreds of billions of neurons within the brain itself.

Qualcomm’s neural-processing units pass data in very small “spikes” of information, Lewis said, rather than the 32- or 64-bit chunks most processors are used to. But run in parallel, these small spikes of data can transmit large amounts of information—and, Qualcomm hopes, run cool enough to serve as a coprocessor of a phone or a data center.

The problem with dealing with parallel processors is that the notion of programming them is relatively new, while programming in a serial fashion is well understood. To help solve this problem, Qualcomm plans to release a tool chain next year. “The quick-start guide shouldn’t say, step one—earn a degree in neuroscience. Step two, program the chip,” Lewis said.

Qualcomm also built a version of the Zeroth chip into a small wheeled robot that the company trained to move around a small playfield, stopping at certain squares along the way, as shown above. The robot wasn’t ordered to move to specific squares, but when it did so, being told “Good robot” helped teach it what it was supposed to do.

And that same model could be used to “train” a future cell phone. “A cell phone is kind of like a robotic device,” Lewis said. “It lacks arms and legs, but it’s a robot in everything but name.”

To comment on this article and other PCWorld content, visit our Facebook page or our Twitter feed.