Global trade in food and luxuries has been going on for centuries, but the next big commodity looks set to be invisible, yet essential. As we change how we react with each other and the world around us, artificial intelligence is destined to become far more than a future-gazing distraction for geeks and scientists.

"AI is going to be a commodity, available on mobiles and computer terminals," says technology analyst Peter Cochrane, "It may well be the most vital of all commodities, surpassing water, food, heat and light. Without it, we will certainly not survive as a species."

One of our problems is data - masses of it. A few hundred years of scientific inquiry and the invention of the data-generating and sharing mechanism that is the internet has left reams of crucial information unused and unanalysed.

Medical teams need to process and cross-reference both personal data and that from previous patients and studies, while financiers find it ever harder to wade through enough data to make coherent predictions of movements on the stock exchanges.

What these industries need - and will get - is "machine learning" from cloud-based computers capable not just of spotting trends in data but making predictions and learning from conclusions. AI is not about sentient robots, but machines that mimic our organic intelligence by adapting to, as well as recognising, patterns in data. AI is about making machines understand.

Watson is such a supercomputer. Comprised of at least 90 servers and 16,000 gigabytes of RAM, Watson - developed by IBM - can process the equivalent of a million books per second. And using revolutionary smart learning software called DeepQA, Watson can interpret the meaning of questions. So what makes Watson different to other supercomputers? "Hardware-wise, there's not much," says Cochrane. "It's the more than 300 algorithms it uses which makes the difference."

There are hundreds of other supercomputers, but none are as good as Watson - not by a long way.

Applications like Watson represent the latest benchmark of computing power, but the era of artificial intelligence as a whole began long ago. Most of us use it each time we punch a few words into a search engine.

"AI has been out there as a commodity for some time," says Steve Furber, the ICL Professor of Computer Engineering at the University of Manchester in Britain. "Google is built on AI as the foundation of its business; it's just not artificial 'common sense'." In dividing up your search terms and cross-referencing them with webpages to bring you the most relevant, ranked results, Google processes 24 petabytes (1,000 terabytes) of data per day, and it does so using top secret algorithms.

A few months ago Google X Labs announced it had developed a nine-layer neural network that could detect faces; a total of 16,000 computer cores spent three days sifting through 10 million random images on YouTube, recognising faces despite not being told what a face was. It even managed to tell the difference between human faces and cat faces.

This apes how the brain processes data, by creating a neural network that gets better at identifying links and relationships the more data it receives. It is like a student, whose analytical powers increase with more tutoring.

The kind of contextual image recognition achieved in the labs is already being used by Google to refine results in Google Voice. The most striking change is the trend to mobile web searching; the Pew Internet Project reports that about a quarter of US citizens go online using mostly a smartphone.

Despite the prevalence of the touch screen, interacting with computers by speaking to them is more convenient, and is destined to take over once accuracy increases.

Apple and Android smartphones already have this with Siri and Google Voice and there are third-party apps, such as Evi. "Siri and Evi are open text query systems that perform similar functions with lower performance (than Watson), but more applicability," says Furber.

Evi has been installed on more than a million mobile devices. "Apps such as Evi are taking AI into the mainstream," says its British inventor William Tunstall-Pedo. "Evi has a huge knowledge base she can understand and reason with," he says. Evi contains almost one billion facts. "This knowledge includes common sense about the world that was previously limited to human beings," he says.

As well as sending SMS using spoken commands, Evi will answer just about any question using a mix of its own database and the wider web, although it won't yet give tailored financial predictions. Nor does it learn about individual users. Could we all soon strike-up a conversation with a supercomputer like Watson instead?

"There is no obvious reason why Watson could not be deployed in a similar manner to raise the standard of the 'common sense' aspects of these systems," says Furber. "Within 10 years we'll all have access to it."

This article appeared in the South China Morning Post print edition as Why the future looks bright