“Three years ago, we dedicated ourselves on the single greatest endeavour in the history of our company,” Jen-Hsun Huang, CEO of Nvidia, said at the GPU Technology Conference earlier this month. “We decided to be all in on AI. For the first time, we would design [a chip] that is dedicated to this field of work. Dedicated to accelerating AI; dedicated to accelerating deep learning.

“I think we are going to realize looking back that one of the biggest things that ever happened is AI.”

The Tesla P100 is the product of around $2.5 billion worth of research and development at the hands of thousands of computer engineers.

The Tesla P100 chip contains more than 15 billion transistors and is described by Nvidia's CEO as "a beast of a machine." Nvidia

“The odds of this working at all is approximately zero,” said Huang. “We are changing so many things in one project. The Tesla P100 has five miracles.

“That’s got to be, wow, Christmas in April. This is so great. The technology has taken us a very long time to invent. I am so fricking excited about it… This is a beast of a machine, the densest computer ever made.”

Huang warned that if AI engineers are not able to create new algorithms that can take advantage of the chip, then Nvidia had just made the “world’s most expensive brick.”

Nvidia will be betting, however, that this isn’t the case. The Tesla P100 is already in volume production. Most of the world’s major technology companies, such as Facebook, Microsoft, Google and Chinese web giant Baidu are investing heavily in artificial intelligence research.

Xuedong Huang, chief speech-scientist at Microsoft Research, said that the Tesla P100 would enable its researchers to “accelerate AI breakthroughs,” while Baidu’s chief scientist Andrew Ng said the new Pascal architecture was like nothing he had seen before.

“AI computers are like space rockets: The bigger the better,” Ng said in a statement. “Pascal’s throughput and interconnect will make the biggest rocket we’ve ever seen.”