High Performance Computing in Artificial Intelligence Applications with Paul Martino from Bullpen Capital

Raghav serves as Content Lead at Emerj, covering our major industry areas and conducting research. Raghav has a personal interest in robotics, and previously worked for research firms like Frost & Sullivan and Infiniti Research.

Brief recognition: Paul earned a Bachelor’s degree in mathematics and computer science from the Lehigh University and went on to gain a master’s degree in computer science from Princeton University. He was the CTO and founder of Tribe and the CEO and Founder of Aggregate Knowledge (acquired by Neustar). Paul founded Bullpen capital 8 years ago.

Big Idea

Paul points out that in the last two years, Moore’s law for graphical processing units (GPUs – often used in machine learning applications requiring parallel computing power) is happening faster than for regular central processing units (CPUs – the compute resources in most everyday PCs or Macs).

Paul points out that new machine learning demand is likely to result in a much greater demand for high-performance computing resources in a variety of industries. Some of these data-rich, machine learning-compatible industries include:

Oil and gas exploration and discovery

Targeted advertising

Drug discovery

Paul seems to suggest that many applications in these industries are the bridge between AI and HPC. Without the changes to the hardware structure in these applications, Paul believes that enabling the implementation of more complex AI algorithms might not be possible in the industries that need them most.

The problem has to do with more than just GPU hardware:

“Today, it is often the case that businesses are not bound by the compute resources their GPUs but rather by the I/O backchannel to get the datasets from place to place (the data and the GPU need to be in proximity for the computing to be possible at scale) … Moving the data may be a bigger problem than performing the computation on it.

Using a cloud data service seems like an obvious solution, but what most executives might not realize is that the I/O access time to the cloud is too slow for the AI algorithms to functions and data is distributed across multiple datacenters with even more latency.“

Paul adds that one of the emerging AI applications in this field today is along the lines of what companies can do to predict which data packet they might need to access from the cloud in advance and ‘pre-cache’ the data to make sure it is optimally shared at the right GPU at the right time.

For now, the mission-critical applications where HPC may find immediate use seem to be in those sectors which are large enough to invest in such cutting-edge technology. Paul believes that we are starting to witness a ‘consumerization of AI’, and he expects to see more and more AI products and services catering to the individual customers in the future.

He goes on to say that small and medium enterprises can get a sense of what might be coming by looking at the systems upgrades are being undertaken by the established firms on the bleeding edge. He suggests that looking at what the larger firms are being forced to do compute-wise may serve as a ‘crystal ball’ as to what might trickle down to smaller companies in the next two to five years.

Interview Highlights with Paul Martino from Bullpen Capital

The main questions Paul answered on this topic are listed below. Listeners can use the embedded podcast player (at the top of this post) to jump ahead to sections they might be interested in:

(1:50) What has actually changed in high-performance computing to enable AI capabilities?

(4:01) Where will high power computing matter, specifically in which industries will be mission-critical to upgrade computing power?

(12:30) What can businesses who need HPC today do to ensure a successful integration?

(15:25) What can smaller businesses expect in terms of HPC in the near future?

Related posts (5)

In 1956, ten of the world's leading electrical engineers convened on the burgeoning topic of "Artificial Intelligence" - which was far from being recognized as a field. A decade later, the progress of early AI efforts had some researchers envisioning a relatively easy glide to human-level intelligence in machines.

Episode Summary: CEO Chris Nicholson speaks on Skymind machine learning applications, which integrate with Hadoop and Spark. In this episode, Nicholson sheds light on current machine learning trends that he sees across industries and best practices for implementing AI solutions in order to gain consistent return on investment. For our readers who enjoyed out consensus on future trends in artificial intelligence consumer applications, it may be interesting to hear some of Chris's specific use cases in industry.

Episode Summary: I remember reading an article in Scientific American years ago about a poster of a person looking in the direction people sitting in a school dining room, and that this poster would make people sitting in the dining room less likely to litter. This seems like an absurd example of holding people accountable for their actions, but as it turns out, there are a lot more serious consequences to ensuring behavior change through observation, and one area where this matters is medicine. Today, there’s a major issue with people who don't adhere to their medical regimens, only to relapse or experience more serious symptoms later on. This week's guest, Cory Kidd, CEO of Catalia Health and known for his work at MIT on human-robotic interaction, is working to help solve this problem by developing a robot that adds some of that physical presence and accountability. This is likely one of many novel medical AI applications that we're likely to see roll out in healthcare over the next decade.

Episode Summary:When one thinks through important industry apps of AI, law or legal apps are not usually the first to jump to mind, but there’s certainly a need. Richard Downe PhD is vice president of Data Science at Casetext, a startup working on improving search and natural language processing and democratizing legal information. In this episode, he speaks about the current bottlenecks for people trying to get more out of of legal case documents, as well as some of the apps on which the Casetext team is working, to make these processes easier and to gain strategic advantage in this industry.

Episode summary: This week on AI in Industry, we speak to Paul Barba (Chief Scientist at Lexalytics) about what how companies are using natural language processing, and what it takes (in terms of expertise, time, and training) to get these systems working. From sentiment analysis to categorization, Paul walks us through interesting and fruitful use-cases and sheds light on the back-end "tweaking" required to keep NLP productive in a changing business environment.

Stay Ahead of the Machine Learning Curve

At Emerj, we have the largest audience of AI-focused business readers online - join other industry leaders and receive our latest AI research, trends analysis, and interviews sent to your inbox weekly.