System Bits: Dec. 5

Building deep learning hardware
A new course at MIT is bringing together both electrical engineering and computer science to educate student in the highly sought after field of deep learning. Vivienne Sze, an associate professor of electrical engineering and computer science at MIT, teaches the course, Hardware Architecture for Deep Learning to only 25 student for now — compared to the bursting lecture halls characteristic of other MIT classes focused on machine learning and artificial intelligence. This course is different in that there is a long list of prerequisites and a heavy base of assumed knowledge. Students jump into deep water quickly, blazing through algorithmic design in a few weeks; covering the terrain of computer hardware design in a similar period; then get down to the real work of making these two fields work together.

The goal of the class is to teach students the interplay between two traditionally separate disciplines, Sze said. “How can you write algorithms that map well onto hardware so they can run faster? And how can you design hardware to better support the algorithm? It’s one thing to design algorithms, but to deploy them in the real world you have to consider speed and energy consumption.”

Joel Emer, a professor of the practice in MIT’s Department of Electrical Engineering and Computer Science, and a senior distinguished research scientist at the chip manufacturer NVidia, co-teaches the course with Sze.

The two have also written a journal article that is meant to provide a comprehensive tutorial and survey coverage of recent advances toward enabling efficient processing of deep neural networks. This paper is actually used as the main reference for the course.

In 2016, their group unveiled a new, energy-efficient computer chip optimized for neural networks, which could enable powerful artificial-intelligence systems to run locally on mobile devices. The groundbreaking chip, called “Eyeriss,” could also help usher in the internet of things.

Deep learning is a new name for an approach to artificial intelligence called neural networks, a means of doing machine learning in which a computer learns to perform some tasks by analyzing training examples, the researchers reminded. Today, popular applications of deep learning are everywhere, they said. The technique drives image recognition, self-driving cars, medical image analysis, surveillance and transportation systems, and language translation, for instance.

Interestingly, Emer said the value of the hardware at the heart of deep learning is often overlooked. Practical and efficient neural networks, which computer scientists have researched off and on for 60 years, were infeasible without hardware to support deep learning algorithms. “Many AI accomplishments were made possible because of advances in hardware. Hardware is the foundation of everything you can do in software,” he said.

And because deep learning techniques are evolving very rapidly, there is a direct need for this sort of hardware. Some of the students coming out of the class might be able to contribute to that hardware revolution, Emer suggested.

AI index
A Stanford University-led team has launched the first index to track the state of artificial intelligence and measure technological progress in the same way the GDP and the S&P 500 index take the pulse of the U.S. economy and stock market.

Since the term “artificial intelligence” was first used in print in 1956, the one-time science fiction fantasy has progressed to the very real prospect of driverless cars, smartphones that recognize complex spoken commands and computers that see, the researchers reminded.

In an effort to track the progress of this emerging field, a Stanford-led group of leading AI thinkers called the AI100 has launched an index that will provide a comprehensive baseline on the state of artificial intelligence and measure technological progress in the same way the gross domestic product and the S&P 500 index track the U.S. economy and the broader stock market.

A Stanford-led AI index reveals a dramatic increase in AI startups and investment as well as significant improvements in the technology’s ability to mimic human performance. Source: Stanford University

“The AI100 effort realized that in order to supplement its regular review of AI, a more continuous set of collected metrics would be incredibly useful,” said Russ Altman, a professor of bioengineering and the faculty director of AI100. “We were very happy to seed the AI Index, which will inform the AI100 as we move forward.”

Plastic + metal-organic frameworks = sensing, storage
A marriage between 3D printer plastic and a versatile material for detecting and storing gases could lead to inexpensive sensors and fuel cell batteries alike, according to National Institute of Standards and Technology (NIST) researchers.

The material — metal-organic framework (MOF) — is perhaps not as familiar a substance as plastic, but one that may prove as broadly useful. They are easy to make, cost little, and some of them are good at picking out a particular gas from the air.

MOFs have caught the attention of a team of scientists from NIST and American University because they might be good as the basis for inexpensive sensing technology. For example, certain MOFs are good at filtering out methane or carbon dioxide, both of which are greenhouse gases. The big problem is that newly made MOFs are tiny particles that in bulk have the consistency of dust. And it’s hard to build a usable sensor from a material that slips through your fingers.

To address this problem, the team decided to try mixing MOFs into the plastic that is used in 3-D printers. Not only would the printer mold the plastic into any shape the team desired, but the plastic itself is permeable enough to allow gases to pass right through it, where the MOFs could snag the specific gas molecules the team wants to detect. The question was, would the MOFs work in the mix?

The team’s new research paper shows the idea has promise not only for sensing but for other applications as well. It demonstrates that the MOFs and the plastic get along well; for example, the MOFs don’t settle to the bottom of the plastic when it’s melted, but stay evenly distributed in the mixture. The team then moved on to mix in a specific MOF that’s good at capturing hydrogen gas and conducted testing to see how well the solidified mixture could store hydrogen.

And because the auto industry is still looking for an inexpensive, lightweight way to store fuel in hydrogen-powered cars, the researchers are hoping that MOFs in plastic might form the basis of the fuel tank.