Modeling Modern films and TV shows are filled with spectacular computer-generated sequences computed by rendering systems that simulate the flow of light in a three-dimensional scene and convert the information into a two-dimensional image. But computing the thousands of light rays (per frame) to achieve accurate color, shadows, reflectivity and other light-based characteristics is a labor-intensive, time-consuming and expensive undertaking. An alternative is to render the images using only a few light rays. That saves time and labor but results in inaccuracies that show up as objectionable "noise" in the final image.UC Santa Barbara electrical and computer engineering Ph.D. student Steve Bako and his advisor, Pradeep Sen, are advancing on a solution. Over the past couple of years, the two have worked with researchers at Disney Research and Pixar Animation Studios to develop a new technology based on artificial intelligence (AI) and deep learning to eliminate that noise and enable … [Read more...] about Bringing Deep Learning to Big Screen Animation

Data Analysis Scientists will use ORNL's computing resources such as the Titan supercomputer to develop deep learning solutions for data analysis. Source: Credit: Jason Richards/Oak Ridge National Laboratory, U.S. Dept. of Energy. A team of researchers from Oak Ridge National Lab oratory has been awarded nearly $2 million over three years from the Department of Energy to explore the potential of machine learning in revolutionizing scientific data analysis.The Advances in Machine Learning to Improve Scientific Discovery at Exascale and Beyond (ASCEND) project aims to use deep learning to assist researchers in making sense of massive datasets produced at the world's most sophisticated scientific facilities. Deep learning is an area of machine learning that uses artificial neural networks to enable self-learning devices and platforms. The team, led by ORNL's Thomas Potok, includes Robert Patton, Chris Symons, Steven Young and Catherine Schuman.While deep learning has long been used to … [Read more...] about ORNL Researchers Turn to Deep Learning to Solve Science’s Big Data Problem

Data Analysis deep-learning, as these multi-layer networks interpose many neuronal processing layers between the input data and the output results calculated by the neural network — hence the use of the word deep in the deep-learning catchphrase. The resulting trained networks can be extremely valuable, as they have the ability to perform complex, real-world pattern recognition tasks very quickly on a variety of low-power devices including sensors, mobile phones, and FPGAs, as well as quickly and economically in the data center. Generic applicability, high accuracy (sometimes better than human), and ability to be deployed nearly everywhere explains why scientists, technologists, entrepreneurs and companies are all scrambling to take advantage of deep-learning technology.Machine learning went through a similar bandwagon stage in the 1980s where superlatives were lauded on the technology and futurists discussed how machine learning was going to change the world. The genesis of the … [Read more...] about Deep Learning Revitalizes Neural Networks to Match or Beat Humans on Complex Tasks

"Deep Learning" computer systems, based on artificial neural networks that mimic the way the brain learns from an accumulation of examples, have become a hot topic in computer science. In addition to enabling technologies such as face- and voice-recognition software, these systems could scour vast amounts of medical data to find patterns that could be useful diagnostically, or scan chemical formulas for possible new pharmaceuticals.But the computations these systems must carry out are highly complex and demanding, even for the most powerful computers.Now, a team of researchers at MIT and elsewhere has developed a new approach to such computations, using light instead of electricity, which they say could vastly improve the speed and efficiency of certain deep learning computations. Their results appear today in the journal Nature Photonics in a paper by MIT postdoc Yichen Shen, graduate student Nicholas Harris, professors Marin Soljacic and Dirk Englund, and eight … [Read more...] about Learning With Light: New System Allows Optical ‘Deep Learning’

Computers can now beat humans at chess and Go, but it may be a while before people trust their driving. The danger of self-driving cars was highlighted last year when Tesla's autonomous car collided with a truck it mistook for a cloud, killing its passenger.Self-driving cars depend on a form of machine learning called deep learning. Modeled after the human brain, layers of artificial neurons process and consolidate information, developing a set of rules to solve complex problems, from recognizing friends' faces online to translating email written in Chinese. The technology has achieved impressive feats of intelligence, but as more tasks become automated this way, concerns about safety, security, and ethics, are growing. Deep learning systems do not explain how they make their decisions, and that makes them hard to trust.In a new approach to the problem, researchers at Columbia and Lehigh universities have come up with a way to automatically error-check the thousands to millions of … [Read more...] about Researchers Unveil Tool to Debug ‘Black Box’ Deep Learning Algorithms

A team of researchers from the University of California, Berkeley, the University of California, Davis and the Texas Advanced Computing Center (TACC) published the results of an effort to harness the power of supercomputers to train a deep neural network (DNN) for image recognition at rapid speed.The researchers efficiently used 1024 Skylake processors on the Stampede2 supercomputerat TACC to complete a 100-epoch ImageNet training with AlexNet in 11 minutes - the fastest time recorded to date. Using 1600 Skylake processors they also bested Facebook's prior results by finishing a 90-epoch ImageNet training with ResNet-50 in 32 minutes and, for batch sizes above 20,000, their accuracy was much higher than Facebook's. (In recent years, the ImageNet benchmark -- a visual database designed for use in image recognition research -- has played a significant role in assessing different approaches to DNN training.)Using 512 Intel Xeon Phi chips on Stampede2 they finished the 100-epoch … [Read more...] about Supercomputing Speeds Up Deep Learning Training

Movidius on Wednesday announced that it's working with Google to put deep learning on mobile devices.Google will source Movidius' latest flagship chip -- the MA2450 -- and software development environment, and will contribute to Movidius' neural network technology road map in return.That could result in smartphones and other mobile devices that will be able to understand images and audio swiftly and accurately.The MA2450 is the most powerful iteration of Movidius' Myriad 2 vision processor unit, which the company said is the only commercial solution available to perform complex neural network computations.The Myriad 2 is the first always-on vision processor, Movidius said.It has a programmable architecture and comes with the Myriad Development Kit, or MDK, which includes a software development framework. That lets developers incorporate proprietary functions and build arbitrary processing pipelines while leveraging the vision, imaging, and linear algebra software libraries and … [Read more...] about Google, Movidius to Bring Deep Learning to Mobile Devices

We often talk about hybrid cloud business models, but virtually always in the context of traditional processor-bound applications. What if deep learning developers and service operators could run their GPU-accelerated model training or inference delivery service anywhere they wanted?What if they could do so without having to worry about which Nvidia graphics processor unit they were using, whether their software development environment was complete, or whether their development environment had all the latest updates?To make that happen, Nvidia's GPU Cloud, aka "NGC," pre-integrates all the software pieces into a modern container architecture and certifies a specific configuration for Amazon Web Services. Nvidia will certify configurations for other public clouds in the future.Nvidia built NGC to address several deep learning challenges. Foremost is that deep learning developers have been having trouble staying current with the latest frameworks and optimizing their frameworks for the … [Read more...] about ANALYSIS Nvidia Containerizes GPU-Accelerated Deep Learning

Nvidia earlier this month launched a massive new push for intelligent machines, including what is likely the most expensive volume workstation in the world designed for this purpose.IBM, which has a tight relationship with Nvidia, launched a quantum computing processor that has a good chance of massively increasing the speed and intelligence of thinking systems.IBM also has been the most aggressive in promoting the concept that systems such as these could have a dramatic effect on the performance of people who use them. I think that after last week, regardless of your personal political preferences, you likely wish a lot of folks in Washington were wired to these machines, because it feels like the country is being run by partisan idiots at the moment.A deep learning system could have reversed the election results. It's too late for that, but it still could turn Trump from a train wreck into the best president the country has ever had. Furthermore, it actually could provide a … [Read more...] about OPINION How Deep Learning Could Fix Trump and Healthcare