2016-05-12

GPUs have helped researchers spark a deep-learning revolution that’s given computers super-human capabilities.

They’ve already enabled breakthrough results on the industry-standard ImageNet benchmark. They’re powering Facebook’s “Big Sur” deep learning computing platform. They’re also accelerating major advances in deep learning across a broad range of fields.

GPUs have become the go-to technology for training deep neural networks. These systems allow computers to identify patterns and objects as well as — or in some cases, better than — humans (see “Accelerating AI with GPUs: A New Computing Model”).

Training these networks is just the start.

After training is completed, the networks are deployed into the field for “inference” — classifying data to “infer” a result. This involves running billions of computations based on the trained network to identify known patterns or objects. Think of voice-enabled internet searches and pedestrian detection in a self-driving car.

Performance and Energy Efficiency

Here, too, GPUs offer big benefits.

A recent whitepaper demonstrated how NVIDIA GPUs and our Tegra SoCs deliver higher performance and energy efficiency than CPUs for image classification using the AlexNet neural network.

We’re continuing to increase the benefits of GPUs for deep learning inference.

Take our new NVIDIA GPU Inference Engine (GIE), for example. It’s a high-performance neural network inference solution for application deployment. It helps developers generate optimized implementations of trained neural network models for web, embedded and automotive applications. It also delivers the fastest inference performance on NVIDIA GPUs.

Major Tech and Web Companies Turning to GPUs

As a result, major organizations are putting GPU-accelerated computing to work for a variety of inference tasks.

Twitter has a massive amount of content with millions of images, videos and GIFs shared every day. Its Cortex team is using GPU-accelerated deep learning to process this massive amount of content in real time, and help people discover the right content for them.

According to Kevin Quennesson, engineering manager of Twitter Cortex, “Using NVIDIA GPUs helped us achieve savings in capital expenditures compared to using CPUs. It also enabled training of Twitter’s most advanced models in a matter of hours.”

Twitter isn’t alone.

Alibaba Group’s cloud computing business, AliCloud — China’s largest public cloud platform — is seeing significant deep learning performance increases for image recognition/classification and speech recognition in tests using our Tesla GPUs.

JD, China’s largest and most popular direct e-commerce merchant, used GPU-accelerated deep learning to create JIMI, an online customer service robot that helps customers resolve their online service issues.

Visit our website if you’d like to learn more about GPU-accelerated deep learning.

Featured image by Rock1997 – Own work, GFDL, https://commons.wikimedia.org/w/index.php?curid=15642389

The post How Web, Tech Companies Use GPUs to Put Deep Learning at Your Fingertips appeared first on The Official NVIDIA Blog.



Show more