GPU for AI Explained

Shimon Ben David. February 12, 2021
GPU for AI Explained

This post provides an introduction to AI that will help you understand what AI is, how to architect your infrastructure for AI, and the benefits of introducing Graphic Processing Units (GPU) to support your AI pipeline. In this blog post you will learn about:

Overview

  • What Is AI?
  • What Are the Different Types of AI?
  • Introduction to GPUs in AI and Machine Learning
  • Why Include GPUs for AI & Deep Learning?
  • Factors to Consider When Designing ML Architectures
  • GPU in AI & Machine Learning Use Cases

What Is AI?

Artificial intelligence (AI) is the buzz in most organizations today. Many organizations are introducing new products and use cases based on their desires to implement AI, but what does AI really mean?
To put it simply, artificial intelligence is the quest to emulate the human brain using a computer—to simulate the neurons within it and their connections in a way that resembles human intelligence and then put it to use to achieve a specific result.

When we think about AI we often imagine androids that can walk and talk and think like real people. In reality, however, the overall practice of AI is more focused on accomplishing specific tasks, often repetitively, as in the chase with Machine Learning (ML), to achieve that result.

For example, within the realm of AI/ML we know that image recognition software has been developed to train the neural networks so that they can identify objects even when they “see” them for the first time, such as in a healthcare situation when doctors are working to identify the differences between benign and malignant tumors. Other implementations of AI are the chat bot that will communicate with a person about specific support issues regarding a product and the drone that scans agricultural fields for possible pests that could damage crops. In these examples, no simple software program was specifically written to instruct the chat bot or the drone about what to do, but rather AI/ML models were used to “teach” the software how to identify the images and teach the chat bot how to respond.

What Are the Different Types of AI?

Over the last few decades AI has become an umbrella term. It currently encompasses different techniques, processes, and disciplines that are too numerous to list, but here’s a start:

  • Machine Learning (ML)
  • Deep Learning (DL)
  • Computer Learning (CL)
  • Natural Language Processing (NLP)
  • Natural Language Understanding (NLU)
  • Computer Vision (CV)
  • More

Generally Machine Learning can be categorized to one of the three following categories:

  1. Supervised learning–The Machine Learning model is given labeled information, and it needs to learn how to identify this label in the future, for example feeding a model with multiple images of a STOP sign in different weather conditions, different times of day, and so on, while expecting it to understand how to identify other images of STOP signs.
  2. Unsupervised learning–The model is given unlabelled data, and the model is expected to identify patterns within that data by itself. An example might be a recommender system for movies that will analyze a person’s watch list and identify additional movies that match the taste of the viewer.
  3. Reinforced learning–The model is provided with positive or negative feedback about its behavior that allows it to adapt. An example here is an autonomous car that constantly gets feedback to stay on the road.

Introduction to GPUs in AI and Machine Learning

Traditionally GPUs (Graphical Processing Units) have been used in processing 3D content/data in gaming to offload the vector calculations of the CPU. Over time, GPUs have evolved to contain multiple cores that are highly efficient at the parallel computation that was useful for the gaming industry. As a result, these cores started to be used for other activities due to their sheer computational power (for example bitcoin mining) or simply to solve problems faster by allowing all of the GPU core to work on a problem in parallel. Since AI/ML operations often require processing massive amounts of images or videos (and more recently additional types of data), and it was recognized that GPUs were highly suitable for that, over time that led to the popularity of GPUs for AI, ML, DL, and more.

Different markets are using AI/ML for different use cases, and many times a single organization will have multiple AI/ML initiatives. For example, Financial Services will use AI/ML for things such as risk assessment, investment decisions, and compliance, while Life Sciences industries will use AI/ML for augmented intelligence decisions for doctors, compliance, and more.

Why Include GPUs for AI & Deep Learning?

GPUs are not mandatory for AI/ML workloads. Some customers are actually using CPUs or other accelerators for certain such activities. However, GPUs are extremely efficient for these workloads and will often show the best performance, and here is why:

  • GPUs can perform multiple, simultaneous computations
  • GPUs are highly efficient at these types of calculations
  • GPUs contain massive amounts of cores that can be used efficiently in parallel

Factors to Consider When Designing ML Architectures

There are many questions to ask when planning an AI/ML project, some of which are the following:

  • What questions do we want the model to answer?
  • What data is relevant in order to answer these questions?
  • Where can the data be acquired from?
  • Where will we store the data, and how will we move the data around?
  • What is the expected dataset size?
  • How will we write the model or can we use an existing model?
  • How will we validate the model and deploy new models?
  • And more….

GPU in AI & Machine Learning Use Cases

Possibilities abound when it comes to AI/ML use cases, so here is just a sampling:

  • Compliance–Updating the compliance team on new regulations and verifying that the organization abides by these regulations
  • Cyber Security–Making sure that an organization is secure by reviewing multiple video inputs in near real time and constantly scanning for and preventing breaches
  • Fraud Detection in Financial Services–Going over massive sets of financial data to detect and expose possible fraud
  • Conversational AI–Augmenting customer service centers to communicate with customers
  • And more….

Conclusion

AI and Machine Learning are disrupting almost every industry and if you are not working with these technologies, you are already late to the game. Topics discussed in this blog post will not only help you understand the basics of AI/ML but also get you started with your first initiatives.

If you are just getting started, here is a guide with “10 Things to Know When Starting With AI”.

Additional Resources:

Optimizing Your Infrastructure for AI
Data Management in the Age of AI
The Infrastructure Behind SIRI & Alexa

Strting with AI

10 Things to Know When Starting with AI

How to ensure your AI initiative is set for success

Get the Whitepaper

Related Resources

Video
Video

What is the Advanced Technology Center and How You can Access it

Watch Now
Datasheet
Datasheet

Datasheet - Hitachi Content Software for File

Download
Solution Brief
Solution Brief

Hitachi Content Software for File: High Performance Storage for AI & ML

Download