So You Think You Need A GPU? Google Colab Can Help!

I Hope everyone that got a chance to go to SIOP last week enjoyed it. I liked all the discussions around AI in our field, but I'd like people to move beyond using it as a buzzword and start to discuss what they mean by AI when it comes to their tools. I don't think asking them to explain that they are using an LSTM, or Random Forests is giving away too much of their "secret sauce". But, one question or statement I got a few times is along the lines of "...one of our biggest issues was hardware....we just didn't have the compute power necessary to do a lot of the calculations, so it took a long time" Or in the Data Science Panel "....Don't I need an expensive computer to start doing REAL data science work?". So I thought I'd write a few articles showing how to get access to great compute hardware for free!

GPUs

First of all look at them, they are thing of beauty! If processing power is a bottleneck clearly a bad boy like this will solve your problems. In this article I'll walk you through how to get free access to this bad boy or one just like it. Then in a follow up article I'll walk through when a GPU can help and when it might not be very valuable.

Imgur

Are GPUs Powerful?

The short answer is......Absolutely YES!!

Below is a comparison of speedup over an Intel E5 CPU.

Imgur

And this shows the # of training examples per second each can handle. Also realize that the top GPU mentioned here is only about 10x faster than a CPU, so the Tesla K80 is another 14x faster than that.

Imgur

So the bottom line is yes, they can greatly impact compute time on very large datasets.

So....

If your immediate thought is...I don't have a gaming computer, I don't want to pay to spin up an AWS or Azure instance, so I guess this isn't available to me, right?

Short answr is no, both Google and Kaggle are very generous and provide you with free instances of GPUs on their platforms. In this article I'll walk you through how to get access to Google's free offering and run some quick code on Colab.

In this article I'll show you how to set it up in Google and in a follow up I'll show you how to do it in Kaggle Kernels.

Google Colab

Google has an app in Drive that is actually called Google Colaboratory. It's designed to be a colaboratory hub where you can share code and work on notebooks in a similar way as slides or docs.

Step 1: Go to Google Drive and click "New" and "More" Like This:

Imgur

Step 2: Name Your Notebook. I named mine "GPU_in_Colab"

Step 3: Let's find out what system of Python it's running.

Imgur

Step 4: Install a few packages that allow us to test out the use of the GPU.

fastai automatically installs PyTorch, Pandas, Matplotlib, etc.

Quick note: If you are in a notebook and you use "!" at the beginning of a cell the cell acts as a terminal and executes terminal commands.

Imgur

Step 5: Let's look at the device name that we have available to us on Colab.

Imgur

....Oops we got a runtime error, it says we don't have a Cuda capable device. So...did I lie to you? Does Colab not have GPU accessibility?

It does, but you need to turn it on, so how do we do that? Simply go to the runtime dropdown and click on the change runtime type.

Step 6: Turn on the GPU

Imgur

You will then be prompted with the runtime type and have an option for a hardware selector. It's defaulted to None, but you have a GPU or a TPU option available.

So, let's turn on the GPU and hit Save

Imgur

Now, let's go try that same code again.

Imgur

Now we have access to a Tesla K80, which costs about $2k alone.

So, let's do some quick stuff to show it in action.

First let's run one epoch on the CPU to show how long it takes and let's compare that to one epoch on the GPU.

For the purposes of this comparison I'm going to use a notebook that walks you through how to do state of the art image recognition on the dogs vs. cats dataset. You can find the full notebook here

Edited to add that shortly after publishing this blog post Google again upgraded their GPUs to the Tesla T4 (see image below). I also provide an example of one epoch training using the T4 below, which speeds up the training by an additional 20%.

Imgur

Why does GPU work so well in Deep Learning?

Before we get into the comparison of a CPU vs. a GPU on an image classiciation problem I think it's important to highlight why it works so well. Essentially GPUs are very good at doing parallel compute. This is extremely helpful in matrix algebra calculations. So, anything where you are doing matrix multiplication and/or calculating derivatives the GPU will be extremely helpful.

Now, back to the example:

Here is an example of some of the images

Imgur

Step 7: Run A CNN

In this example we will use the Resnet architecture. This architecture was developed to handle many of the difficult problems of training extremely deep neural networks. One of the biggest issues is the vanishing gradient which Resnet addressed.

This specific implementation is Resnet 34, which contains 34 layers. You can read more about Resnet here if you are interested.

CPU

Here are the results of 1 epoch, or pass through the data, using the CPU from Google Colab.

Imgur

It took us just under 40 minutes to complete and as you can see by looking at the training vs. the validation loss, we still need to do a few more epochs as the common rule of thumb is you want the training loss to be slightly lower than the validation loss.

GPU

Imgur

It took just under 2 minutes to do the exact same calculation on the exact same dataset using the K80 GPU.

GPU using the Tesla T4

Imgur

Takeaway

GPUs have the capability to increase your compute efficiency many times over and Google Colab is one easy way to do that. Hopefully this article helps you to set up your own Colab environment and perhaps replicate the Cats and Dogs notebook I touched on here today.

In [ ]: