How Do I Import Kaggle Datasets Into Google Colab?

Kaggle is used by almost every data science aspirant. It contains datasets for all domains. You can get a dataset for just about every use case, including entertainment, medicine, e-commerce, and even astronomy. Its users can practise on a variety of datasets to see how well they know Data Science and Machine Learning.
In this article, I’ll show you how to import datasets from Kaggle directly into Google Colab notebooks.

Step 1: Select any dataset from Kaggle

The first and foremost step is to choose your dataset from Kaggle. You can select datasets from competitions too. For this article, I am choosing two datasets: One random dataset and one from the active competition.

Step 2: Download API Credentials

To download data from Kaggle, you need to authenticate with the Kaggle services. For this purpose, you need an API token. This token can be easily generated from the profile section of your Kaggle account. Simply, navigate to your Kaggle profile and then,

Click the Account tab and then scroll down to the API section (Screenshot from Kaggle profile)

A file named “kaggle.json” will be download which contains the username and the API key.

This is a one-time step and you don’t need to generate the credentials every time you download the dataset.

Step 3: Setup the Colab Notebook

Fire up a Google Colab notebook and connect it to the cloud instance (basically start the notebook interface). Then, upload the “kaggle.json” file that you just downloaded from Kaggle.

Now you are all set to run the commands need to load the dataset. Follow along with these commands:

Note: Here we will run all the Linux and installation commands starting with “!”. As Colab instances are Linux-based, you can run all the Linux commands in the code cells.

1. Install the Kaggle library

! pip install kaggle

2. Make a directory named “.kaggle”

! mkdir ~/.kaggle

3. Copy the “kaggle.json” into this new directory

! cp kaggle.json ~/.kaggle/

4. Allocate the required permission for this file.

! chmod 600 ~/.kaggle/kaggle.json

The colab notebook is now ready to download datasets from Kaggle.

All the commands needed to set up the colab notebook

Step 4: Download datasets

Kaggle host two types of datasets: Competitions and Datasets. The procedure to download any type remains the same with just minor changes.

Downloading Competitions dataset:

! kaggle competitions download <name-of-competition>

Here, the name of the competition is not the bold title displayed over the background. It is the slug of the competition link followed after the “/c/”. Consider our example link:

“https://www.kaggle.com/c/google-smartphone-decimeter-challenge”

“google-smartphone-decimeter-challenge” is the name of the competition to be passed in the Kaggle command. This will start downloading the data under the allocated storage in the instance:

The output of the command (Notebook screenshot)

Downloading Datasets:

These datasets are not part of any competition. You can download these datasets by:

! kaggle datasets download <name-of-dataset>

Here, the name of the dataset is the “user-name/dataset-name”. You can simply copy the trailing text after “www.kaggle.com/”. Therefore, in our case,

“https://www.kaggle.com/arenagrenade/the-complete-pokemon-images-data-set”

It will be: “arenagrenade/the-complete-pokemon-images-data-set”

In case you get a dataset with a zip extension, you can simply use the unzip command of Linux to extract the data:

! unzip <name-of-file>

Bonus Tips

Tip 1: Download Specific Files

You just saw how to download datasets from Kaggle in Google Colab. It is possible that you are only concerned about a specific file and want to download only that file. Then you can use the “-f” flag followed by name of the file. This will download only that specific file. The “-f” flag works for both competitions and datasets command.

Example:

! kaggle competitions download google-smartphone-decimeter-challenge -f baseline_locations_train.csv

The output of the command (Notebook screenshot)

You can check out Kaggle API official documentation for more features and commands.

Tip 2: Load Kaggle Credentials from Google Drive

In step 3, you uploaded the “kaggle.json” when executing the notebook. All the files uploaded in the storage provided while running the notebook are not retained after the termination of the notebook.

It means that you need to upload the JSON file every time the notebook is reloaded or restarted. To avoid this manual work,

1. Simply upload the “kaggle.json” to your Google Drive. For simplicity, upload it in the root folder rather than any folder structure.

2. Next, mount the drive to your notebook:

Steps to Mount Google Drive

3. The initial command for installing the Kaggle library and creating a directory named “.kaggle” remains the same:

! pip install kaggle! mkdir ~/.kaggle

4. Now, you need to copy the “kaggle.json” file from the mounted google drive to the current instance storage. The Google drive is mounted under the “./content/drive/MyDrive” path. Just run the copy command as used in Linux:

!cp /content/drive/MyDrive/kaggle.json ~/.kaggle/kaggle.json

Now you can easily use your Kaggle competitions and datasets command to download the datasets. This method has the added advantage of not uploading the credential file on every notebook re-run.

Downloading dataset after configuring API key

Benefits of Using Google Colab

Google Colab is an excellent tool for putting data science questions to the test. The free GPU support is one of the main advantages of the Colab. In the beginning, data science aspirants are short on computation resources, therefore employing Google Colab solves their hardware issues. Because the Colab notebooks operate on Linux instances, you can easily run all of the standard Linux commands and interact with the kernel.
For practise datasets, the RAM and disc allocation are sufficient, but if your research demands additional compute power, you can upgrade to “Colab pro.”

--

--

--

Aspiring on Computer vision, Data science , NLP , IoT https://www.linkedin.com/in/soorajece1993/ https://mobile.twitter.com/Soorajsknair

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

Teaching Data Science at General Assembly: Week 5

CS:GO Demo Analysis & Analytics Platforms in General

Global Warming Debate: How Can Computer Models’ Predictions Be Wrong?

Computer

Simple Linear Regression

Online Data Entry Jobs: The simplest online work you can find!

Sentiment Analysis and Word Usage for KickStarter Blurbs

Netflix Data Science Interview Practice Problems

Retrieve data from IMDb by Python

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Sooraj S

Sooraj S

Aspiring on Computer vision, Data science , NLP , IoT https://www.linkedin.com/in/soorajece1993/ https://mobile.twitter.com/Soorajsknair

More from Medium

What If The Systems Recommend Clothes For You To Purchase According To The Season?

Top 18 AI Use Cases for Leading Industries

INTERPRETING MACHINE LEARNING MODEL USING SHAP ALGORITHM

Bussiness utility of uncertain machine learning classiffiers