Instructions et exigences de configuration de l'atelier
Protégez votre compte et votre progression. Utilisez toujours une fenêtre de navigation privée et les identifiants de l'atelier pour exécuter cet atelier.

Using custom containers with AI Platform Training

Atelier 2 heures universal_currency_alt 5 crédits show_chart Débutant
info Cet atelier peut intégrer des outils d'IA pour vous accompagner dans votre apprentissage.
Ce contenu n'est pas encore optimisé pour les appareils mobiles.
Pour une expérience optimale, veuillez accéder à notre site sur un ordinateur de bureau en utilisant un lien envoyé par e-mail.

Overview

In this lab, you develop a multi-class classification model, package the model as a Docker image, and run the model on AI Platform Training as a training application. The training application trains a multi-class classification model that predicts the type of forest cover from cartographic data. The dataset used in the lab is based on the Covertype Data Set from the UCI Machine Learning Repository.

Scikit-learn is one of the most useful libraries for machine learning in Python. The training code uses Scikit-learn for data pre-processing and modeling.

The code is instrumented using the hypertune package. Therefore, it can be used with an AI Platform hyperparameter tuning job to serch for the best combination of hyperparameter values by optimizing the metrics you specify.

Objectives

  • Create a training and validation split with BigQuery.
  • Wrap a machine learning model into a Docker container, and train it on AI Platform.
  • Use the hyperparameter tuning engine on Google Cloud to find the best hyperparameters.
  • Deploy a trained machine learning model on Google Cloud as a REST API, and query it.

Setup

For each lab, you get a new Google Cloud project and set of resources for a fixed time at no cost.

  1. Sign in to Google Skills using an incognito window.

  2. Note the lab's access time (for example, 1:15:00), and make sure you can finish within that time.
    There is no pause feature. You can restart if needed, but you have to start at the beginning.

  3. When ready, click Start lab.

  4. Note your lab credentials (Username and Password). You will use them to sign in to the Google Cloud Console.

  5. Click Open Google Console.

  6. Click Use another account and copy/paste credentials for this lab into the prompts.
    If you use other credentials, you'll receive errors or incur charges.

  7. Accept the terms and skip the recovery resource page.

Activate Cloud Shell

Cloud Shell is a virtual machine that contains development tools. It offers a persistent 5-GB home directory and runs on Google Cloud. Cloud Shell provides command-line access to your Google Cloud resources. gcloud is the command-line tool for Google Cloud. It comes pre-installed on Cloud Shell and supports tab completion.

  1. Click the Activate Cloud Shell button (Activate Cloud Shell icon) at the top right of the console.

  2. Click Continue.
    It takes a few moments to provision and connect to the environment. When you are connected, you are also authenticated, and the project is set to your PROJECT_ID.

Sample commands

  • List the active account name:
gcloud auth list

(Output)

Credentialed accounts: - <myaccount>@<mydomain>.com (active)

(Example output)

Credentialed accounts: - google1623327_student@qwiklabs.net
  • List the project ID:
gcloud config list project

(Output)

[core] project = <project_ID>

(Example output)

[core] project = qwiklabs-gcp-44776a13dea667a6 Note: Full documentation of gcloud is available in the gcloud CLI overview guide.

Task 1. Enable Cloud services

  1. In Cloud Shell, to set the project ID to your Google Cloud Project, run the following command:
export PROJECT_ID=$(gcloud config get-value core/project) gcloud config set project $PROJECT_ID
  1. To enable the required Cloud services, run the following commands:
gcloud services enable \ cloudbuild.googleapis.com \ container.googleapis.com \ cloudresourcemanager.googleapis.com \ iam.googleapis.com \ containerregistry.googleapis.com \ containeranalysis.googleapis.com \ ml.googleapis.com \ dataflow.googleapis.com
  1. Add the Editor permission for your Cloud Build service account:
PROJECT_NUMBER=$(gcloud projects describe $PROJECT_ID --format="value(projectNumber)") CLOUD_BUILD_SERVICE_ACCOUNT="${PROJECT_NUMBER}@cloudbuild.gserviceaccount.com" gcloud projects add-iam-policy-binding $PROJECT_ID \ --member serviceAccount:$CLOUD_BUILD_SERVICE_ACCOUNT \ --role roles/editor

Click Check my progress to verify the objective. Add the Editor permission for a Cloud Build service account.

Task 2. Create an instance of AI Platform Pipelines

  1. In the Google Cloud Console, on the Navigation menu, scroll down to AI Platform and pin the section for easier access later in the lab.

The Google Cloud navigation menu, wherein the Pin icon is highlighted within the AI Platform option.

  1. In the Cloud Shell enter the following command to create the required GKE cluster:
gcloud container clusters create cluster-1 --zone us-central1-a --release-channel stable --machine-type n1-standard-2 --scopes=https://www.googleapis.com/auth/cloud-platform

This should take 2-3 minutes to complete. Wait for the cluster to finish before proceeding to the next step.

  1. While you wait for the cluster to be created, do one of the following:
    • On the Navigation menu, click Kubernetes Engine to view the cluster being created.
    • On the Navigation menu, click Compute Engine to see the individual VMs spinning up.

When the cluster is complete the cloud shell will show a status similar to the image below.

The Cloud Shell data, which includes STATUS: RUNNING.

  1. Return to the AI Platform grouping and click Pipelines.

The highlighted navigation path to the Pipelines option.

  1. Click New Instance.

The AI Platform Pipelines page, with the New Instance button highlighted.

  1. On the Kubeflow Pipelines page, click Configure.

The cluster you created in Step 2 will appear by default in the selection window.

The Deploy Kubeflow Pipelines page, wherein the Cluster field is highlighted; cluster-1 [us-central1-a]

  1. Scroll to the bottom of the page, accept the marketplace terms, and click Deploy.

You will see the individual services of KFP deployed to your GKE cluster. Wait for the deployment to finish before proceeding to the next task.

Click Check my progress to verify the objective. Create an instance of AI Platform Pipelines.

Task 3. Create an instance of Vertex AI Platform Notebooks

An instance of Vertex AI Platform Notebooks is used as a primary experimentation/development workbench. The instance is configured using a custom container image that includes all Python packages required for this lab.

  1. In Cloud Shell, create a folder in your home directory:
cd mkdir tmp-workspace cd tmp-workspace
  1. Create a requirements file with the Python packages to install in the custom image:
gsutil cp gs://cloud-training/OCBL203/requirements.txt .
  1. Create a Dockerfile that defines your custom container image:
gsutil cp gs://cloud-training/OCBL203/Dockerfile .
  1. Build the image and push it to your project's Container Registry:
IMAGE_NAME=kfp-dev TAG=latest IMAGE_URI="gcr.io/${PROJECT_ID}/${IMAGE_NAME}:${TAG}" gcloud builds submit --timeout 15m --tag ${IMAGE_URI} .

Click Check my progress to verify the objective. Build the image and push it to your project's Container Registry.

  1. Create an instance of Vertex AI Platform Notebooks:
ZONE=us-central1-a INSTANCE_NAME=ai-notebook

If you want to use a different ZONE and INSTANCE_NAME, replace us-central1-a with the zone of your choice as [YOUR_ZONE] and replace ai-notebook with the instance name of your choice as [YOUR_INSTANCE_NAME]:

IMAGE_FAMILY="common-container" IMAGE_PROJECT="deeplearning-platform-release" INSTANCE_TYPE="n1-standard-4" METADATA="proxy-mode=service_account,container=$IMAGE_URI" gcloud compute instances create $INSTANCE_NAME \ --zone=$ZONE \ --image-family=$IMAGE_FAMILY \ --machine-type=$INSTANCE_TYPE \ --image-project=$IMAGE_PROJECT \ --maintenance-policy=TERMINATE \ --boot-disk-device-name=${INSTANCE_NAME}-disk \ --boot-disk-size=100GB \ --boot-disk-type=pd-ssd \ --scopes=cloud-platform,userinfo-email \ --metadata=$METADATA

This may take up to 5 minutes to complete.

  1. After five minutes, in the Cloud Console, on the Navigation menu, click Vertex AI > Workbench.

The Vertex AI Platform Workbench instance, takes 2 to 3 minutes to reflect.

  1. Please refresh the page, once you see Include legacy instances checkbox, enable it to see your instance.

The Workbench page, wherein the Refresh button is highlighted, along with the option Include legacy instance, which is enabled.

  1. Click the Open Jupyterlab link.

Click Check my progress to verify the objective. Create an instance of Vertex AI Platform Notebooks.

Task 4. Clone the mlops-on-gcp repo within your Vertex AI Platform Notebooks instance

To clone the mlops-on-gcp notebook in your JupyterLab instance:

  1. In JupyterLab, click the Terminal icon to open a new terminal.

  2. At the command-line prompt, type in the following command and press Enter:

    git clone https://github.com/GoogleCloudPlatform/mlops-on-gcp Note: If the cloned repo does not appear in the JupyterLab UI, you can use the top line menu and under Git > Clone a repository, clone the repo (https://github.com/GoogleCloudPlatform/mlops-on-gcp) using the UI.

    Clone Repo dialog

  3. Confirm that you have cloned the repository by double clicking on the mlops-on-gcp directory and ensuring that you can see its contents. The files for all the Jupyter notebook-based labs throughout this course are available in this directory.

Click Check my progress to verify the objective. Clone the mlops-on-gcp repo within your Vertex AI Platform Notebooks instance.

Task 5. Navigate to the mlops-on-gcp notebook

Note: In order to perform all tasks, you need to read all explanations and follow the instructions carefully before running each cell. Some tasks may take 5-10 minutes to complete. Wait for each task to be completed before proceeding to the next one.
  1. In the notebook interface, navigate to mlops-on-gcp > on_demand > kfp-caip-sklearn > lab-01-caip-containers > exercises, and open lab-01.ipynb.

  2. In the notebook interface, click Edit > Clear All Outputs.

  3. Carefully read through the notebook instructions and fill in lines marked with #TODO where you need to complete the code.

Note: To run the current cell, click the cell and press SHIFT+ENTER. Other cell commands are listed in the notebook UI under Run.
  • Hints may also be provided for the tasks to guide you along. Highlight the text to read the hints (they are in white text).
  • If you need more help, navigate to mlops-on-gcp > on_demand > kfp-caip-sklearn > lab-01-caip-containers, and open lab-01.ipynb to display the complete solution.

Prepare the lab dataset

The pipeline ingests data from BigQuery. While executing the cells in the notebook, you set the BigQuery parameters, created a BigQuery dataset, and uploaded the Covertype CSV data into a table.

Click Check my progress to verify the objective. Prepared the lab dataset.

Click Check my progress to verify the objective. Create training and validation splits.

Click Check my progress to verify the objective. Develop a training application.

Click Check my progress to verify the objective. Submit an AI Platform hyperparameter tuning job.

Click Check my progress to verify the objective. Deploy the model to AI Platform Prediction.

Congratulations!

In this lab you learned how to develop a training application, package it as a Docker image, and run it on AI Platform Training.

End your lab

When you have completed your lab, click End Lab. Google Skills removes the resources you’ve used and cleans the account for you.

You will be given an opportunity to rate the lab experience. Select the applicable number of stars, type a comment, and then click Submit.

The number of stars indicates the following:

  • 1 star = Very dissatisfied
  • 2 stars = Dissatisfied
  • 3 stars = Neutral
  • 4 stars = Satisfied
  • 5 stars = Very satisfied

You can close the dialog box if you don't want to provide feedback.

For feedback, suggestions, or corrections, please use the Support tab.

Copyright 2026 Google LLC All rights reserved. Google and the Google logo are trademarks of Google LLC. All other company and product names may be trademarks of the respective companies with which they are associated.

Avant de commencer

  1. Les ateliers créent un projet Google Cloud et des ressources pour une durée déterminée.
  2. Les ateliers doivent être effectués dans le délai imparti et ne peuvent pas être mis en pause. Si vous quittez l'atelier, vous devrez le recommencer depuis le début.
  3. En haut à gauche de l'écran, cliquez sur Démarrer l'atelier pour commencer.

Utilisez la navigation privée

  1. Copiez le nom d'utilisateur et le mot de passe fournis pour l'atelier
  2. Cliquez sur Ouvrir la console en navigation privée

Connectez-vous à la console

  1. Connectez-vous à l'aide des identifiants qui vous ont été attribués pour l'atelier. L'utilisation d'autres identifiants peut entraîner des erreurs ou des frais.
  2. Acceptez les conditions d'utilisation et ignorez la page concernant les ressources de récupération des données.
  3. Ne cliquez pas sur Terminer l'atelier, à moins que vous n'ayez terminé l'atelier ou que vous ne vouliez le recommencer, car cela effacera votre travail et supprimera le projet.

Ce contenu n'est pas disponible pour le moment

Nous vous préviendrons par e-mail lorsqu'il sera disponible

Parfait !

Nous vous contacterons par e-mail s'il devient disponible

Un atelier à la fois

Confirmez pour mettre fin à tous les ateliers existants et démarrer celui-ci

Utilisez la navigation privée pour effectuer l'atelier

Le meilleur moyen d'exécuter cet atelier consiste à utiliser une fenêtre de navigation privée. Vous éviterez ainsi les conflits entre votre compte personnel et le compte temporaire de participant, qui pourraient entraîner des frais supplémentaires facturés sur votre compte personnel.