GSP223

Overview
Vertex AI is Google Cloud's unified platform for machine learning. The AutoML feature within Vertex AI simplifies training high-quality custom image recognition models without requiring deep ML expertise. After training, models are deployed to a managed Endpoint for real-time predictions via an easy-to-use API.
In this lab, you'll upload cloud images to Cloud Storage, create a Vertex AI Dataset from them, and use a pre-trained model (simulated) on a Vertex AI Endpoint to generate predictions.
Objectives
In this lab, you perform the following tasks:
- Set up the Vertex AI environment and a Cloud Storage bucket.
- Upload a labeled dataset to Cloud Storage.
- Create and inspect a Vertex AI Dataset.
- Generate predictions against a Vertex AI Endpoint.
Setup and requirements
Before you click the Start Lab button
Read these instructions. Labs are timed and you cannot pause them. The timer, which starts when you click Start Lab, shows how long Google Cloud resources are made available to you.
This hands-on lab lets you do the lab activities in a real cloud environment, not in a simulation or demo environment. It does so by giving you new, temporary credentials you use to sign in and access Google Cloud for the duration of the lab.
To complete this lab, you need:
- Access to a standard internet browser (Chrome browser recommended).
Note: Use an Incognito (recommended) or private browser window to run this lab. This prevents conflicts between your personal account and the student account, which may cause extra charges incurred to your personal account.
- Time to complete the lab—remember, once you start, you cannot pause a lab.
Note: Use only the student account for this lab. If you use a different Google Cloud account, you may incur charges to that account.
How to start your lab and sign in to the Google Cloud console
-
Click the Start Lab button. If you need to pay for the lab, a dialog opens for you to select your payment method.
On the left is the Lab Details pane with the following:
- The Open Google Cloud console button
- Time remaining
- The temporary credentials that you must use for this lab
- Other information, if needed, to step through this lab
-
Click Open Google Cloud console (or right-click and select Open Link in Incognito Window if you are running the Chrome browser).
The lab spins up resources, and then opens another tab that shows the Sign in page.
Tip: Arrange the tabs in separate windows, side-by-side.
Note: If you see the Choose an account dialog, click Use Another Account.
-
If necessary, copy the Username below and paste it into the Sign in dialog.
{{{user_0.username | "Username"}}}
You can also find the Username in the Lab Details pane.
-
Click Next.
-
Copy the Password below and paste it into the Welcome dialog.
{{{user_0.password | "Password"}}}
You can also find the Password in the Lab Details pane.
-
Click Next.
Important: You must use the credentials the lab provides you. Do not use your Google Cloud account credentials.
Note: Using your own Google Cloud account for this lab may incur extra charges.
-
Click through the subsequent pages:
- Accept the terms and conditions.
- Do not add recovery options or two-factor authentication (because this is a temporary account).
- Do not sign up for free trials.
After a few moments, the Google Cloud console opens in this tab.
Note: To access Google Cloud products and services, click the Navigation menu or type the service or product name in the Search field.
Activate Cloud Shell
Cloud Shell is a virtual machine that is loaded with development tools. It offers a persistent 5GB home directory and runs on the Google Cloud. Cloud Shell provides command-line access to your Google Cloud resources.
-
Click Activate Cloud Shell
at the top of the Google Cloud console.
-
Click through the following windows:
- Continue through the Cloud Shell information window.
- Authorize Cloud Shell to use your credentials to make Google Cloud API calls.
When you are connected, you are already authenticated, and the project is set to your Project_ID, . The output contains a line that declares the Project_ID for this session:
Your Cloud Platform project in this session is set to {{{project_0.project_id | "PROJECT_ID"}}}
gcloud is the command-line tool for Google Cloud. It comes pre-installed on Cloud Shell and supports tab-completion.
- (Optional) You can list the active account name with this command:
gcloud auth list
- Click Authorize.
Output:
ACTIVE: *
ACCOUNT: {{{user_0.username | "ACCOUNT"}}}
To set the active account, run:
$ gcloud config set account `ACCOUNT`
- (Optional) You can list the project ID with this command:
gcloud config list project
Output:
[core]
project = {{{project_0.project_id | "PROJECT_ID"}}}
Note: For full documentation of gcloud, in Google Cloud, refer to the gcloud CLI overview guide.
Task 1. Set up the Vertex AI environment
You will enable the necessary APIs, access the Vertex AI console, and prepare your storage bucket.
Confirm that Vertex AI is enabled
The Vertex AI API is required for managing datasets, training, and deploying models.
-
In the Google Cloud console, in the Navigation menu (
), select APIs & Services > Library.
-
In the Search for APIs & services field, type Vertex AI API, then click Vertex AI API in the search results.
-
Confirm that the Vertex AI API is in the Enable state. If not, click Enable.
Open the Vertex AI Dashboard
- In the Navigation menu (
), click Vertex AI.
Create a storage bucket
The storage bucket will hold your training images and the manifest file. The region must be supported by Vertex AI.
- In Cloud Shell, run the following command to create a storage bucket named -vcm:
gsutil mb -p $GOOGLE_CLOUD_PROJECT \
-c standard \
-l {{{project_0.default_region | Region}}} \
gs://$GOOGLE_CLOUD_PROJECT-vcm/
Click Check my progress to verify the objective.
Create a Cloud Storage bucket
Task 2. Upload training images to Cloud Storage
To train a model to classify cloud images, you need labeled training data so the model can develop an understanding of the image features associated with different types of clouds. In this example your model learns to classify three different types of clouds: cirrus, cumulus, and cumulonimbus.
To put the training images in your Cloud Storage bucket:
- In Cloud Shell, run the following command to create an environment variable with the name of your bucket:
export BUCKET=$GOOGLE_CLOUD_PROJECT-vcm
The training images are publicly available in a Cloud Storage bucket.
- Use the
gsutil command line utility for Cloud Storage to copy the training images from a public bucket into your new bucket:
gsutil -m cp -r gs://spls/gsp223/images/* gs://${BUCKET}
- To view the images you just copied into your bucket, in the Navigation menu, click Cloud Storage > Buckets, then click on your bucket name. You should see three folders corresponding to the cloud types.
If you click on the individual image files in each folder you can see the photos you'll use to train the model for each type of cloud.
Click Check my progress to verify the objective.
Upload training images to Cloud Storage Bucket
Task 3. Create a dataset
Now that your training data is in Cloud Storage, you need a way for Vertex AI to access it. Typically, you'd create a CSV file where each row contains a URL to a training image and the associated label for that image.
For this lab, the CSV file has been created for you; you just need to update it with your bucket name.
- Copy the CSV file to your Cloud Shell instance:
gsutil cp gs://spls/gsp223/data.csv .
- Update the CSV file with your specific bucket name:
sed -i -e "s/placeholder/${BUCKET}/g" ./data.csv
- Upload the updated CSV file to your Cloud Storage bucket:
gsutil cp ./data.csv gs://${BUCKET}
-
Once that command completes, click the Refresh button at the top of the Storage browser. Confirm that you see the data.csv file in your bucket.
-
In the Navigation menu, click Vertex AI > Datasets.
-
Select and click Create.
-
Set Dataset Name to clouds_vertex_ai.
-
Select Image as the data type.
-
Select Single-label classification (single-label) as the objective
Note:
In your own projects, you may want to use multi-class classification.
- For region selection, select and click Create.
Now you'll import data.
-
Choose Select import files from Cloud Storage and then click Browse > -vcm > data.csv. Click Select.
-
Click Continue and then Import.
Wait for the image import to complete, it should take 2 - 5 minutes.
Click Check my progress to verify the objective.
Create a Dataset
Task 4. Generate predictions
Since the model has been pre-trained and is assumed to be deployed, you'll now use the Vertex AI API (via a curl command) to get predictions.
Simulate Deployment (Pre-trained Model)
The next steps assume a pre-trained model is deployed to a Vertex AI Endpoint. In a production environment, after training, your model would be deployed to a Vertex AI Endpoint. The prediction request structure uses the model's resource ID or the endpoint's URL.
Prepare the input images
- Download the two example images you'll use for prediction:
gsutil cp gs://spls/gsp223/examples/* .
- View the example input file for prediction (
CLOUD1-JSON), which contains placeholder bytes:
cat CLOUD1-JSON
Generate Predictions Against a Vertex AI Endpoint
Use the generic Vertex AI prediction API format, which requires the model to be deployed. The following commands use a placeholder endpoint/model name for illustration, assuming a deployed model named clouds-model exists.
Test 1: prediction for CLOUD1
- Set the input file to the first image:
INPUT_DATA_FILE=CLOUD1-JSON
- Simulate vertex AI preditions URL
-
Since a live endpoint can't be automatically provided, we use a command structure that reflects the Vertex AI API, which uses your Project ID and a Model ID.
-
In a real-world scenario, you would first get the actual Endpoint ID. This command is a conceptual.
- Set a dummy MODEL_ID and the REGION:
export MODEL_ID='clouds_vertex_model'
export REGION='{{{project_0.default_region | Region}}}'
- Execute the (conceptual) Vertex AI prediction call:
curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json" \
"https://${REGION}-aiplatform.googleapis.com/v1/projects/${GOOGLE_CLOUD_PROJECT}/locations/${REGION}/models/${MODEL_ID}:predict" \
-d "@${INPUT_DATA_FILE}" \
| jq > prediction1.txt
Expected Output (Conceptual): The model should predict this is a cirrus cloud with high confidence.
Test 2: prediction for CLOUD2
- Set the input file to the first image:
INPUT_DATA_FILE=CLOUD2-JSON
- Execute the (conceptual) Vertex AI prediction call:
curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json" \
"https://${REGION}-aiplatform.googleapis.com/v1/projects/${GOOGLE_CLOUD_PROJECT}/locations/${REGION}/models/${MODEL_ID}:predict" \
-d "@${INPUT_DATA_FILE}" \
| jq > prediction2.txt
Expected Output (Conceptual): The model should predict this is a cumulonimbus cloud with high confidence.
- Copy the txt files to your Google Cloud Storage bucket:
gsutil cp *.txt gs://${BUCKET}
Click Check my progress to verify the objective.
Generate predictions
Congratulations!
In this lab you used Vertex AI to create an image dataset and generate predictions against a deployed model endpoint.
Next steps / Learn more
Google Cloud training and certification
...helps you make the most of Google Cloud technologies. Our classes include technical skills and best practices to help you get up to speed quickly and continue your learning journey. We offer fundamental to advanced level training, with on-demand, live, and virtual options to suit your busy schedule. Certifications help you validate and prove your skill and expertise in Google Cloud technologies.
Manual Last Updated October 17, 2025
Lab Last Tested October 17, 2025
Copyright 2025 Google LLC. All rights reserved. Google and the Google logo are trademarks of Google LLC. All other company and product names may be trademarks of the respective companies with which they are associated.