
准备工作
- 实验会创建一个 Google Cloud 项目和一些资源,供您使用限定的一段时间
- 实验有时间限制,并且没有暂停功能。如果您中途结束实验,则必须重新开始。
- 在屏幕左上角,点击开始实验即可开始
The Cloud Retail service and the Retail API enables customers to build end-to-end personalized recommendation systems without requiring a high level of expertise in machine learning, recommendation system, or Google Cloud.
In this lab, you will explore the monitoring and management features available for the Retail service and examine some common issues and errors associated with importing and updating catalog and event data. The performance of the Recommendations AI and Product Search services is dependent on having an error free stream of user events so it is important to monitor these services and create alerts that trigger when there are issues with your Retail catalog and user event data.
This lab uses a subset of the Google Merchant Center dataset for product catalog and user event data.
In this lab, you will learn how to complete the following tasks:
For each lab, you get a new Google Cloud project and set of resources for a fixed time at no cost.
Sign in to Qwiklabs using an incognito window.
Note the lab's access time (for example, 1:15:00
), and make sure you can finish within that time.
There is no pause feature. You can restart if needed, but you have to start at the beginning.
When ready, click Start lab.
Note your lab credentials (Username and Password). You will use them to sign in to the Google Cloud Console.
Click Open Google Console.
Click Use another account and copy/paste credentials for this lab into the prompts.
If you use other credentials, you'll receive errors or incur charges.
Accept the terms and skip the recovery resource page.
While in Google Cloud you can operate Google Cloud remotely from your own machine. This lab uses both the Google Cloud Console and the Cloud Shell, a command line environment running in Google Cloud.
From the Cloud Console, click Activate Cloud Shell.
Here's what that one-time screen looks like:
It should only take a few moments to provision and connect to Cloud Shell.
Cloud Shell provides you with terminal access to a virtual machine hosted in the cloud. The virtual machine includes all the development tools that you'll need. It offers a persistent 5GB home directory and runs in Google Cloud, greatly enhancing network performance and authentication. Much, if not all, of your work in this lab can be done through the Cloud Console and Cloud Shell using only a browser.
Once connected to Cloud Shell, you should see that you are already authenticated and that the project is already set to your project ID.
Run the following command in Cloud Shell to confirm that you are authenticated:
Output:
To set the active account, run:
Run the following command to confirm that you are using the correct project for this lab:
Output:
If the correct project is not listed, you can set it with this command:
Output:
Before you can begin using the Retail Recommendations AI or Retail Search APIs, you must enable the Retail API.
On the Navigation menu (), click VIEW ALL PRODUCTS. In the Artificial Intelligence section, click Search for Retail.
Click Turn On API.
Click Continue and accept the data terms by clicking Accept.
Click Get Started.
In this task, import product data into the catalog from a BigQuery table that uses the Retail products schema.
In the GCP Console, on the Navigation menu (), click VIEW ALL PRODUCTS. In the Artificial Intelligence section, click Search for Retail > Data to open the Retail Data management page.
Make sure the Catalog tab is selected and click Import.
Configure the import parameters as follows to import the product catalog:
For Big Query table, click Browse.
Enter products in the search box and click Search.
Select the radio button for products Dataset: retail table.
Click Select.
You need to wait for a pop-up message to appear with a message similar to the following:
When the import task is scheduled you will also see the details of a gcloud scheduler
command displayed that you can use to schedule a regular data import task.
Click X to close the popup that appeared to tell you that the import operation was successfully scheduled.
Click Cancel to close the import page and return to the Retail Data page to check the status of your catalog data import task.
In the Search Retail navigation menu, click Data and then click Activity Status to monitor the progress of the import task.
The import task will take a minute or two for the status of the import task in the Product catalog import activity status section to change to Succeeded. A total of 1268 items will have been imported.
In this task, import user event data from a BigQuery table.
In the GCP Console, on the Navigation menu (), click VIEW ALL PRODUCTS. In the Artificial Intelligence section, click Search for Retail > Data to open the Retail Data management page.
Make sure the Event tab is selected and click Import.
Configure the import parameters as follows to import the product catalog:
For Google Cloud Storage location, click the Browse button.
Navigate to the storage bucket called recent_retail_events.json
.
Click the Filename to make sure it is selected.
Click Select.
Click Import.
You need to wait for a pop-up message to appear with a message similar to the following:
When the import task is scheduled you will also see the details of a gcloud scheduler
command displayed that you can use to schedule a regular event import task.
Wait for the import task to be scheduled with the gcloud scheduler
command displayed.
Click X to close the popup that appeared to tell you that the import operation was successfully scheduled.
Click Cancel to close the import page and return to the Retail Data page to check the status of your event data import task.
In the Retail navigation menu, click Data and then click Activity Status to monitor the progress of the import task.
The import task will take a minute or two for the status of the import task in the User events import activity status section to change to Succeeded. Approximately 32,000 items will have been imported and 5 items will have failed.
In this task, you will explore the features in the Retail Dashboard and the information provided.
The Dashboard contains high-level status information and quick links for the following:
In this task, you will configure an alert that fires when user events uploads are interrupted. This indicates that there is a major problem with the source of your user events, your retail site for example, that should be investigated.
You will now create a service-managed alerting policy using the Retail console.
In the GCP Console, on the Navigation menu (), click VIEW ALL PRODUCTS. In the Artificial Intelligence section, click Search for Retail > Monitoring to open the Retail Monitoring page.
Click Recommended Alerts.
Enable User events record reduction.
Click Get Code > Command Line.
This will display the gcloud command line
and REST
API syntax that you can use to create an alert that will fire if no Retail events have been uploaded for a period of 10 minutes. Note that this command does not include a notification channel, that will need to be specified if you want to use a command like this.
Click Close. You will use the console to set up the alert in the following steps.
In User events record reduction, click Notification Channels.
Click Manage Notification Channels.
This will open a new browser tab with the Cloud Monitoring Notification Channels page open.
Next to Emails, click Add New.
Enter your email and name details in the Create Email Channel dialog.
Click Save.
Close the Cloud Monitoring Notification Channels browser tab. This will return you to the Retail alert configuration tab.
Click the refresh icon next to Manage Notification Channels.
This will close the Manage Notifications Channels pop-up.
In User events record reduction, click Notification Channels again.
Select the Email notification channel for the email alert channel you have just created.
Click OK.
Click Submit.
You will explore the information and controls available to you for Retail alerting policies in the Cloud Operations Monitoring console by opening the service-managed policy for user events record reduction that you created in the previous task.
In the GCP Console, on the Navigation menu (), click VIEW ALL PRODUCTS. In the Observability section, click Monitoring > Overview to open the Cloud Operations Monitoring overview page.
Click Alerting.
Under policies, click User events record reduction to open your newly created Retail alert.
The button is greyed out and the mouse-over text indicates that you cannot edit a service-managed alerting policy.
Note that you can enable and disable the event on these page but you cannot change the alert parameters or change the notification channel.
On the Monitoring menu, click Monitoring Settings.
Click Notification channels.
Note that you can manage your notification channels here, adding or removing alerting paths as required. Any additional notification channels created here will be available for Retail service-managed policies you create in the Retail console.
In this task, you will examine the data ingestion and other events in Cloud Logging.
In the GCP Console, on the Navigation menu (), click VIEW ALL PRODUCTS. In the Observability section, click Logging to open the Retail dashboard.
Click Logs Explorer.
Copy the following query string into the query edit dialog and click Run Query.
If no events are displayed, click Extend time by: 1 hour and extend the time by a few hours until you can see the user event import errors that were triggered when you imported user event data.
Click the > icon to the left of one of the events to expand it.
Click jsonPayload.importPayload to expand the import payload data.
The jsonPayload.importPayload.gcsPath
string contains the name of the Cloud Storage source file that had unjoined event data. This is the recent_retail_events.json
file you imported in an earlier lab.
The status.message
string contains the full error details, including the unmatched productId strings not found in the catalog. This data will help you identify the specific events that failed and allow you to identify the root cause of the problem.
Click Edit Query.
Copy the following query string into the query edit dialog and click Run Query.
This will show you a wider range of error and more event data.
You can expand the jsonPayload.status
property in these events to get the specific details that caused the event errors.
You will now use curl to make calls to the Retail API to retrieve and update product catalog data.
Create an environment variable to store the Project ID:
Create an IAM service account for controlled access to the Retail API:
If this succeeds then the updated project IAM policy, including this new binding, is displayed.
Creating a role binding on the service account for the lab user with the Service Account Token Creator role allows the lab user to use service account impersonation to safely generate limited duration authentication tokens for the service account. These tokens can then be used to interactively test access to APIs and services.
Create a role binding on the Retail API service account for your user account to permit impersonation:
Generate a temporary access token for the Retail API:
You will use the product URL in the catalog to retrieve the product data.
Store sample user event JSON data in an environment variable:
Store the REST API URL for the product in your catalog using the Retail API in an environment variable:
This is the REST API URL for the product in the Retail API. This URL is used to manage the product in the catalog. Note that the URL includes bash environment variable substitutions for the Project ID and the Product_ID.
The response returned will look like the following block of JSON.
Note that the product title is Android Iconic Sock
.
This update payload makes a minor change to the product title.
This will return the updated product JSON in the response showing the updated product title.
Congratulations, you've successfully explored the monitoring and management tools for the Retail API.
此内容目前不可用
一旦可用,我们会通过电子邮件告知您
太好了!
一旦可用,我们会通过电子邮件告知您
一次一个实验
确认结束所有现有实验并开始此实验