
准备工作
- 实验会创建一个 Google Cloud 项目和一些资源,供您使用限定的一段时间
- 实验有时间限制,并且没有暂停功能。如果您中途结束实验,则必须重新开始。
- 在屏幕左上角,点击开始实验即可开始
Copy sample files to the training_vm home directory
/ 5
Simulate traffic sensor data into Pub/Sub
/ 5
Launch Dataflow Pipeline
/ 5
In this lab, you will use Dataflow to collect traffic events from simulated traffic sensor data made available through Google Cloud PubSub, and write them into a Bigtable table.
In this lab, you will perform the following tasks:
For each lab, you get a new Google Cloud project and set of resources for a fixed time at no cost.
Sign in to Qwiklabs using an incognito window.
Note the lab's access time (for example, 1:15:00
), and make sure you can finish within that time.
There is no pause feature. You can restart if needed, but you have to start at the beginning.
When ready, click Start lab.
Note your lab credentials (Username and Password). You will use them to sign in to the Google Cloud Console.
Click Open Google Console.
Click Use another account and copy/paste credentials for this lab into the prompts.
If you use other credentials, you'll receive errors or incur charges.
Accept the terms and skip the recovery resource page.
Before you begin your work on Google Cloud, you need to ensure that your project has the correct permissions within Identity and Access Management (IAM).
In the Google Cloud console, on the Navigation menu (), select IAM & Admin > IAM.
Confirm that the default compute Service Account {project-number}-compute@developer.gserviceaccount.com
is present and has the editor
role assigned. The account prefix is the project number, which you can find on Navigation menu > Cloud Overview > Dashboard.
editor
role, follow the steps below to assign the required role.729328892908
).{project-number}
with your project number.You will be running a sensor simulator from the training VM. There are several files and some setup of the environment required.
In the Console, on the Navigation menu ( ), click Compute Engine > VM instances.
Locate the line with the instance called training-vm.
On the far right, under the Connect column, click on SSH to open a terminal window. Then click Connect.
In this lab, you will enter CLI commands on the training-vm.
The setup is complete when the result of your list (ls
) command output appears as in the image below. If the full listing does not appear, wait a few minutes and try again.
This script sets the $DEVSHELL_PROJECT_ID
and $BUCKET
environment variables.
Click Check my progress to verify the objective.
This command will send 1 hour of data in 1 minute. Let the script continue to run in the current terminal.
The new terminal session will not have the required environment variables. Complete the next step to set these variables.
Click Check my progress to verify the objective.
What does the script do?
The script takes 3 required arguments: project id, bucket name, classname and possibly a 4th argument: options. In this part of the lab, we will use the --bigtable
option which will direct the pipeline to write into Cloud Bigtable.
Press CTRL+X to exit.
Run the following commands to create the Bigtable instance:
Example successful run:
Click Check my progress to verify the objective.
Return to the browser tab for Console. On the Navigation menu ( ), click Dataflow and click on the new pipeline job. Confirm that the pipeline job is listed and verify that it is running without errors.
Find the write:cbt step in the pipeline graph, and click on the down arrow on the right to see the writer in action. Click on the given writer. Review the Bigtable Options in the Step summary.
Repeat the 'scan' command until you see a list of rows returned:
Review the output. Notice each row is broken into column, timestamp, value combinations.
Run another query. This time look only at the lane: speed column, limit to 10 rows, and specify rowid patterns for start and end rows to scan over:
Review the output. Notice that you see 10 of the column, timestamp, value combinations, all of which correspond to Highway 15. Also notice that column is restricted to lane: speed.
Feel free to run other queries if you are familiar with the syntax. Once you're satisfied, enter quit
to exit the shell:
If prompted to confirm, enter Y
.
On your Dataflow page in your Cloud Console, click on the pipeline job name.
Click Stop on the top menu bar. Select Cancel, and then click Stop Job.
Go back to the first SSH terminal with the publisher, and enter Ctrl+C
to stop it.
In the BigQuery console, click on the three dots next to the demos dataset, and click Delete.
Type delete and then click Delete.
When you have completed your lab, click End Lab. Google Cloud Skills Boost removes the resources you’ve used and cleans the account for you.
You will be given an opportunity to rate the lab experience. Select the applicable number of stars, type a comment, and then click Submit.
The number of stars indicates the following:
You can close the dialog box if you don't want to provide feedback.
For feedback, suggestions, or corrections, please use the Support tab.
Copyright 2022 Google LLC All rights reserved. Google and the Google logo are trademarks of Google LLC. All other company and product names may be trademarks of the respective companies with which they are associated.
此内容目前不可用
一旦可用,我们会通过电子邮件告知您
太好了!
一旦可用,我们会通过电子邮件告知您
一次一个实验
确认结束所有现有实验并开始此实验