700개 이상의 실습 및 과정 이용하기

Configuring and Using Cloud Logging and Cloud Monitoring

실습 1시간 15분 universal_currency_alt 크레딧 5개 show_chart 입문
info 이 실습에는 학습을 지원하는 AI 도구가 통합되어 있을 수 있습니다.
700개 이상의 실습 및 과정 이용하기

Overview

In this lab, you will learn common configurations and uses of both Cloud Logging and Cloud Monitoring.

You will learn how to view logs with filtering mechanisms, export logs to BigQuery syncs, and create logging metrics. You will also learn how to use Cloud Monitoring to view consumption metrics and create dashboards..

Objectives

In this lab, you will learn how to perform the following tasks:

  • View logs using a variety of filtering mechanisms.
  • Exclude log entries and disable log ingestion.
  • Export logs and run reports against exported logs.
  • Create and report on logging metrics.
  • Use Cloud Monitoring to monitor different Google Cloud projects.
  • Create a metrics dashboard.

Setup and requirements

For each lab, you get a new Google Cloud project and set of resources for a fixed time at no cost.

  1. Sign in to Qwiklabs using an incognito window.

  2. Note the lab's access time (for example, 1:15:00), and make sure you can finish within that time.
    There is no pause feature. You can restart if needed, but you have to start at the beginning.

  3. When ready, click Start lab.

  4. Note your lab credentials (Username and Password). You will use them to sign in to the Google Cloud Console.

  5. Click Open Google Console.

  6. Click Use another account and copy/paste credentials for this lab into the prompts.
    If you use other credentials, you'll receive errors or incur charges.

  7. Accept the terms and skip the recovery resource page.

Activate Google Cloud Shell

Google Cloud Shell is a virtual machine that is loaded with development tools. It offers a persistent 5GB home directory and runs on the Google Cloud.

Google Cloud Shell provides command-line access to your Google Cloud resources.

  1. In Cloud console, on the top right toolbar, click the Open Cloud Shell button.

    Highlighted Cloud Shell icon

  2. Click Continue.

It takes a few moments to provision and connect to the environment. When you are connected, you are already authenticated, and the project is set to your PROJECT_ID. For example:

Project ID highlighted in the Cloud Shell Terminal

gcloud is the command-line tool for Google Cloud. It comes pre-installed on Cloud Shell and supports tab-completion.

  • You can list the active account name with this command:
gcloud auth list

Output:

Credentialed accounts: - @.com (active)

Example output:

Credentialed accounts: - google1623327_student@qwiklabs.net
  • You can list the project ID with this command:
gcloud config list project

Output:

[core] project =

Example output:

[core] project = qwiklabs-gcp-44776a13dea667a6 Note: Full documentation of gcloud is available in the gcloud CLI overview guide .

Task 1. View and filter logs in first project

In this task, you view VM instance logs with simple filtering.

See which services are writing logs

  1. Ensure that you are on the Google Cloud Console homepage.

  2. Verify you are still working in project 1; the project ID in the Console's info panel should match Project ID 1 in your lab's connection details panel.

  3. In the Google Cloud console, in the Navigation menu (Navigation menu icon), click Monitoring > Logs explorer.

If prompted, close the notification.

  1. On the left-hand panel, Fields will be visible. For Resource Type, you will see several Google Cloud services that are creating logs.

All of these services are writing log entries. Entries from all these logs appear on the right, in the Query results pane. You can also query for results from specific logs, or that match specific criteria.

View VM instance logs with simple filtering

  1. In the System Metadata, for Resource Type, click VM Instance.

After you click this:

  • The contents of the Log fields panel changes. You will see a new field named INSTANCE ID. It shows all the instance IDs of the VM instances that are writing log entries.
  • The Query box near the top of the page is populated with resource.type="gce_instance". This means that only entries from VM instances will be logged and displayed.
  • The Query results pane also updates automatically—entries from VM Instances are the only logs displayed.
  1. In the Instance Id field, select one of the instance IDs. Logs for the associated VM instance appear in the Query results pane.

  2. Click inside the Query box. This now becomes editable.

  3. In the Query box, remove everything after line 1. You should see only line 1, which contains resource.type="gce_instance".

  4. Click Run query (located in the top-right corner). In the Query results, you should see entries from all VM instance logs.

  5. Note that the logs panel reverts to its previous state.

  1. Now click on the All log names dropdown, and select syslog, and then click Apply.

Entries from syslog appear in the Query results pane.

Note: You can also control log entry displays by selecting the log severity and time windows.
  1. Turn on streaming logs by clicking (Stream logs) Stream logs (top-right corner, above the "Run query" button).

The streamed logs are visible in the results pane.

  1. Stop log streaming by clicking on Stop stream in the top-right corner.

Task 2. Use log exports

In this task, you configure and test log exports to BigQuery.

Cloud Logging retains log entries for 30 days. In most circumstances, you'll want to retain some log entries for an extended time (and possibly perform sophisticated reporting on the archived logs).

Google Cloud provides a mechanism to have all log entries ingested into Cloud Monitoring also written to one or more archival sinks.

Configure the export to BigQuery

  1. In the Google Cloud console, in the Navigation menu (Navigation menu icon), click Monitoring > Log router.

  2. Click Create sink.

  3. For the Sink name, type vm_logs and then click Next.

  4. For Select sink service, select BigQuery dataset.

  5. For Select BigQuery dataset, select Create new BigQuery dataset.

  6. For the Dataset ID, type project_logs, and click Create dataset.

  7. Click Next.

  8. In the Build inclusion filter list box, copy and paste resource.type="gce_instance"

  9. Click Next.

  10. Click Create sink. You will now return to a Log Router Create log sink next steps page (a message at the top may appear that says "Your log sink was successfully created. Data should be available soon.")

Note: You could also export log entries to Pub/Sub or Cloud Storage.

Exporting to Pub/Sub can be useful if you want to flow through an ETL process prior to storing in a database (Monitoring > Pub/Sub > Dataflow > BigQuery/Bigtable).

Exporting to Cloud Storage will batch up entries and write them into Cloud Storage objects approximately every hour.

Configure HTTP load balancing exports to BigQuery

You will now create an export for the HTTP load balancing logs to BigQuery.

  1. From the left-hand navigation menu, select Log router to return to the service homepage.

  2. Click Create Sink.

  3. For the Sink name, type load_bal_logs and then click Next.

  4. For Select sink service, select BigQuery dataset.

  5. For Select BigQuery dataset, select project_logs. (You created this BigQuery dataset in the previous set of steps.)

  6. Click Next.

  7. In the Build inclusion filter list box, copy and paste resource.type="http_load_balancer"

  8. Click Next.

  9. Click Create sink.

  10. You will now be on the Create log sink next steps page for the log sink.

  11. From the left-hand navigation menu, select Log router to return to the service homepage.

The Log Router page appears, displaying a list of sinks (including the one you just created—load_bal_logs).

Investigate the exported log entries

  1. In the Google Cloud console, on the Navigation menu (Navigation menu icon), click BigQuery.

  2. The "Welcome to BigQuery in the Cloud Console" message box opens. This message box provides a link to the quickstart guide and lists UI updates.

  3. Click Done.

  4. In the left pane in the Explorer section, click the arrow next to your project (this starts with qwiklabs-gcp-xxx) and you should see a project_logs dataset revealed under it.

You will now verify that the BigQuery dataset has appropriate permissions to allow the export writer to store log entries.

  1. Click on the three dotted menu item ("View actions") next to the project_logs dataset and click Open.

  2. Then from the top-right hand corner of the console, click the Share dropdown and select Manage permissions.

  3. On the Share permissions page, you will see that your service accounts have the "BigQuery Data Editor" role.

  4. Close the share permissions panel.

  5. Expand the project_logs dataset to see the tables with your exported logs—you should see multiple tables (one for each type of log that's receiving entries).

  6. Click on the syslog_(1) table, then click Details to see the number of rows and other metadata.

Note: If the syslog_(1) table is not visible, wait for few minutes and try refreshing the browser.
  1. In Details tab, under the table info you will see the full table name in the Table ID, copy this table name.
Note: Because the log entries are being streamed into BigQuery as they arrive to Cloud Monitoring, they are stored in a BigQuery streaming buffer. Roughly 24 hours after arriving in the buffer, they will be moved into regular BigQuery storage. You can perform queries against the table and both the data in regular storage and the buffer will be scanned.
  1. To see a subset of your tables fields, paste the below query in the query editor tab (replacing qwiklabs-gcp-xx.project_logs.syslog_xxxxx with the table name you copied in the previous step).
SELECT logName, resource.type, resource.labels.zone, resource.labels.project_id, FROM `qwiklabs-gcp-xx.project_logs.syslog_xxxxx`
  1. Then click Run.

Feel free to experiment with some other queries that might provide interesting insights.

Note: Cloud Logging exports incoming log entries before any decision is made about ingesting the entry into logging storage. As a result, only new log entries will be exported to the sink. As a result, you may not see a syslog_(1) table as all the syslog entries were generated prior to the export.

Existing log entries already ingested into Cloud Logging can be extracted using commands like:

gcloud logging read "resource.type=gce_instance AND logName=projects/[PROJECT_ID]/logs/syslog AND textPayload:SyncAddress" --limit 10 --format json.
Note: You have set up an export for all the log entries generated by all services in the project. You can also create aggregate exports, which export log entries generated across projects, grouped by billing account, folder, or organization.

Click Check my progress to verify the objective. Configure the export to BigQuery

Task 3. Create a logging metric

In this task, you create a counter-type logging metric named 403s within Google Cloud Monitoring. You configure this metric to filter and count syslog entries specifically from your GCE instances.

  1. In the Google Cloud console, on the Navigation menu (Navigation menu icon), click Monitoring > Logs-based Metrics.
Note: If prompted, click Leave for unsaved work.
  1. Click Create metric.

  2. In the Log-based metric Editor, set Metric Type as Counter.

  3. For the Details section, set the Log-based metric name to 403s.

  4. For the Filter selection for Build filter, enter the following and replace PROJECT_ID with Project ID 1:

resource.type="gce_instance" log_name="projects/PROJECT_ID/logs/syslog"
  1. Leave all the other fields at their default.

  2. Click Create metric.

  3. You will make use of this metric in the dashboarding portion of the lab.

Click Check my progress to verify the objective. Create a logging metric

Task 4. Create a monitoring dashboard

In this task, you switch to the second project created by Qwiklabs and setup a Monitoring workspace.

Switch projects

Note the Project shown in the upper left corner of the Google Cloud console. First, you’ll switch the console to use Project ID 2.

GCP project identifier

  1. Click the project name at the top of the Google Cloud console and click the All tab.

Google Cloud all projects tab

  1. Click the project that matches Project ID 2 from the Qwiklabs Connection Details.

  2. Click Open.

Create a Monitoring workspace

You will now setup a Monitoring workspace that's tied to your Google Cloud Project. The following steps create a new account that has a free trial of Monitoring.

  1. In the Google Cloud console, in the Navigation menu (Navigation menu icon), click Monitoring > Overview.

  2. Wait for your workspace to be provisioned.

When the Monitoring dashboard opens, your workspace is ready.

Cloud Monitoring Dashboard

Now add the first project to your Cloud Monitoring workspace.

  1. In the left menu, click Settings, then click Metric Scope and click + Add projects.

  2. Click Select Projects

  3. Select the checkmark next to your first project ID and click Select.

  4. Click Add projects.

Create a monitoring dashboard

  1. In the left pane, click Dashboards.

  2. Click Create Custom Dashboard.

  3. Replace the generic dashboard name at the top with Example Dashboard.

  4. Click + Add widget > Line.

  5. For Widget Title, enter in CPU Usage.

  6. Click Select a metric dropdown.

  7. Click Active to deselect it. The tick should disappear.

  8. From Popular Resources, select VM Instance > Instance > CPU usage. Make sure it's the one that follows the format: compute.googleapis.com/instance/cpu/usage_time.

  9. Click Apply.

  10. Now click Apply in the top-right corner.

  11. Click + Add widget > Line.

  12. For Widget Title, enter in Memory Utilization.

  13. Click Select a metric dropdown.

  14. Click Active to deselect it. The tick should disappear.

  15. From Popular Resources, select VM Instance > Memory > Memory utilization. Make sure it's the one that follows the format: agent.googleapis.com/memory/percent_used.

  16. Click Apply.

  17. Now click Apply in the top-right corner.

  18. Click + Add widget > Line.

  19. For Widget Title, enter in 403s.

  20. Click Select a metric dropdown.

  21. From Popular Resources, select VM Instance > Logs-based metrics > Logging/user/403s. Make sure it's the one that follows the format: logging.googleapis.com/user/403s.

Note: If you don't see Logging/user/403s, refresh your browser.
  1. Click Apply.

  2. Now click Apply in the top-right corner.

You should now see your three graphs—one for CPU usage, memory utilization—populated and the other for 403s .

CPU usage and netowrk traffic graphs in the monitoring dashboard

You can now explore some other options by editing the charts such as Filter, Group By, and Aggregation.

Congratulations!

In this lab, you learned how to do the following:

  • View logs using a variety of filtering mechanisms.

  • Exclude log entries and disable log ingestion.

  • Export logs and run reports against exported logs.

  • Create and report on logging metrics.

  • Use Cloud Monitoring to monitor different Google Cloud projects.

  • Create a metrics dashboard.

End your lab

When you have completed your lab, click End Lab. Google Cloud Skills Boost removes the resources you’ve used and cleans the account for you.

You will be given an opportunity to rate the lab experience. Select the applicable number of stars, type a comment, and then click Submit.

The number of stars indicates the following:

  • 1 star = Very dissatisfied
  • 2 stars = Dissatisfied
  • 3 stars = Neutral
  • 4 stars = Satisfied
  • 5 stars = Very satisfied

You can close the dialog box if you don't want to provide feedback.

For feedback, suggestions, or corrections, please use the Support tab.

Copyright 2025 Google LLC All rights reserved. Google and the Google logo are trademarks of Google LLC. All other company and product names may be trademarks of the respective companies with which they are associated.

시작하기 전에

  1. 실습에서는 정해진 기간 동안 Google Cloud 프로젝트와 리소스를 만듭니다.
  2. 실습에는 시간 제한이 있으며 일시중지 기능이 없습니다. 실습을 종료하면 처음부터 다시 시작해야 합니다.
  3. 화면 왼쪽 상단에서 실습 시작을 클릭하여 시작합니다.

시크릿 브라우징 사용

  1. 실습에 입력한 사용자 이름비밀번호를 복사합니다.
  2. 비공개 모드에서 콘솔 열기를 클릭합니다.

콘솔에 로그인

    실습 사용자 인증 정보를 사용하여
  1. 로그인합니다. 다른 사용자 인증 정보를 사용하면 오류가 발생하거나 요금이 부과될 수 있습니다.
  2. 약관에 동의하고 리소스 복구 페이지를 건너뜁니다.
  3. 실습을 완료했거나 다시 시작하려고 하는 경우가 아니면 실습 종료를 클릭하지 마세요. 이 버튼을 클릭하면 작업 내용이 지워지고 프로젝트가 삭제됩니다.

현재 이 콘텐츠를 이용할 수 없습니다

이용할 수 있게 되면 이메일로 알려드리겠습니다.

감사합니다

이용할 수 있게 되면 이메일로 알려드리겠습니다.

한 번에 실습 1개만 가능

모든 기존 실습을 종료하고 이 실습을 시작할지 확인하세요.

시크릿 브라우징을 사용하여 실습 실행하기

이 실습을 실행하려면 시크릿 모드 또는 시크릿 브라우저 창을 사용하세요. 개인 계정과 학생 계정 간의 충돌로 개인 계정에 추가 요금이 발생하는 일을 방지해 줍니다.