리뷰 Cloud Composer 3 소개개
로드 중...
검색 결과가 없습니다.

Google Cloud 콘솔에서 기술 적용

리뷰 Cloud Composer 3 소개개

리뷰 22608개

Alvindra R. · 12개월 전에 리뷰됨

Michael G. · 12개월 전에 리뷰됨

Michael G. · 12개월 전에 리뷰됨

Michael G. · 12개월 전에 리뷰됨

Arindam R. · 12개월 전에 리뷰됨

Bacterie M. · 12개월 전에 리뷰됨

Sreejith R. · 12개월 전에 리뷰됨

Subham J. · 12개월 전에 리뷰됨

Khôi P. · 12개월 전에 리뷰됨

Robert Leandro R. · 12개월 전에 리뷰됨

Kiran B. · 12개월 전에 리뷰됨

Lucas N. · 12개월 전에 리뷰됨

Good

Tahir P. · 12개월 전에 리뷰됨

Revanth kumar G. · 12개월 전에 리뷰됨

Revanth kumar G. · 12개월 전에 리뷰됨

Arindam R. · 12개월 전에 리뷰됨

Giacomo G. · 12개월 전에 리뷰됨

Raghunandan P. · 12개월 전에 리뷰됨

Rebeka S. · 12개월 전에 리뷰됨

livin v. · 12개월 전에 리뷰됨

Veena B. · 12개월 전에 리뷰됨

default-hostname *** Reading remote logs from Cloud Logging. [2024-09-12, 06:04:41 UTC] {local_task_job_runner.py:120} ▶ Pre task execution logs [2024-09-12, 06:04:41 UTC] {dataproc.py:784} INFO - Creating cluster: composer-hadoop-tutorial-cluster-20240911 [2024-09-12, 06:04:41 UTC] {connection.py:274} WARNING - Connection schemes (type: google_cloud_platform) shall not contain '_' according to RFC3986. [2024-09-12, 06:04:41 UTC] {base.py:84} INFO - Using connection ID 'google_cloud_default' for task execution. [2024-09-12, 06:04:41 UTC] {credentials_provider.py:402} INFO - Getting connection using `google.auth.default()` since no explicit credentials are provided. [2024-09-12, 06:08:47 UTC] {taskinstance.py:441} ▼ Post task execution logs [2024-09-12, 06:08:47 UTC] {taskinstance.py:2907} ERROR - Task failed with exception\nTraceback (most recent call last):\n File "/opt/python3.11/lib/python3.11/site-packages/google/api_core/grpc_helpers.py", line 76, in error_remapped_callable\n return callable_(*args, **kwargs)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^\n File "/opt/python3.11/lib/python3.11/site-packages/grpc/_channel.py", line 1181, in __call__\n return _end_unary_response_blocking(state, call, False, None)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File "/opt/python3.11/lib/python3.11/site-packages/grpc/_channel.py", line 1006, in _end_unary_response_blocking\n raise _InactiveRpcError(state) # pytype: disable=not-instantiable\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\ngrpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:\n status = StatusCode.UNAVAILABLE\n details = "failed to connect to all addresses; last error: UNAVAILABLE: ipv4:199.36.153.9:443: Socket closed"\n debug_error_string = "UNKNOWN:Error received from peer {created_time:"2024-09-12T06:08:47.042203565+00:00", grpc_status:14, grpc_message:"failed to connect to all addresses; last error: UNAVAILABLE: ipv4:199.36.153.9:443: Socket closed"}"\n>\n\nThe above exception was the direct cause of the following exception:\n\nTraceback (most recent call last):\n File "/opt/python3.11/lib/python3.11/site-packages/google/api_core/retry/retry_unary.py", line 144, in retry_target\n result = target()\n ^^^^^^^^\n File "/opt/python3.11/lib/python3.11/site-packages/google/api_core/timeout.py", line 120, in func_with_timeout\n return func(*args, **kwargs)\n ^^^^^^^^^^^^^^^^^^^^^\n File "/opt/python3.11/lib/python3.11/site-packages/google/api_core/grpc_helpers.py", line 78, in error_remapped_callable\n raise exceptions.from_grpc_error(exc) from exc\ngoogle.api_core.exceptions.ServiceUnavailable: 503 failed to connect to all addresses; last error: UNAVAILABLE: ipv4:199.36.153.9:443: Socket closed\n\nThe above exception was the direct cause of the following exception:\n\nTraceback (most recent call last):\n File "/opt/python3.11/lib/python3.11/site-packages/airflow/models/taskinstance.py", line 465, in _execute_task\n result = _execute_callable(context=context, **execute_callable_kwargs)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File "/opt/python3.11/lib/python3.11/site-packages/airflow/models/taskinstance.py", line 432, in _execute_callable\n return execute_callable(context=context, **execute_callable_kwargs)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File "/opt/python3.11/lib/python3.11/site-packages/airflow/models/baseoperator.py", line 400, in wrapper\n return func(self, *args, **kwargs)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File "/opt/python3.11/lib/python3.11/site-packages/airflow/providers/google/cloud/operators/dataproc.py", line 800, in execute\n operation = self._create_cluster(hook)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^\n File "/opt/python3.11/lib/python3.11/site-packages/airflow/providers/google/cloud/operators/dataproc.py", line 700, in _create_cluster\n return hook.create_cluster(\n ^^^^^^^^^^^^^^^^^^^^\n File "/opt/python3.11/lib/python3.11/site-packages/airflow/providers/google/common/hooks/base_google.py", line 559, in inner_wrapper\n return func(self, *args, **kwargs)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File "/opt/python3.11/lib/python3.11/site-packages/airflow/providers/google/cloud/hooks/dataproc.py", line 343, in create_cluster\n result = client.create_cluster(\n ^^^^^^^^^^^^^^^^^^^^^^\n File "/opt/python3.11/lib/python3.11/site-packages/google/cloud/dataproc_v1/services/cluster_controller/client.py", line 872, in create_cluster\n response = rpc(\n ^^^^\n File "/opt/python3.11/lib/python3.11/site-packages/google/api_core/gapic_v1/method.py", line 131, in __call__\n return wrapped_func(*args, **kwargs)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File "/opt/python3.11/lib/python3.11/site-packages/google/api_ [2024-09-12, 06:08:47 UTC] {taskinstance.py:1206} INFO - Marking task as UP_FOR_RETRY. dag_id=composer_hadoop_tutorial, task_id=create_dataproc_cluster, run_id=scheduled__2024-09-11T00:00:00+00:00, execution_date=20240911T000000, start_date=20240912T060441, end_date=20240912T060847 [2024-09-12, 06:08:47 UTC] {standard_task_runner.py:110} ERROR - Failed to execute job 6 for task create_dataproc_cluster (Timeout of 300.0s exceeded, last exception: 503 failed to connect to all addresses; last error: UNAVAILABLE: ipv4:199.36.153.9:443: Socket closed; 5755) [2024-09-12, 06:08:47 UTC] {local_task_job_runner.py:240} INFO - Task exited with return code 1 [2024-09-12, 06:08:47 UTC] {warnings.py:110} WARNING - /opt/python3.11/lib/python3.11/site-packages/airflow/models/baseoperator.py:1308: AirflowProviderDeprecationWarning: Call to deprecated class DataprocSubmitHadoopJobOperator. (Please use `DataprocSubmitJobOperator` instead. You can use `generate_job` method to generate dictionary representing your job and use it with the new operator.)\n result = cls.__new__(cls)\n [2024-09-12, 06:08:47 UTC] {taskinstance.py:3501} INFO - 0 downstream tasks scheduled from follow-on schedule check [2024-09-12, 06:08:47 UTC] {local_task_job_runner.py:222} ▲▲▲ Log group end

Soyoung L. · 12개월 전에 리뷰됨

Withet P. · 12개월 전에 리뷰됨

Broken DAG (dags/hadoop_tutorial.py): Traceback (most recent call last): File "/home/airflow/gcs/dags/hadoop_tutorial.py", line 29, in <module> models.Variable.get('gcs_bucket'), 'wordcount', ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/python3.11/lib/python3.11/site-packages/airflow/models/variable.py", line 143, in get raise KeyError(f"Variable {key} does not exist") KeyError: 'Variable gcs_bucket does not exist'

Matthew B. · 12개월 전에 리뷰됨

Federico Javier C. · 12개월 전에 리뷰됨

Google은 게시된 리뷰가 제품을 구매 또는 사용한 소비자에 의해 작성되었음을 보증하지 않습니다. 리뷰는 Google의 인증을 거치지 않습니다.