Opiniones sobre Procesamiento de datos sin servidores con Dataflow: Cómo escribir una canalización de ETL con Apache Beam y Dataflow (Python)

11486 opiniones

Unsure how to resolve error when running pipeline WARNING:google.auth.compute engine.metadata:Compute Engine Metadata server unavailable on attempt 1 of 3. Reason: http.client transport only supports the http scheme, https is specified

Ivan L. · Se revisó hace 5 días

the lab cfails to provission workers to the dataflow jobs, no matter if the machine types are stated explicitely.

Iván Marcelo U. · Se revisó hace 5 días

Yan H. · Se revisó hace 5 días

Svetak S. · Se revisó hace 6 días

Francisco M. · Se revisó hace 6 días

Leonardo M. · Se revisó hace 6 días

Issue with the project id. There is no parent organization to select.

Vanitha H. · Se revisó hace 7 días

médiocre

Kasraoui W. · Se revisó hace 7 días

Igor d. · Se revisó hace 7 días

Ana N. · Se revisó hace 8 días

having error on Run your pipeline task, even using the solution code provided in the lab. Traceback (most recent call last): File "/home/jupyter/training-data-analyst/quests/dataflow_python/1_Basic_ETL/lab/my_pipeline.py", line 108, in <module> run() File "/home/jupyter/training-data-analyst/quests/dataflow_python/1_Basic_ETL/lab/my_pipeline.py", line 92, in run | 'ReadFromGCS' >> beam.io.ReadFromText(input) File "/home/jupyter/training-data-analyst/quests/dataflow_python/1_Basic_ETL/lab/df-env/lib/python3.10/site-packages/apache_beam/io/textio.py", line 808, in __init__ self._source = self._source_class( File "/home/jupyter/training-data-analyst/quests/dataflow_python/1_Basic_ETL/lab/df-env/lib/python3.10/site-packages/apache_beam/io/textio.py", line 144, in __init__ super().__init__( File "/home/jupyter/training-data-analyst/quests/dataflow_python/1_Basic_ETL/lab/df-env/lib/python3.10/site-packages/apache_beam/io/filebasedsource.py", line 127, in __init__ self._validate() File "/home/jupyter/training-data-analyst/quests/dataflow_python/1_Basic_ETL/lab/df-env/lib/python3.10/site-packages/apache_beam/options/value_provider.py", line 193, in _f return fnc(self, *args, **kwargs) File "/home/jupyter/training-data-analyst/quests/dataflow_python/1_Basic_ETL/lab/df-env/lib/python3.10/site-packages/apache_beam/io/filebasedsource.py", line 190, in _validate match_result = FileSystems.match([pattern], limits=[1])[0] File "/home/jupyter/training-data-analyst/quests/dataflow_python/1_Basic_ETL/lab/df-env/lib/python3.10/site-packages/apache_beam/io/filesystems.py", line 240, in match return filesystem.match(patterns, limits) File "/home/jupyter/training-data-analyst/quests/dataflow_python/1_Basic_ETL/lab/df-env/lib/python3.10/site-packages/apache_beam/io/filesystem.py", line 779, in match raise BeamIOError("Match operation failed", exceptions) apache_beam.io.filesystem.BeamIOError: Match operation failed with exceptions {'gs://qwiklabs-gcp-02-c763640af21b/events.json': RefreshError(TransportError("Failed to retrieve https://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true from the Google Compute Engine metadata service. Compute Engine Metadata server unavailable. Last exception: HTTPSConnectionPool(host='metadata.google.internal', port=443): Max retries exceeded with url: /computeMetadata/v1/instance/service-accounts/default/?recursive=true (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1017)')))"))}

yumeng y. · Se revisó hace 8 días

proper folders were not cloned for all the tasks

Shaik S. · Se revisó hace 8 días

Sriyansh S. · Se revisó hace 8 días

Sriyansh S. · Se revisó hace 8 días

Luis Antonio C. · Se revisó hace 8 días

Jyoti S. · Se revisó hace 8 días

Luis Antonio C. · Se revisó hace 8 días

Luis Antonio C. · Se revisó hace 8 días

Could not get the pipeline to run due to multiple errors. Even the solution would not run. Very disappointing to be kicked out of the lab and lose progress before being able to resolve / troubleshoot. Errors were around: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1017)'))': /computeMetadata/v1/instance/service-accounts/default/?recursive=true and Compute Engine Medadata server unavailable.

Mariette D. · Se revisó hace 9 días

too complex, needs more direction on what to do

Oscar O. · Se revisó hace 10 días

Daniela L. · Se revisó hace 10 días

problema con la infrectuctura (falta de espacio)

Gabriela C. · Se revisó hace 10 días

Luis Antonio C. · Se revisó hace 11 días

Could not complete Part1 Task 6 run the pipeline (using the provided Solution code) due to the following error: raise BeamIOError("Match operation failed", exceptions) apache_beam.io.filesystem.BeamIOError: Match operation failed with exceptions {'gs://qwiklabs-gcp-00-f5855126f119/events.json': RefreshError(TransportError("Failed to retrieve https://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true from the Google Compute Engine metadata service. Compute Engine Metadata server unavailable. Last exception: HTTPSConnectionPool(host='metadata.google.internal', port=443): Max retries exceeded with url: /computeMetadata/v1/instance/service-accounts/default/?recursive=true (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1017)')))"))}

Lingmin M. · Se revisó hace 11 días

Qwiklabs Dataflow Lab Fix Summary Problems SSL/Metadata auth failure — google-auth 2.44.0+ broke the default HTTP transport, causing CERTIFICATE_VERIFY_FAILED errors when the notebook tried to reach GCP's metadata server Zone resource exhaustion — n1-standard-1 (Dataflow's default) was completely unavailable across all us-central1 zones for Qwiklabs accounts Pipeline exits early — p.run() doesn't wait for completion, causing silent failures argparse rejects extra flags — parse_args() blocks passing extra Beam arguments like --worker_machine_type Fixes 1. Fix SSL auth error bashexport GCE_METADATA_MTLS_MODE=none 2. Use e2-standard-2 machine type Add to your run command: bash--worker_machine_type=e2-standard-2 3. Fix pipeline code In my_pipeline.py: python# Change this: opts = parser.parse_args() options = PipelineOptions() p.run() # To this: opts, pipeline_args = parser.parse_known_args() options = PipelineOptions(pipeline_args) p.run().wait_until_finish() 4. Full working command bashcd $BASE_DIR export PROJECT_ID=$(gcloud config get-value project) export GCE_METADATA_MTLS_MODE=none python3 my_pipeline.py \ --project=${PROJECT_ID} \ --region=us-central1 \ --stagingLocation=gs://$PROJECT_ID/staging/ \ --tempLocation=gs://$PROJECT_ID/temp/ \ --runner=DataflowRunner \ --worker_machine_type=e2-standard-2 5. For the Dataflow Template UI Under Optional Parameters → uncheck "Use default machine type" → Series: E2 → Machine type: e2-standard-2

David O. · Se revisó hace 11 días

No garantizamos que las opiniones publicadas provengan de consumidores que hayan comprado o utilizado los productos. Google no verifica las opiniones.