Serverless Data Processing with Dataflow - Writing an ETL Pipeline using Apache Beam and Dataflow (Python) Reviews
11486 reviews
Unsure how to resolve error when running pipeline WARNING:google.auth.compute engine.metadata:Compute Engine Metadata server unavailable on attempt 1 of 3. Reason: http.client transport only supports the http scheme, https is specified
Ivan L. · Reviewed 5 days ago
the lab cfails to provission workers to the dataflow jobs, no matter if the machine types are stated explicitely.
Iván Marcelo U. · Reviewed 5 days ago
Yan H. · Reviewed 5 days ago
Svetak S. · Reviewed 5 days ago
Francisco M. · Reviewed 6 days ago
Leonardo M. · Reviewed 6 days ago
Issue with the project id. There is no parent organization to select.
Vanitha H. · Reviewed 7 days ago
médiocre
Kasraoui W. · Reviewed 7 days ago
Igor d. · Reviewed 7 days ago
Ana N. · Reviewed 7 days ago
having error on Run your pipeline task, even using the solution code provided in the lab. Traceback (most recent call last): File "/home/jupyter/training-data-analyst/quests/dataflow_python/1_Basic_ETL/lab/my_pipeline.py", line 108, in <module> run() File "/home/jupyter/training-data-analyst/quests/dataflow_python/1_Basic_ETL/lab/my_pipeline.py", line 92, in run | 'ReadFromGCS' >> beam.io.ReadFromText(input) File "/home/jupyter/training-data-analyst/quests/dataflow_python/1_Basic_ETL/lab/df-env/lib/python3.10/site-packages/apache_beam/io/textio.py", line 808, in __init__ self._source = self._source_class( File "/home/jupyter/training-data-analyst/quests/dataflow_python/1_Basic_ETL/lab/df-env/lib/python3.10/site-packages/apache_beam/io/textio.py", line 144, in __init__ super().__init__( File "/home/jupyter/training-data-analyst/quests/dataflow_python/1_Basic_ETL/lab/df-env/lib/python3.10/site-packages/apache_beam/io/filebasedsource.py", line 127, in __init__ self._validate() File "/home/jupyter/training-data-analyst/quests/dataflow_python/1_Basic_ETL/lab/df-env/lib/python3.10/site-packages/apache_beam/options/value_provider.py", line 193, in _f return fnc(self, *args, **kwargs) File "/home/jupyter/training-data-analyst/quests/dataflow_python/1_Basic_ETL/lab/df-env/lib/python3.10/site-packages/apache_beam/io/filebasedsource.py", line 190, in _validate match_result = FileSystems.match([pattern], limits=[1])[0] File "/home/jupyter/training-data-analyst/quests/dataflow_python/1_Basic_ETL/lab/df-env/lib/python3.10/site-packages/apache_beam/io/filesystems.py", line 240, in match return filesystem.match(patterns, limits) File "/home/jupyter/training-data-analyst/quests/dataflow_python/1_Basic_ETL/lab/df-env/lib/python3.10/site-packages/apache_beam/io/filesystem.py", line 779, in match raise BeamIOError("Match operation failed", exceptions) apache_beam.io.filesystem.BeamIOError: Match operation failed with exceptions {'gs://qwiklabs-gcp-02-c763640af21b/events.json': RefreshError(TransportError("Failed to retrieve https://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true from the Google Compute Engine metadata service. Compute Engine Metadata server unavailable. Last exception: HTTPSConnectionPool(host='metadata.google.internal', port=443): Max retries exceeded with url: /computeMetadata/v1/instance/service-accounts/default/?recursive=true (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1017)')))"))}
yumeng y. · Reviewed 7 days ago
proper folders were not cloned for all the tasks
Shaik S. · Reviewed 8 days ago
Sriyansh S. · Reviewed 8 days ago
Sriyansh S. · Reviewed 8 days ago
Luis Antonio C. · Reviewed 8 days ago
Jyoti S. · Reviewed 8 days ago
Luis Antonio C. · Reviewed 8 days ago
Luis Antonio C. · Reviewed 8 days ago
Could not get the pipeline to run due to multiple errors. Even the solution would not run. Very disappointing to be kicked out of the lab and lose progress before being able to resolve / troubleshoot. Errors were around: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1017)'))': /computeMetadata/v1/instance/service-accounts/default/?recursive=true and Compute Engine Medadata server unavailable.
Mariette D. · Reviewed 9 days ago
too complex, needs more direction on what to do
Oscar O. · Reviewed 10 days ago
Daniela L. · Reviewed 10 days ago
problema con la infrectuctura (falta de espacio)
Gabriela C. · Reviewed 10 days ago
Luis Antonio C. · Reviewed 11 days ago
Could not complete Part1 Task 6 run the pipeline (using the provided Solution code) due to the following error: raise BeamIOError("Match operation failed", exceptions) apache_beam.io.filesystem.BeamIOError: Match operation failed with exceptions {'gs://qwiklabs-gcp-00-f5855126f119/events.json': RefreshError(TransportError("Failed to retrieve https://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true from the Google Compute Engine metadata service. Compute Engine Metadata server unavailable. Last exception: HTTPSConnectionPool(host='metadata.google.internal', port=443): Max retries exceeded with url: /computeMetadata/v1/instance/service-accounts/default/?recursive=true (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1017)')))"))}
Lingmin M. · Reviewed 11 days ago
Qwiklabs Dataflow Lab Fix Summary Problems SSL/Metadata auth failure — google-auth 2.44.0+ broke the default HTTP transport, causing CERTIFICATE_VERIFY_FAILED errors when the notebook tried to reach GCP's metadata server Zone resource exhaustion — n1-standard-1 (Dataflow's default) was completely unavailable across all us-central1 zones for Qwiklabs accounts Pipeline exits early — p.run() doesn't wait for completion, causing silent failures argparse rejects extra flags — parse_args() blocks passing extra Beam arguments like --worker_machine_type Fixes 1. Fix SSL auth error bashexport GCE_METADATA_MTLS_MODE=none 2. Use e2-standard-2 machine type Add to your run command: bash--worker_machine_type=e2-standard-2 3. Fix pipeline code In my_pipeline.py: python# Change this: opts = parser.parse_args() options = PipelineOptions() p.run() # To this: opts, pipeline_args = parser.parse_known_args() options = PipelineOptions(pipeline_args) p.run().wait_until_finish() 4. Full working command bashcd $BASE_DIR export PROJECT_ID=$(gcloud config get-value project) export GCE_METADATA_MTLS_MODE=none python3 my_pipeline.py \ --project=${PROJECT_ID} \ --region=us-central1 \ --stagingLocation=gs://$PROJECT_ID/staging/ \ --tempLocation=gs://$PROJECT_ID/temp/ \ --runner=DataflowRunner \ --worker_machine_type=e2-standard-2 5. For the Dataflow Template UI Under Optional Parameters → uncheck "Use default machine type" → Series: E2 → Machine type: e2-standard-2
David O. · Reviewed 11 days ago
We do not ensure the published reviews originate from consumers who have purchased or used the products. Reviews are not verified by Google.