Serverless Data Processing with Dataflow - Batch Analytics Pipelines with Dataflow (Python) Reviews

6755 reviews

1. to pass permission issue I runned the command from cloud shell 2. after task4 "my progress" still not green , despite the fact that everything ended well

Michael G. · Reviewed about 4 years ago

no funciona, sigo los pasos y no le asigna el rol de daflow worker. Asigne el rol de forma manual, y ya lo tiene, se ejecuta el script que ejecuta los archivos .sh para crear el bucket y no funciona Se siguen los pasos y esta mandando muchos errores se creo la tabla usser trafic en bigquery y manda mensaje de que no está terminado ese paso, y no indica que falta. Se siguieron los pasos, de forma manual se asigno el rol de dataflow worker porque siguiendo los pasos manda error. Pésimos laboratorios, por mi le daría cero estrellas, solo que no lo permite la herramienta

Leoncio O. · Reviewed about 4 years ago

no funciona, sigo los pasos y no le asigna el rol de daflow worker. Asigne el rol de forma manual, y ya lo tiene, se ejecuta el script que ejecuta los archivos .sh para crear el bucket y no funciona Se siguen los pasos y esta mandando muchos errores

Leoncio O. · Reviewed about 4 years ago

The lab is a bit unclear about which commands should be run in the IDE versus Cloud shell. In particular, in step 4 of Part 1, Task 1, we are asked to run the command below: gcloud projects add-iam-policy-binding $PROJECT_ID --member="serviceAccount:${serviceAccount}" --role="roles/dataflow.worker" However this fails if run in the IDE since the service account doesn't have the necessary permissions. I was able to work around this by running the command in Cloud Shell instead, however it would be could to update the lab text to make this requirement clear.

Robert L. · Reviewed about 4 years ago

Eber D. · Reviewed about 4 years ago

Eber D. · Reviewed about 4 years ago

Wasn't able to complete Task 3 as my Python class didn't match the JSON schema. Errors from Dataflow were hard to understand. Not really an issue with the lab, just a general Dataflow problem.

Robert L. · Reviewed about 4 years ago

no funciona, sigo los pasos y no le asigna el rol de daflow worker. Asigne el rol de forma manual, y ya lo tiene, se ejecuta el script que ejecuta los archivos .sh para crear el bucket y no funciona

Leoncio O. · Reviewed about 4 years ago

Divya N. · Reviewed about 4 years ago

Yves E. · Reviewed about 4 years ago

Sascha D. · Reviewed about 4 years ago

no funciona, sigo los pasos y no le asigna el rol de daflow worker.

Leoncio O. · Reviewed about 4 years ago

Incorrect task to solution. Have raised as error

Tristan T. · Reviewed about 4 years ago

H F. · Reviewed about 4 years ago

Jean-Edouard R. · Reviewed about 4 years ago

Jean-Edouard R. · Reviewed about 4 years ago

Mihir D. · Reviewed about 4 years ago

Mihir D. · Reviewed about 4 years ago

Javier M. · Reviewed about 4 years ago

The python version (3.9) installed in the cloud shell machine, doesn't support the apache beam python sdk

Brayan Jair S. · Reviewed about 4 years ago

Yi D. · Reviewed about 4 years ago

Felix W. · Reviewed about 4 years ago

Alceu D. · Reviewed about 4 years ago

Apache Beam labs are valid for 3.7 and version of Python but Google Cloud default shell starts with 3.9. So most of the python files have to be amended to use experiment option!

Krishna V. · Reviewed about 4 years ago

Priya Y. · Reviewed about 4 years ago

We do not ensure the published reviews originate from consumers who have purchased or used the products. Reviews are not verified by Google.