Processamento de dados sem servidor com o Dataflow: como criar um Pipeline ETL usando Apache Beam e Dataflow (Python) avaliações
11245 avaliações
achraf e. · Revisado há almost 2 years
Afonso P. · Revisado há almost 2 years
DJIBRILLA B. · Revisado há almost 2 years
Akhil N. · Revisado há almost 2 years
Unable to continue as the pipeline.py file is not accessible with 403 Forbidden error
Meenakshi M. · Revisado há almost 2 years
Alex N. · Revisado há almost 2 years
Alex N. · Revisado há almost 2 years
Lab instruction are not clear and really confusing, For example while editing the py files, it is not really explaining the place where you have to add the transformations and so on.
Saravanakumar N. · Revisado há almost 2 years
Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#worker-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED: Instance 'my-pipeline-1713806604483-04221023-0l4b-harness-jjft' creation failed: The zone 'projects/qwiklabs-gcp-01-e378a7e6bf47/zones/us-central1-a' does not have enough resources available to fulfill the request. Try a different zone, or try again later.
William M. · Revisado há almost 2 years
William M. · Revisado há almost 2 years
No dataflow workers available in us-central1-a. No way to use a different zone. Worked earlier in the US day (UK).
Will N. · Revisado há almost 2 years
shehran s. · Revisado há almost 2 years
Nishana H. · Revisado há almost 2 years
ok
john m. · Revisado há almost 2 years
angel v. · Revisado há almost 2 years
Jan Š. · Revisado há almost 2 years
Amit G. · Revisado há almost 2 years
Shawn B. · Revisado há almost 2 years
Too much to accomplish is one lab. Please break it up into small section. Also, please have an TRAINING PROFESSIONAL write the steps - TOO MUCH OVER EXPLAINING!!!
Carl N. · Revisado há almost 2 years
Ông T. · Revisado há almost 2 years
Ivan S. · Revisado há almost 2 years
poor, the generated event file csv has not the correct encoding. hence the bucket cannot be json parsed. in the file user_generator.py change open("users_bq.txt", 'w') to open("users_bq.txt", 'w', encoding='utf-8') to fix it. encoding parm probably needs to be added at other places as well
Nils D. · Revisado há almost 2 years
chirag l. · Revisado há almost 2 years
Safeer K. · Revisado há almost 2 years
Rafael T. · Revisado há almost 2 years
Não garantimos que as avaliações publicadas sejam de consumidores que compraram ou usaram os produtos. As avaliações não são verificadas pelo Google.