Processamento de dados sem servidor com o Dataflow: como criar um Pipeline ETL usando Apache Beam e Dataflow (Python) avaliações
10871 avaliações
Martin T. · Revisado há over 1 year
achraf e. · Revisado há over 1 year
Afonso P. · Revisado há over 1 year
DJIBRILLA B. · Revisado há over 1 year
Akhil N. · Revisado há over 1 year
Unable to continue as the pipeline.py file is not accessible with 403 Forbidden error
Meenakshi M. · Revisado há over 1 year
Alex N. · Revisado há over 1 year
Alex N. · Revisado há over 1 year
Lab instruction are not clear and really confusing, For example while editing the py files, it is not really explaining the place where you have to add the transformations and so on.
Saravanakumar N. · Revisado há over 1 year
Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#worker-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED: Instance 'my-pipeline-1713806604483-04221023-0l4b-harness-jjft' creation failed: The zone 'projects/qwiklabs-gcp-01-e378a7e6bf47/zones/us-central1-a' does not have enough resources available to fulfill the request. Try a different zone, or try again later.
William M. · Revisado há over 1 year
William M. · Revisado há over 1 year
No dataflow workers available in us-central1-a. No way to use a different zone. Worked earlier in the US day (UK).
Will N. · Revisado há over 1 year
shehran s. · Revisado há over 1 year
Nishana H. · Revisado há over 1 year
ok
john m. · Revisado há over 1 year
angel v. · Revisado há over 1 year
Jan Š. · Revisado há over 1 year
Amit G. · Revisado há over 1 year
Shawn B. · Revisado há over 1 year
Too much to accomplish is one lab. Please break it up into small section. Also, please have an TRAINING PROFESSIONAL write the steps - TOO MUCH OVER EXPLAINING!!!
Carl N. · Revisado há over 1 year
Ông T. · Revisado há over 1 year
Ivan S. · Revisado há over 1 year
poor, the generated event file csv has not the correct encoding. hence the bucket cannot be json parsed. in the file user_generator.py change open("users_bq.txt", 'w') to open("users_bq.txt", 'w', encoding='utf-8') to fix it. encoding parm probably needs to be added at other places as well
Nils D. · Revisado há over 1 year
chirag l. · Revisado há over 1 year
Safeer K. · Revisado há over 1 year
Não garantimos que as avaliações publicadas sejam de consumidores que compraram ou usaram os produtos. As avaliações não são verificadas pelo Google.