Análise de dados sem servidor com o Dataflow: entradas secundárias (Python) avaliações

41311 avaliações

Walter W. · Revisado há over 2 years

Thamyaa A. · Revisado há over 2 years

Islam T. · Revisado há over 2 years

Xiao C. · Revisado há over 2 years

Shiva C. · Revisado há over 2 years

We keep getting this error, JavaProjectsThatNeedHelp.py:163: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead. bigqcollection = p | 'ReadFromBQ' >> beam.io.Read(beam.io.BigQuerySource(project=project,query=get_java_query)) /usr/local/lib/python3.9/dist-packages/apache_beam/io/gcp/bigquery.py:2485: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = pcoll.pipeline.options.view_as(

Waleed G. · Revisado há over 2 years

Jai C. · Revisado há over 2 years

Islam T. · Revisado há over 2 years

Tamas S. · Revisado há over 2 years

Andik A. · Revisado há over 2 years

Tamas S. · Revisado há over 2 years

Ashutosh D. · Revisado há over 2 years

Jagadeesh N. · Revisado há over 2 years

Matthieu C. · Revisado há over 2 years

Zana O. · Revisado há over 2 years

Allam V. · Revisado há over 2 years

Sudarsan S. · Revisado há over 2 years

Amine K. · Revisado há over 2 years

Ben S. · Revisado há over 2 years

OMKAR B. · Revisado há over 2 years

ok

Sovers S. · Revisado há over 2 years

gnanaarasan j. · Revisado há over 2 years

Executing the pipeline on the cloud (Task 4, step 4) often results in the pipeline failing due to a ZONE_RESOURCE_POOL_EXHAUSTED error, and the instructions don't account for this error. The error can (probably) be suppressed right away by changing the region/zone of the job (by editing JavaProjectsThatNeedHelp.py, the relevant parameter is `'--region=us-central1',` at around line 155, and if you want to specify the zone you can add another parameter beside it called `'--worker_zone=<zone>'` — the zone MUST BE contained within the specified region), but it seems like changing to any region beside `us-central1` prevents the lab from counting the objective as completed. Alternatively, you can just wait and try again another time. I really think this lab should check whether the pipeline has been successfully run regardless of region, because an unlucky learner could end up hitting the resource pool exhausted error several times in a row and potentially be locked out of the lab while trying to debug it. I ran the lab 3 times before succeeding in `us-central1`. It also seems like people in other regions are persistently having another type of error, `'us-central1' violates constraint 'constraints/gcp.resourceLocations'`. If the lab accounted for work being done in different regions (and included some guidance about these potential errors), both of these issues would be easy to resolve. There are loads of people reporting the same issues in the reviews for this lab, the Java version of the lab, the Coursera forums for a Coursera course using this lab, and there is a GitHub issue about this on the training-data-analyst repo.

Nicholas C. · Revisado há over 2 years

自分の進め方が悪かったのか、最後の[進行状況を確認]押しても完了できなかった GCSのバケットにはファイルが作成されており、ローカル・クラウドそれぞれで更新がかかっていた "error": { "code": 400, "message": "(5b4cea77a6d05a9d): 'us-central1' violates constraint 'constraints/gcp.resourceLocations' on the resource 'projects/qwiklabs-gcp-02-54459e55a7fd'.", "status": "FAILED_PRECONDITION"

Yuhei K. · Revisado há over 2 years

ayushi p. · Revisado há over 2 years

Não garantimos que as avaliações publicadas sejam de consumidores que compraram ou usaram os produtos. As avaliações não são verificadas pelo Google.