Serverless Data Analysis with Dataflow: Side Inputs (Python) Reviews

41311 reviews

Walter W. · Reviewed over 2 years ago

Thamyaa A. · Reviewed over 2 years ago

Islam T. · Reviewed over 2 years ago

Xiao C. · Reviewed over 2 years ago

Shiva C. · Reviewed over 2 years ago

We keep getting this error, JavaProjectsThatNeedHelp.py:163: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead. bigqcollection = p | 'ReadFromBQ' >> beam.io.Read(beam.io.BigQuerySource(project=project,query=get_java_query)) /usr/local/lib/python3.9/dist-packages/apache_beam/io/gcp/bigquery.py:2485: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = pcoll.pipeline.options.view_as(

Waleed G. · Reviewed over 2 years ago

Jai C. · Reviewed over 2 years ago

Islam T. · Reviewed over 2 years ago

Tamas S. · Reviewed over 2 years ago

Andik A. · Reviewed over 2 years ago

Tamas S. · Reviewed over 2 years ago

Ashutosh D. · Reviewed over 2 years ago

Jagadeesh N. · Reviewed over 2 years ago

Matthieu C. · Reviewed over 2 years ago

Zana O. · Reviewed over 2 years ago

Allam V. · Reviewed over 2 years ago

Sudarsan S. · Reviewed over 2 years ago

Amine K. · Reviewed over 2 years ago

Ben S. · Reviewed over 2 years ago

OMKAR B. · Reviewed over 2 years ago

ok

Sovers S. · Reviewed over 2 years ago

gnanaarasan j. · Reviewed over 2 years ago

Executing the pipeline on the cloud (Task 4, step 4) often results in the pipeline failing due to a ZONE_RESOURCE_POOL_EXHAUSTED error, and the instructions don't account for this error. The error can (probably) be suppressed right away by changing the region/zone of the job (by editing JavaProjectsThatNeedHelp.py, the relevant parameter is `'--region=us-central1',` at around line 155, and if you want to specify the zone you can add another parameter beside it called `'--worker_zone=<zone>'` — the zone MUST BE contained within the specified region), but it seems like changing to any region beside `us-central1` prevents the lab from counting the objective as completed. Alternatively, you can just wait and try again another time. I really think this lab should check whether the pipeline has been successfully run regardless of region, because an unlucky learner could end up hitting the resource pool exhausted error several times in a row and potentially be locked out of the lab while trying to debug it. I ran the lab 3 times before succeeding in `us-central1`. It also seems like people in other regions are persistently having another type of error, `'us-central1' violates constraint 'constraints/gcp.resourceLocations'`. If the lab accounted for work being done in different regions (and included some guidance about these potential errors), both of these issues would be easy to resolve. There are loads of people reporting the same issues in the reviews for this lab, the Java version of the lab, the Coursera forums for a Coursera course using this lab, and there is a GitHub issue about this on the training-data-analyst repo.

Nicholas C. · Reviewed over 2 years ago

自分の進め方が悪かったのか、最後の[進行状況を確認]押しても完了できなかった GCSのバケットにはファイルが作成されており、ローカル・クラウドそれぞれで更新がかかっていた "error": { "code": 400, "message": "(5b4cea77a6d05a9d): 'us-central1' violates constraint 'constraints/gcp.resourceLocations' on the resource 'projects/qwiklabs-gcp-02-54459e55a7fd'.", "status": "FAILED_PRECONDITION"

Yuhei K. · Reviewed over 2 years ago

ayushi p. · Reviewed over 2 years ago

We do not ensure the published reviews originate from consumers who have purchased or used the products. Reviews are not verified by Google.