Serverless Data Analysis with Dataflow: Side Inputs (Python) Reviews

41311 reviews

Walter W. · Reviewed أكثر من سنتين ago

Thamyaa A. · Reviewed أكثر من سنتين ago

Islam T. · Reviewed أكثر من سنتين ago

Xiao C. · Reviewed أكثر من سنتين ago

Shiva C. · Reviewed أكثر من سنتين ago

We keep getting this error, JavaProjectsThatNeedHelp.py:163: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead. bigqcollection = p | 'ReadFromBQ' >> beam.io.Read(beam.io.BigQuerySource(project=project,query=get_java_query)) /usr/local/lib/python3.9/dist-packages/apache_beam/io/gcp/bigquery.py:2485: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = pcoll.pipeline.options.view_as(

Waleed G. · Reviewed أكثر من سنتين ago

Jai C. · Reviewed أكثر من سنتين ago

Islam T. · Reviewed أكثر من سنتين ago

Tamas S. · Reviewed أكثر من سنتين ago

Andik A. · Reviewed أكثر من سنتين ago

Tamas S. · Reviewed أكثر من سنتين ago

Ashutosh D. · Reviewed أكثر من سنتين ago

Jagadeesh N. · Reviewed أكثر من سنتين ago

Matthieu C. · Reviewed أكثر من سنتين ago

Zana O. · Reviewed أكثر من سنتين ago

Allam V. · Reviewed أكثر من سنتين ago

Sudarsan S. · Reviewed أكثر من سنتين ago

Amine K. · Reviewed أكثر من سنتين ago

Ben S. · Reviewed أكثر من سنتين ago

OMKAR B. · Reviewed أكثر من سنتين ago

ok

Sovers S. · Reviewed أكثر من سنتين ago

gnanaarasan j. · Reviewed أكثر من سنتين ago

Executing the pipeline on the cloud (Task 4, step 4) often results in the pipeline failing due to a ZONE_RESOURCE_POOL_EXHAUSTED error, and the instructions don't account for this error. The error can (probably) be suppressed right away by changing the region/zone of the job (by editing JavaProjectsThatNeedHelp.py, the relevant parameter is `'--region=us-central1',` at around line 155, and if you want to specify the zone you can add another parameter beside it called `'--worker_zone=<zone>'` — the zone MUST BE contained within the specified region), but it seems like changing to any region beside `us-central1` prevents the lab from counting the objective as completed. Alternatively, you can just wait and try again another time. I really think this lab should check whether the pipeline has been successfully run regardless of region, because an unlucky learner could end up hitting the resource pool exhausted error several times in a row and potentially be locked out of the lab while trying to debug it. I ran the lab 3 times before succeeding in `us-central1`. It also seems like people in other regions are persistently having another type of error, `'us-central1' violates constraint 'constraints/gcp.resourceLocations'`. If the lab accounted for work being done in different regions (and included some guidance about these potential errors), both of these issues would be easy to resolve. There are loads of people reporting the same issues in the reviews for this lab, the Java version of the lab, the Coursera forums for a Coursera course using this lab, and there is a GitHub issue about this on the training-data-analyst repo.

Nicholas C. · Reviewed أكثر من سنتين ago

自分の進め方が悪かったのか、最後の[進行状況を確認]押しても完了できなかった GCSのバケットにはファイルが作成されており、ローカル・クラウドそれぞれで更新がかかっていた "error": { "code": 400, "message": "(5b4cea77a6d05a9d): 'us-central1' violates constraint 'constraints/gcp.resourceLocations' on the resource 'projects/qwiklabs-gcp-02-54459e55a7fd'.", "status": "FAILED_PRECONDITION"

Yuhei K. · Reviewed أكثر من سنتين ago

ayushi p. · Reviewed أكثر من سنتين ago

We do not ensure the published reviews originate from consumers who have purchased or used the products. Reviews are not verified by Google.