Migrating Apache Spark Jobs to Dataproc [PWDW] Reviews

3124 reviews

Roberto P. · Reviewed over 1 year ago

Rajat B. · Reviewed over 1 year ago

Deeksha T. · Reviewed over 1 year ago

Manisha S. · Reviewed over 1 year ago

mukul k. · Reviewed over 1 year ago

Augusto M. · Reviewed over 1 year ago

Bruno T. · Reviewed over 1 year ago

Jyotshna D. · Reviewed over 1 year ago

Sethupathi S. · Reviewed over 1 year ago

Jyotshna D. · Reviewed over 1 year ago

ashwini r. · Reviewed over 1 year ago

All good!

Ossie B. · Reviewed over 1 year ago

Gabriel A. · Reviewed over 1 year ago

Sayali J. · Reviewed over 1 year ago

Subhankar S. · Reviewed over 1 year ago

Ulan S. · Reviewed over 1 year ago

Nice

Ansari Mohammad Zeeshan M. · Reviewed over 1 year ago

AKHIL A. · Reviewed over 1 year ago

In Task4. the last coding part, where you save output to bucket doesnt work. Code: # save locally ax[0].get_figure().savefig('report.png'); connections_by_protocol.to_csv("connections_by_protocol.csv") # upload to GCS bucket = gcs.Client().get_bucket(BUCKET) for blob in bucket.list_blobs(prefix='sparktobq/'): blob.delete() for fname in ['report.png', 'connections_by_protocol.csv']: bucket.blob('sparktobq/{}'.format(fname)).upload_from_filename(fname) It gives following error: TypeError: UploadBase.__init__() got an unexpected keyword argument 'checksum'

Kacper S. · Reviewed over 1 year ago

Bhanu Prasad T. · Reviewed over 1 year ago

Faizan K. · Reviewed over 1 year ago

Rakesh Kumar N. · Reviewed over 1 year ago

Yashashvi D. · Reviewed over 1 year ago

Leo C. · Reviewed over 1 year ago

Mikołaj N. · Reviewed over 1 year ago

We do not ensure the published reviews originate from consumers who have purchased or used the products. Reviews are not verified by Google.