Running Apache Spark jobs on Cloud Dataproc Reviews
46081 reviews
Mallikarjun S. · Reviewed almost 2 years ago
kishore m. · Reviewed almost 2 years ago
Leandro G. · Reviewed almost 2 years ago
Digvijay P. · Reviewed almost 2 years ago
Thomas G. · Reviewed almost 2 years ago
Olivier M. · Reviewed almost 2 years ago
Sotiria S. · Reviewed almost 2 years ago
Sushrut A. · Reviewed almost 2 years ago
Yuliia M. · Reviewed almost 2 years ago
Apporv D. · Reviewed almost 2 years ago
ok
Phoutthakone B. · Reviewed almost 2 years ago
There where some erros in the last part.
SERGIO ENRIQUE Y. · Reviewed almost 2 years ago
excellent experience
Diego C. · Reviewed almost 2 years ago
Luigino N. · Reviewed almost 2 years ago
Hello, I found an issue where I had to replace the name "sparktodp" by the correct name of my cluster ("cluster-e93e") in the command and script below: export DP_STORAGE="gs://$(gcloud dataproc clusters describe sparktodp --region=us-east1 --format=json | jq -r '.config.configBucket')" #!/bin/bash gcloud dataproc jobs submit pyspark \ --cluster sparktodp \ --region us-east1 \ spark_analysis.py \ -- --bucket=$1
Crhistian S. · Reviewed almost 2 years ago
Great learning!
Jitendra J. · Reviewed almost 2 years ago
Sandra C. · Reviewed almost 2 years ago
Himanshu S. · Reviewed almost 2 years ago
Nayan P. · Reviewed almost 2 years ago
Dominik F. · Reviewed almost 2 years ago
Pankaj G. · Reviewed almost 2 years ago
R S. · Reviewed almost 2 years ago
Foo Hoo G. · Reviewed almost 2 years ago
kavan q. · Reviewed almost 2 years ago
Miguel F. · Reviewed almost 2 years ago
We do not ensure the published reviews originate from consumers who have purchased or used the products. Reviews are not verified by Google.