关于“使用 Serverless for Apache Spark 加载 BigQuery”的评价
14076 条评价
I got this error when attempting the spark job, I tried to solve it but it was way over my knowledge level: Batch [2332b6b961d14bd399e0a93ccace27e9] submitted. ERROR: (gcloud.beta.dataproc.batches.submit.pyspark) Batch job is FAILED. Detail: Multiple Errors: - Failed to create batch autoscaling policy - Not authorized to requested resource. Running auto diagnostics on the batch. It may take few minutes before diagnostics output is available. Please check diagnostics output by running 'gcloud dataproc batches describe' command.
Rodrigo R. · 已于 about 1 year前审核
Manas S. · 已于 about 1 year前审核
Sashrika A. · 已于 about 1 year前审核
Rushikesh N. · 已于 about 1 year前审核
Rutik M. · 已于 about 1 year前审核
Vimal R. · 已于 about 1 year前审核
Arturo Eduardo M. · 已于 about 1 year前审核
RANGA C. · 已于 about 1 year前审核
Marcos G. · 已于 about 1 year前审核
Javier C. · 已于 about 1 year前审核
Joel V. · 已于 about 1 year前审核
lab takes a while to register that I've completed the step
Adaeze A. · 已于 about 1 year前审核
Javier V. · 已于 about 1 year前审核
Lars L. · 已于 about 1 year前审核
Christopher M. · 已于 about 1 year前审核
Jorge M. · 已于 about 1 year前审核
Arnaud M. · 已于 about 1 year前审核
Arabind M. · 已于 about 1 year前审核
Noguchi M. · 已于 about 1 year前审核
Edward K. · 已于 about 1 year前审核
As a beginner in the world of Date Engineering on GCP, I struggle to understand the point of Dataproc in this lab. Also, I wanted to let you new that during the spark execute command I got the following error a couple of times. After the third try, it worked: ERROR: (gcloud.beta.dataproc.batches.submit.pyspark) Batch job is FAILED. Detail: Insufficient 'CPUS' quota. Requested 12.0, available 11.0. Your resource request exceeds your available quota. See https://cloud.google.com/compute/resource-usage. Use https://cloud.google.com/docs/quotas/view-manage#requesting_higher_quota to request additional quota. Running auto diagnostics on the batch. It may take few minutes before diagnostics output is available. Please check diagnostics output by running 'gcloud dataproc batches describe' command.
Andrea T. · 已于 about 1 year前审核
Beini W. · 已于 about 1 year前审核
Ronald Alberto R. · 已于 about 1 year前审核
Antonio B. · 已于 about 1 year前审核
I see opportunities to improve this quick lab: 1) The lab is missing one step which is granting permissions to get the dataproc cluster. I've got the following error message in the VM while executing the batch job: ```` ERROR: (gcloud.beta.dataproc.batches.submit.pyspark) Batch job is FAILED. Detail: Multiple Errors: - Failed to fetch cluster for batch - Permission 'dataproc.clusters.get' denied on resource '//dataproc.googleapis.com/projects/qwiklabs-gcp-00-c467d66f6efc/regions/us-east4/clusters/srvls-batch-1b1a8482-374f-4c44-83d7-6bc417531bed' (or it may not exist). ``` Fortunatelly I was able to figure it out through IAM permissions configuration, but the lab does not provide that guidance. 2) It can be out of scope for a lab, but I think somehow it lacks the explanation for what use cases this solution would be preferred over other ones.
Caio L. · 已于 about 1 year前审核
我们无法确保发布的评价来自已购买或已使用产品的消费者。评价未经 Google 核实。