Opiniones sobre Procesamiento de datos sin servidores con Dataflow: Cómo escribir una canalización de ETL con Apache Beam y Dataflow (Java)
9166 opiniones
lab with error, doesnt open the thea url IDE
Ricardo A. · Se revisó hace alrededor de 1 mes
Leonardo S. · Se revisó hace alrededor de 1 mes
Second Attempt: I couldnt complete the lab because the Dataflow jobs failed with the following message. My data flow labs often fail with these errors. Error Message: "Startup of the worker pool in us-east1 failed to bring up any of the desired 1 workers. This is likely a quota issue or a Compute Engine stockout. The service will retry." Additionally, there is no instructions on how to open the IDE. An intor to the DIE would be nice. Because on my first attempt, the IDE URL didnt work for me and I wasted a lot of time in understanding what "terminal" means. I actually did an SSH to the VM but that introduced me to a whole other lot of issues. This time around, the IDE URL worked but it took me a while to get acquainted with it. I will now do a 3rd attempt of the lab.
Sayed Fawad Ali S. · Se revisó hace alrededor de 1 mes
Amara J. · Se revisó hace alrededor de 1 mes
IDE not opening
Muhammad A. · Se revisó hace alrededor de 1 mes
IDE did not pop up
Robert A. · Se revisó hace alrededor de 1 mes
Hiru S. · Se revisó hace alrededor de 1 mes
Not very. It had a deprecated java version and failed to allocate resources in the cloud environment
Dale K. · Se revisó hace alrededor de 1 mes
Complicated
Sergi F. · Se revisó hace alrededor de 1 mes
During this lab, I repeatedly encountered ZONE_RESOURCE_POOL_EXHAUSTED errors when trying to launch Dataflow jobs. This occurred across multiple zones in both us-east1 and us-west1 regions, making it impossible to complete the lab tasks that require Dataflow. This issue has occurred across multiple lab restarts and sessions. Please investigate resource availability in these regions for Qwiklabs projects
Emerson M. · Se revisó hace alrededor de 1 mes
Aubin C. · Se revisó hace alrededor de 1 mes
Laboratorio com erro no link http://35.197.125.188:3000/#/home/project/training-data-analyst/quests/dataflow/no link:
EDUARDO A. · Se revisó hace alrededor de 1 mes
Leonardo S. · Se revisó hace alrededor de 1 mes
Luis Antonio C. · Se revisó hace alrededor de 1 mes
Sriyansh S. · Se revisó hace alrededor de 1 mes
TARUN KUMAR S. · Se revisó hace alrededor de 1 mes
Seems like a good lesson, but no workers available means it's not possible to go through it properly
Martin H. · Se revisó hace alrededor de 1 mes
Sriyansh S. · Se revisó hace alrededor de 1 mes
Sriyansh S. · Se revisó hace alrededor de 1 mes
Aubin C. · Se revisó hace alrededor de 1 mes
The first part of the lab was confusing. The lab gives an ide-url. This was the first time, I saw an ide-url. However, the lab didnt suggest how to use it. I had to do the following out of my own understanding: I had to find a VM Instance in the Project. SSH into it. And then do the following command: cd /home/theia-java-dataflow/training-data-analyst/quests/dataflow/ Maven was not installed, so I had to do the following: sudo apt install maven On running, the following command, I got an error ("ERROR: (gcloud) Invalid choice: 'storage'.") source create_batch_sinks.sh I had to run sudo gcloud components update. Even that didnt work so I had to use echo "Creating pipeline sinks" PROJECT_ID=$(gcloud config get-value project) # GCS buckets #TODO: Add try/catch for the first bucket since qwiklabs #gcloud storage buckets create --location=US gs://$PROJECT_ID gsutil mb -l US gs://$PROJECT_ID #gcloud storage buckets create --location=US --default-storage-class="COLDLINE" gs://$PROJECT_ID-coldline gsutil mb -l US -c COLDLINE gs://$PROJECT_ID-coldline But as several files contained the gcloud commands and I could not change all of the files to gsutil commands. So I had to install a local version of gcloud with the following statements: #Step0: cd ~ #Step1: curl -O https://dl.google.com/dl/cloudsdk/channels/rapid/downloads/google-cloud-cli-linux-x86_64.tar.gz #Step2: tar -xf google-cloud-cli-linux-x86_64.tar.gz #Step3: sudo ./google-cloud-sdk/install.sh I also gto pip3 errors. So I had to install it with below statements: sudo apt-get update sudo apt-get install python3-pip -y Now I am almost at the endo of the lab. I have executed "bash generate_batch_events.sh" but it is stuck at "Installing Packages". I dont think, I would be able to complete the lab.
Sayed Fawad Ali S. · Se revisó hace alrededor de 1 mes
Sriyansh S. · Se revisó hace alrededor de 1 mes
Daniel A. · Se revisó hace alrededor de 1 mes
Gustavo L. · Se revisó hace alrededor de 1 mes
Ferdie O. · Se revisó hace alrededor de 1 mes
No garantizamos que las opiniones publicadas provengan de consumidores que hayan comprado o utilizado los productos. Google no verifica las opiniones.