Automatiser la collecte de données à grande échelle avec Document AI : atelier challenge avis
14515 avis
Great
Weeradate K. · Examiné il y a 10 jours
Abhinandan K. · Examiné il y a 11 jours
Sithu K. · Examiné il y a 11 jours
Nisit P. · Examiné il y a 11 jours
Nisit P. · Examiné il y a 11 jours
Nisit P. · Examiné il y a 11 jours
3474_WiraYeYint G. · Examiné il y a 11 jours
Nachapat I. · Examiné il y a 11 jours
Deepnita M. · Examiné il y a 11 jours
Deepnita M. · Examiné il y a 11 jours
Chanatda K. · Examiné il y a 11 jours
Chanatda K. · Examiné il y a 11 jours
Pratiksha T. · Examiné il y a 11 jours
Chandu V. · Examiné il y a 12 jours
Alok s. · Examiné il y a 12 jours
Wisit S. · Examiné il y a 12 jours
Bodin C. · Examiné il y a 12 jours
Nice.
Thawon J. · Examiné il y a 12 jours
Parkpoom L. · Examiné il y a 12 jours
Tanaphol R. · Examiné il y a 13 jours
Supawit S. · Examiné il y a 13 jours
Titichaya V. · Examiné il y a 13 jours
the scripts is using cloud run function gen1 but the instruction said we need to deploy gen2 so the code will not work
Prachya K. · Examiné il y a 13 jours
siddhi k. · Examiné il y a 13 jours
To the Google Cloud Skills Boost / Lab Content Team, I am writing to report that Task 4 and Task 5 of this lab are fundamentally broken and desperately need an immediate update. The provided instructions contain multiple critical errors that force students to spend hours debugging the lab's poorly maintained code rather than learning the actual GCP concepts. Here is a list of the critical bugs in the current lab instructions: Outdated Deployment Flags: The instructions use --trigger-resource=gs://... which is invalid for Cloud Run Functions Gen 2. It must be updated to --trigger-bucket. Missing Service Account (404 Error): The deploy command includes --service-account=${PROJECT_ID}@appspot.gserviceaccount.com, but this default App Engine SA is NOT provisioned in the lab environment, causing instant deployment failure. Wrong Bucket Name (403 Error): The deploy command references a bucket ending in -input-in, but the actual bucket we are instructed to create earlier ends in -input-invoices. This typo causes confusing Eventarc permission errors. Code Crash on Startup (Healthcheck Failed): The main.py code tries to parse a TIMEOUT environment variable (int(os.environ.get('TIMEOUT'))), but the provided .env.yaml file does not include it. This causes a NoneType error that crashes the container during the initial startup. Hardcoded Project ID (gRPC Permission Denied): The main.py file has the string YourGCPProjectID hardcoded instead of dynamically pulling the student's project ID. This causes a gRPC Permission Denied error when the function tries to write data to BigQuery. I had to manually run a sed command to fix your provided source code. Flawed File Copy Command: In Task 5, the command gsutil -m cp -r gs://cloud-training/gsp367/* ... copies Python scripts and YAML files into the input bucket alongside the PDFs. This immediately crashes the Document AI processor, which expects only document files. As a Data Engineer trying to upskill, it is extremely frustrating to waste time troubleshooting outdated lab environments and untested code. Please fix these issues so future learners don't have to face the same terrible experience. Regards, Benz
Thanachit S. · Examiné il y a 13 jours
Nous ne pouvons pas certifier que les avis publiés proviennent de consommateurs qui ont acheté ou utilisé les produits. Les avis ne sont pas vérifiés par Google.