Opiniones sobre Automatiza la captura de datos a gran escala con Document AI: Lab de desafío
14513 opiniones
Sithu K. · Se revisó hace 10 días
Nisit P. · Se revisó hace 11 días
Nisit P. · Se revisó hace 11 días
Nisit P. · Se revisó hace 11 días
3474_WiraYeYint G. · Se revisó hace 11 días
Nachapat I. · Se revisó hace 11 días
Deepnita M. · Se revisó hace 11 días
Deepnita M. · Se revisó hace 11 días
Chanatda K. · Se revisó hace 11 días
Chanatda K. · Se revisó hace 11 días
Pratiksha T. · Se revisó hace 11 días
Chandu V. · Se revisó hace 12 días
Alok s. · Se revisó hace 12 días
Wisit S. · Se revisó hace 12 días
Bodin C. · Se revisó hace 12 días
Nice.
Thawon J. · Se revisó hace 12 días
Parkpoom L. · Se revisó hace 12 días
Tanaphol R. · Se revisó hace 12 días
Supawit S. · Se revisó hace 12 días
Titichaya V. · Se revisó hace 13 días
the scripts is using cloud run function gen1 but the instruction said we need to deploy gen2 so the code will not work
Prachya K. · Se revisó hace 13 días
siddhi k. · Se revisó hace 13 días
To the Google Cloud Skills Boost / Lab Content Team, I am writing to report that Task 4 and Task 5 of this lab are fundamentally broken and desperately need an immediate update. The provided instructions contain multiple critical errors that force students to spend hours debugging the lab's poorly maintained code rather than learning the actual GCP concepts. Here is a list of the critical bugs in the current lab instructions: Outdated Deployment Flags: The instructions use --trigger-resource=gs://... which is invalid for Cloud Run Functions Gen 2. It must be updated to --trigger-bucket. Missing Service Account (404 Error): The deploy command includes --service-account=${PROJECT_ID}@appspot.gserviceaccount.com, but this default App Engine SA is NOT provisioned in the lab environment, causing instant deployment failure. Wrong Bucket Name (403 Error): The deploy command references a bucket ending in -input-in, but the actual bucket we are instructed to create earlier ends in -input-invoices. This typo causes confusing Eventarc permission errors. Code Crash on Startup (Healthcheck Failed): The main.py code tries to parse a TIMEOUT environment variable (int(os.environ.get('TIMEOUT'))), but the provided .env.yaml file does not include it. This causes a NoneType error that crashes the container during the initial startup. Hardcoded Project ID (gRPC Permission Denied): The main.py file has the string YourGCPProjectID hardcoded instead of dynamically pulling the student's project ID. This causes a gRPC Permission Denied error when the function tries to write data to BigQuery. I had to manually run a sed command to fix your provided source code. Flawed File Copy Command: In Task 5, the command gsutil -m cp -r gs://cloud-training/gsp367/* ... copies Python scripts and YAML files into the input bucket alongside the PDFs. This immediately crashes the Document AI processor, which expects only document files. As a Data Engineer trying to upskill, it is extremely frustrating to waste time troubleshooting outdated lab environments and untested code. Please fix these issues so future learners don't have to face the same terrible experience. Regards, Benz
Thanachit S. · Se revisó hace 13 días
Ark A. · Se revisó hace 14 días
Martin K. · Se revisó hace 15 días
No garantizamos que las opiniones publicadas provengan de consumidores que hayan comprado o utilizado los productos. Google no verifica las opiniones.