Automate Data Capture at Scale with Document AI: Challenge Lab Reviews
14513 reviews
Sithu K. · Reviewed 10 أيام ago
Nisit P. · Reviewed 11 يوم ago
Nisit P. · Reviewed 11 يوم ago
Nisit P. · Reviewed 11 يوم ago
3474_WiraYeYint G. · Reviewed 11 يوم ago
Nachapat I. · Reviewed 11 يوم ago
Deepnita M. · Reviewed 11 يوم ago
Deepnita M. · Reviewed 11 يوم ago
Chanatda K. · Reviewed 11 يوم ago
Chanatda K. · Reviewed 11 يوم ago
Pratiksha T. · Reviewed 11 يوم ago
Chandu V. · Reviewed 12 يوم ago
Alok s. · Reviewed 12 يوم ago
Wisit S. · Reviewed 12 يوم ago
Bodin C. · Reviewed 12 يوم ago
Nice.
Thawon J. · Reviewed 12 يوم ago
Parkpoom L. · Reviewed 12 يوم ago
Tanaphol R. · Reviewed 12 يوم ago
Supawit S. · Reviewed 13 يوم ago
Titichaya V. · Reviewed 13 يوم ago
the scripts is using cloud run function gen1 but the instruction said we need to deploy gen2 so the code will not work
Prachya K. · Reviewed 13 يوم ago
siddhi k. · Reviewed 13 يوم ago
To the Google Cloud Skills Boost / Lab Content Team, I am writing to report that Task 4 and Task 5 of this lab are fundamentally broken and desperately need an immediate update. The provided instructions contain multiple critical errors that force students to spend hours debugging the lab's poorly maintained code rather than learning the actual GCP concepts. Here is a list of the critical bugs in the current lab instructions: Outdated Deployment Flags: The instructions use --trigger-resource=gs://... which is invalid for Cloud Run Functions Gen 2. It must be updated to --trigger-bucket. Missing Service Account (404 Error): The deploy command includes --service-account=${PROJECT_ID}@appspot.gserviceaccount.com, but this default App Engine SA is NOT provisioned in the lab environment, causing instant deployment failure. Wrong Bucket Name (403 Error): The deploy command references a bucket ending in -input-in, but the actual bucket we are instructed to create earlier ends in -input-invoices. This typo causes confusing Eventarc permission errors. Code Crash on Startup (Healthcheck Failed): The main.py code tries to parse a TIMEOUT environment variable (int(os.environ.get('TIMEOUT'))), but the provided .env.yaml file does not include it. This causes a NoneType error that crashes the container during the initial startup. Hardcoded Project ID (gRPC Permission Denied): The main.py file has the string YourGCPProjectID hardcoded instead of dynamically pulling the student's project ID. This causes a gRPC Permission Denied error when the function tries to write data to BigQuery. I had to manually run a sed command to fix your provided source code. Flawed File Copy Command: In Task 5, the command gsutil -m cp -r gs://cloud-training/gsp367/* ... copies Python scripts and YAML files into the input bucket alongside the PDFs. This immediately crashes the Document AI processor, which expects only document files. As a Data Engineer trying to upskill, it is extremely frustrating to waste time troubleshooting outdated lab environments and untested code. Please fix these issues so future learners don't have to face the same terrible experience. Regards, Benz
Thanachit S. · Reviewed 13 يوم ago
Ark A. · Reviewed 14 يوم ago
Martin K. · Reviewed 15 يوم ago
We do not ensure the published reviews originate from consumers who have purchased or used the products. Reviews are not verified by Google.