Continuous Training Pipelines with Cloud Composer Reviews
2455 reviews
Pooja D. · Reviewed أكثر من سنتين ago
Rangan G. · Reviewed أكثر من سنتين ago
Thiago S. · Reviewed أكثر من سنتين ago
follow this instruction to arrive to the end: - remember to disable and re-enable composer api - wait untill the composer api is enable again to go on with cloud shell commands (e.g. activation of gcp services) - use this command to create the env gcloud composer environments create demo-environment \ --location $REGION \ --python-version 3 \ --image-version composer-1.20.8-airflow-1.10.15 - if the env is not going to an end, go to composer tool, click the composer name (demo etc etc), the there is the schedule hearthbeat, click zoom in and then refresh many times on the top right button - that's it
Francesco N. · Reviewed أكثر من سنتين ago
doesn't work because the environment doesn't set up in the available time and the customer service is terrible
Francesco N. · Reviewed أكثر من سنتين ago
Instruction missing regarding service account.
Thusitha C. · Reviewed أكثر من سنتين ago
Andrew F. · Reviewed أكثر من سنتين ago
Arnab S. · Reviewed أكثر من سنتين ago
Mathanagopalan N. · Reviewed أكثر من سنتين ago
The Airflow UI is not very user friendly
Vinicius G. · Reviewed أكثر من سنتين ago
Mahesh G. · Reviewed أكثر من سنتين ago
Nobu Y. · Reviewed أكثر من سنتين ago
good
Shyam Prakash M. · Reviewed أكثر من سنتين ago
good
Shyam Prakash M. · Reviewed أكثر من سنتين ago
Vamshidhar C. · Reviewed أكثر من سنتين ago
Tommy T. · Reviewed أكثر من سنتين ago
Dominik K. · Reviewed أكثر من سنتين ago
This lab does not work
Widi E. · Reviewed أكثر من سنتين ago
The lab should be updated regarding new namings and versions
Maria A. · Reviewed أكثر من سنتين ago
This code causes error: gcloud composer environments storage data import \ --source vars.json \ --environment demo-environment \ --location $REGION Maybe, someting like this: gcloud composer environments create demo-environment --location $REGION --python-version 3 --image-version composer-1-airflow-2 On a final stage got the following error: Broken DAG: [/home/airflow/gcs/dags/chicago_taxi_dag.py] Traceback (most recent call last): File "/opt/python3.8/lib/python3.8/site-packages/airflow/models/baseoperator.py", line 408, in apply_defaults result = func(self, **kwargs, default_args=default_args) File "/opt/python3.8/lib/python3.8/site-packages/airflow/models/baseoperator.py", line 756, in __init__ raise AirflowException( airflow.exceptions.AirflowException: Invalid arguments were passed to PubSubPublishMessageOperator (task_id: publish_on_failed_check_task). Invalid arguments were: **kwargs: {'project': 'qwiklabs-gcp-01-b3b09229c6d5'}
Alexandr G. · Reviewed أكثر من سنتين ago
Need an update to the gcloud composer environment create script
Phillip M. · Reviewed أكثر من سنتين ago
doesn't work because the environment doesn't set up in the available time and the customer service is terrible
Francesco N. · Reviewed أكثر من سنتين ago
doesn't work and the customer service is terrible
Francesco N. · Reviewed أكثر من سنتين ago
doesn't work
Francesco N. · Reviewed أكثر من سنتين ago
In the DAG writing task, TODO#2, the solution says task_id="bq_eval_data_task", it should be task_id="bq_valid_data_task".
Sophie S. · Reviewed أكثر من سنتين ago
We do not ensure the published reviews originate from consumers who have purchased or used the products. Reviews are not verified by Google.