Continuous Training Pipelines with Cloud Composer Reviews
2455 reviews
Pooja D. · Reviewed больше 2 лет ago
Rangan G. · Reviewed больше 2 лет ago
Thiago S. · Reviewed больше 2 лет ago
follow this instruction to arrive to the end: - remember to disable and re-enable composer api - wait untill the composer api is enable again to go on with cloud shell commands (e.g. activation of gcp services) - use this command to create the env gcloud composer environments create demo-environment \ --location $REGION \ --python-version 3 \ --image-version composer-1.20.8-airflow-1.10.15 - if the env is not going to an end, go to composer tool, click the composer name (demo etc etc), the there is the schedule hearthbeat, click zoom in and then refresh many times on the top right button - that's it
Francesco N. · Reviewed больше 2 лет ago
doesn't work because the environment doesn't set up in the available time and the customer service is terrible
Francesco N. · Reviewed больше 2 лет ago
Instruction missing regarding service account.
Thusitha C. · Reviewed больше 2 лет ago
Andrew F. · Reviewed больше 2 лет ago
Arnab S. · Reviewed больше 2 лет ago
Mathanagopalan N. · Reviewed больше 2 лет ago
The Airflow UI is not very user friendly
Vinicius G. · Reviewed больше 2 лет ago
Mahesh G. · Reviewed больше 2 лет ago
Nobu Y. · Reviewed больше 2 лет ago
good
Shyam Prakash M. · Reviewed больше 2 лет ago
good
Shyam Prakash M. · Reviewed больше 2 лет ago
Vamshidhar C. · Reviewed больше 2 лет ago
Tommy T. · Reviewed больше 2 лет ago
Dominik K. · Reviewed больше 2 лет ago
This lab does not work
Widi E. · Reviewed больше 2 лет ago
The lab should be updated regarding new namings and versions
Maria A. · Reviewed больше 2 лет ago
This code causes error: gcloud composer environments storage data import \ --source vars.json \ --environment demo-environment \ --location $REGION Maybe, someting like this: gcloud composer environments create demo-environment --location $REGION --python-version 3 --image-version composer-1-airflow-2 On a final stage got the following error: Broken DAG: [/home/airflow/gcs/dags/chicago_taxi_dag.py] Traceback (most recent call last): File "/opt/python3.8/lib/python3.8/site-packages/airflow/models/baseoperator.py", line 408, in apply_defaults result = func(self, **kwargs, default_args=default_args) File "/opt/python3.8/lib/python3.8/site-packages/airflow/models/baseoperator.py", line 756, in __init__ raise AirflowException( airflow.exceptions.AirflowException: Invalid arguments were passed to PubSubPublishMessageOperator (task_id: publish_on_failed_check_task). Invalid arguments were: **kwargs: {'project': 'qwiklabs-gcp-01-b3b09229c6d5'}
Alexandr G. · Reviewed больше 2 лет ago
Need an update to the gcloud composer environment create script
Phillip M. · Reviewed больше 2 лет ago
doesn't work because the environment doesn't set up in the available time and the customer service is terrible
Francesco N. · Reviewed больше 2 лет ago
doesn't work and the customer service is terrible
Francesco N. · Reviewed больше 2 лет ago
doesn't work
Francesco N. · Reviewed больше 2 лет ago
In the DAG writing task, TODO#2, the solution says task_id="bq_eval_data_task", it should be task_id="bq_valid_data_task".
Sophie S. · Reviewed больше 2 лет ago
We do not ensure the published reviews originate from consumers who have purchased or used the products. Reviews are not verified by Google.