[DEPRECATED] Continuous Training with TensorFlow, PyTorch, XGBoost, and Scikit Learn Models with Kubeflow and AI Platform Pipelines Reviews
2101 reviews
杉山 阿. · Reviewed almost 5 years ago
Eric T. · Reviewed almost 5 years ago
Yo H. · Reviewed almost 5 years ago
Morten P. · Reviewed almost 5 years ago
What a waste of time.
Daniel L. · Reviewed almost 5 years ago
Matthew J. · Reviewed almost 5 years ago
Everything except for the last task is perfect. During the last task "create pipeline runs", the creation of validation dataset using bigquery (BQ Eval Split) saying the dataset "kfp_tmp_dataset" already exists. The creation of training dataset worked fine. Here is the full tracing message: INFO:root:Fetching latest pod metadata: continuous-training-with-multiple-frameworks-79msd-114173538. INFO:root:Start KFP context with ID: 35f5c028ecdc2905e040a8daf55e22b3. INFO:root:Creating dataset kfp_tmp_dataset Traceback (most recent call last): File "/usr/local/lib/python2.7/runpy.py", line 174, in _run_module_as_main "__main__", fname, loader, pkg_name) File "/usr/local/lib/python2.7/runpy.py", line 72, in _run_code exec code in run_globals File "/ml/kfp_component/launcher/__main__.py", line 45, in <module> main() File "/ml/kfp_component/launcher/__main__.py", line 42, in main launch(args.file_or_module, args.args) File "kfp_component/launcher/launcher.py", line 45, in launch return fire.Fire(module, command=args, name=module.__name__) File "/usr/local/lib/python2.7/site-packages/fire/core.py", line 127, in Fire component_trace = _Fire(component, args, context, name) File "/usr/local/lib/python2.7/site-packages/fire/core.py", line 366, in _Fire component, remaining_args) File "/usr/local/lib/python2.7/site-packages/fire/core.py", line 542, in _CallCallable result = fn(*varargs, **kwargs) File "kfp_component/google/bigquery/_query.py", line 63, in query dataset_location) File "kfp_component/google/bigquery/_query.py", line 98, in _prepare_dataset_ref dataset = _create_dataset(client, dataset_ref, dataset_location) File "kfp_component/google/bigquery/_query.py", line 110, in _create_dataset return client.create_dataset(dataset) File "/usr/local/lib/python2.7/site-packages/google/cloud/bigquery/client.py", line 341, in create_dataset api_response = self._connection.api_request(method="POST", path=path, data=data) File "/usr/local/lib/python2.7/site-packages/google/cloud/_http.py", line 319, in api_request raise exceptions.from_http_response(response) google.api_core.exceptions.Conflict: 409 POST https://www.googleapis.com/bigquery/v2/projects/qwiklabs-gcp-01-ad03335814f1/datasets: Already Exists: Dataset qwiklabs-gcp-01-ad03335814f1:kfp_tmp_dataset
Maosi C. · Reviewed almost 5 years ago
Yuehao P. · Reviewed almost 5 years ago
Broken lab yet again, the pipeline creation from console didn't work properly due to the following error "ERROR: unable to pull gcr.io//kfp-dev:latest."
Wajeeh Ul H. · Reviewed almost 5 years ago
Saïd R. · Reviewed about 5 years ago
Rakesh J. · Reviewed about 5 years ago
Giuseppe R. · Reviewed about 5 years ago
Aishwarya D. · Reviewed about 5 years ago
gianpiero p. · Reviewed about 5 years ago
Stuart M. · Reviewed about 5 years ago
Esra D. · Reviewed about 5 years ago
Claudio I. · Reviewed about 5 years ago
Ivan N. · Reviewed about 5 years ago
Yeseul Y. · Reviewed about 5 years ago
Alejandro M. · Reviewed about 5 years ago
Oussama B. · Reviewed about 5 years ago
this did not work!!!!!!!!!!!!!!!
Antony S. · Reviewed about 5 years ago
Yuntae H. · Reviewed about 5 years ago
Romain B. · Reviewed about 5 years ago
Surachart O. · Reviewed about 5 years ago
We do not ensure the published reviews originate from consumers who have purchased or used the products. Reviews are not verified by Google.