Machine Learning with TensorFlow and Vertex AI Reviews
14048 reviews
Its broken! It doesnt work!
Justin H. · Reviewed أكثر من سنتين ago
could only get a score of 80% as task 6 kept failing due to error listed below: Google Cloud Self-Paced Labs Machine Learning with TensorFlow in Vertex AI - GSP273 Task 6 gs://qwiklabs-gcp-00-905ba1094efe-dsongcp/ch9/trained_model/export/flights_20230726-210005/ Using endpoint [https://us-central1-aiplatform.googleapis.com/] Endpoint for flights_xai-20230726-215121 already exists Using endpoint [https://us-central1-aiplatform.googleapis.com/] ENDPOINT_ID=6499675026667601920 Using endpoint [https://us-central1-aiplatform.googleapis.com/] Using endpoint [https://us-central1-aiplatform.googleapis.com/] Waiting for operation [7021920166275448832]... ..................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................failed. ERROR: (gcloud.beta.ai.models.upload) Error occurred in Explanation preprocessing. <class 'ValueError'> NodeDef mentions attr 'Tsegmentids' not in Op<name=SparseSegmentMean; signature=data:T, indices:Tidx, segment_ids:int32 -> output:T; attr=T:type,allowed=[DT_FLOAT, DT_DOUBLE]; attr=Tidx:type,default=DT_INT32,allowed=[DT_INT32, DT_INT64]>; NodeDef: {{node model_3/deep_inputs/arr_airport_lat_bucketized_X_arr_airport_lon_bucketized_X_dep_airport_lat_bucketized_X_dep_airport_lon_bucketized_embedding/arr_airport_lat_bucketized_X_arr_airport_lon_bucketized_X_dep_airport_lat_bucketized_X_dep_airport_lon_bucketized_embedding_weights/embedding_lookup_sparse}}. (Check whether your GraphDef-interpreting binary is up to date with your GraphDef-generating binary.). Using endpoint [https://us-central1-aiplatform.googleapis.com/] MODEL_ID= Using endpoint [https://us-central1-aiplatform.googleapis.com/] ERROR: (gcloud.beta.ai.endpoints.deploy-model) could not parse resource [] --------------------------------------------------------------------------- CalledProcessError Traceback (most recent call last) Cell In[42], line 1 ----> 1 get_ipython().run_cell_magic('bash', '', '# note TF_VERSION set in 1st cell, but ENDPOINT_NAME is being changed\n# TF_VERSION=2-6\nENDPOINT_NAME=flights_xai\nTIMESTAMP=$(date +%Y%m%d-%H%M%S)\nMODEL_NAME=${ENDPOINT_NAME}-${TIMESTAMP}\nEXPORT_PATH=$(gsutil ls ${OUTDIR}/export | tail -1)\necho $EXPORT_PATH\n# create the model endpoint for deploying the model\nif [[ $(gcloud beta ai endpoints list --region=$REGION \\\n --format=\'value(DISPLAY_NAME)\' --filter=display_name=${ENDPOINT_NAME}) ]]; then\n echo "Endpoint for $MODEL_NAME already exists"\nelse\n # create model endpoint\n echo "Creating Endpoint for $MODEL_NAME"\n gcloud beta ai endpoints create --region=${REGION} --display-name=${ENDPOINT_NAME}\nfi\nENDPOINT_ID=$(gcloud beta ai endpoints list --region=$REGION \\\n --format=\'value(ENDPOINT_ID)\' --filter=display_name=${ENDPOINT_NAME})\necho "ENDPOINT_ID=$ENDPOINT_ID"\n# delete any existing models with this name\nfor MODEL_ID in $(gcloud beta ai models list --region=$REGION --format=\'value(MODEL_ID)\' --filter=display_name=${MODEL_NAME}); do\n echo "Deleting existing $MODEL_NAME ... $MODEL_ID "\n gcloud ai models delete --region=$REGION $MODEL_ID\ndone\n# upload the model using the parameters docker conatiner image, artifact URI, explanation method, \n# explanation path count and explanation metadata JSON file `explanation-metadata.json`. \n# Here, you keep number of feature permutations to `10` when approximating the Shapley values for explanation.\ngcloud beta ai models upload --region=$REGION --display-name=$MODEL_NAME \\\n --container-image-uri=us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.${TF_VERSION}:latest \\\n --artifact-uri=$EXPORT_PATH \\\n --explanation-method=sampled-shapley --explanation-path-count=10 --explanation-metadata-file=explanation-metadata.json\nMODEL_ID=$(gcloud beta ai models list --region=$REGION --format=\'value(MODEL_ID)\' --filter=display_name=${MODEL_NAME})\necho "MODEL_ID=$MODEL_ID"\n# deploy the model to the endpoint\ngcloud beta ai endpoints deploy-model $ENDPOINT_ID \\\n --region=$REGION \\\n --model=$MODEL_ID \\\n --display-name=$MODEL_NAME \\\n --machine-type=n1-standard-2 \\\n --min-replica-count=1 \\\n --max-replica-count=1 \\\n --traffic-split=0=100\n') File /opt/conda/lib/python3.10/site-packages/IPython/core/interactiveshell.py:2478, in InteractiveShell.run_cell_magic(self, magic_name, line, cell) 2476 with self.builtin_trap: 2477 args = (magic_arg_s, cell) -> 2478 result = fn(*args, **kwargs) 2480 # The code below prevents the output from being displayed 2481 # when using magics with decodator @output_can_be_silenced 2482 # when the last Python token in the expression is a ';'. 2483 if getattr(fn, magic.MAGIC_OUTPUT_CAN_BE_SILENCED, False): File /opt/conda/lib/python3.10/site-packages/IPython/core/magics/script.py:154, in ScriptMagics._make_script_magic.<locals>.named_script_magic(line, cell) 152 else: 153 line = script --> 154 return self.shebang(line, cell) File /opt/conda/lib/python3.10/site-packages/IPython/core/magics/script.py:314, in ScriptMagics.shebang(self, line, cell) 309 if args.raise_error and p.returncode != 0: 310 # If we get here and p.returncode is still None, we must have 311 # killed it but not yet seen its return code. We don't wait for it, 312 # in case it's stuck in uninterruptible sleep. -9 = SIGKILL 313 rc = p.returncode or -9 --> 314 raise CalledProcessError(rc, cell) CalledProcessError: Command 'b'# note TF_VERSION set in 1st cell, but ENDPOINT_NAME is being changed\n# TF_VERSION=2-6\nENDPOINT_NAME=flights_xai\nTIMESTAMP=$(date +%Y%m%d-%H%M%S)\nMODEL_NAME=${ENDPOINT_NAME}-${TIMESTAMP}\nEXPORT_PATH=$(gsutil ls ${OUTDIR}/export | tail -1)\necho $EXPORT_PATH\n# create the model endpoint for deploying the model\nif [[ $(gcloud beta ai endpoints list --region=$REGION \\\n --format=\'value(DISPLAY_NAME)\' --filter=display_name=${ENDPOINT_NAME}) ]]; then\n echo "Endpoint for $MODEL_NAME already exists"\nelse\n # create model endpoint\n echo "Creating Endpoint for $MODEL_NAME"\n gcloud beta ai endpoints create --region=${REGION} --display-name=${ENDPOINT_NAME}\nfi\nENDPOINT_ID=$(gcloud beta ai endpoints list --region=$REGION \\\n --format=\'value(ENDPOINT_ID)\' --filter=display_name=${ENDPOINT_NAME})\necho "ENDPOINT_ID=$ENDPOINT_ID"\n# delete any existing models with this name\nfor MODEL_ID in $(gcloud beta ai models list --region=$REGION --format=\'value(MODEL_ID)\' --filter=display_name=${MODEL_NAME}); do\n echo "Deleting existing $MODEL_NAME ... $MODEL_ID "\n gcloud ai models delete --region=$REGION $MODEL_ID\ndone\n# upload the model using the parameters docker conatiner image, artifact URI, explanation method, \n# explanation path count and explanation metadata JSON file `explanation-metadata.json`. \n# Here, you keep number of feature permutations to `10` when approximating the Shapley values for explanation.\ngcloud beta ai models upload --region=$REGION --display-name=$MODEL_NAME \\\n --container-image-uri=us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.${TF_VERSION}:latest \\\n --artifact-uri=$EXPORT_PATH \\\n --explanation-method=sampled-shapley --explanation-path-count=10 --explanation-metadata-file=explanation-metadata.json\nMODEL_ID=$(gcloud beta ai models list --region=$REGION --format=\'value(MODEL_ID)\' --filter=display_name=${MODEL_NAME})\necho "MODEL_ID=$MODEL_ID"\n# deploy the model to the endpoint\ngcloud beta ai endpoints deploy-model $ENDPOINT_ID \\\n --region=$REGION \\\n --model=$MODEL_ID \\\n --display-name=$MODEL_NAME \\\n --machine-type=n1-standard-2 \\\n --min-replica-count=1 \\\n --max-replica-count=1 \\\n --traffic-split=0=100\n'' returned non-zero exit status 1.
Paul C. · Reviewed أكثر من سنتين ago
highly
Anna A. · Reviewed أكثر من سنتين ago
Anna A. · Reviewed أكثر من سنتين ago
Felype d. · Reviewed أكثر من سنتين ago
Notebook instance not creating in us-central1 region due to resources not being available in any zone in that region.
Sanjay S. · Reviewed أكثر من سنتين ago
can't create notebook
Suppadate T. · Reviewed أكثر من سنتين ago
Could not create the notebook. The directions are outdated.
Jeongho J. · Reviewed أكثر من سنتين ago
Could not actually instantiate a notebook from the coursera instructions. The error provided was no help.
Pat B. · Reviewed أكثر من سنتين ago
Resource issues
Gábor K. · Reviewed أكثر من سنتين ago
Santos B. · Reviewed أكثر من سنتين ago
David G. · Reviewed أكثر من سنتين ago
Full of errors. The environment cannot even run predetermined code. What a shame.
Rajesh R. · Reviewed أكثر من سنتين ago
Hard to get to resources to be able to even start the notebook.
Viktor S. · Reviewed أكثر من سنتين ago
The code provided contains a lot of bugs and errors
Aziz B. · Reviewed أكثر من سنتين ago
Jacob P. · Reviewed أكثر من سنتين ago
Shantanu S. · Reviewed أكثر من سنتين ago
Panagiotis T. · Reviewed أكثر من سنتين ago
unable finish this lab last step due to CalledProcessError: Command 'b'# note TF_VERSION set in 1st cell, but ENDPOINT_NAME is being changed\n# TF_VERSION=2-6\nENDPOINT_NAME=flights_xai\nTIMESTAMP=$(date +%Y%m%d-%H%M%S)\nMODEL_NAME=${ENDPOINT_NAME}-${TIMESTAMP}\nEXPORT_PATH=$(gsutil ls ${OUTDIR}/export | tail -1)\necho $EXPORT_PATH\n# create the model endpoint for deploying the model\nif [[ $(gcloud beta ai endpoints list --region=$REGION \\\n --format=\'value(DISPLAY_NAME)\' --filter=display_name=${ENDPOINT_NAME}) ]]; then\n echo "Endpoint for $MODEL_NAME already exists"\nelse\n # create model endpoint\n echo "Creating Endpoint for $MODEL_NAME"\n gcloud beta ai endpoints create --region=${REGION} --display-name=${ENDPOINT_NAME}\nfi\nENDPOINT_ID=$(gcloud beta ai endpoints list --region=$REGION \\\n --format=\'value(ENDPOINT_ID)\' --filter=display_name=${ENDPOINT_NAME})\necho "ENDPOINT_ID=$ENDPOINT_ID"\n# delete any existing models with this name\nfor MODEL_ID in $(gcloud beta ai models list --region=$REGION --format=\'value(MODEL_ID)\' --filter=display_name=${MODEL_NAME}); do\n echo "Deleting existing $MODEL_NAME ... $MODEL_ID "\n gcloud ai models delete --region=$REGION $MODEL_ID\ndone\n# upload the model using the parameters docker conatiner image, artifact URI, explanation method, \n# explanation path count and explanation metadata JSON file `explanation-metadata.json`. \n# Here, you keep number of feature permutations to `10` when approximating the Shapley values for explanation.\ngcloud beta ai models upload --region=$REGION --display-name=$MODEL_NAME \\\n --container-image-uri=us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.${TF_VERSION}:latest \\\n --artifact-uri=$EXPORT_PATH \\\n --explanation-method=sampled-shapley --explanation-path-count=10 --explanation-metadata-file=explanation-metadata.json\nMODEL_ID=$(gcloud beta ai models list --region=$REGION --format=\'value(MODEL_ID)\' --filter=display_name=${MODEL_NAME})\necho "MODEL_ID=$MODEL_ID"\n# deploy the model to the endpoint\ngcloud beta ai endpoints deploy-model $ENDPOINT_ID \\\n --region=$REGION \\\n --model=$MODEL_ID \\\n --display-name=$MODEL_NAME \\\n --machine-type=n1-standard-2 \\\n --min-replica-count=1 \\\n --max-replica-count=1 \\\n --traffic-split=0=100\n'' returned non-zero exit status 1.
Hsin-Wen C. · Reviewed أكثر من سنتين ago
There are no resources available at the specified region to complete the lab.
José Luis G. · Reviewed أكثر من سنتين ago
Takashi I. · Reviewed أكثر من سنتين ago
I could not create the notebook instance: I got an error saying that not enough resources were availaible
Davide S. · Reviewed أكثر من سنتين ago
Initial instructions to do with creating the notebook are out of date. The region you're instructed to use didn't work. Other instructions are out of date. It's more a series of things to paste in than anything instructive of what you're doing and why
Llewellyn R. · Reviewed أكثر من سنتين ago
규보 임. · Reviewed أكثر من سنتين ago
Pablo B. · Reviewed أكثر من سنتين ago
We do not ensure the published reviews originate from consumers who have purchased or used the products. Reviews are not verified by Google.