关于“Introducing the Keras Sequential API on Vertex AI Platform”的评价
13866 条评价
Barbara S. · 已于 over 1 year前审核
Sudhanshu P. · 已于 over 1 year前审核
Nalin G. · 已于 over 1 year前审核
Vaibhav N. · 已于 over 1 year前审核
There are issues with the version of the commands in the lab
nicholas m. · 已于 over 1 year前审核
Luccas R. · 已于 over 1 year前审核
from google.cloud import aiplatform cant import this library, throwing this error: ImportError Traceback (most recent call last) Cell In[29], line 1 ----> 1 from google.cloud import aiplatform 2 uploaded_model = aiplatform.Model.upload( 3 display_name=MODEL_DISPLAYNAME, 4 artifact_uri=f"gs://{BUCKET}/{MODEL_DISPLAYNAME}", 5 serving_container_image_uri=SERVING_CONTAINER_IMAGE_URI, 6 ) File /opt/conda/lib/python3.9/site-packages/google/cloud/aiplatform/__init__.py:24 19 from google.cloud.aiplatform import version as aiplatform_version 21 __version__ = aiplatform_version.__version__ ---> 24 from google.cloud.aiplatform import initializer 26 from google.cloud.aiplatform.datasets import ( 27 ImageDataset, 28 TabularDataset, (...) 31 VideoDataset, 32 ) 33 from google.cloud.aiplatform import explain File /opt/conda/lib/python3.9/site-packages/google/cloud/aiplatform/initializer.py:34 32 from google.cloud.aiplatform.constants import base as constants 33 from google.cloud.aiplatform import utils ---> 34 from google.cloud.aiplatform.metadata import metadata 35 from google.cloud.aiplatform.utils import resource_manager_utils 36 from google.cloud.aiplatform.tensorboard import tensorboard_resource File /opt/conda/lib/python3.9/site-packages/google/cloud/aiplatform/metadata/metadata.py:25 22 from google.protobuf import timestamp_pb2 24 from google.cloud.aiplatform import base ---> 25 from google.cloud.aiplatform import pipeline_jobs 26 from google.cloud.aiplatform.compat.types import execution as gca_execution 27 from google.cloud.aiplatform.metadata import constants File /opt/conda/lib/python3.9/site-packages/google/cloud/aiplatform/pipeline_jobs.py:31 29 from google.cloud.aiplatform import utils 30 from google.cloud.aiplatform.constants import pipeline as pipeline_constants ---> 31 from google.cloud.aiplatform.metadata import artifact 32 from google.cloud.aiplatform.metadata import context 33 from google.cloud.aiplatform.metadata import execution File /opt/conda/lib/python3.9/site-packages/google/cloud/aiplatform/metadata/artifact.py:25 22 from google.auth import credentials as auth_credentials 24 from google.cloud.aiplatform import base ---> 25 from google.cloud.aiplatform import models 26 from google.cloud.aiplatform import utils 27 from google.cloud.aiplatform.compat.types import artifact as gca_artifact File /opt/conda/lib/python3.9/site-packages/google/cloud/aiplatform/models.py:46 44 from google.cloud.aiplatform import explain 45 from google.cloud.aiplatform import initializer ---> 46 from google.cloud.aiplatform import jobs 47 from google.cloud.aiplatform import models 48 from google.cloud.aiplatform import utils File /opt/conda/lib/python3.9/site-packages/google/cloud/aiplatform/jobs.py:26 23 import time 25 from google.cloud import storage ---> 26 from google.cloud import bigquery 28 from google.auth import credentials as auth_credentials 29 from google.protobuf import duration_pb2 # type: ignore File /opt/conda/lib/python3.9/site-packages/google/cloud/bigquery/__init__.py:35 31 from google.cloud.bigquery import version as bigquery_version 33 __version__ = bigquery_version.__version__ ---> 35 from google.cloud.bigquery.client import Client 36 from google.cloud.bigquery.dataset import AccessEntry 37 from google.cloud.bigquery.dataset import Dataset File /opt/conda/lib/python3.9/site-packages/google/cloud/bigquery/client.py:74 72 from google.cloud.bigquery._helpers import _verify_job_config_type 73 from google.cloud.bigquery._http import Connection ---> 74 from google.cloud.bigquery import _pandas_helpers 75 from google.cloud.bigquery.dataset import Dataset 76 from google.cloud.bigquery.dataset import DatasetListItem File /opt/conda/lib/python3.9/site-packages/google/cloud/bigquery/_pandas_helpers.py:56 52 return write(v) if notnull(v) else v 54 return _to_wkb ---> 56 _to_wkb = _to_wkb() 58 try: 59 from google.cloud.bigquery_storage import ArrowSerializationOptions File /opt/conda/lib/python3.9/site-packages/google/cloud/bigquery/_pandas_helpers.py:46, in _to_wkb() 39 def _to_wkb(): 40 # Create a closure that: 41 # - Adds a not-null check. This allows the returned function to (...) 44 # - Caches the WKBWriter (and write method lookup :) ) 45 # - Avoids adding WKBWriter, lgeos, and notnull to the module namespace. ---> 46 from shapely.geos import WKBWriter, lgeos # type: ignore 48 write = WKBWriter(lgeos).write 49 notnull = pandas.notnull ImportError: cannot import name 'WKBWriter' from 'shapely.geos' (/opt/conda/lib/python3.9/site-packages/shapely/geos.py)
Ülke E. · 已于 over 1 year前审核
there is a problem when importing packages: from google.cloud import aiplatform (even after restarting the kernel)
Omar R. V. · 已于 over 1 year前审核
Anish S. · 已于 over 1 year前审核
Had to install a couple of libraries, namely: !pip install -U google-cloud-aiplatform "shapely<2" & !pip3 install google-cloud-pipeline-components --upgrade
Etienne V. · 已于 over 1 year前审核
Luca F. · 已于 over 1 year前审核
Luca F. · 已于 over 1 year前审核
Lakshmidevi K. · 已于 over 1 year前审核
Kaita F. · 已于 over 1 year前审核
Ajay D. · 已于 over 1 year前审核
Yonggeun S. · 已于 over 1 year前审核
Yonggeun S. · 已于 over 1 year前审核
Useless lab. First, the dependencies doesn't work, I had an error importing "aiplatform", and second it wasn't clear what I should had to do. Please fix this lab or give me my 5 credits back, I need them.
LORENZO S. · 已于 over 1 year前审核
Alexander M. · 已于 over 1 year前审核
Mychael B. · 已于 over 1 year前审核
Jean Mith P. · 已于 over 1 year前审核
It took a lot of time to save the model. So I couldn't finish the whole experiments in this lab.
Han S. · 已于 over 1 year前审核
Mushtaq Z. · 已于 over 1 year前审核
Reinhardt M. · 已于 over 1 year前审核
Ming H. · 已于 over 1 year前审核
我们无法确保发布的评价来自已购买或已使用产品的消费者。评价未经 Google 核实。