Vertex AI Platform で Keras Sequential API を使用する のレビュー
13866 件のレビュー
Sobkowiak Barbara · 1年以上前にレビュー済み
Patil Sudhanshu · 1年以上前にレビュー済み
Giri Nalin · 1年以上前にレビュー済み
Nayak Vaibhav · 1年以上前にレビュー済み
There are issues with the version of the commands in the lab
malone nicholas · 1年以上前にレビュー済み
Rojas Becerra Luccas · 1年以上前にレビュー済み
from google.cloud import aiplatform cant import this library, throwing this error: ImportError Traceback (most recent call last) Cell In[29], line 1 ----> 1 from google.cloud import aiplatform 2 uploaded_model = aiplatform.Model.upload( 3 display_name=MODEL_DISPLAYNAME, 4 artifact_uri=f"gs://{BUCKET}/{MODEL_DISPLAYNAME}", 5 serving_container_image_uri=SERVING_CONTAINER_IMAGE_URI, 6 ) File /opt/conda/lib/python3.9/site-packages/google/cloud/aiplatform/__init__.py:24 19 from google.cloud.aiplatform import version as aiplatform_version 21 __version__ = aiplatform_version.__version__ ---> 24 from google.cloud.aiplatform import initializer 26 from google.cloud.aiplatform.datasets import ( 27 ImageDataset, 28 TabularDataset, (...) 31 VideoDataset, 32 ) 33 from google.cloud.aiplatform import explain File /opt/conda/lib/python3.9/site-packages/google/cloud/aiplatform/initializer.py:34 32 from google.cloud.aiplatform.constants import base as constants 33 from google.cloud.aiplatform import utils ---> 34 from google.cloud.aiplatform.metadata import metadata 35 from google.cloud.aiplatform.utils import resource_manager_utils 36 from google.cloud.aiplatform.tensorboard import tensorboard_resource File /opt/conda/lib/python3.9/site-packages/google/cloud/aiplatform/metadata/metadata.py:25 22 from google.protobuf import timestamp_pb2 24 from google.cloud.aiplatform import base ---> 25 from google.cloud.aiplatform import pipeline_jobs 26 from google.cloud.aiplatform.compat.types import execution as gca_execution 27 from google.cloud.aiplatform.metadata import constants File /opt/conda/lib/python3.9/site-packages/google/cloud/aiplatform/pipeline_jobs.py:31 29 from google.cloud.aiplatform import utils 30 from google.cloud.aiplatform.constants import pipeline as pipeline_constants ---> 31 from google.cloud.aiplatform.metadata import artifact 32 from google.cloud.aiplatform.metadata import context 33 from google.cloud.aiplatform.metadata import execution File /opt/conda/lib/python3.9/site-packages/google/cloud/aiplatform/metadata/artifact.py:25 22 from google.auth import credentials as auth_credentials 24 from google.cloud.aiplatform import base ---> 25 from google.cloud.aiplatform import models 26 from google.cloud.aiplatform import utils 27 from google.cloud.aiplatform.compat.types import artifact as gca_artifact File /opt/conda/lib/python3.9/site-packages/google/cloud/aiplatform/models.py:46 44 from google.cloud.aiplatform import explain 45 from google.cloud.aiplatform import initializer ---> 46 from google.cloud.aiplatform import jobs 47 from google.cloud.aiplatform import models 48 from google.cloud.aiplatform import utils File /opt/conda/lib/python3.9/site-packages/google/cloud/aiplatform/jobs.py:26 23 import time 25 from google.cloud import storage ---> 26 from google.cloud import bigquery 28 from google.auth import credentials as auth_credentials 29 from google.protobuf import duration_pb2 # type: ignore File /opt/conda/lib/python3.9/site-packages/google/cloud/bigquery/__init__.py:35 31 from google.cloud.bigquery import version as bigquery_version 33 __version__ = bigquery_version.__version__ ---> 35 from google.cloud.bigquery.client import Client 36 from google.cloud.bigquery.dataset import AccessEntry 37 from google.cloud.bigquery.dataset import Dataset File /opt/conda/lib/python3.9/site-packages/google/cloud/bigquery/client.py:74 72 from google.cloud.bigquery._helpers import _verify_job_config_type 73 from google.cloud.bigquery._http import Connection ---> 74 from google.cloud.bigquery import _pandas_helpers 75 from google.cloud.bigquery.dataset import Dataset 76 from google.cloud.bigquery.dataset import DatasetListItem File /opt/conda/lib/python3.9/site-packages/google/cloud/bigquery/_pandas_helpers.py:56 52 return write(v) if notnull(v) else v 54 return _to_wkb ---> 56 _to_wkb = _to_wkb() 58 try: 59 from google.cloud.bigquery_storage import ArrowSerializationOptions File /opt/conda/lib/python3.9/site-packages/google/cloud/bigquery/_pandas_helpers.py:46, in _to_wkb() 39 def _to_wkb(): 40 # Create a closure that: 41 # - Adds a not-null check. This allows the returned function to (...) 44 # - Caches the WKBWriter (and write method lookup :) ) 45 # - Avoids adding WKBWriter, lgeos, and notnull to the module namespace. ---> 46 from shapely.geos import WKBWriter, lgeos # type: ignore 48 write = WKBWriter(lgeos).write 49 notnull = pandas.notnull ImportError: cannot import name 'WKBWriter' from 'shapely.geos' (/opt/conda/lib/python3.9/site-packages/shapely/geos.py)
Eren Aktaş Ülke · 1年以上前にレビュー済み
there is a problem when importing packages: from google.cloud import aiplatform (even after restarting the kernel)
Valencia García Omar R. · 1年以上前にレビュー済み
Shah Anish · 1年以上前にレビュー済み
Had to install a couple of libraries, namely: !pip install -U google-cloud-aiplatform "shapely<2" & !pip3 install google-cloud-pipeline-components --upgrade
VINCENT Etienne · 1年以上前にレビュー済み
Ferrari Fenini Luca · 1年以上前にレビュー済み
Ferrari Fenini Luca · 1年以上前にレビュー済み
Kandath Lakshmidevi · 1年以上前にレビュー済み
Furukawa Kaita · 1年以上前にレビュー済み
Dakhore Ajay · 1年以上前にレビュー済み
Song Yonggeun · 1年以上前にレビュー済み
Song Yonggeun · 1年以上前にレビュー済み
Useless lab. First, the dependencies doesn't work, I had an error importing "aiplatform", and second it wasn't clear what I should had to do. Please fix this lab or give me my 5 credits back, I need them.
STORCHI LORENZO · 1年以上前にレビュー済み
Migunov Alexander · 1年以上前にレビュー済み
Brown Mychael · 1年以上前にレビュー済み
PHILIPPE Jean Mith · 1年以上前にレビュー済み
It took a lot of time to save the model. So I couldn't finish the whole experiments in this lab.
Seyeong Han · 1年以上前にレビュー済み
Zahid Khan Mushtaq · 1年以上前にレビュー済み
Muehlhaeusser Reinhardt · 1年以上前にレビュー済み
Huang Ming · 1年以上前にレビュー済み
公開されたレビューが、製品を購入または使用した人によるものであることは保証されません。Google はこれらのレビューの検証を行っていません。