Como configurar, usar e auditar escopos e contas de serviços da VM avaliações
Carregando...
Nenhum resultado encontrado.

Aplique suas habilidades no console do Google Cloud

Como configurar, usar e auditar escopos e contas de serviços da VM avaliações

15079 avaliações

I was getting stuck on the last step. Service account had all required access but I could not verify. Boot disk required in the lab was not available.

Zack V. · Revisado há about 1 year

Issue with student-01-e50d83278aae@bigquery-instance:~$ python3 query.py A module that was compiled using NumPy 1.x cannot be run in NumPy 2.0.0 as it may crash. To support both 1.x and 2.x versions of NumPy, modules must be compiled with NumPy 2.0. Some module may need to rebuild instead e.g. with 'pybind11>=2.12'. If you are a user of the module, the easiest solution will be to downgrade to 'numpy<2' or try to upgrade the affected module. We expect that some modules will need time to support NumPy 2.

Bharatkumar P. · Revisado há about 1 year

i have a problem with "Please make a request to BigQuery public dataset with a 'bigquery-qwiklab' service account."

Edison D. · Revisado há about 1 year

Nataliia R. · Revisado há about 1 year

Dad p. · Revisado há about 1 year

Sai S. · Revisado há about 1 year

HAYYYSST

Jerald C. · Revisado há about 1 year

Zaki Zarkasih Al M. · Revisado há about 1 year

Jonas C. · Revisado há about 1 year

This lab is HORRENDOUS. The instructions are TRASH. First - it says to use Debian 10. This is not even an option in GCP anymore. Only 11 and 12. The python setup instructions are AWFUL. These could not have actually been tested. It's impossible. First off, installing all these packages system-wide as root is terrible practice. The instructions should have the user install venv and setup a virtual environment. Newer systems will refuse to install modules at the system level using pip, so these commands won't even work. For simplicity's sake, all of the pip install commands can be run on a single line. You don't need to make people copy/paste ten different lines. And then, the worst part, this code won't even run due to incompatibilities in libraries. NumPy < 2.0 is required, but won't be installed by these instructions. It has to manually be specified using pip. How do you expect people to learn when you provide such garbage instructions? These need to be updated, they clearly haven't been touched in years.

Chris M. · Revisado há about 1 year

Giovanni D. · Revisado há about 1 year

debian 10 no longer exist - there is only debian 11 or 12 to choose

Konrad S. · Revisado há about 1 year

pip3 commands not working

Jasim A. · Revisado há about 1 year

Tom K. · Revisado há about 1 year

pip commands are not working as is, need to use sudo apt install python3.11-venv && python3 -m venv my-venv and then use my-venv/bin/pip install

Aleksey B. · Revisado há about 1 year

Liane A. · Revisado há about 1 year

Eñaut Z. · Revisado há about 1 year

Can't choose Debian 10 image and many errors like that If you are a user of the module, the easiest solution will be to downgrade to 'numpy<2' or try to upgrade the affected module. We expect that some modules will need time to support NumPy 2. Traceback (most recent call last): File "/home/student-02-af562455dd2a/query.py", line 3, in <module> from google.cloud import bigquery File "/usr/local/lib/python3.9/dist-packages/google/cloud/bigquery/__init__.py", line 35, in <module> from google.cloud.bigquery.client import Client File "/usr/local/lib/python3.9/dist-packages/google/cloud/bigquery/client.py", line 69, in <module> from google.cloud.bigquery import _job_helpers File "/usr/local/lib/python3.9/dist-packages/google/cloud/bigquery/_job_helpers.py", line 47, in <module> from google.cloud.bigquery import job File "/usr/local/lib/python3.9/dist-packages/google/cloud/bigquery/job/__init__.py", line 27, in <module> from google.cloud.bigquery.job.copy_ import CopyJob File "/usr/local/lib/python3.9/dist-packages/google/cloud/bigquery/job/copy_.py", line 22, in <module> from google.cloud.bigquery.table import TableReference File "/usr/local/lib/python3.9/dist-packages/google/cloud/bigquery/table.py", line 62, in <module> from google.cloud.bigquery import _pandas_helpers File "/usr/local/lib/python3.9/dist-packages/google/cloud/bigquery/_pandas_helpers.py", line 42, in <module> import db_dtypes # type: ignore File "/usr/local/lib/python3.9/dist-packages/db_dtypes/__init__.py", line 27, in <module> import pyarrow File "/usr/local/lib/python3.9/dist-packages/pyarrow/__init__.py", line 63, in <module> import pyarrow.lib as _lib AttributeError: _ARRAY_API not found A module that was compiled using NumPy 1.x cannot be run in NumPy 2.0.0 as it may crash. To support both 1.x and 2.x versions of NumPy, modules must be compiled with NumPy 2.0. Some module may need to rebuild instead e.g. with 'pybind11>=2.12'. If you are a user of the module, the easiest solution will be to downgrade to 'numpy<2' or try to upgrade the affected module. We expect that some modules will need time to support NumPy 2. Traceback (most recent call last): File "/home/student-02-af562455dd2a/query.py", line 3, in <module> from google.cloud import bigquery File "/usr/local/lib/python3.9/dist-packages/google/cloud/bigquery/__init__.py", line 35, in <module> from google.cloud.bigquery.client import Client File "/usr/local/lib/python3.9/dist-packages/google/cloud/bigquery/client.py", line 69, in <module> from google.cloud.bigquery import _job_helpers File "/usr/local/lib/python3.9/dist-packages/google/cloud/bigquery/_job_helpers.py", line 47, in <module> from google.cloud.bigquery import job File "/usr/local/lib/python3.9/dist-packages/google/cloud/bigquery/job/__init__.py", line 27, in <module> from google.cloud.bigquery.job.copy_ import CopyJob File "/usr/local/lib/python3.9/dist-packages/google/cloud/bigquery/job/copy_.py", line 22, in <module> from google.cloud.bigquery.table import TableReference File "/usr/local/lib/python3.9/dist-packages/google/cloud/bigquery/table.py", line 62, in <module> from google.cloud.bigquery import _pandas_helpers File "/usr/local/lib/python3.9/dist-packages/google/cloud/bigquery/_pandas_helpers.py", line 52, in <module> pyarrow = _versions_helpers.PYARROW_VERSIONS.try_import() File "/usr/local/lib/python3.9/dist-packages/google/cloud/bigquery/_versions_helpers.py", line 78, in try_import import pyarrow File "/usr/local/lib/python3.9/dist-packages/pyarrow/__init__.py", line 63, in <module> import pyarrow.lib as _lib AttributeError: _ARRAY_API not found A module that was compiled using NumPy 1.x cannot be run in NumPy 2.0.0 as it may crash. To support both 1.x and 2.x versions of NumPy, modules must be compiled with NumPy 2.0. Some module may need to rebuild instead e.g. with 'pybind11>=2.12'. If you are a user of the module, the easiest solution will be to downgrade to 'numpy<2' or try to upgrade the affected module. We expect that some modules will need time to support NumPy 2. Traceback (most recent call last): File "/home/student-02-af562455dd2a/query.py", line 3, in <module> from google.cloud import bigquery File "/usr/local/lib/python3.9/dist-packages/google/cloud/bigquery/__init__.py", line 35, in <module> from google.cloud.bigquery.client import Client File "/usr/local/lib/python3.9/dist-packages/google/cloud/bigquery/client.py", line 124, in <module> pyarrow = _versions_helpers.PYARROW_VERSIONS.try_import() File "/usr/local/lib/python3.9/dist-packages/google/cloud/bigquery/_versions_helpers.py", line 78, in try_import import pyarrow File "/usr/local/lib/python3.9/dist-packages/pyarrow/__init__.py", line 63, in <module> import pyarrow.lib as _lib AttributeError: _ARRAY_API not found Traceback (most recent call last): File "/usr/local/lib/python3.9/dist-packages/google/cloud/bigquery/_pandas_helpers.py", line 42, in <module> import db_dtypes # type: ignore File "/usr/local/lib/python3.9/dist-packages/db_dtypes/__init__.py", line 27, in <module> import pyarrow File "/usr/local/lib/python3.9/dist-packages/pyarrow/__init__.py", line 63, in <module> import pyarrow.lib as _lib File "pyarrow/lib.pyx", line 35, in init pyarrow.lib ImportError: numpy.core.multiarray failed to import The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/home/student-02-af562455dd2a/query.py", line 23, in <module> print(client.query(query).to_dataframe()) File "/usr/local/lib/python3.9/dist-packages/google/cloud/bigquery/job/query.py", line 2053, in to_dataframe return query_result.to_dataframe( File "/usr/local/lib/python3.9/dist-packages/google/cloud/bigquery/table.py", line 2287, in to_dataframe _pandas_helpers.verify_pandas_imports() File "/usr/local/lib/python3.9/dist-packages/google/cloud/bigquery/_pandas_helpers.py", line 1024, in verify_pandas_imports raise ValueError(_NO_DB_TYPES_ERROR) from db_dtypes_import_exception ValueError: Please install the 'db-dtypes' package to use this function

Lassina D. · Revisado há about 1 year

VIJAY K. · Revisado há about 1 year

Amira I. · Revisado há about 1 year

there is no debian 10 boot image

Bala s. · Revisado há about 1 year

Jason R. · Revisado há about 1 year

Deepesh W. · Revisado há about 1 year

COREY W. · Revisado há about 1 year

Não garantimos que as avaliações publicadas sejam de consumidores que compraram ou usaram os produtos. As avaliações não são verificadas pelo Google.