리뷰 TFX 표준 구성요소 둘러보기개

리뷰 4445개

Could not get python imports to work

Abdul Z. · 2년 초과 전에 리뷰됨

it still doesn't work! Now Jupiterlab won't open.

Paul M. · 2년 초과 전에 리뷰됨

The lab is not compatible with python 3 anymore... the lab can not be completed

Tamás P. · 2년 초과 전에 리뷰됨

Dariusz Ł. · 2년 초과 전에 리뷰됨

Codes not running or compatible to latest versions

suvojyoti c. · 2년 초과 전에 리뷰됨

Tanai A. · 2년 초과 전에 리뷰됨

Error on `pip install tensorflow_model_analysis` gcc: build/src.linux-x86_64-3.10/numpy/core/src/multiarray/scalartypes.c numpy/core/src/multiarray/scalartypes.c.src: In function ‘float_arrtype_hash’: numpy/core/src/multiarray/scalartypes.c.src:2967:27: error: incompatible type for argument 1 of ‘_Py_HashDouble’ 2967 | return _Py_HashDouble((double) PyArrayScalar_VAL(obj, @name@)); In file included from /opt/conda/include/python3.10/Python.h:77, from numpy/core/src/multiarray/scalartypes.c.src:3: /opt/conda/include/python3.10/pyhash.h:10:38: note: expected ‘PyObject *’ {aka ‘struct _object *’} but argument is of type ‘double’ 10 | PyAPI_FUNC(Py_hash_t) _Py_HashDouble(PyObject *, double); | ^~~~~~~~~~ numpy/core/src/multiarray/scalartypes.c.src:2967:12: error: too few arguments to function ‘_Py_HashDouble’ 2967 | return _Py_HashDouble((double) PyArrayScalar_VAL(obj, @name@)); | ^~~~~~~~~~~~~~ In file included from /opt/conda/include/python3.10/Python.h:77, from numpy/core/src/multiarray/scalartypes.c.src:3: /opt/conda/include/python3.10/pyhash.h:10:23: note: declared here 10 | PyAPI_FUNC(Py_hash_t) _Py_HashDouble(PyObject *, double); | ^~~~~~~~~~~~~~ numpy/core/src/multiarray/scalartypes.c.src: In function ‘cfloat_arrtype_hash’: numpy/core/src/multiarray/scalartypes.c.src:2975:31: error: incompatible type for argument 1 of ‘_Py_HashDouble’ 2975 | hashreal = _Py_HashDouble((double) | ^~~~~~~~ | | | double 2976 | PyArrayScalar_VAL(obj, C@name@).real); | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ In file included from /opt/conda/include/python3.10/Python.h:77, from numpy/core/src/multiarray/scalartypes.c.src:3: /opt/conda/include/python3.10/pyhash.h:10:38: note: expected ‘PyObject *’ {aka ‘struct _object *’} but argument is of type ‘double’ 10 | PyAPI_FUNC(Py_hash_t) _Py_HashDouble(PyObject *, double); | ^~~~~~~~~~ numpy/core/src/multiarray/scalartypes.c.src:2975:16: error: too few arguments to function ‘_Py_HashDouble’ 2975 | hashreal = _Py_HashDouble((double) | ^~~~~~~~~~~~~~ In file included from /opt/conda/include/python3.10/Python.h:77, from numpy/core/src/multiarray/scalartypes.c.src:3: /opt/conda/include/python3.10/pyhash.h:10:23: note: declared here 10 | PyAPI_FUNC(Py_hash_t) _Py_HashDouble(PyObject *, double); | ^~~~~~~~~~~~~~ numpy/core/src/multiarray/scalartypes.c.src:2981:31: error: incompatible type for argument 1 of ‘_Py_HashDouble’ 2981 | hashimag = _Py_HashDouble((double) | ^~~~~~~~ | | | double 2982 | PyArrayScalar_VAL(obj, C@name@).imag); | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ In file included from /opt/conda/include/python3.10/Python.h:77, from numpy/core/src/multiarray/scalartypes.c.src:3: /opt/conda/include/python3.10/pyhash.h:10:38: note: expected ‘PyObject *’ {aka ‘struct _object *’} but argument is of type ‘double’ 10 | PyAPI_FUNC(Py_hash_t) _Py_HashDouble(PyObject *, double); | ^~~~~~~~~~ numpy/core/src/multiarray/scalartypes.c.src:2981:16: error: too few arguments to function ‘_Py_HashDouble’ 2981 | hashimag = _Py_HashDouble((double) | ^~~~~~~~~~~~~~ In file included from /opt/conda/include/python3.10/Python.h:77, from numpy/core/src/multiarray/scalartypes.c.src:3: /opt/conda/include/python3.10/pyhash.h:10:23: note: declared here 10 | PyAPI_FUNC(Py_hash_t) _Py_HashDouble(PyObject *, double); | ^~~~~~~~~~~~~~ numpy/core/src/multiarray/scalartypes.c.src: In function ‘longdouble_arrtype_hash’: numpy/core/src/multiarray/scalartypes.c.src:2967:27: error: incompatible type for argument 1 of ‘_Py_HashDouble’ 2967 | return _Py_HashDouble((double) PyArrayScalar_VAL(obj, @name@)); In file included from /opt/conda/include/python3.10/Python.h:77, from numpy/core/src/multiarray/scalartypes.c.src:3: /opt/conda/include/python3.10/pyhash.h:10:38: note: expected ‘PyObject *’ {aka ‘struct _object *’} but argument is of type ‘double’ 10 | PyAPI_FUNC(Py_hash_t) _Py_HashDouble(PyObject *, double); | ^~~~~~~~~~ numpy/core/src/multiarray/scalartypes.c.src:2967:12: error: too few arguments to function ‘_Py_HashDouble’ 2967 | return _Py_HashDouble((double) PyArrayScalar_VAL(obj, @name@)); | ^~~~~~~~~~~~~~ In file included from /opt/conda/include/python3.10/Python.h:77, from numpy/core/src/multiarray/scalartypes.c.src:3: /opt/conda/include/python3.10/pyhash.h:10:23: note: declared here 10 | PyAPI_FUNC(Py_hash_t) _Py_HashDouble(PyObject *, double); | ^~~~~~~~~~~~~~ numpy/core/src/multiarray/scalartypes.c.src: In function ‘clongdouble_arrtype_hash’: numpy/core/src/multiarray/scalartypes.c.src:2975:31: error: incompatible type for argument 1 of ‘_Py_HashDouble’ 2975 | hashreal = _Py_HashDouble((double) | ^~~~~~~~ | | | double 2976 | PyArrayScalar_VAL(obj, C@name@).real); | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ In file included from /opt/conda/include/python3.10/Python.h:77, from numpy/core/src/multiarray/scalartypes.c.src:3: /opt/conda/include/python3.10/pyhash.h:10:38: note: expected ‘PyObject *’ {aka ‘struct _object *’} but argument is of type ‘double’ 10 | PyAPI_FUNC(Py_hash_t) _Py_HashDouble(PyObject *, double); | ^~~~~~~~~~ numpy/core/src/multiarray/scalartypes.c.src:2975:16: error: too few arguments to function ‘_Py_HashDouble’ 2975 | hashreal = _Py_HashDouble((double) | ^~~~~~~~~~~~~~ In file included from /opt/conda/include/python3.10/Python.h:77, from numpy/core/src/multiarray/scalartypes.c.src:3: /opt/conda/include/python3.10/pyhash.h:10:23: note: declared here 10 | PyAPI_FUNC(Py_hash_t) _Py_HashDouble(PyObject *, double); | ^~~~~~~~~~~~~~ numpy/core/src/multiarray/scalartypes.c.src:2981:31: error: incompatible type for argument 1 of ‘_Py_HashDouble’ 2981 | hashimag = _Py_HashDouble((double) | ^~~~~~~~ | | | double 2982 | PyArrayScalar_VAL(obj, C@name@).imag); | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ In file included from /opt/conda/include/python3.10/Python.h:77, from numpy/core/src/multiarray/scalartypes.c.src:3: /opt/conda/include/python3.10/pyhash.h:10:38: note: expected ‘PyObject *’ {aka ‘struct _object *’} but argument is of type ‘double’ 10 | PyAPI_FUNC(Py_hash_t) _Py_HashDouble(PyObject *, double); | ^~~~~~~~~~ numpy/core/src/multiarray/scalartypes.c.src:2981:16: error: too few arguments to function ‘_Py_HashDouble’ 2981 | hashimag = _Py_HashDouble((double) | ^~~~~~~~~~~~~~ In file included from /opt/conda/include/python3.10/Python.h:77, from numpy/core/src/multiarray/scalartypes.c.src:3: /opt/conda/include/python3.10/pyhash.h:10:23: note: declared here 10 | PyAPI_FUNC(Py_hash_t) _Py_HashDouble(PyObject *, double); | ^~~~~~~~~~~~~~ numpy/core/src/multiarray/scalartypes.c.src: In function ‘half_arrtype_hash’: numpy/core/src/multiarray/scalartypes.c.src:2997:27: error: incompatible type for argument 1 of ‘_Py_HashDouble’ 2997 | return _Py_HashDouble(npy_half_to_double(PyArrayScalar_VAL(obj, Half))); | ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | | | double In file included from /opt/conda/include/python3.10/Python.h:77, from numpy/core/src/multiarray/scalartypes.c.src:3: /opt/conda/include/python3.10/pyhash.h:10:38: note: expected ‘PyObject *’ {aka ‘struct _object *’} but argument is of type ‘double’ 10 | PyAPI_FUNC(Py_hash_t) _Py_HashDouble(PyObject *, double); | ^~~~~~~~~~ numpy/core/src/multiarray/scalartypes.c.src:2997:12: error: too few arguments to function ‘_Py_HashDouble’ 2997 | return _Py_HashDouble(npy_half_to_double(PyArrayScalar_VAL(obj, Half))); | ^~~~~~~~~~~~~~ In file included from /opt/conda/include/python3.10/Python.h:77, from numpy/core/src/multiarray/scalartypes.c.src:3: /opt/conda/include/python3.10/pyhash.h:10:23: note: declared here 10 | PyAPI_FUNC(Py_hash_t) _Py_HashDouble(PyObject *, double); | ^~~~~~~~~~~~~~ numpy/core/src/multiarray/scalartypes.c.src: In function ‘longdouble_arrtype_hash’: numpy/core/src/multiarray/scalartypes.c.src:2968:1: warning: control reaches end of non-void function [-Wreturn-type] 2968 | } | ^ numpy/core/src/multiarray/scalartypes.c.src: In function ‘float_arrtype_hash’: numpy/core/src/multiarray/scalartypes.c.src:2968:1: warning: control reaches end of non-void function [-Wreturn-type] 2968 | } | ^ numpy/core/src/multiarray/scalartypes.c.src: In function ‘half_arrtype_hash’: numpy/core/src/multiarray/scalartypes.c.src:2998:1: warning: control reaches end of non-void function [-Wreturn-type] 2998 | } | ^ gcc: numpy/core/src/multiarray/scalarapi.c gcc: numpy/core/src/multiarray/vdot.c gcc: numpy/core/src/multiarray/nditer_pywrap.c gcc: numpy/core/src/umath/umathmodule.c gcc: build/src.linux-x86_64-3.10/numpy/core/src/umath/clip.c gcc: numpy/core/src/umath/reduction.c gcc: build/src.linux-x86_64-3.10/numpy/core/src/umath/loops.c gcc: numpy/core/src/umath/override.c gcc: numpy/core/src/umath/ufunc_object.c gcc: numpy/core/src/npymath/npy_math.c gcc: build/src.linux-x86_64-3.10/numpy/core/src/npymath/ieee754.c gcc: build/src.linux-x86_64-3.10/numpy/core/src/npymath/npy_math_complex.c gcc: numpy/core/src/npymath/halffloat.c gcc: numpy/core/src/common/array_assign.c gcc: numpy/core/src/common/mem_overlap.c gcc: numpy/core/src/umath/extobj.c gcc: numpy/core/src/common/npy_longdouble.c gcc: build/src.linux-x86_64-3.10/numpy/core/src/umath/scalarmath.c gcc: numpy/core/src/common/ucsnarrow.c gcc: numpy/core/src/common/ufunc_override.c gcc: numpy/core/src/common/numpyos.c gcc: build/src.linux-x86_64-3.10/numpy/core/src/common/npy_cpu_features.c gcc: numpy/core/src/umath/ufunc_type_resolution.c gcc: numpy/core/src/multiarray/mapping.c gcc: numpy/core/src/multiarray/methods.c gcc: build/src.linux-x86_64-3.10/numpy/core/src/umath/matmul.c error: Command "gcc -pthread -B /opt/conda/compiler_compat -Wno-unused-result -Wsign-compare -DNDEBUG -fwrapv -O2 -Wall -fPIC -O2 -isystem /opt/conda/include -fPIC -O2 -isystem /opt/conda/include -fPIC -DNPY_INTERNAL_BUILD=1 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Ibuild/src.linux-x86_64-3.10/numpy/core/src/umath -Ibuild/src.linux-x86_64-3.10/numpy/core/src/npymath -Ibuild/src.linux-x86_64-3.10/numpy/core/src/common -Inumpy/core/include -Ibuild/src.linux-x86_64-3.10/numpy/core/include/numpy -Inumpy/core/src/common -Inumpy/core/src -Inumpy/core -Inumpy/core/src/npymath -Inumpy/core/src/multiarray -Inumpy/core/src/umath -Inumpy/core/src/npysort -I/opt/conda/include/python3.10 -Ibuild/src.linux-x86_64-3.10/numpy/core/src/common -Ibuild/src.linux-x86_64-3.10/numpy/core/src/npymath -c build/src.linux-x86_64-3.10/numpy/core/src/multiarray/scalartypes.c -o build/temp.linux-x86_64-3.10/build/src.linux-x86_64-3.10/numpy/core/src/multiarray/scalartypes.o -MMD -MF build/temp.linux-x86_64-3.10/build/src.linux-x86_64-3.10/numpy/core/src/multiarray/scalartypes.o.d" failed with exit status 1 [end of output] note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for numpy Failed to build numpy ERROR: Could not build wheels for numpy, which is required to install pyproject.toml-based projects [end of output] note: This error originates from a subprocess, and is likely not a problem with pip.

ChengCheng T. · 2년 초과 전에 리뷰됨

lab does not work with current notebook settings

Guendalina C. · 2년 초과 전에 리뷰됨

Tamás P. · 2년 초과 전에 리뷰됨

Faulty instructions, there is no AI Platform, there is no create instance on Vertex AI Pipelines page. Tensorflow data validation library is not installed and installing permission is denied.

Arslan A. · 2년 초과 전에 리뷰됨

Arvindh R. · 2년 초과 전에 리뷰됨

--------------------------------------------------------------------------- ModuleNotFoundError Traceback (most recent call last) Cell In[1], line 7 4 import time 6 import tensorflow as tf ----> 7 import tensorflow_data_validation as tfdv

Skandarajan R. · 2년 초과 전에 리뷰됨

Carl M. · 2년 초과 전에 리뷰됨

please update the lab for latest version of tensorflow

Arvindh R. · 2년 초과 전에 리뷰됨

Federico F. · 2년 초과 전에 리뷰됨

Carl M. · 2년 초과 전에 리뷰됨

Mahdieh K. · 2년 초과 전에 리뷰됨

it still doesn't work! After 45 minutes received this error: Error encountered while creating a cluster. Please try again or select an existing cluster. Failure Reason: Error while creating the new cluster - Error: Cluster not in running state

Paul M. · 2년 초과 전에 리뷰됨

Eduardo J. · 2년 초과 전에 리뷰됨

I coul not create any pipeline it gave me an error. I spent 2 hours waiting for the cluster to be created and no success

Niloofar M. · 2년 초과 전에 리뷰됨

Marco M. · 2년 초과 전에 리뷰됨

Aleksei K. · 2년 초과 전에 리뷰됨

Doesn't work the deploy of kubeflow pipelines. Creating "custer-1" in zone "us-central1-a" doesm't finish

Ivan R. · 2년 초과 전에 리뷰됨

Ivan R. · 2년 초과 전에 리뷰됨

Anas K. · 2년 초과 전에 리뷰됨

Google은 게시된 리뷰가 제품을 구매 또는 사용한 소비자에 의해 작성되었음을 보증하지 않습니다. 리뷰는 Google의 인증을 거치지 않습니다.