Applying Contextual Bandits for Recommendations with Tensorflow and TF-Agents avis
2539 avis
Jakub L. · Examiné il y a plus d'un an
could not get this one to work, many problems with the libraries and unclear directions
Mika K. · Examiné il y a plus d'un an
Julius L. · Examiné il y a plus d'un an
Gonzalo L. · Examiné il y a plus d'un an
Wafin S. · Examiné il y a plus d'un an
Rajesh S. · Examiné il y a plus d'un an
cannot load all libraries ((
Denis K. · Examiné il y a plus d'un an
Alvaro R. · Examiné il y a plus d'un an
Shibu P. · Examiné il y a plus d'un an
ankit g. · Examiné il y a plus d'un an
Followed all the instructions and during the initial setup, upon running the commands "!pip install --user --quiet --upgrade --force-reinstall tensorflow tensorflow_probability tensorflow-io !pip install tf_agents --quiet gast --upgrade" I receive the following errors: "WARNING: The script wheel is installed in '/home/jupyter/.local/bin' which is not on PATH. Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location. WARNING: The script pygmentize is installed in '/home/jupyter/.local/bin' which is not on PATH. Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location. WARNING: The script f2py is installed in '/home/jupyter/.local/bin' which is not on PATH. Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location. WARNING: The script normalizer is installed in '/home/jupyter/.local/bin' which is not on PATH. Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location. WARNING: The script markdown-it is installed in '/home/jupyter/.local/bin' which is not on PATH. Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location. WARNING: The script markdown_py is installed in '/home/jupyter/.local/bin' which is not on PATH. Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location. WARNING: The script tensorboard is installed in '/home/jupyter/.local/bin' which is not on PATH. Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location. WARNING: The scripts import_pb_to_tensorboard, saved_model_cli, tensorboard, tf_upgrade_v2, tflite_convert, toco and toco_from_protos are installed in '/home/jupyter/.local/bin' which is not on PATH. Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location. ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. apache-beam 2.46.0 requires cloudpickle~=2.2.1, but you have cloudpickle 3.0.0 which is incompatible. apache-beam 2.46.0 requires numpy<1.25.0,>=1.14.3, but you have numpy 1.26.4 which is incompatible. apache-beam 2.46.0 requires protobuf<4,>3.12.2, but you have protobuf 4.25.3 which is incompatible. google-api-core 1.34.1 requires protobuf!=3.20.0,!=3.20.1,!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<4.0.0dev,>=3.19.5, but you have protobuf 4.25.3 which is incompatible. google-cloud-bigtable 1.7.3 requires protobuf<4.0.0dev, but you have protobuf 4.25.3 which is incompatible. google-cloud-datastore 1.15.5 requires protobuf<4.0.0dev, but you have protobuf 4.25.3 which is incompatible. google-cloud-language 1.3.2 requires protobuf<4.0.0dev, but you have protobuf 4.25.3 which is incompatible. google-cloud-videointelligence 1.16.3 requires protobuf<4.0.0dev, but you have protobuf 4.25.3 which is incompatible. numba 0.56.4 requires numpy<1.24,>=1.18, but you have numpy 1.26.4 which is incompatible. tensorboardx 2.6 requires protobuf<4,>=3.8.0, but you have protobuf 4.25.3 which is incompatible. tensorflow-metadata 0.14.0 requires protobuf<4,>=3.7, but you have protobuf 4.25.3 which is incompatible. tensorflow-transform 0.14.0 requires absl-py<2,>=0.7, but you have absl-py 2.1.0 which is incompatible. tensorflow-transform 0.14.0 requires protobuf<4,>=3.7, but you have protobuf 4.25.3 which is incompatible. ydata-profiling 4.5.1 requires numpy<1.24,>=1.16.0, but you have numpy 1.26.4 which is incompatible. ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. apache-beam 2.46.0 requires cloudpickle~=2.2.1, but you have cloudpickle 3.0.0 which is incompatible. apache-beam 2.46.0 requires numpy<1.25.0,>=1.14.3, but you have numpy 1.26.4 which is incompatible. apache-beam 2.46.0 requires protobuf<4,>3.12.2, but you have protobuf 4.25.3 which is incompatible. tensorflow-transform 0.14.0 requires absl-py<2,>=0.7, but you have absl-py 2.1.0 which is incompatible. tensorflow-transform 0.14.0 requires protobuf<4,>=3.7, but you have protobuf 4.25.3 which is incompatible. ydata-profiling 4.5.1 requires numpy<1.24,>=1.16.0, but you have numpy 1.26.4 which is incompatible. I restarted the kernel as the next code block suggested and it also fails: 2024-05-08 14:13:25.448625: I external/local_tsl/tsl/cuda/cudart_stub.cc:32] Could not find cuda drivers on your machine, GPU will not be used. 2024-05-08 14:13:25.453948: I external/local_tsl/tsl/cuda/cudart_stub.cc:32] Could not find cuda drivers on your machine, GPU will not be used. 2024-05-08 14:13:25.520054: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations. To enable the following instructions: AVX2 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags. 2024-05-08 14:13:27.338947: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT --------------------------------------------------------------------------- AttributeError Traceback (most recent call last) Cell In[1], line 7 4 from absl import flags 6 import tensorflow as tf # pylint: disable=g-explicit-tensorflow-version-import ----> 7 from tf_agents.bandits.agents import dropout_thompson_sampling_agent as dropout_ts_agent 8 from tf_agents.bandits.agents import lin_ucb_agent 9 from tf_agents.bandits.agents import linear_thompson_sampling_agent as lin_ts_agent File /opt/conda/lib/python3.9/site-packages/tf_agents/__init__.py:74 70 _ensure_tf_install() 72 import sys as _sys ---> 74 from tf_agents import agents 75 from tf_agents import bandits 76 from tf_agents import distributions File /opt/conda/lib/python3.9/site-packages/tf_agents/agents/__init__.py:17 1 # coding=utf-8 2 # Copyright 2020 The TF-Agents Authors. 3 # (...) 13 # See the License for the specific language governing permissions and 14 # limitations under the License. 16 """Module importing all agents.""" ---> 17 from tf_agents.agents import behavioral_cloning 18 from tf_agents.agents import categorical_dqn 19 from tf_agents.agents import cql File /opt/conda/lib/python3.9/site-packages/tf_agents/agents/behavioral_cloning/__init__.py:17 1 # coding=utf-8 2 # Copyright 2020 The TF-Agents Authors. 3 # (...) 13 # See the License for the specific language governing permissions and 14 # limitations under the License. 16 """A Behavioral Cloning agent.""" ---> 17 from tf_agents.agents.behavioral_cloning import behavioral_cloning_agent File /opt/conda/lib/python3.9/site-packages/tf_agents/agents/behavioral_cloning/behavioral_cloning_agent.py:37 35 import gin 36 import tensorflow as tf ---> 37 import tensorflow_probability as tfp 38 from tf_agents.agents import data_converter 39 from tf_agents.agents import tf_agent File /opt/conda/lib/python3.9/site-packages/tensorflow_probability/__init__.py:20 15 """Tools for probabilistic reasoning in TensorFlow.""" 17 # Contributors to the `python/` dir should not alter this file; instead update 18 # `python/__init__.py` as necessary. ---> 20 from tensorflow_probability import substrates 21 # from tensorflow_probability.google import staging # DisableOnExport 22 # from tensorflow_probability.google import tfp_google # DisableOnExport 23 from tensorflow_probability.python import * # pylint: disable=wildcard-import File /opt/conda/lib/python3.9/site-packages/tensorflow_probability/substrates/__init__.py:17 1 # Copyright 2019 The TensorFlow Probability Authors. 2 # 3 # Licensed under the Apache License, Version 2.0 (the "License"); (...) 13 # limitations under the License. 14 # ============================================================================ 15 """TensorFlow Probability alternative substrates.""" ---> 17 from tensorflow_probability.python.internal import all_util 18 from tensorflow_probability.python.internal import lazy_loader # pylint: disable=g-direct-tensorflow-import 21 jax = lazy_loader.LazyLoader( 22 'jax', globals(), 23 'tensorflow_probability.substrates.jax') File /opt/conda/lib/python3.9/site-packages/tensorflow_probability/python/__init__.py:138 135 if _tf_loaded(): 136 # Non-lazy load of packages that register with tensorflow or keras. 137 for pkg_name in _maybe_nonlazy_load: --> 138 dir(globals()[pkg_name]) # Forces loading the package from its lazy loader. 141 all_util.remove_undocumented(__name__, _lazy_load + _maybe_nonlazy_load) File /opt/conda/lib/python3.9/site-packages/tensorflow_probability/python/internal/lazy_loader.py:57, in LazyLoader.__dir__(self) 56 def __dir__(self): ---> 57 module = self._load() 58 return dir(module) File /opt/conda/lib/python3.9/site-packages/tensorflow_probability/python/internal/lazy_loader.py:40, in LazyLoader._load(self) 38 self._on_first_access = None 39 # Import the target module and insert it into the parent's namespace ---> 40 module = importlib.import_module(self.__name__) 41 if self._parent_module_globals is not None: 42 self._parent_module_globals[self._local_name] = module File /opt/conda/lib/python3.9/importlib/__init__.py:127, in import_module(name, package) 125 break 126 level += 1 --> 127 return _bootstrap._gcd_import(name[level:], package, level) File /opt/conda/lib/python3.9/site-packages/tensorflow_probability/python/experimental/__init__.py:31 15 """TensorFlow Probability API-unstable package. 16 17 This package contains potentially useful code which is under active development (...) 27 You are welcome to try any of this out (and tell us how well it works for you!). 28 """ 30 from tensorflow_probability.python.experimental import auto_batching ---> 31 from tensorflow_probability.python.experimental import bayesopt 32 from tensorflow_probability.python.experimental import bijectors 33 from tensorflow_probability.python.experimental import distribute File /opt/conda/lib/python3.9/site-packages/tensorflow_probability/python/experimental/bayesopt/__init__.py:17 1 # Copyright 2023 The TensorFlow Probability Authors. 2 # 3 # Licensed under the Apache License, Version 2.0 (the "License"); (...) 13 # limitations under the License. 14 # ============================================================================ 15 """TensorFlow Probability experimental Bayesopt package.""" ---> 17 from tensorflow_probability.python.experimental.bayesopt import acquisition 18 from tensorflow_probability.python.internal import all_util 20 _allowed_symbols = [ 21 'acquisition', 22 ] File /opt/conda/lib/python3.9/site-packages/tensorflow_probability/python/experimental/bayesopt/acquisition/__init__.py:19 17 from tensorflow_probability.python.experimental.bayesopt.acquisition.acquisition_function import AcquisitionFunction 18 from tensorflow_probability.python.experimental.bayesopt.acquisition.acquisition_function import MCMCReducer ---> 19 from tensorflow_probability.python.experimental.bayesopt.acquisition.expected_improvement import GaussianProcessExpectedImprovement 20 from tensorflow_probability.python.experimental.bayesopt.acquisition.expected_improvement import ParallelExpectedImprovement 21 from tensorflow_probability.python.experimental.bayesopt.acquisition.expected_improvement import StudentTProcessExpectedImprovement File /opt/conda/lib/python3.9/site-packages/tensorflow_probability/python/experimental/bayesopt/acquisition/expected_improvement.py:19 15 """Expected Improvement.""" 17 import tensorflow.compat.v2 as tf ---> 19 from tensorflow_probability.python.distributions import normal 20 from tensorflow_probability.python.distributions import student_t 21 from tensorflow_probability.python.experimental.bayesopt.acquisition import acquisition_function File /opt/conda/lib/python3.9/site-packages/tensorflow_probability/python/distributions/__init__.py:110 108 from tensorflow_probability.python.distributions.pareto import Pareto 109 from tensorflow_probability.python.distributions.pert import PERT --> 110 from tensorflow_probability.python.distributions.pixel_cnn import PixelCNN 111 from tensorflow_probability.python.distributions.plackett_luce import PlackettLuce 112 from tensorflow_probability.python.distributions.poisson import Poisson File /opt/conda/lib/python3.9/site-packages/tensorflow_probability/python/distributions/pixel_cnn.py:33 31 from tensorflow_probability.python.internal import reparameterization 32 from tensorflow_probability.python.internal import tensorshape_util ---> 33 from tensorflow_probability.python.layers import weight_norm 36 class PixelCNN(distribution.Distribution): 37 """The Pixel CNN++ distribution. 38 39 Pixel CNN++ [(Salimans et al., 2017)][1] models a distribution over image (...) 228 Learning_, 2016. https://arxiv.org/pdf/1601.06759.pdf 229 """ File /opt/conda/lib/python3.9/site-packages/tensorflow_probability/python/layers/__init__.py:27 25 from tensorflow_probability.python.layers.dense_variational import DenseReparameterization 26 from tensorflow_probability.python.layers.dense_variational_v2 import DenseVariational ---> 27 from tensorflow_probability.python.layers.distribution_layer import CategoricalMixtureOfOneHotCategorical 28 from tensorflow_probability.python.layers.distribution_layer import DistributionLambda 29 from tensorflow_probability.python.layers.distribution_layer import IndependentBernoulli File /opt/conda/lib/python3.9/site-packages/tensorflow_probability/python/layers/distribution_layer.py:68 47 from tensorflow_probability.python.layers.internal import tensor_tuple 50 __all__ = [ 51 'CategoricalMixtureOfOneHotCategorical', 52 'DistributionLambda', (...) 64 'VariationalGaussianProcess', 65 ] ---> 68 tf.keras.__internal__.utils.register_symbolic_tensor_type(dtc._TensorCoercible) # pylint: disable=protected-access 71 def _event_size(event_shape, name=None): 72 """Computes the number of elements in a tensor with shape `event_shape`. 73 74 Args: (...) 82 a scalar tensor. 83 """ AttributeError: module 'keras._tf_keras.keras' has no attribute '__internal__'" So basically the lab is not runnable, even though I haven't had any chance to insert any inputs of my own, so I guess there's some issue on a notebook/compatibility level.
Atanas V. · Examiné il y a plus d'un an
Harshit P. · Examiné il y a plus d'un an
Harshit P. · Examiné il y a plus d'un an
Matt S. · Examiné il y a plus d'un an
Mohammad S. · Examiné il y a plus d'un an
Sharan S. · Examiné il y a plus d'un an
Harold M. · Examiné il y a plus d'un an
2303C 5. · Examiné il y a plus d'un an
Aryan s. · Examiné il y a plus d'un an
Dependency issues
Riccardo F. · Examiné il y a plus d'un an
Quang V. · Examiné il y a plus d'un an
Quang V. · Examiné il y a plus d'un an
James C. · Examiné il y a plus d'un an
Guilherme M. · Examiné il y a plus d'un an
Carlos Felipe M. · Examiné il y a plus d'un an
Nous ne pouvons pas certifier que les avis publiés proviennent de consommateurs qui ont acheté ou utilisé les produits. Les avis ne sont pas vérifiés par Google.