关于“Build a DNN using the Keras Functional API”的评价
10215 条评价
Christopher D. · 已于 8 months前审核
Martin L. · 已于 8 months前审核
Odar M. · 已于 8 months前审核
Harsh B. · 已于 8 months前审核
Issues with defining features: def features_and_labels(row_data): for unwanted_col in ['pickup_datetime', 'key']: row_data.pop(unwanted_col) label = row_data.pop(LABEL_COLUMN) return row_data, label # features, label # load the training data def load_dataset(pattern, batch_size=1, mode=tf.estimator.ModeKeys.EVAL): dataset = (tf.data.experimental.make_csv_dataset(pattern, batch_size, CSV_COLUMNS, DEFAULTS) .map(features_and_labels) # features, label ) if mode == tf.estimator.ModeKeys.TRAIN: dataset = dataset.shuffle(1000).repeat() dataset = dataset.prefetch(1) # take advantage of multi-threading; 1=AUTOTUNE return dataset ---------------------------------------------------------------------------AttributeError Traceback (most recent call last) Cell In[11], line 8 5 return row_data, label # features, label 7 # load the training data----> 8 def load_dataset(pattern, batch_size=1, mode=tf.estimator.ModeKeys.EVAL): 9 dataset = (tf.data.experimental.make_csv_dataset(pattern, batch_size, CSV_COLUMNS, DEFAULTS) 10 .map(features_and_labels) # features, label 11 ) 12 if mode == tf.estimator.ModeKeys.TRAIN:AttributeError: module 'tensorflow' has no attribute 'estimator' --------------- Per Gemini - this worked The tf.estimator module is outdated and no longer the recommended way to build models in TensorFlow. It's been largely superseded by Keras, which provides a more user-friendly and flexible API for building and training models. Here's how you can update your code to use Keras instead of tf.estimator: ------- # ... (your features_and_labels function remains the same) # Load the training data using tf.data.Dataset def load_dataset(pattern, batch_size=1, mode='eval'): # Use strings for mode dataset = (tf.data.experimental.make_csv_dataset(pattern, batch_size, CSV_COLUMNS, DEFAULTS) .map(features_and_labels) # features, label ) if mode 1 == 'train': dataset = dataset.shuffle(1000).repeat() dataset = dataset.prefetch(1) # take advantage of multi-threading; 1=AUTOTUNE return dataset ----- mode parameter: Instead of using tf.estimator.ModeKeys, we'll simply use strings like 'train' or 'eval' to represent the mode. No tf.estimator: We've removed the dependency on tf.estimator altogether.
Rod M. · 已于 8 months前审核
Great Lab!
Francisco A. · 已于 8 months前审核
Ioana B. · 已于 8 months前审核
Thomas N. · 已于 8 months前审核
David L. · 已于 8 months前审核
Ronny d. · 已于 8 months前审核
Anderson I. · 已于 8 months前审核
good
Nikhitha K. · 已于 8 months前审核
David O. · 已于 8 months前审核
敬源 黃. · 已于 8 months前审核
Harry M. · 已于 8 months前审核
Sakshi Nagare .. · 已于 8 months前审核
ok
Raul H. · 已于 8 months前审核
ok
Raul H. · 已于 8 months前审核
Víctor P. · 已于 8 months前审核
Paulo C. · 已于 8 months前审核
Valeria A. · 已于 8 months前审核
Pablo J. · 已于 8 months前审核
Saulo R. · 已于 8 months前审核
Xiomara O. · 已于 8 months前审核
Paulo C. · 已于 8 months前审核
我们无法确保发布的评价来自已购买或已使用产品的消费者。评价未经 Google 核实。