I have a class representing a model that is set up as follows:
class Model:
def __init__(self):
self.setup_graph()
def setup_graph():
# sets up the model
....
def train(self, dataset):
# dataset is a tf.data.Dataset iterator, from which I can get
# tf.Tensor objects directly, which become part of the graph
....
def predict(self, sample):
# sample is a single NumPy array representing a sample,
# which could be fed to a tf.placeholder using feed_dict
....
During training I want to make use of the efficiency of TensorFlow's tf.data.Dataset, but I still want to be able to get the output of the model on a single sample. It seems to me that this requires recreating the graph for prediction. Is this true, or can I create a TF graph where I can either run with a sample from a tf.data.Dataset, or with a given sample I feed to a tf.placeholder?