I don't know how to approach this. I have one graph, g1 which is used to extract features.
class FeatureExtractor():
self.g1
...
def _build_model():
...
def compute_features(new_in):
out = self.session.run(self.compute_out(), feed_dict = {self.in_: new_in})
And I want to use this graph inside another graph g2 like so:
class LSTM():
def __init__(feature_extractor):
self.fe = feature_extractor
def _build_model():
....
signal = tf.map_fn(self.fe.compute_features, signal)
but I get this error.
TypeError: The value of a feed cannot be a tf.Tensor object. Acceptable feed values include Python scalars, strings, lists, numpy ndarrays, or TensorHandles.
because compute_features expects a np.array or something like that to feed it to g1.
Is there any way I could do this?