1

I don't know how to approach this. I have one graph, g1 which is used to extract features.

class FeatureExtractor():
    self.g1
    ...
    def _build_model():
        ...
    def compute_features(new_in):
        out = self.session.run(self.compute_out(), feed_dict = {self.in_: new_in})

And I want to use this graph inside another graph g2 like so:

class LSTM():
    def __init__(feature_extractor):
        self.fe = feature_extractor
    def _build_model():
        ....
        signal = tf.map_fn(self.fe.compute_features, signal)

but I get this error.

TypeError: The value of a feed cannot be a tf.Tensor object. Acceptable feed values include Python scalars, strings, lists, numpy ndarrays, or TensorHandles.

because compute_features expects a np.array or something like that to feed it to g1.

Is there any way I could do this?

Isaac
  • 814
  • 2
  • 12
  • 25

1 Answers1

0

tensors are symbolic objects, so cannot be provided in a feed_dict , Give it as sess.run(tensor_object)

This is better discussed here:-

https://github.com/tensorflow/tensorflow/issues/3389

Jiss Raphel
  • 312
  • 3
  • 11
  • Thanks for answer! I tried that too but unfortunatelly I cannot run a session inside a graph, because there are some tensors inside that graph that need to be initializated – Isaac Apr 03 '18 at 18:04