Questions tagged [tf.keras]

[tf.keras] is TensorFlow's implementation of the Keras API specification. Use the tag for questions specific to this TensorFlow module. You might also add the tag [keras] to your question since it has the same API.

is TensorFlow's implementation of the Keras API specification. Use the tag for questions specific to this TensorFlow module. You might also add the -tag to your question since it has the same API.

gives you the power and flexibility of Keras within TensorFlow (see the docs with a lot of examples). The API of is described here

1494 questions
60
votes
4 answers

WARNING:tensorflow:sample_weight modes were coerced from ... to ['...']

Training an image classifier using .fit_generator() or .fit() and passing a dictionary to class_weight= as an argument. I never got errors in TF1.x but in 2.1 I get the following output when starting training: WARNING:tensorflow:sample_weight modes…
jorijnsmit
  • 3,893
  • 4
  • 24
  • 48
44
votes
18 answers

How to fix "AttributeError: module 'tensorflow' has no attribute 'get_default_graph'"?

I am trying to run some code to create an LSTM model but i get an error: AttributeError: module 'tensorflow' has no attribute 'get_default_graph' My code is as follows: from keras.models import Sequential model = Sequential() model.add(Dense(32,…
Alice
  • 513
  • 1
  • 4
  • 7
33
votes
2 answers

Custom TensorFlow Keras optimizer

Suppose I want to write a custom optimizer class that conforms to the tf.keras API (using TensorFlow version>=2.0). I am confused about the documented way to do this versus what's done in implementations. The documentation for…
19
votes
3 answers

Should I use @tf.function for all functions?

An official tutorial on @tf.function says: To get peak performance and to make your model deployable anywhere, use tf.function to make graphs out of your programs. Thanks to AutoGraph, a surprising amount of Python code just works with …
problemofficer
  • 1,502
  • 2
  • 16
  • 29
19
votes
6 answers

model.summary() can't print output shape while using subclass model

This is the two methods for creating a keras model, but the output shapes of the summary results of the two methods are different. Obviously, the former prints more information and makes it easier to check the correctness of the network. import…
Gary
  • 533
  • 3
  • 13
18
votes
2 answers

Save model every 10 epochs tensorflow.keras v2

I'm using keras defined as submodule in tensorflow v2. I'm training my model using fit_generator() method. I want to save my model every 10 epochs. How can I achieve this? In Keras (not as a submodule of tf), I can give…
Nagabhushan S N
  • 4,063
  • 5
  • 26
  • 54
17
votes
1 answer

Keras - Validation Loss and Accuracy stuck at 0

I am trying to train a simple 2 layer Fully Connected neural net for Binary Classification in Tensorflow keras. I have split my data into Training and Validation sets with a 80-20 split using sklearn's train_test_split(). When I call…
Animesh Sinha
  • 585
  • 5
  • 18
17
votes
4 answers

How to graph tf.keras model in Tensorflow-2.0?

I upgraded to Tensorflow 2.0 and there is no tf.summary.FileWriter("tf_graphs", sess.graph). I was looking through some other StackOverflow questions on this and they said to use tf.compat.v1.summary etc. Surely there must be a way to graph and…
17
votes
1 answer

RNN in Tensorflow vs Keras, depreciation of tf.nn.dynamic_rnn()

My question is: Are the tf.nn.dynamic_rnn and keras.layers.RNN(cell) truly identical as stated in docs? I am planning on building an RNN, however, it seems that tf.nn.dynamic_rnn is depricated in favour of Keras. In particular, it states…
GRS
  • 2,303
  • 2
  • 26
  • 60
15
votes
4 answers

What is meant by sequential model in Keras

I have recently started working Tensorflow for deep learning. I found this statement model = tf.keras.models.Sequential() bit different. I couldn't understand what is actually meant and is there any other models as well for deep learning? I worked …
Aadnan Farooq A
  • 500
  • 3
  • 7
  • 17
14
votes
1 answer

How to perform gradient accumulation WITH distributed training in TF 2.0 / 1.14.0-eager and custom training loop (gradient tape)?

Background: I have a model and I'm trying to port it to TF 2.0 to get some sweet eager execution, but I just can't seem to figure out how to do distributed training (4 GPU's) AND perform gradient accumulation at the same time. Problem: I need to be…
13
votes
2 answers

What is the difference between MaxPool and MaxPooling layers in Keras?

I just started working with Keras and noticed that there are two layers with very similar names for max pooling: MaxPool and MaxPooling. I was surprised that I couldn't find the difference between these two on Google; so I am wondering what the…
Ken
  • 459
  • 7
  • 15
13
votes
7 answers

Keras that does not support TensorFlow 2.0. We recommend using `tf.keras`, or alternatively, downgrading to TensorFlow 1.14

I am having an error regarding (Keras that does not support TensorFlow 2.0. We recommend using tf.keras, or alternatively, downgrading to TensorFlow 1.14.) any recommendations. thanks import keras #For building the Neural Network layer by…
Dean
  • 187
  • 1
  • 3
  • 11
13
votes
2 answers

Running the Tensorflow 2.0 code gives 'ValueError: tf.function-decorated function tried to create variables on non-first call'. What am I doing wrong?

error_giving_notebook non_problematic_notebook As it can be seen that I have used tf.function decorator in the 'error_giving_notebook' and it throws a ValueError while the same notebook without any changes except for removing the tf.function…
Gaurav Singh
  • 155
  • 1
  • 2
  • 8
13
votes
4 answers

from_logits=True and from_logits=False get different training result for tf.losses.CategoricalCrossentropy for UNet

I am doing the image semantic segmentation job with unet, if I set the Softmax Activation for last layer like this: ... conv9 = Conv2D(n_classes, (3,3), padding = 'same')(conv9) conv10 = (Activation('softmax'))(conv9) model = Model(inputs,…
tidy
  • 3,887
  • 5
  • 36
  • 74
1
2 3
99 100