60

Training an image classifier using .fit_generator() or .fit() and passing a dictionary to class_weight= as an argument.

I never got errors in TF1.x but in 2.1 I get the following output when starting training:

WARNING:tensorflow:sample_weight modes were coerced from
  ...
    to  
  ['...']

What does it mean to coerce something from ... to ['...']?

The source for this warning on tensorflow's repo is here, comments placed are:

Attempt to coerce sample_weight_modes to the target structure. This implicitly depends on the fact that Model flattens outputs for its internal representation.

jorijnsmit
  • 3,893
  • 4
  • 24
  • 48

4 Answers4

21

This seems like a bogus message. I get the same warning message after upgrading to TensorFlow 2.1, but I do not use any class weights or sample weights at all. I do use a generator that returns a tuple like this:

return inputs, targets

And now I just changed it to the following to make the warning go away:

return inputs, targets, [None]

I don't know if this is relevant, but my model uses 3 inputs, so my inputs variable is actually a list of 3 numpy arrays. targets is just a single numpy array.

In any case, it's just a warning. The training works fine either way.

Edit for TensorFlow 2.2:

This bug seems to have been fixed in TensorFlow 2.2, which is great. However the fix above will fail in TF 2.2, because it will try to get the shape of the sample weights, which will obviously fail with AttributeError: 'NoneType' object has no attribute 'shape'. So undo the above fix when upgrading to 2.2.

jlh
  • 3,182
  • 31
  • 41
18

I believe this is a bug with tensorflow that will happen when you call model.compile() with default parameter sample_weight_mode=None and then call model.fit() with specified sample_weight or class_weight.

From the tensorflow repos:

  • fit() eventually calls _process_training_inputs()
  • _process_training_inputs() sets sample_weight_modes = [None] based on model.sample_weight_mode = None and then creates a DataAdapter with sample_weight_modes = [None]
  • the DataAdapter calls broadcast_sample_weight_modes() with sample_weight_modes = [None] during initialization
  • broadcast_sample_weight_modes() seems to expect sample_weight_modes = None but receives [None]
  • it asserts that [None] is a different structure from sample_weight / class_weight, overwrites it back to None by fitting to the structure of sample_weight / class_weight and outputs a warning

Warning aside this has no effect on fit() as sample_weight_modes in the DataAdapter is set back to None.

Note that tensorflow documentation states that sample_weight must be a numpy-array. If you call fit() with sample_weight.tolist() instead, you will not get a warning but sample_weight is silently overwritten to None when _process_numpy_inputs() is called in preprocessing and receives an input of length greater than one.

Max
  • 418
  • 3
  • 11
  • 1
    A very thorough explanation, thanks. Only thing I don't understand is that the warning describes `...` being coerced to `[...]`, whereas in your case `[None]` is coerced to `None`... – jorijnsmit Feb 19 '20 at 06:16
7

I have taken your Gist and installed Tensorflow 2.0, instead of TFA and it worked without any such Warning.

Here is the Gist of the complete code. Code for installing the Tensorflow is shown below:

!pip install tensorflow==2.0

Screenshot of the successful execution is shown below:

enter image description here

Update: This bug is fixed in Tensorflow Version 2.2.

Tensorflow Support
  • 5,091
  • 1
  • 21
  • 52
  • 5
    Thank you for your response. You are right, the warning message is not introduced until version `2.1.0rc0`. However, I'm afraid my question remains: "What does it mean to coerce something from `...` to `['...']`?" – jorijnsmit Dec 19 '19 at 10:41
  • 3
    I noticed that some probably unintended stuff happens when `sample_weight_mode=None` and `target_structure` is of type `dict`, `sample_weight_modes` is then `[None]` and the exception in `broadcast_sample_weight_modes` is caught due to the `dict`. Can this be considered as a bug? – Franz Knülle Dec 19 '19 at 15:18
  • @FranzKnülle could you elaborate on this in an answer? Not sure if it is the answer to my question but at least worth the bounty! – jorijnsmit Dec 21 '19 at 10:09
  • @jorijnsmit Did you find any solution to this warning? I've installed the '2.1.0' and also seeing same warning. – Snehal Jan 30 '20 at 22:28
  • 2
    Nope. Question keeps gathering views and upvotes but no answers. – jorijnsmit Jan 30 '20 at 22:29
  • I had a look and came to the same conclusion as @FranzKnülle - think it is a bug – gkennos Feb 06 '20 at 02:07
  • 1
    @gkennos: If you feel it is a bug, Can you file a Bug in Github Tensorflow Repository. – Tensorflow Support Feb 17 '20 at 05:00
  • 1
    It's definitely a bug, but it's now fixed in TensorFlow 2.2 – jlh Mar 23 '20 at 11:11
3

instead of providing a dictionary

weights = {'0': 42.0, '1': 1.0}

i tried a list

weights = [42.0, 1.0]

and the warning disappeared.

0-_-0
  • 687
  • 8
  • 11