Fix masked losses.

Masked losses with the default "auto" reduction were giving outputs that are
inconsistent with what you would get from a ragged input. Masked and Ragged are two different representations of the same thing (when it can be represented as ragged).  These should match.

The (input_type='masked', reduction='auto') case fails (doesn't match the ragged case) before this change.

The existing tests, where I'm changing the expected values are because I believe the old values are incorrect.

PiperOrigin-RevId: 493003876
This commit is contained in:
Mark Daoust 2022-12-05 07:39:19 -08:00 committed by TensorFlower Gardener
parent 1657534078
commit 6543868155

View File

@ -14,14 +14,28 @@
* Using functools.wraps on a function with different signature
* Using functools.partial with an invalid tf.function input
* `tfconfig.experimental.enable_mlir_graph_optimization`:
* `tf.config.experimental.enable_mlir_graph_optimization`:
* Experimental API removed.
* `tfconfig.experimental.disable_mlir_graph_optimization`:
* `tf.config.experimental.disable_mlir_graph_optimization`:
* Experimental API removed.
* `tf.keras`
* Improvements and fixes in Keras loss masking:
* Whether you represent a ragged tensor as a `tf.RagedTensor` or using
[keras masking](https://www.tensorflow.org/guide/keras/masking_and_padding),
the returned loss values should be the identical to each other.
In previous versions Keras may have silently ignored the mask.
* If you use masked losses with Keras the loss values may be different
in TensorFlow `2.12` compared to previous versions.
* In cases where the mask was previously ignored, you will now get
an error if you pass a mask with an incompatible shape.
# Known Caveats
* <CAVEATS REGARDING THE RELEASE (BUT NOT BREAKING CHANGES).>