You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It seems that with TensorFlow 2.7, the models that are no longer exporting with TensorFlow Lite are:
RNN-based models with Luong Attention decoder
Relative Transformer models
I tried fixing the error with the RNN models by changing the output shape, and attention shape in the decoding while loop back to tf.TensorShape(()), but that didn't seem to fix it.
I don't really know how to fix the error with Relative Transformer models, it's a strange error.
The good thing about TensorFlow 2.7 is that quantization now works for Transformer models.
guillaumekln
changed the title
TensorFlow Lite tests failing with TensorFlow 2.7
TensorFlow Lite tests failing with TensorFlow 2.7+
Mar 10, 2022
The following test cases are failing with TensorFlow 2.7 and are currently disabled:
cc @gcervantes8
The text was updated successfully, but these errors were encountered: