You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The state of the batch_norm layers (mean and var) are never updated ! They stay all 0 and all 1, but only when they are on the "right path" of the network.
I have noticed a really strange behaviour that appears to be a bug. Here is a Livebook to demonstrate the bug:
debugging-batch-norm
Batch norm layers not updated
So I have noticed this weird bug.
The state of the batch_norm layers (mean and var) are never updated ! They stay all 0 and all 1, but only when they are on the "right path" of the network.
Example:
So here, in this configuration, batch_norm state is updated
Now let's build an example where it's not:
So here batch_norm_0 state is never returned nor updated.
But when the batch_norm is "on the left of the network", it works:
I would be happy to help fixing this but I'm not yet very familiar with all the internals of Axon and have limited times 😊
The text was updated successfully, but these errors were encountered: