You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
correctly, the input to your loss function are the one-hot-encoded target values (y_true) and the estimated softmax probabilities (y_pred). From this y_pred, you compute the magnitude pred for each sample. However, this magnitude should be computed from the output of the layer before the logits, not from the softmax probabilities.
This might be the reason, why there is very little difference between the Entropic Open-Set loss and the Objectosphere loss.
The text was updated successfully, but these errors were encountered:
When I understand the loss function:
Reducing-Network-Agnostophobia/MNIST/Mnist_Training.py
Line 69 in 581b3f0
correctly, the input to your loss function are the one-hot-encoded target values (
y_true
) and the estimated softmax probabilities (y_pred
). From thisy_pred
, you compute the magnitudepred
for each sample. However, this magnitude should be computed from the output of the layer before the logits, not from the softmax probabilities.This might be the reason, why there is very little difference between the Entropic Open-Set loss and the Objectosphere loss.
The text was updated successfully, but these errors were encountered: