Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Objectosphere is computed on the softmax prediction, but not on the embedding layer #4

Open
siebenkopf opened this issue Jul 13, 2020 · 1 comment
Labels
enhancement New feature or request

Comments

@siebenkopf
Copy link
Member

When I understand the loss function:

pred=K.sqrt(K.sum(K.square(y_pred),axis=1))

correctly, the input to your loss function are the one-hot-encoded target values (y_true) and the estimated softmax probabilities (y_pred). From this y_pred, you compute the magnitude pred for each sample. However, this magnitude should be computed from the output of the layer before the logits, not from the softmax probabilities.

This might be the reason, why there is very little difference between the Entropic Open-Set loss and the Objectosphere loss.

@siebenkopf siebenkopf added the bug Something isn't working label Jul 13, 2020
@siebenkopf
Copy link
Member Author

Sorry, I should always try to read the whole thing before writing a bug report. In

loss={'softmax': 'categorical_crossentropy','fc':ring_loss},

your clearly apply this function on the fc layer, as expected.

You should consider renaming the y_pred parameter of the function call to fc or something alike to avoid confusion.

@siebenkopf siebenkopf added enhancement New feature or request and removed bug Something isn't working labels Jul 13, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant