Skip to content

Is CEBRA invariant to permutations in the input channels? #182

Answered by stes
icarosadero asked this question in Q&A
Discussion options

You must be logged in to vote

Hi @icarosadero , you would not get the same embedding, but an embedding with very high consistency. So yes, the ordering does not matter.

This applies if you train and embed from scratch. The input layer is not permutation invariant in the sense that you can train a model, input permuted channels, and expect to get the same embedding.

Does that answer the Q?

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@icarosadero
Comment options

Answer selected by icarosadero
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants