-
Notifications
You must be signed in to change notification settings - Fork 33
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Wrong training procedure? #237
Comments
Hmm, your pretrained model does not have weights for Did you set Still, maybe I am wrong and |
However, as I wrote in #118 (comment) ** in the continued/restarted runs I used the first model as
Any clarification is highly appreciated! |
I think you are free to ignore these messages. I imagine this happens because somewhere during loading of the model, |
I have to admit that I am not particularly familiar enough with the underyling |
The example code you cited uses mean pooling on the token embeddings from the model's last transformer block. This doesn't require |
Closing this, feel free to re-open if you are still having issues. |
I trained an extension of model
sentence-transformers/paraphrase-multilingual-mpnet-base-v2
(see #235).After training I used the script save_pretrained_hf.py in order to convey it to a HuggingFace Transformer-compatible format.
When I now run the example code for mean-pooling embeddings I get the following warning (
output_bs32_ep20_export
is my exported model):Any idea why this occurs? Is it true what the warning says or can I ignore it?
The text was updated successfully, but these errors were encountered: