Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

关于微调bge-reranker的max_len问题 #1126

Open
abc123456cxx opened this issue Sep 27, 2024 · 1 comment
Open

关于微调bge-reranker的max_len问题 #1126

abc123456cxx opened this issue Sep 27, 2024 · 1 comment

Comments

@abc123456cxx
Copy link

我看训练参数中max_len设置为512,我现在希望可以微调bge_reranker_large模型,希望能达到2k的长度,我直接修改max_len为2k就行了嘛,但我发现训练完之后模型config中max_position_embeddings还是512+2,想问下如果想微调到2k的话 还应该修改哪个参数呢

@545999961
Copy link
Collaborator

bge_reranker_large不支持2k的长度,可以用bge-reranker-v2-m3进行微调

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants