Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

can't load reranker-v2-minicpm-layerwise model #721

Open
AlexYoung757 opened this issue Apr 24, 2024 · 6 comments · May be fixed by #1059
Open

can't load reranker-v2-minicpm-layerwise model #721

AlexYoung757 opened this issue Apr 24, 2024 · 6 comments · May be fixed by #1059

Comments

@AlexYoung757
Copy link

AlexYoung757 commented Apr 24, 2024

when uses code like this:

from FlagEmbedding import LayerWiseFlagLLMReranker

reranker = LayerWiseFlagLLMReranker('/path/bge-reranker-v2-minicpm-layerwise', use_fp16=True) 

score = reranker.compute_score(['query', 'passage'], cutoff_layers=[28]) # Adjusting 'cutoff_layers' to pick which layers are used for computing the score.
print(score)

environment

transformer==4.38.2
FlagEmbedding==1.2.9
sentence-transformers==2.6.1

so, the error as follow:
Could not locate the configuration_minicpm_reranker.py inside BAAI/bge-reranker-v2-minicpm-layerwise.

OSError: We couldn't connect to 'https://huggingface.co' to load this file, couldn't find it in the cached files and it looks like BAAI/bge-reranker-v2-minicpm-layerwise is not the path to a directory containing a file named configuration_minicpm_reranker.py.
Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.

But i have download the minicpm-layerwise model, and set the path. anyone have same question?

@ChengRuiLiang
Copy link

I had the same problem

@545999961
Copy link
Collaborator

If you download reranker-v2-minicpm-layerwise, you can load it with the following method:

  1. make sure configuration_minicpm_reranker.py and modeling_minicpm_reranker.py in /path/bge-reranker-v2-minicpm-layerwise.
  2. modify the following part of config.json:
"auto_map": {
    "AutoConfig": "configuration_minicpm_reranker.LayerWiseMiniCPMConfig",
    "AutoModel": "modeling_minicpm_reranker.LayerWiseMiniCPMModel",
    "AutoModelForCausalLM": "modeling_minicpm_reranker.LayerWiseMiniCPMForCausalLM"
  },

@akkikiki
Copy link

akkikiki commented May 5, 2024

@545999961 Why are we turning off direct download from Huggingface in Commit 65cd70d?
If we remove local_files_only=True then it works and having local file only seems quite inconvenient.

@545999961
Copy link
Collaborator

@545999961 Why are we turning off direct download from Huggingface in Commit 65cd70d? If we remove local_files_only=True then it works and having local file only seems quite inconvenient.

I'm sorry that my negligence caused this problem, this part has been modified now, thank you for the reminder。

@akkikiki
Copy link

akkikiki commented May 7, 2024

@545999961 Thanks a lot for the fix!

@Zhouziyi828
Copy link

我现在还是会遇到这个问题,由于网络证书的原因我无法从huggingface上下载,所以我把他下载到了本地,但是报错信息显示他还是从huggingface下载上的,这怎么解决?

@545999961 Why are we turning off direct download from Huggingface in Commit 65cd70d? If we remove local_files_only=True then it works and having local file only seems quite inconvenient.

I'm sorry that my negligence caused this problem, this part has been modified now, thank you for the reminder。

我现在还是会遇到这个问题,由于网络证书的原因我无法从huggingface上下载,所以我把他下载到了本地,但是报错信息显示他还是从huggingface下载上的,这怎么解决?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants