Skip to content

求助: 4060TI 16G 部署无论怎么调整max_split_size_mb都会在加载第四个checkpoint后报 CUDA out of memory. #1068

Closed Answered by luguoj
luguoj asked this question in Q&A
Discussion options

You must be logged in to vote

解决了。。。使用的是cli_demo.py

model = AutoModel.from_pretrained(MODEL_PATH, trust_remote_code=True, device_map="auto").eval()

这行的device_map="auto"删掉,不需要设置max_split_size_mb直接过了

Replies: 2 comments

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Answer selected by luguoj
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
1 participant