Loading models from an S3 location instead of local path #3072
petrosbaltzis
started this conversation in
Ideas
Replies: 2 comments
-
My thin opinion: you may wrap the s3 storage with something like juiceFS, or use some kind of mounting technique, making s3 indistinguishable from a regular path. Instead of letting vllm do this compatibility. We use NAS or juiceFS wrapped s3 storage/ceph works fine with vllm😀 |
Beta Was this translation helpful? Give feedback.
0 replies
-
Good point. Moved this to an issue #3090 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello,
The VLLM library gives the ability to load the model and the tokenizer either from a local folder or directly from HuggingFace.
I wonder if this functionality can be extended to support s3 locations so that when we initialize the API server, we pass the proper S3 location.
Petros
Beta Was this translation helpful? Give feedback.
All reactions