Replies: 1 comment
-
Right now it uses FastChat to hopefully find the right format. I have a PR open to use the HF chat templates: #1493 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello everyone,
I hope you're all doing well. I wanted to reach out regarding the implementation of the Vllm technology, specifically in the context of the "OpenAI Like" feature. I'm currently exploring how Vllm functions when applied to OpenAI-like tasks. I'm particularly interested in understanding if VLLM can manage the translation from OpenAI message format to, for example, LLAMA message format. Example:
LLama format :
openAI format
prompt = [{"role":"assistant","content":"Hi"},{"role":"user","content":"Hello! How are you?"},{"role":"assistant","content":"I'm great, thanks for asking. Could you help me with a task? "}]
Beta Was this translation helpful? Give feedback.
All reactions