Model Options Support #288
Replies: 1 comment
-
I have some concerns regarding Options-related logic. Currently at startup time, a (default) Options is created, and at runtime, a (runtime) Options is created, then some logic will build (native) Options from these two Options using different merge methods.
I believe that writing logic to merge is error prone especially when copying values from different Options types (e.g. ChatOptionsImpl/PortableFunctionCallingOptions/AzureOpenAiChatOptions) to (native) Options (i.e. ChatCompletionsOptions). Also it is impossible to clear a default value at runtime (i.e. use a null value at runtime instead of a non-null default value). Can we reconsider how to support portable options (without merging) to avoid these issues? Thanks. |
Beta Was this translation helpful? Give feedback.
-
With the new
ModelOptions
mechanism you can set any model-specific configurations on startup or at runtime.For example:
seed
,model
,temperature
,logitBias
,topP
… in case OpenAIChatClient ornum-thread
,top-k
,num-gpu
... for the OllamaChatClient.Use the auto-configuration
options
properties to set the default (e.g. start-up) options common for all model requests.Example for OpenAiChatClient:
or use the dedicated options builders, to create the options programmatically.
NOTE: The runtime options override the default one.
For more information, check the updated openai-chat , and openai-embeddings documentations.
Gradually we will enable the options support to all model clients.
The Azure OpenAI Chat/Embeddings Options PR is underway: #286 and the OllamaChatClient has a partial support.
Following diagram illustrates the data (and processing) flow for all model types:
Beta Was this translation helpful? Give feedback.
All reactions