How to add fuctions for Ollama models #1531
-
I've saw in 1.0.0-M3 added possibility to add functions for model, but didn't find how to "to teach" model "hey buddy you you can use this functions" Here's my config @Configuration
class AiModelConfiguration(
private val chatClientBuilder: ChatClient.Builder,
@Value("\${app.prompt.system-message}") val systemMessage: String,
private val ollamaApi: OllamaApi
) {
@Bean
fun chatMemory(): ChatMemory {
return InMemoryChatMemory()
}
@Bean
fun chatClient(): ChatClient {
TODO("ADD TOOL(FUNCTION) CONTEXT TO MODEL")
return chatClientBuilder
.defaultSystem(systemMessage)
.defaultAdvisors(
MessageChatMemoryAdvisor(chatMemory(), DEFAULT_CHAT_MEMORY_CONVERSATION_ID, 5),
SimpleLoggerAdvisor()
)
.build()
}
@Bean
fun vectorStore(embeddingModel: EmbeddingModel): VectorStore {
return SimpleVectorStore(embeddingModel)
}
}
--------------
@Service
class AiDataProvider @Autowired constructor(
@Qualifier("vectorStore") private val vectorStore: VectorStore
) {
private val log = LoggerFactory.getLogger(AiDataProvider::class.java)
init {
log.info("Connected vector store: ${vectorStore.name}")
}
}
-------------
@Configuration
class AiFunctionConfiguration {
// The @Description annotation helps the model understand when to call the function
@Bean
@Description("test function for testing purpose")
fun testFunction(aiDataProvider: AiDataProvider): java.util.function.Function<Any, Any> {
return java.util.function.Function<Any, Any> {
}
}
}
--------------- spring:
ai:
ollama:
base-url: ${OLLAMA_BASE_URL:http://localhost:11434/}
chat:
options:
model: ${OLLAMA_CHAT_MODEL:deepseek-coder-v2:16b}
temperature: ${OLLAMA_CHAT_TEMPERATURE:0.5}
functions: TODO()
app:
prompt:
system-message: ${APP_PROMPT_SYSTEM-MESSAGE:"TODO ADD PROMPT"} |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 5 replies
-
You can either pass the function name via the ChatClient API (example: https://github.com/ThomasVitale/llm-apps-java-spring-ai/blob/main/05-function-calling/function-calling-ollama/src/main/java/com/thomasvitale/ai/spring/ChatService.java) or via configuration properties. If you use properties, based on your example, you would configure the function name as follows, where "testFunction" is the name of the method representing the function you want to make available to the model.
|
Beta Was this translation helpful? Give feedback.
You can either pass the function name via the ChatClient API (example: https://github.com/ThomasVitale/llm-apps-java-spring-ai/blob/main/05-function-calling/function-calling-ollama/src/main/java/com/thomasvitale/ai/spring/ChatService.java) or via configuration properties.
If you use properties, based on your example, you would configure the function name as follows, where "testFunction" is the name of the method representing the function you want to make available to the model.