Replies: 1 comment
-
This is solved. I had to do the following:
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
My question is, what embedding model should I use for Ollama with PGVector? I'm getting the following error testing embedding for the first time. I believe PGVector is limited to 2096.
Caused by: org.postgresql.util.PSQLException: ERROR: expected 1536 dimensions, not 4096
This is my vector store.
The code is straight out of a Josh Long video, except Josh uses OpenAI, and I'm using Ollama running the Llama3.1 model. I know from testing that I'm getting the OllamaEmbeddingModel by default, which uses Mistral.
My application properties include
spring.ai.ollama.embedding.enabled=true
. I presume the fix is to addspring.ai.ollama.embedding.model
set to some value other than `mistral,' but I don't know what to set it to.Beta Was this translation helpful? Give feedback.
All reactions