Just a simple experiment to see if I can get postgres to work as a message queue and a vector db to run with some open soruce llm :)
- Ollama Docker Image
docker run -d --gpus=all -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama #(Well if you have a gpu ;)) # OR docker run -d -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama
- A layer over pgmq, just simple functions to add and get data from a queue
- An interface over Llama as LlamaInstance struct, can generate responses to pormpts
- Put the entire queueing system behind an async server
- Develop a frontend to interact with multiple bots