Skip to content

destrex271/llmqueue

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LLMQueue

Just a simple experiment to see if I can get postgres to work as a message queue and a vector db to run with some open soruce llm :)

What do you need to run this locally for now?

  • Ollama Docker Image
    docker run -d --gpus=all -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama #(Well if you have a gpu ;))
     # OR
    docker run -d -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama

Things added till now

  • A layer over pgmq, just simple functions to add and get data from a queue
  • An interface over Llama as LlamaInstance struct, can generate responses to pormpts

TBD

  • Put the entire queueing system behind an async server
  • Develop a frontend to interact with multiple bots

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages