You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
I'm trying to use Ollama to load local models with MemGPT, but I'm encountering an AttributeError. The error message indicates that a NoneType object has no attribute model_endpoint_type. This error occurs when I attempt to create a MemGPT agent using the provided configuration.
Please describe your setup
How did you install memgpt?
I installed it using pip install pymemgpt.
Describe your setup
OS: Linux
Running memgpt via Jupyter Notebook
Additional context
I've tried different configurations and combinations, but the error persists. Below is the error traceback:
The error occurs at this line in the memgpt_agent.py file:
ifllm_config["model_endpoint_type"] in ["azure", "openai"] orllm_config["model_endpoint_type"] !=config.default_llm_config.model_endpoint_type:
# Additional code
I've followed the default example for using Ollama with MemGPT.
MemGPT Config
Here is the configuration I'm using:
config_list= [
{
"model": "NULL",
"base_url": "http://localhost:11434/v1",
"api_key": "ollama",
}
]
config_list_memgpt= [
{
"preset": DEFAULT_PRESET,
"model": "llama3.1", # only required for Ollama, see: https://memgpt.readme.io/docs/ollama"context_window": 8192, # the context window of your model (for Mistral 7B-based models, it's likely 8192)"model_wrapper": "chatml", # chatml is the default wrapper"model_endpoint_type": "ollama", # can use webui, ollama, llamacpp, etc."model_endpoint": "http://localhost:11434", # the IP address of your LLM backend
},
]
Local LLM details
If you are trying to run MemGPT with local LLMs, please provide the following information:
The exact model you're trying to use: llama3.1
The local LLM backend you are using: Ollama
Your hardware for the local LLM backend: Local computer running Linux
The text was updated successfully, but these errors were encountered:
Describe the bug
I'm trying to use Ollama to load local models with MemGPT, but I'm encountering an
AttributeError
. The error message indicates that aNoneType
object has no attributemodel_endpoint_type
. This error occurs when I attempt to create a MemGPT agent using the provided configuration.Please describe your setup
pip install pymemgpt
.Additional context
I've tried different configurations and combinations, but the error persists. Below is the error traceback:
The error occurs at this line in the
memgpt_agent.py
file:I've followed the default example for using Ollama with MemGPT.
MemGPT Config
Here is the configuration I'm using:
Local LLM details
If you are trying to run MemGPT with local LLMs, please provide the following information:
llama3.1
The text was updated successfully, but these errors were encountered: