Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Wrong context_overflow_policy for LM Studio #1782

Open
auterak opened this issue Sep 24, 2024 · 2 comments
Open

Wrong context_overflow_policy for LM Studio #1782

auterak opened this issue Sep 24, 2024 · 2 comments

Comments

@auterak
Copy link

auterak commented Sep 24, 2024

Description
Letta seems to be creating requests for LM studio with context_overflow_policy set to 0.

"lmstudio": {
    "context_overflow_policy": 0
  },

Expected values seem to be 'stopAtLimit' | 'truncateMiddle' | 'rollingWindow' as seen in the error from LM Studio:

2024-09-24 20:14:35 [ERROR]
Field with key llm.prediction.contextOverflowPolicy does not satisfy the schema:[
  {
    "expected": "'stopAtLimit' | 'truncateMiddle' | 'rollingWindow'",
    "received": "number",
    "code": "invalid_type",
    "path": [],
    "message": "Expected 'stopAtLimit' | 'truncateMiddle' | 'rollingWindow', received number"
  }
]. Error Data: n/a, Additional Data: n/a

Setup

  • Running Letta via Docker following instructions at https://docs.letta.com/docker.
  • LM Studio 0.3.2, server set to
    • stop at limit
    • empty template
    • context 8192
@Dante0710
Copy link

Description Letta seems to be creating requests for LM studio with context_overflow_policy set to 0.

"lmstudio": {
    "context_overflow_policy": 0
  },

Expected values seem to be 'stopAtLimit' | 'truncateMiddle' | 'rollingWindow' as seen in the error from LM Studio:

2024-09-24 20:14:35 [ERROR]
Field with key llm.prediction.contextOverflowPolicy does not satisfy the schema:[
  {
    "expected": "'stopAtLimit' | 'truncateMiddle' | 'rollingWindow'",
    "received": "number",
    "code": "invalid_type",
    "path": [],
    "message": "Expected 'stopAtLimit' | 'truncateMiddle' | 'rollingWindow', received number"
  }
]. Error Data: n/a, Additional Data: n/a

Setup

  • Running Letta via Docker following instructions at https://docs.letta.com/docker.

  • LM Studio 0.3.2, server set to

    • stop at limit
    • empty template
    • context 8192

there is a script called api.py in my anaconda3/envs/myenv/lib/site-packages/memgpt/local_llm/lmstudio/api.py

and it has a few lines that say this:

This controls how LM studio handles context overflow

        # In MemGPT we handle this ourselves, so this should be disabled
        # "context_overflow_policy": 0,
        "lmstudio": {"context_overflow_policy": 0},  # 0 = stop at limit

i deleted those lines and saved the file and it started working.

@auterak
Copy link
Author

auterak commented Oct 2, 2024

Yeah, I found that but then had some problems getting the Docker environment work in development mode (some config files missing) and haven't had time to play with it further.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: To triage
Development

No branches or pull requests

2 participants