You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It seems that the Perplexity settings in mods --settings are not up to date with the current perplexity API. According to their docs Perplexity have switched to llama 3.1 models, but it appears that mods settings are still configured to use llama 3.
For instance, the context allowed is only 8K rather than the 128K context allowed with llama 3.1.
Please let me know if we can resolve this or if I'm misunderstanding something about how the settings are configured.
The text was updated successfully, but these errors were encountered:
Good day,
It seems that the Perplexity settings in
mods --settings
are not up to date with the current perplexity API. According to their docs Perplexity have switched to llama 3.1 models, but it appears that mods settings are still configured to use llama 3.For instance, the context allowed is only 8K rather than the 128K context allowed with llama 3.1.
Please let me know if we can resolve this or if I'm misunderstanding something about how the settings are configured.
The text was updated successfully, but these errors were encountered: