-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix: add requirement to LlamaIndex instrumentation import #110
Conversation
literalai/client.py
Outdated
@@ -25,6 +24,12 @@ | |||
) | |||
from literalai.observability.thread import ThreadContextManager, thread_decorator | |||
|
|||
from literalai.requirements import check_all_requirements | |||
|
|||
LLAMA_INDEX_REQUIREMENT = ["llama-index>=0.10.58"] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am surprised mypy is not complaining since it is conditional and we use this function in the code. Why not move this logic in the client function?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I was very surprised too.
Much better to move it to inside the method itself.
Side note, I noticed openai & mistralai instrumentations make sure of instrumenting only once with logic like:
global is_mistralai_instrumented
if is_mistralai_instrumented:
return
I wonder whether we should do that with LlamaIndex.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
not sure, i think the llama index one is global by default?
f"LlamaIndex instrumentation requirements not satisfied: {LLAMA_INDEX_REQUIREMENT}" | ||
) | ||
from literalai.instrumentation.llamaindex import instrument_llamaindex | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
did you try that everything is still working? we could alias the import to avoid functions with the exact same name
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I didn't try to instrument llama index, just the regular literal client.
I'll run a couple of cookbooks and alias the import that's more neat.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
To test:
pip install .
python
from literalai import LiteralClient