Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OpenAI dependency #19

Open
DanielChico opened this issue Jul 9, 2024 · 1 comment
Open

OpenAI dependency #19

DanielChico opened this issue Jul 9, 2024 · 1 comment

Comments

@DanielChico
Copy link

Hello, I am facing this issue:
File "/app/app/src/router/__init__.py", line 5, in <module> from .gateway.router import router as gateway_router File "/app/app/src/router/gateway/router.py", line 4, in <module> from routellm.controller import Controller File "/usr/local/lib/python3.12/site-packages/routellm/controller.py", line 10, in <module> from routellm.routers.routers import ROUTER_CLS File "/usr/local/lib/python3.12/site-packages/routellm/routers/routers.py", line 17, in <module> from routellm.routers.matrix_factorization.model import MODEL_IDS, MFModel File "/usr/local/lib/python3.12/site-packages/routellm/routers/matrix_factorization/model.py", line 4, in <module> from routellm.routers.similarity_weighted.utils import OPENAI_CLIENT File "/usr/local/lib/python3.12/site-packages/routellm/routers/similarity_weighted/utils.py", line 11, in <module> OPENAI_CLIENT = OpenAI() ^^^^^^^^ File "/usr/local/lib/python3.12/site-packages/openai/_client.py", line 104, in __init__ raise OpenAIError( openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable
Of course, I can set the environment variable, but I don't want to depend on it. It would be preferable to have a way to set the base_url, model, and api_key for this client as well.

@DanielChico DanielChico changed the title OpenAI dependance OpenAI dependency Jul 9, 2024
@iojw
Copy link
Collaborator

iojw commented Jul 9, 2024

Thank you for raising this! We are aware of this and are actively looking into it. 2 points:

  1. Currently, OpenAI's client is required for generating embeddings for both the mf and sw_ranking routers, but not the classifiers. So, if you use the bert router for example and not any OpenAI models, you can just set the OpenAI key to a dummy value as a temporary workaround. I'll work on fixing this so that you don't need to do this if you don't require embeddings.

  2. The above is just for generating embeddings for mf and sw_ranking. You can set the base URL and API key for the actual LLM calls in the controller or server config.

I understand this can be confusing, will try to make this clearer. Let me know if you have any questions!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants