-
Notifications
You must be signed in to change notification settings - Fork 239
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
OpenAIError: The api_key client option must be set #38
Comments
I tried with clean conda environment, same result. |
Are you using any IDEs? |
set it to environment variable by running this in your terminal: |
Yes, please try setting the OpenAI API key before running. |
@DmitriyG228 , you should revoke the key that you pasted above. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
while running basic example I get this error
`
import os
from routellm.controller import Controller
os.environ["OPENAI_API_KEY"] = 'my api'
client = Controller(
routers=["mf"],
strong_model="gpt-4o",
weak_model="gpt-4o-mini",
)
`
'---------------------------------------------------------------------------
OpenAIError Traceback (most recent call last)
Cell In[1], line 2
1 import os
----> 2 from routellm.controller import Controller
4 os.environ["OPENAI_API_KEY"] = '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'
7 client = Controller(
8 routers=["mf"],
9 strong_model="gpt-4o",
10 weak_model="gpt-4o-mini",
11 )
File ~/playground/agent_2906/0507/RouteLLM/routellm/controller.py:10
7 from litellm import acompletion, completion
8 from tqdm import tqdm
---> 10 from routellm.routers.routers import ROUTER_CLS
12 # Default config for routers augmented using golden label data from GPT-4.
13 # This is exactly the same as config.example.yaml.
14 GPT_4_AUGMENTED_CONFIG = {
15 "sw_ranking": {
16 "arena_battle_datasets": [
(...)
27 "mf": {"checkpoint_path": "routellm/mf_gpt4_augmented"},
28 }
File ~/playground/agent_2906/0507/RouteLLM/routellm/routers/routers.py:17
12 from routellm.routers.causal_llm.llm_utils import (
13 load_prompt_format,
14 to_openai_api_messages,
15 )
16 from routellm.routers.causal_llm.model import CausalLLMClassifier
---> 17 from routellm.routers.matrix_factorization.model import MODEL_IDS, MFModel
18 from routellm.routers.similarity_weighted.utils import (
19 OPENAI_CLIENT,
20 compute_elo_mle_with_tie,
21 compute_tiers,
22 preprocess_battles,
23 )
26 def no_parallel(cls):
File ~/playground/agent_2906/0507/RouteLLM/routellm/routers/matrix_factorization/model.py:4
1 import torch
2 from huggingface_hub import PyTorchModelHubMixin
----> 4 from routellm.routers.similarity_weighted.utils import OPENAI_CLIENT
6 MODEL_IDS = {
7 "RWKV-4-Raven-14B": 0,
8 "alpaca-13b": 1,
(...)
70 "zephyr-7b-beta": 63,
71 }
74 class MFModel(torch.nn.Module, PyTorchModelHubMixin):
File ~/playground/agent_2906/0507/RouteLLM/routellm/routers/similarity_weighted/utils.py:11
8 from sklearn.linear_model import LogisticRegression
10 choices = ["A", "B", "C", "D"]
---> 11 OPENAI_CLIENT = OpenAI()
14 def compute_tiers(model_ratings, num_tiers):
15 n = len(model_ratings)
File ~/anaconda3/envs/langchain/lib/python3.11/site-packages/openai/_client.py:105, in OpenAI.init(self, api_key, organization, project, base_url, timeout, max_retries, default_headers, default_query, http_client, _strict_response_validation)
103 api_key = os.environ.get("OPENAI_API_KEY")
104 if api_key is None:
--> 105 raise OpenAIError(
106 "The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable"
107 )
108 self.api_key = api_key
110 if organization is None:
OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable'
The text was updated successfully, but these errors were encountered: