Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ability to run backend in Langgraph cloud #76

Open
dividor opened this issue Oct 30, 2024 · 4 comments
Open

Ability to run backend in Langgraph cloud #76

dividor opened this issue Oct 30, 2024 · 4 comments

Comments

@dividor
Copy link

dividor commented Oct 30, 2024

I love the FastAPI server, but I'd actually like to run the backend in LangGraph cloud, as this is quite convenient for rapid prototyping. Langgraph cloud is in beta, limited to one deployment, but would be cool to have it working for when they do open it up.

The good news is that the back runs nicely in Langraph cloud, I connected the repo and it picked up the langgraph.json nicely and just worked. I tested in Langgraph cloud, all fine and dandy. Yay!

Now I would like to connect the streamlit app to the cloud version. I can get it to authnticate, if I add X-Api-Key to the header in client.py ...

        if self.auth_secret:
            headers["Authorization"] = f"Bearer {self.auth_secret}"
            headers["X-Api-Key"]: self.auth_secret

Then in my .env set ...

AGENT_URL=https://
AUTH_SECRET=

I know it's at least authenticating, because previously I got a different error, but It's not quite working as I'm not sure I have the right URL path for the streamlit code ...

Traceback (most recent call last):
  File "/Users/matthewharris/Desktop/git/pia-researcher/.venv/lib/python3.11/site-packages/streamlit/runtime/scriptrunner/exec_code.py", line 85, in exec_func_with_error_handling
    result = func()
             ^^^^^^
  File "/Users/matthewharris/Desktop/git/pia-researcher/.venv/lib/python3.11/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 576, in code_to_exec
    exec(code, module.__dict__)
  File "/Users/matthewharris/Desktop/git/pia-researcher/src/streamlit_app.py", line 310, in <module>
    asyncio.run(main())
  File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/asyncio/runners.py", line 190, in run
    return runner.run(main)
           ^^^^^^^^^^^^^^^^
  File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/asyncio/runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/asyncio/base_events.py", line 653, in run_until_complete
    return future.result()
           ^^^^^^^^^^^^^^^
  File "/Users/matthewharris/Desktop/git/pia-researcher/src/streamlit_app.py", line 148, in main
    await draw_messages(stream, is_new=True)
  File "/Users/matthewharris/Desktop/git/pia-researcher/src/streamlit_app.py", line 196, in draw_messages
    while msg := await anext(messages_agen, None):
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/matthewharris/Desktop/git/pia-researcher/src/client/client.py", line 200, in astream
    raise Exception(f"Error: {response.status_code} - {response.text}")
                                                       ^^^^^^^^^^^^^
  File "/Users/matthewharris/Desktop/git/pia-researcher/.venv/lib/python3.11/site-packages/httpx/_models.py", line 574, in text
    content = self.content
              ^^^^^^^^^^^^
  File "/Users/matthewharris/Desktop/git/pia-researcher/.venv/lib/python3.11/site-packages/httpx/_models.py", line 568, in content
    raise ResponseNotRead()
httpx.ResponseNotRead: Attempted to access streaming response content, without having called `read()`.

AGENT_URL=https://<MY DEPLOYMENT STUFF>.default.us.langgraph.app
AGENT_URL=https://<MY DEPLOYMENT STUFF>.default.us.langgraph.app/runs/
AGENT_URL=https://<MY DEPLOYMENT STUFF>.default.us.langgraph.app/runs/stream

I have another app - terrible, agent-service toolkit is MUCH better - which is able to call langgraph cloud from streamlit, so I know it can work.

Any thoughts please on what the URL should be, or if there is something else I could be doing?

Thanks!

@dividor
Copy link
Author

dividor commented Oct 30, 2024

Langgraph API reference in case it's useful: https://langchain-ai.github.io/langgraph/cloud/reference/api/api_ref.html

@dividor
Copy link
Author

dividor commented Oct 30, 2024

Here is how I get streaming from the deployed Langgraph cloud backend in a different streamlit app I have ...

async def main(message: cl.Message):
    response = requests.post(
        LANGGRAPH_APP_URL,
        headers={
            "Content-Type": "application/json",
            "X-Api-Key": LANGCHAIN_API_KEY,
        },
        json={
            "assistant_id": "agent",

            "input": {
                "input": message.content,
                "chat_history": []
            },
            "metadata": {},
            "config": {"configurable": {}},
            "multitask_strategy": "reject",
            "stream_mode": ["values"],
        },
        stream=True
    )

    try:
        response.raise_for_status()

        msg = cl.Message(content="")
        await msg.send()

        message_strings = {}
        for line in response.iter_lines():
            if line:
                line = line.decode('utf-8')
                if line.startswith("event:"):
                    event_type = line.split(":", 1)[1].strip()
                elif line.startswith("data:"):
                    data = parse_sse(line)
                    if data and "intermediate_steps" in data:
                        for message in data["intermediate_steps"]:
                            if "tool" in message:
                                message_str = ""
                                if message["tool"] == "rag_search":
                                    message_str = f"Searching **{message['tool_input']['data_source']}** for '*{message['tool_input']['query']}*' \n"
                                    if message_str not in message_strings:
                                        await msg.stream_token(message_str)
                                elif message["tool"] == "search_gao_recommendations":
                                    print(message)
                                    message_str = f"Searching **GAO Recommendations** for '*{message['tool_input']['query']}*' \n"
                                    if message_str not in message_strings:
                                        await msg.stream_token(message_str)
                                elif message["tool"] == "final_answer":
                                    message_str = build_report(message['tool_input'])
                                    if message_str not in message_strings:
                                        message_strings[message_str] = message_str
                                        await msg.stream_token(message_str)   
                                #else:
                                #    await msg.stream_token(str(message))

                                message_strings[message_str] = message_str

                    else:
                        prettified = json.dumps(data, indent=2)
                        #await msg.stream_token(f"```json\n{prettified}\n```\n")
                else:
                    if 'heartbeat' not in line:
                        await msg.stream_token(f"{line}\n")
                    else:
                        await msg.stream_token(f"Processing results ...\n")

        await msg.update()

    except requests.exceptions.HTTPError as http_err:
        await cl.Message(content=f"HTTP error occurred: {http_err}").send()
    except requests.exceptions.RequestException as req_err:
        await cl.Message(content=f"Request error occurred: {req_err}").send()
    except ValueError:
        await cl.Message(content="Error: Invalid JSON response").send()

Where I add /runs/stream at the end of the deployment URL I get in Langgraph.

Noting that this API is the same as presented by running langgraph CLI langgraph test.

@JoshuaC215
Copy link
Owner

Hey @dividor - Thanks for writing this up and your investigation.

I agree it would be useful to have this compatibility. I haven't personally played with LangGraph Cloud and don't have a great use case for it right now.

Since the request body you shared looks different than the format used by agent-service-toolkit, I'd guess it will need some more changes and a universal compatibility may not be possible.

My suggestion - you would probably need to make changes in client.py to provide the request format that LangGraph Cloud uses and parse the response in the expected way. You might want to copy the existing client and make a new version (feel free to post a PR if you get it working!)

You could try to get it working with src/run_client.py first rather than using the Streamlit app to keep it simple. Once that works, I'd expect that dropping the updated client into the Streamlit app would also work.

If you or anyone else can get a roughly working version of the client posted as a PR, I could probably help from there with cleanup, further testing and inclusion 😄 Also if my situation changes maybe it'll become a priority for me in the future.

@dividor
Copy link
Author

dividor commented Nov 1, 2024

Thanks Joshua, I realized after writing it that it's a bit out of the wheelhouse. I deployed to Azure which is a better option anyway. I'll have a think on the LangGraph cloud bit, but likely won't get to it so perhaps best to close this issue.

Onwards and upwards!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants