Skip to content

Commit

Permalink
Merge pull request #207 from sirji-ai/create-and-sync-assistant-actions
Browse files Browse the repository at this point in the history
Create and sync assistant actions
  • Loading branch information
V-R-Dighe authored Jul 9, 2024
2 parents 0599e1c + 904f47f commit 27cbd09
Show file tree
Hide file tree
Showing 46 changed files with 1,558 additions and 661 deletions.
24 changes: 3 additions & 21 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,27 +27,9 @@

Sirji is a framework designed to build and run custom AI agents for your everyday development tasks.

Sirji has two main product components: Sirji Studio and Sirji VS Code Extension.
An agent in the Sirji framework is a modular AI component designed to perform specific task based on custom pseudocode. [Here](./docs/How%20to%20Write%20an%20Agent.md) is a guide for writing your own custom agent.

### Sirji Studio

We have built the framework that allows the developer community to build custom agents simply by writing pseudocode in plain English.

Custom agents help capture and convey the developer's code writing style and domain knowledge to Sirji. They performs a specific task based on a custom pseudocode. The community can create a custom agent either by modifying an existing agent or by writing an entirely new agent with a different pseudocode.

[Here](./docs/How%20to%20Write%20Agent%20Pseudocode.md) is a guide for writing agent pseudocode.

### Sirji VS Code Extension

We have build and released the [VS Code Extension](https://marketplace.visualstudio.com/items?itemName=TrueSparrow.sirji) on the Visual Studio Marketplace.

This extension has the following features implemented:
- Interactive chat interface allows user to submit their problem statements and give feedback to Sirji.
- Messaging protocol implements the allowed response templates for messages exchanged between various agents.
- Orchestrator enables requirement gathering, recipe selection, recipe execution by invoking agents
- Executor makes these functionalities accessible to the agents: file system access, search, find & replace, insert text in project files, install packages, execute commands, run code, run test cases.
- Agent Sessions provide the ability to invoke an agent with a fresh LLM conversation or continue on an existing LLM conversation.
- Logs and Token Usage Summary are displayed alongside the interactive chat interface.
[Here](./docs/Sirji%20Studio.md) is a guide to organize and share your custom agents.

## Installation

Expand All @@ -59,7 +41,7 @@ Make sure you have installed all of the following prerequisites on your machine:
- Python (>= 3.10) - Make sure `python --version` runs without error.
- tee command - Make sure `which tee` runs without error.

Also, you will need an OpenAI API key to access the GPT-4o model.
For LLM inference, you would need an API key from at least one of OpenAI, Anthropic or DeepSeek.

## Demo Video

Expand Down
59 changes: 0 additions & 59 deletions agents/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,6 @@
`sirji-agents` is a PyPI package that implements the following components of the Sirji AI agentic framework:
- **Orchestrator**: The Orchestrator is the central component in the Sirji framework, responsible for managing the flow and execution of tasks across different agents.
- **Generic Agent**: Run time composable class providing the agent functionality as per the pseudo code provided in the agent.yml file.
- **Research Agent**: Utilizes RAG (Retrieval-Augmented Generation) and gets trained on URLs and search terms.

By default, it utilizes:
- OpenAI Chat Completions API
Expand Down Expand Up @@ -69,7 +68,6 @@ Ensure that the following environment variables are set:
export SIRJI_PROJECT="Absolute folder path for Sirji to use as its project folder."
export SIRJI_INSTALLATION_DIR='Absolute path of the Sirji installation directory.'
export SIRJI_RUN_PATH='Folder path containing run related logs, etc.'
export SIRJI_OPENAI_API_KEY='OpenAI API key for Chat Completions API and Assistants API'
export SIRJI_MODEL_PROVIDER='Model Provider to be used for LLM inference. Defaults to "openai".'
export SIRJI_MODEL='Model to be used for LLM inference. Defaults to "gpt-4o".'
export SIRJI_MODEL_PROVIDER_API_KEY='API key to be used for LLM inference.'
Expand Down Expand Up @@ -161,63 +159,6 @@ message_str = "***\nFROM: ORCHESTRATOR\nTO: CODER\nACTION: INVOKE_AGENT\nSUMMARY
response_message, history, prompt_tokens, completion_tokens = agent.message(message_str, history)
```

### Research Agent

The Research Agent utilizes RAG (Retrieval-Augmented Generation) and gets trained on URLs and search terms.

### Initialization

```python
from sirji_agents import ResearchAgent

# Initialize Researcher without assistant ID
researcher = ResearchAgent('openai_assistant', 'openai_assistant')

# init_payload fetched from researcher object should be persisted
init_payload = researcher.init_payload

# Initialize Researcher with assistant ID
researcher = ResearchAgent('openai_assistant', 'openai_assistant', init_payload)
```

Some example message handling usages are given below.

#### Train using URL

```python
from sirji_messages import MessageFactory, ActionEnum

message_class = MessageFactory[ActionEnum.TRAIN_USING_URL.name]

body = {
"URL": "https://www.w3schools.com/python/python_json.asp"
}

message_str = message_class().generate({
"from_agent_id": "Id of the agent, who is invoking the action",
"summary": "{{Display a concise summary to the user, describing the action using the present continuous tense.}}",
"body": body
})

researcher.message(message_str)
```

#### Infer

```python
from sirji_messages import MessageFactory, ActionEnum

message_class = MessageFactory[ActionEnum.INFER.name]
infer_query = "What is the capital of India?"
message_str = message_class().generate({
"from_agent_id": "Id of the agent, who is invoking the action",
"summary": "{{Display a concise summary to the user, describing the action using the present continuous tense.}}",
"body": infer_query
})

response, total_tokens = researcher.message(message_str)
```

## For Contributors

1. Fork and clone the repository.
Expand Down
6 changes: 3 additions & 3 deletions agents/requirements.txt
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
sirji-messages==0.0.28
sirji-tools==0.0.14
openai==1.14.1
sirji-messages==0.0.29
sirji-tools==0.0.15
openai==1.35.7
anthropic==0.29.0
2 changes: 1 addition & 1 deletion agents/setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

setup(
name='sirji-agents',
version='0.0.40',
version='0.0.41',
author='Sirji',
description='Orchestrator, Generic Agent, and Research Agent components of the Sirji AI agentic framework.',
license='MIT',
Expand Down
5 changes: 3 additions & 2 deletions agents/sirji_agents/__init__.py
Original file line number Diff line number Diff line change
@@ -1,9 +1,10 @@
from .researcher import ResearchAgent
from .researcher import ResearchAgent, CleanupFactory
from .llm.orchestrator import Orchestrator
from .llm.generic import GenericAgent

__all__ = [
'ResearchAgent',
'Orchestrator',
'GenericAgent'
'GenericAgent',
'CleanupFactory'
]
3 changes: 2 additions & 1 deletion agents/sirji_agents/researcher/__init__.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
from .researcher import ResearchAgent
from .cleanup.factory import CleanupFactory

__all__ = ['ResearchAgent']
__all__ = ['ResearchAgent', 'CleanupFactory']
Empty file.
31 changes: 31 additions & 0 deletions agents/sirji_agents/researcher/cleanup/base.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,31 @@
from abc import ABC, abstractmethod


class CleanupBase(ABC):

@abstractmethod
def delete_assistant(self, assistant_id):
"""
Deletes the assistant.
:param assistant_id: The assistant ID to be deleted.
"""
pass

@abstractmethod
def delete_vector_store(self, vector_store_id):
"""
Deletes the vector store.
:param vector_store_id: The vector store ID to be deleted.
"""
pass

@abstractmethod
def delete_file(self, file_path):
"""
Deletes the file.
:param file_path: The file path to be deleted.
"""
pass
14 changes: 14 additions & 0 deletions agents/sirji_agents/researcher/cleanup/factory.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
from .openai_cleanup import OpenAICleanup
import os


class CleanupFactory:
@classmethod
def get_instance(cls):

provider_name = os.environ.get('SIRJI_MODEL_PROVIDER').lower()

if provider_name == "openai":
return OpenAICleanup()
else:
raise ValueError("Unsupported provider: {}".format(provider_name))
52 changes: 52 additions & 0 deletions agents/sirji_agents/researcher/cleanup/openai_cleanup.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,52 @@
import os
from openai import OpenAI

from sirji_tools.logger import create_logger

from .base import CleanupBase

class OpenAICleanup(CleanupBase):
def __init__(self):
self.logger = create_logger("researcher.log", "debug")
api_key = os.environ.get("SIRJI_MODEL_PROVIDER_API_KEY")

if api_key is None:
raise ValueError(
"OpenAI API key is not set as an environment variable")

# Initialize OpenAI client
client = OpenAI(api_key=api_key)

self.logger.info("Completed initializing OpenAI client")
self.client = client


def delete_assistant(self, assistant_id):
self.logger.info("Deleting assistant")
try:
response = self.client.beta.assistants.delete(assistant_id)
print(response)
self.logger.info(response)
except Exception as e:
print(e)
self.logger.error(e)

def delete_vector_store(self, vector_store_id):
try:
response = self.client.beta.vector_stores.delete(
vector_store_id = vector_store_id
)
print(response)
self.logger.info(response)
except Exception as e:
print(e)
self.logger.error(e)

def delete_file(self, file_path):
try:
response = self.client.files.delete(file_path)
print(response)
self.logger.info(response)
except Exception as e:
print(e)
self.logger.error(e)
8 changes: 6 additions & 2 deletions agents/sirji_agents/researcher/embeddings/factory.py
Original file line number Diff line number Diff line change
@@ -1,10 +1,14 @@
from .openai_assistant import OpenAIAssistantEmbeddings
import os


class EmbeddingsFactory:
@classmethod
def get_instance(cls, embeddings_type, init_payload):
if embeddings_type == "openai_assistant":
def get_instance(cls, init_payload):

provider_name = os.environ.get('SIRJI_MODEL_PROVIDER').lower()

if provider_name == "openai":
return OpenAIAssistantEmbeddings(init_payload)
else:
raise ValueError("Unsupported embeddings_type.")
Loading

0 comments on commit 27cbd09

Please sign in to comment.