Skip to content

Commit

Permalink
Add docs
Browse files Browse the repository at this point in the history
  • Loading branch information
congminh1254 committed Aug 23, 2024
1 parent e60dd21 commit 95dccd7
Showing 1 changed file with 40 additions and 4 deletions.
44 changes: 40 additions & 4 deletions docs/ai.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,16 +6,18 @@ AI allows to send an intelligence request to supported large language models and
<!-- START doctoc generated TOC please keep comment here to allow auto update -->
<!-- DON'T EDIT THIS SECTION, INSTEAD RE-RUN doctoc TO UPDATE -->

- [Send AI request](#send-ai-request)
- [Send AI text generation request](#send-ai-text-generation-request)
- [AI](#ai)
- [Send AI request](#send-ai-request)
- [Send AI text generation request](#send-ai-text-generation-request)
- [Get AI agent default configuration](#get-ai-agent-default-configuration)

<!-- END doctoc generated TOC please keep comment here to allow auto update -->

Send AI request
------------------------

To send an AI request to the supported large language models, call the
[`ai.ask(body, options?, callback?)`](http://opensource.box.com/box-node-sdk/jsdoc/AI.html#ask) method with the prompt and items. The `items` parameter is a list of items to be processed by the LLM, often files. The `prompt` provided by the client to be answered by the LLM. The prompt's length is limited to 10000 characters. The `mode` specifies if this request is for a single or multiple items. If you select `single_item_qa` the items array can have one element only. Selecting `multiple_item_qa` allows you to provide up to 25 items.
[`ai.ask(body, options?, callback?)`](http://opensource.box.com/box-node-sdk/jsdoc/AIManager.html#ask) method with the prompt and items. The `items` parameter is a list of items to be processed by the LLM, often files. The `prompt` provided by the client to be answered by the LLM. The prompt's length is limited to 10000 characters. The `mode` specifies if this request is for a single or multiple items. If you select `single_item_qa` the items array can have one element only. Selecting `multiple_item_qa` allows you to provide up to 25 items.

<!-- sample post_ai_ask -->
```js
Expand Down Expand Up @@ -47,7 +49,7 @@ Send AI text generation request
------------------------

To send an AI text generation request to the supported large language models, call the
[`ai.textGen(body, options?, callback?)`](http://opensource.box.com/box-node-sdk/jsdoc/AI.html#textGen) method with the prompt, items and dialogue history. The `dialogue_history` parameter is history of prompts and answers previously passed to the LLM. This provides additional context to the LLM in generating the response. The `items` parameter is a list of items to be processed by the LLM, often files. The `prompt` provided by the client to be answered by the LLM. The prompt's length is limited to 10000 characters.
[`ai.textGen(body, options?, callback?)`](http://opensource.box.com/box-node-sdk/jsdoc/AIManager.html#textGen) method with the prompt, items and dialogue history. The `dialogue_history` parameter is history of prompts and answers previously passed to the LLM. This provides additional context to the LLM in generating the response. The `items` parameter is a list of items to be processed by the LLM, often files. The `prompt` provided by the client to be answered by the LLM. The prompt's length is limited to 10000 characters.

<!-- sample post_ai_text_gen -->
```js
Expand Down Expand Up @@ -81,3 +83,37 @@ client.ai.textGen(
} */
});
```


Get AI agent default configuration
------------------------

To get an AI agent default configuration call the [ai.getDefaultAiAgent(options?, callback?)](http://opensource.box.com/box-node-sdk/jsdoc/AIManager.html#getDefaultAiAgent) method. The `mode` parameter filters the agent configuration to be returned. It can be either `ask` or `text_gen`. The `language` parameter specifies the ISO language code to return the agent config for. If the language is not supported, the default agent configuration is returned. The `model` parameter specifies the model for which the default agent configuration should be returned.

<!-- sample get_ai_agent_default -->
```js
client.ai.getDefaultAiAgent({
mode: 'ask',
language: 'en',
model:'openai__gpt_3_5_turbo'
}).then(response => {
/* response -> {
"type": "ai_agent_ask",
"basic_text": {
"llm_endpoint_params": {
"type": "openai_params",
"frequency_penalty": 1.5,
"presence_penalty": 1.5,
"stop": "<|im_end|>",
"temperature": 0,
"top_p": 1
},
"model": "openai__gpt_3_5_turbo",
"num_tokens_for_completion": 8400,
"prompt_template": "It is `{current_date}`, and I have $8000 and want to spend a week in the Azores. What should I see?",
"system_message": "You are a helpful travel assistant specialized in budget travel"
},
...
} */
});
```

0 comments on commit 95dccd7

Please sign in to comment.