Skip to content

Commit

Permalink
feat: Add Langchain integration (#120)
Browse files Browse the repository at this point in the history
  • Loading branch information
tomfrenken authored Sep 19, 2024
1 parent 04950bd commit d3f5d1c
Show file tree
Hide file tree
Showing 28 changed files with 1,200 additions and 13 deletions.
11 changes: 11 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@ Integrate chat completion into your business applications with SAP Cloud SDK for
- [@sap-ai-sdk/ai-api](#sap-ai-sdkai-api)
- [@sap-ai-sdk/foundation-models](#sap-ai-sdkfoundation-models)
- [@sap-ai-sdk/orchestration](#sap-ai-sdkorchestration)
- [@sap-ai-sdk/langchain](#sap-ai-sdklangchain)
- [SAP Cloud SDK for AI Sample Project](#sap-cloud-sdk-for-ai-sample-project)
- [Support, Feedback, Contribution](#support-feedback-contribution)
- [Security / Disclosure](#security--disclosure)
Expand Down Expand Up @@ -56,6 +57,16 @@ This package incorporates generative AI foundation models into your AI activitie
$ npm install @sap-ai-sdk/foundation-models
```

### @sap-ai-sdk/langchain

This package provides LangChain model clients, built on top of the foundation model clients of the SAP Cloud SDK for AI.

#### Installation

```
$ npm install @sap-ai-sdk/langchain
```

## SAP Cloud SDK for AI Sample Project

We have created a sample project demonstrating the different clients' usage of the SAP Cloud SDK for AI for TypeScript/JavaScript. The [project README](./sample-code/README.md) outlines the set-up needed to build and run it locally.
Expand Down
6 changes: 6 additions & 0 deletions eslint.config.js
Original file line number Diff line number Diff line change
Expand Up @@ -34,6 +34,12 @@ export default [
'jsdoc/require-jsdoc': 'off'
}
},
{
files: ['packages/langchain/**/*.ts'],
rules: {
'import/no-internal-modules': 'off'
}
},
{
files: ['packages/foundation-models/src/azure-openai/client/inference/schema/*.ts'],
rules: {
Expand Down
1 change: 1 addition & 0 deletions package.json
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,7 @@
"foundation-models": "pnpm -F=@sap-ai-sdk/foundation-models",
"orchestration": "pnpm -F=@sap-ai-sdk/orchestration",
"core": "pnpm -F=@sap-ai-sdk/core",
"langchain": "pnpm -F=@sap-ai-sdk/langchain",
"e2e-tests": "pnpm -F=@sap-ai-sdk/e2e-tests",
"type-tests": "pnpm -F=@sap-ai-sdk/type-tests",
"smoke-tests": "pnpm -F=@sap-ai-sdk/smoke-tests",
Expand Down
121 changes: 121 additions & 0 deletions packages/langchain/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,121 @@
# @sap-ai-sdk/langchain

This package provides LangChain model clients, built on top of the foundation model clients of the SAP Cloud SDK for AI.

## Table of Contents

1. [Installation](#installation)
2. [Pre-requisites](#pre-requisites)
3. [Usage](#usage)
- [Client Initialization](#client-initialization)
- [Chat Clients](#chat-clients)
- [Embedding Clients](#embedding-clients)
4. [Support, Feedback, Contribution](#support-feedback-contribution)
5. [License](#license)

## Installation

```
$ npm install @sap-ai-sdk/langchain
```

## Pre-requisites

- [Enable the AI Core service in SAP BTP](https://help.sap.com/docs/sap-ai-core/sap-ai-core-service-guide/initial-setup).
- Bind the service to your application.
- Ensure the project is configured with Node.js v20 or higher, along with native ESM support.
- For testing your application locally:
- Download a service key for your AI Core service instance.
- Create a `.env` file in the root of your directory.
- Add an entry `AICORE_SERVICE_KEY='<content-of-service-key>'`.

## Usage

This package provides both chat and embedding clients, currently supporting Azure OpenAI.
All clients comply with [LangChain's interface](https://js.langchain.com/docs/introduction).

### Client Initialization

To initialize a client, provide the model name:

```ts
import {
AzureOpenAiChatClient,
AzureOpenAiEmbeddingClient
} from '@sap-ai-sdk/langchain';

// For a chat client
const chatClient = new AzureOpenAiChatClient({ modelName: 'gpt-4o' });
// For an embedding client
const embeddingClient = new AzureOpenAiEmbeddingClient({ modelName: 'gpt-4o' });
```

In addition to the default parameters of the model vendor (e.g. OpenAI) and LangChain, there are additional parameters, which you can use to narrow down the search for the model you want to use:

```ts
const chatClient = new AzureOpenAiChatClient({
modelName: 'gpt-4o',
modelVersion: '24-07-2021',
resourceGroup: 'my-resource-group'
});
```

### Chat Client

The chat clients allow you to interact with Azure OpenAI chat models, accessible via the generative AI hub of SAP AI Core.
To invoke the client, you only have a to pass a prompt:

```ts
const response = await chatClient.invoke("What's the capital of France?");
```

#### Advanced Example with Templating and Output Parsing

```ts
import { AzureOpenAiChatClient } from '@sap-ai-sdk/langchain';
import { StringOutputParser } from '@langchain/core/output_parsers';
import { ChatPromptTemplate } from '@langchain/core/prompts';

const client = new AzureOpenAiChatClient({ modelName: 'gpt-35-turbo' });
const promptTemplate = ChatPromptTemplate.fromMessages([
['system', 'Answer the following in {language}:'],
['user', '{text}']
]);
const parser = new StringOutputParser();
const llmChain = promptTemplate.pipe(client).pipe(parser);
const response = await llmChain.invoke({
language: 'german',
text: 'What is the capital of France?'
});
```

### Embedding Client

Embedding clients allow embedding either text or documents (represented as arrays of strings).

#### Embed Text

```ts
const embeddedText = await embeddingClient.embedQuery(
'Paris is the capital of France.'
);
```

#### Embed Documents

```ts
const embeddedDocument = await embeddingClient.embedDocuments([
'Page 1: Paris is the capital of France.',
'Page 2: It is a beautiful city.'
]);
```

## Support, Feedback, Contribution

This project is open to feature requests/suggestions, bug reports etc. via [GitHub issues](https://github.com/SAP/ai-sdk-js/issues).

Contribution and feedback are encouraged and always welcome. For more information about how to contribute, the project structure, as well as additional contribution information, see our [Contribution Guidelines](https://github.com/SAP/ai-sdk-js/blob/main/CONTRIBUTING.md).

## License

The SAP Cloud SDK for AI is released under the [Apache License Version 2.0.](http://www.apache.org/licenses/).
3 changes: 3 additions & 0 deletions packages/langchain/internal.d.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
// eslint-disable-next-line import/no-internal-modules
export * from './dist/internal.js';
// # sourceMappingURL=internal.d.ts.map
2 changes: 2 additions & 0 deletions packages/langchain/internal.js

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

5 changes: 5 additions & 0 deletions packages/langchain/jest.config.mjs
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
import config from '../../jest.config.mjs';
export default {
...config,
displayName: 'langchain',
};
38 changes: 38 additions & 0 deletions packages/langchain/package.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
{
"name": "@sap-ai-sdk/langchain",
"version": "0.1.0",
"description": "LangChain clients based on the @sap-ai-sdk",
"license": "Apache-2.0",
"keywords": [
"sap-ai-sdk",
"langchain"
],
"type": "module",
"main": "./dist/index.js",
"types": "./dist/index.d.ts",
"files": [
"dist/**/*.js",
"dist/**/*.js.map",
"dist/**/*.d.ts",
"dist/**/*.d.ts.map",
"internal.js",
"internal.d.ts"
],
"scripts": {
"compile": "tsc",
"compile:cjs": "tsc -p tsconfig.cjs.json",
"test": "NODE_OPTIONS=--experimental-vm-modules jest",
"lint": "eslint \"**/*.ts\" && prettier . --config ../../.prettierrc --ignore-path ../../.prettierignore -c",
"lint:fix": "eslint \"**/*.ts\" --fix && prettier . --config ../../.prettierrc --ignore-path ../../.prettierignore -w --log-level error",
"check:public-api": "node --loader ts-node/esm ../../scripts/check-public-api-cli.ts"
},
"dependencies": {
"@sap-ai-sdk/ai-api": "workspace:^",
"@sap-ai-sdk/foundation-models": "workspace:^",
"@langchain/core": "0.3.1",
"zod-to-json-schema": "^3.23.2"
},
"devDependencies": {
"typescript": "^5.5.4"
}
}
9 changes: 9 additions & 0 deletions packages/langchain/src/index.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
export {
AzureOpenAiChatClient,
AzureOpenAiEmbeddingClient
} from './openai/index.js';
export type {
AzureOpenAiChatModelParams,
AzureOpenAiEmbeddingModelParams,
AzureOpenAiChatCallOptions
} from './openai/index.js';
1 change: 1 addition & 0 deletions packages/langchain/src/internal.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
export * from './openai/index.js';
50 changes: 50 additions & 0 deletions packages/langchain/src/openai/__snapshots__/util.test.ts.snap
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
// Jest Snapshot v1, https://goo.gl/fbAQLP

exports[`Mapping Functions should parse an OpenAI response to a (LangChain) chat response 1`] = `
{
"generations": [
{
"generationInfo": {
"finish_reason": "stop",
"function_call": undefined,
"index": 0,
"tool_calls": undefined,
},
"message": {
"id": [
"langchain_core",
"messages",
"AIMessage",
],
"kwargs": {
"additional_kwargs": {
"finish_reason": "stop",
"function_call": undefined,
"index": 0,
"tool_call_id": "",
"tool_calls": undefined,
},
"content": "The deepest place on Earth is located in the Western Pacific Ocean and is known as the Mariana Trench.",
"invalid_tool_calls": [],
"response_metadata": {},
"tool_calls": [],
},
"lc": 1,
"type": "constructor",
},
"text": "The deepest place on Earth is located in the Western Pacific Ocean and is known as the Mariana Trench.",
},
],
"llmOutput": {
"created": 1725457796,
"id": "chatcmpl-A3kgOwg9B6j87n0IkoCFCUCxRSwQZ",
"model": "gpt-4-32k",
"object": "chat.completion",
"tokenUsage": {
"completionTokens": 22,
"promptTokens": 15,
"totalTokens": 37,
},
},
}
`;
80 changes: 80 additions & 0 deletions packages/langchain/src/openai/chat.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,80 @@
import { CallbackManagerForLLMRun } from '@langchain/core/callbacks/manager';
import { BaseMessage } from '@langchain/core/messages';
import type { ChatResult } from '@langchain/core/outputs';
import { AzureOpenAiChatClient as AzureOpenAiChatClientBase } from '@sap-ai-sdk/foundation-models';
import { BaseChatModel } from '@langchain/core/language_models/chat_models';
import { AzureOpenAiChatModel } from '@sap-ai-sdk/core';
import { mapLangchainToAiClient, mapOutputToChatResult } from './util.js';
import type {
AzureOpenAiChatCallOptions,
AzureOpenAiChatModelParams
} from './types.js';

/**
* LangChain chat client for Azure OpenAI consumption on SAP BTP.
*/
export class AzureOpenAiChatClient
extends BaseChatModel<AzureOpenAiChatCallOptions>
implements AzureOpenAiChatModelParams
{
modelName: AzureOpenAiChatModel;
modelVersion?: string;
resourceGroup?: string;
temperature?: number;
top_p?: number;
logit_bias?: Record<string, unknown>;
user?: string;
n?: number;
presence_penalty?: number;
frequency_penalty?: number;
stop?: string | string[];
max_tokens?: number;
private openAiChatClient: AzureOpenAiChatClientBase;

constructor(fields: AzureOpenAiChatModelParams) {
super(fields);
this.openAiChatClient = new AzureOpenAiChatClientBase(fields);
this.modelName = fields.modelName;
this.modelVersion = fields.modelVersion;
this.resourceGroup = fields.resourceGroup;
this.temperature = fields.temperature;
this.top_p = fields.top_p;
this.logit_bias = fields.logit_bias;
this.user = fields.user;
this.n = fields.n;
this.stop = fields.stop;
this.presence_penalty = fields.presence_penalty;
this.frequency_penalty = fields.frequency_penalty;
this.max_tokens = fields.max_tokens;
}

_llmType(): string {
return 'azure_openai';
}

override async _generate(
messages: BaseMessage[],
options: typeof this.ParsedCallOptions,
runManager?: CallbackManagerForLLMRun
): Promise<ChatResult> {
const res = await this.caller.callWithOptions(
{
signal: options.signal
},
() =>
this.openAiChatClient.run(
mapLangchainToAiClient(this, options, messages),
options.requestConfig
)
);

const content = res.getContent();

// we currently do not support streaming
await runManager?.handleLLMNewToken(
typeof content === 'string' ? content : ''
);

return mapOutputToChatResult(res.data);
}
}
Loading

0 comments on commit d3f5d1c

Please sign in to comment.