Skip to content

Commit

Permalink
Merge pull request #91 from Mirascope/release/v0.2.4
Browse files Browse the repository at this point in the history
Release/v0.2.4
  • Loading branch information
willbakst authored Feb 23, 2024
2 parents 9ba6c06 + 1b40c21 commit 7789f4c
Show file tree
Hide file tree
Showing 3 changed files with 20 additions and 42 deletions.
37 changes: 11 additions & 26 deletions docs/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -98,39 +98,14 @@ print(prompt.messages)
* **Convenience**: Tooling that is **clean**, **elegant**, and **delightful** that **you don't need to maintain**.
* **Open**: Dedication to building **open-source tools** you can use with **your choice of LLM**.

## Requirements

**Pydantic** is the only strict requirement, which will be included automatically during installation.

The Prompt CLI and LLM Convenience Wrappers have additional requirements, which you can opt-in to include if you're using those features.

## Installation

Install Mirascope and start building with LLMs in minutes.

```sh
$ pip install mirascope
```

This will install the `mirascope` package along with `pydantic`.

To include extra dependencies, run:

```sh
$ pip install mirascope[cli] # Prompt CLI
$ pip install mirascope[openai] # LLM Convenience Wrappers
$ pip install mirascope[all] # All Extras
```

<details>
<summary>For those using zsh, you'll need to escape brackets:</summary>

```sh
$ pip install mirascope\[all\]
pip install mirascope
```

</details>

## 🚨 Warning: Strong Opinion 🚨

Prompt Engineering is engineering. Beyond basic illustrative examples, prompting quickly becomes complex. Separating prompts from the engineering workflow will only put limitations on what you can build with LLMs. We firmly believe that prompts are far more than "just f-strings" and thus require developer tools that are purpose-built for building these more complex prompts as easily as possible.
Expand Down Expand Up @@ -284,12 +259,18 @@ print(prompt)
<p>Since the `Prompt`'s `str` method uses template, the above will work as expected.</p>
</details>

### Additional Examples

If you're using our LLM convenience wrappers, you'll need to get an API key to use the model of your choice. [OpenAI Account Setup](https://platform.openai.com/docs/quickstart/account-setup) will walk you through how to get your OpenAI API Key. You can follow similar steps for the provider of your choice (e.g. [Anyscale](https://platform.openai.com/docs/quickstart/account-setup) or [Together](https://docs.together.ai/reference/authentication-1)), or you can use a raw client for non-OpenAI models, like [Mistral](https://docs.together.ai/reference/authentication-1).

Because the `Prompt` class is built on top of `BaseModel`, prompts easily integrate with tools like [FastAPI](https://fastapi.tiangolo.com):

<details>
<summary>FastAPI Example</summary>

```python
import os

from fastapi import FastAPI
from mirascope import OpenAIChat

Expand All @@ -313,6 +294,8 @@ You can also use the `Prompt` class with whichever LLM you want to use:
<summary>Mistral Example</summary>

```python
import os

from mistralai.client import MistralClient
from mistralai.models.chat_completion import ChatMessage

Expand All @@ -339,6 +322,8 @@ chat_response = client.chat(
<summary>OpenAI Example</summary>

```python
import os

from openai import OpenAI

from prompts import GreetingsPrompt
Expand Down
6 changes: 3 additions & 3 deletions docs/concepts/llm_convenience_wrappers.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ Mirascope provides convenience wrappers around the OpenAI client to make writing

### Create

You can initialize an [`OpenAIChat`](../api/chat/models.md#mirascope.chat.models.OpenAIChat) instance and call [`create`](../api/chat/models.md#mirascope.chat.models.OpenAIChat.create) to generate an [`OpenAIChatCompletion`](../api/chat/types.md#mirascope.chat.types.OpenAIChatCompletion):
You can initialize an [`OpenAIChat`](../api/chat/models/openai_chat.md#mirascope.chat.models.openai_chat.OpenAIChat) instance and call [`create`](../api/chat/models/openai_chat.md#mirascope.chat.models.openai_chat.OpenAIChat.create) to generate an [`OpenAIChatCompletion`](../api/chat/types.md#mirascope.chat.types.OpenAIChatCompletion):

```python
from mirascope import OpenAIChat, Prompt
Expand Down Expand Up @@ -104,7 +104,7 @@ recipe = recipe_by_chef_using("apples", "japanese")

### Streaming

You can use the [`stream`](../api/chat/models.md#mirascope.chat.models.OpenAIChat.stream) method to stream a response. All this is doing is setting `stream=True` and providing the [`OpenAIChatCompletionChunk`](../api/chat/types.md#mirascope.chat.types.OpenAIChatCompletionChunk) convenience wrappers around the response chunks.
You can use the [`stream`](../api/chat/models/openai_chat.md#mirascope.chat.models.openai_chat.OpenAIChat.stream) method to stream a response. All this is doing is setting `stream=True` and providing the [`OpenAIChatCompletionChunk`](../api/chat/types.md#mirascope.chat.types.OpenAIChatCompletionChunk) convenience wrappers around the response chunks.

```python
chat = OpenAIChat()
Expand Down Expand Up @@ -132,7 +132,7 @@ chunk.content # original.choices[0].delta.content

### Extraction

Often you want to extract structured information into a format like JSON. The [`extract`](../api/chat/models.md#mirascope.chat.models.OpenAIChat.extract) method makes this extremely easy by extracting the information into a Pydantic `BaseModel` schema that you define:
Often you want to extract structured information into a format like JSON. The [`extract`](../api/chat/models/openai_chat.md#mirascope.chat.models.openai_chat.OpenAIChat.extract) method makes this extremely easy by extracting the information into a Pydantic `BaseModel` schema that you define:

```python
from mirascope import OpenAIChat
Expand Down
19 changes: 6 additions & 13 deletions pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
[tool.poetry]
name = "mirascope"
version = "0.2.3"
description = "The most pythonic LLM application building experience"
version = "0.2.4"
description = "LLM toolkit for lightning-fast, high-quality development"
license = "MIT"
authors = [
"William Bakst <[email protected]>",
Expand All @@ -17,17 +17,10 @@ mirascope = 'mirascope.cli.commands:app'
[tool.poetry.dependencies]
python = ">=3.9,<3.13"
pydantic = "^2.0.2"

# A list of optional dependencies that are required for certain features
typer = { version = "^0.9.0", optional = true, extras = ["all"] }
Jinja2 = { version = "^3.1.3", optional = true }
openai = { version = "^1.6.0", optional = true }
docstring-parser = { version = "^0.15", optional = true }

[tool.poetry.extras]
cli = ["typer", "Jinja2"]
openai = ["openai", "docstring-parser"]
all = ["typer", "Jinja2", "openai", "docstring-parser"]
typer = { version = "^0.9.0", extras = ["all"] }
Jinja2 = "^3.1.3"
openai = "^1.6.0"
docstring-parser = "^0.15"

[tool.poetry.group.dev.dependencies]
mypy = "^1.6.1"
Expand Down

0 comments on commit 7789f4c

Please sign in to comment.