Metadata-Version: 2.3
Name: opperai
Version: 0.6.2
Summary: Opper Python client
Project-URL: Homepage, https://opper.ai
Project-URL: Documentation, https://docs.opper.ai
Project-URL: Platform, https://platform.opper.ai
Author-email: Opper <opper@opper.ai>
License-File: LICENSE
Classifier: License :: OSI Approved :: MIT License
Classifier: Operating System :: OS Independent
Classifier: Programming Language :: Python :: 3
Requires-Dist: httpx
Requires-Dist: httpx-sse
Requires-Dist: pydantic
Provides-Extra: test
Requires-Dist: jsonschema; extra == 'test'
Requires-Dist: pytest; extra == 'test'
Requires-Dist: pytest-asyncio; extra == 'test'
Requires-Dist: pytest-cov; extra == 'test'
Requires-Dist: vcrpy; extra == 'test'
Description-Content-Type: text/markdown

# Opper Python SDK

This is the Opper Python SDK. See below for getting started, and the [docs](https://docs.opper.ai) for more information. The SDK has builtin documentation and examples in function docstrings, which should be visible in your code editor as you are using the functions.

## Install

```bash
pip install opperai
```

## Configuration

### Environment variable

- `OPPER_API_KEY` environment variable is read by the SDK if no `api_key` is provided to the `Client` object. 
- `OPPER_PROJECT` is attached to traces and can be used for filtering in the Opper UI.
- `OPPER_DEFAULT_MODEL` will define the model used by functions created with the `fn` decorator

When using the `fn` decorator the SDK client is automatically initialized with the `OPPER_API_KEY` environment variable.

## Using the `fn` decorator

```python
from opperai import fn

@fn()
def translate(text: str, target_language: str) -> str:
    """Translate text to a target language."""


print(translate("Hello","fr"))

>>> "Bonjour"
```

The `fn` decorator automatically creates an Opper function ready to be called like any other function in your code. They're no different than any other function!

### Using the `fn` decorator with images as inputs

```python
from opperai import fn
from opperai.types import ImageContent
from pydantic import BaseModel
from typing import List

class Word(BaseModel):
    letters: List[str]

@fn(model="openai/gpt-4o")
def extract_letters(image: ImageContent) -> Word:
    """given an image extract the word it represents"""

letters = extract_letters(
    ImageContent.from_path("tests/fixtures/images/letters.png"),
)

print(letters)
```

Note: one need to select the model that can handle images as inputs, see [models](https://docs.opper.ai/functions/models)

## Calling functions

To call a function you created at [https://platform.opper.ai](https://platform.opper.ai) you can use the following code:


```python
from opperai import Opper
from opperai.types import Message

opper = Opper()

function = opper.functions.create(
    "jokes/tell", 
    instructions="given a topic tell a joke",
)

response = function.chat(
    messages=[Message(role="user", content="topic: python")]
)

print(response)
```

## Async function calling

```python
import asyncio
from opperai import AsyncOpper
from opperai.types import Message

async def main():
    opper = AsyncOpper()

    function = await opper.functions.create(
        "jokes/tell", 
        instructions="given a topic tell a joke",
    )
    
    response = await function.chat(
        messages=[Message(role="user", content="topic: python")],
    )

    print(response)

if __name__ == "__main__":
    asyncio.run(main())
```

## Streaming responses

```python
from opperai import Opper
from opperai.types import Message

opper = Opper()

function = opper.functions.create(
    "jokes/tell", 
    instructions="given a topic tell a joke",
    description="tell a joke",
)

response = function.chat(
    messages=[Message(role="user", content="topic: python")],
    stream=True
)

for delta in response.deltas:
    print(delta, end="", flush=True)
```

## Async streaming responses

```python
import asyncio
from opperai import AsyncOpper
from opperai.types import Message


async def main():
    opper = AsyncOpper()
    
    function = await opper.functions.create(
        "jokes/tell", 
        instructions="given a topic tell a joke",
        description="tell a joke",
    )

    response = await function.chat(
        messages=[Message(role="user", content="topic: python")],
        stream=True,
    )

    async for delta in response.deltas:
        print(delta, end="", flush=True)


if __name__ == "__main__":
    asyncio.run(main())
```

## Indexes

```python
from opperai import Opper
from opperai.types import Document, Filter

opper = Opper()

index = opper.indexes.create("my-index")

index.upload_file("file.txt")

index.add(Document(key="key1", content="Hello world 1", metadata={"score": 0}))
index.add(Document(key="key1", content="Hello world 1", metadata={"score": 1}))
index.add(Document(key="key2", content="Hello world 3", metadata={"score": 0}))

response = index.query("Hello")
print(response)

response = index.query("Hello", filters=[Filter(key="score", operation="=", value="1")])
print(response)
```

# Examples

See examples in our [documentation](https://docs.opper.ai)
