Skip to content

Quickstart

Quickstart

Requirements

  • Python 3.11 or newer
  • API_KEYS to access to whichever LLM you choose to use.

Installation

Direct Usage

In your pip install, include the supported providers that you plan on using, or use the all option if you want to install support for all any-llm supported providers.

pip install any-llm-sdk[mistral]  # For Mistral provider
pip install any-llm-sdk[ollama]   # For Ollama provider
# install multiple providers
pip install any-llm-sdk[mistral,ollama]
# or install support for all providers
pip install any-llm-sdk[all]

Library Integration

If you're integrating any-llm into your own library that others will use, you only need to install the base package:

pip install any-llm-sdk

In this scenario, the end users of your library will be responsible for installing the appropriate provider dependencies when they want to use specific providers. any-llm is designed so that you'll only encounter exceptions at runtime if you try to use a provider without having the required dependencies installed.

Those exceptions will clearly describe what needs to be installed to resolve the issue.

Make sure you have the appropriate API key environment variable set for your provider. Alternatively, you could use the api_key parameter when making a completion call instead of setting an environment variable.

export MISTRAL_API_KEY="YOUR_KEY_HERE"  # or OPENAI_API_KEY, etc

Basic Usage

completion and acompletion use a unified interface across all providers.

The provider_id key of the model should be specified according the provider ids supported by any-llm. The model_id portion is passed directly to the provider internals: to understand what model ids are available for a provider, you will need to refer to the provider documentation.

from any_llm import completion
import os

# Make sure you have the appropriate environment variable set
assert os.environ.get('MISTRAL_API_KEY')

model = "mistral/mistral-small-latest" # <provider_id>/<model_id>
# Basic completion
response = completion(
    model=model,
    messages=[{"role": "user", "content": "Hello!"}]
)
print(response.choices[0].message.content)

In that above script, updating to use an ollama hosted mistral model (assuming that you have ollama installed and running) is as easy as updating the model to specify the ollama provider and using ollama model syntax for mistral!

model="ollama/mistral-small3.2:latest"

Streaming

For the providers that support streaming, you can enable it by passing stream=True:

output = ""
for chunk in completion(
    model=model,
    messages=[{"role": "user", "content": "Hello!"}],
    stream=True
):
    chunk_content = chunk.choices[0].delta.content or ""
    print(chunk_content)
    output += chunk_content

Embeddings

embedding and aembedding allow you to create vector embeddings from text using the same unified interface across providers.

Not all providers support embeddings - check the providers documentation to see which ones do.

from any_llm import embedding
model = "openai/text-embedding-3-small"
result = embedding(
    model=model,
    inputs="Hello, world!" # can be either string or list of strings
)

# Access the embedding vector
embedding_vector = result.data[0].embedding
print(f"Embedding vector length: {len(embedding_vector)}")
print(f"Tokens used: {result.usage.total_tokens}")

Tools

any-llm supports tool calling for providers that support it. You can pass a list of tools where each tool is either:

  1. Python callable - Functions with proper docstrings and type annotations
  2. OpenAI Format tool dict - Already in OpenAI tool format
from any_llm import completion

def get_weather(location: str, unit: str = "F") -> str:
    """Get weather information for a location.

    Args:
        location: The city or location to get weather for
        unit: Temperature unit, either 'C' or 'F'
    """
    return f"Weather in {location} is sunny and 75{unit}!"

response = completion(
    model="mistral/mistral-small-latest",
    messages=[{"role": "user", "content": "What's the weather in Pittsburgh PA?"}],
    tools=[get_weather]
)

any-llm automatically converts your Python functions to OpenAI tools format. Functions must have: - A docstring describing what the function does - Type annotations for all parameters - A return type annotation