aisuite: Unified Interface for Generative AI Providers
aisuite is a lightweight Python library designed to simplify interactions with multiple Generative AI providers through a single, consistent API. By abstracting away the complexities of individual SDKs—such as authentication, parameter mappings, and response formats—it allows developers to write portable code that can seamlessly switch between providers like OpenAI, Anthropic, Google, Hugging Face, AWS, Cohere, Mistral, and Ollama. This library is particularly useful for building applications like chatbots, agentic systems, or experimental setups where flexibility across models is key, without the overhead of managing provider-specific details.
Core Philosophy and Design
Inspired by the familiar OpenAI API structure, aisuite ensures a low learning curve, enabling developers to focus on application logic rather than API integrations. It's not a heavy agents framework but provides lightweight abstractions for creating standalone agents, tool calling, and multi-turn interactions. The modular architecture makes it extensible, allowing easy addition of new providers via plugin-style implementations.
Key benefits include:
- Portability: Write code once and run it across providers by simply changing the model prefix (e.g.,
openai:gpt-4otoanthropic:claude-3-5-sonnet). - Simplicity: Handles authentication via environment variables or config, and standardizes requests/responses.
- Efficiency: Minimal dependencies; install only what you need for specific providers.
Installation and Quick Start
Install the base library with pip install aisuite. For provider support, use pip install 'aisuite[anthropic]' or pip install 'aisuite[all]' for everything. Set API keys as environment variables (e.g., OPENAI_API_KEY), then create a client:
import aisuite as ai
client = ai.Client()
response = client.chat.completions.create(
model="openai:gpt-4o",
messages=[{"role": "user", "content": "Hello!"}],
temperature=0.7
)
print(response.choices[0].message.content)This generates completions in a provider-agnostic way, with support for core parameters like temperature, max_tokens, and tools.
Advanced Features: Tool Calling and Agents
aisuite excels in tool integration, offering two modes:
Manual Tool Handling
Pass tools in OpenAI-style JSON schema, and handle execution yourself for full control:
tools = [{
"type": "function",
"function": {
"name": "get_weather",
"description": "Get current weather",
"parameters": {...}
}
}]
response = client.chat.completions.create(model="...", messages=[...], tools=tools)Automatic Tool Execution
For agentic flows, pass Python functions directly and set max_turns for automated loops:
def get_weather(city: str) -> str:
return f"Sunny in {city}"
response = client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": "What's the weather in NYC?"}],
tools=[get_weather],
max_turns=2
)This handles calling, execution, and feeding results back to the model until completion or the turn limit.
MCP Integration for External Tools
Support for the Model Context Protocol (MCP) allows secure connections to external resources like filesystems or databases. Install with pip install 'aisuite[mcp]' and use config dicts or MCPClient for seamless tool exposure:
response = client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": "List files"}],
tools=[{
"type": "mcp",
"name": "filesystem",
"command": "npx",
"args": ["...", "/path"]
}],
max_turns=3
)This enables LLMs to interact with real-world data sources without custom boilerplate.
Extending and Contributing
Add providers by creating <provider>_provider.py files following the BaseProvider pattern. The library is open-source under MIT license, with a welcoming community on Discord. Check the examples directory for notebooks on chat completions, tools, and MCP.
In summary, aisuite streamlines multi-provider AI development, making it ideal for rapid prototyping and production apps that leverage diverse LLMs efficiently.
