LiteLLM allows you to interact with 100+ LLMs seamlessly using a consistent OpenAI-compatible format, either use their python SDK or their proxy server.
Starting from LiteLLM v1.48.12, you can:
- Log LLM calls to Literal AI and evaluate your LLM or prompt performance
- Create multi-step traces with Literal AI decorators
- Bind Prompt Templates directly to LiteLLM calls
Pre-Requisites
Ensure you have the literalai
package installed:
pip install literalai litellm
Quick Start
import litellm
import os
os.environ["LITERAL_API_KEY"] = ""
os.environ['OPENAI_API_KEY']= ""
os.environ['LITERAL_BATCH_SIZE'] = "1"
litellm.success_callback = ["literalai"]
litellm.failure_callback = ["literalai"]
response = litellm.completion(
model="gpt-4o-mini",
messages=[
{"role": "user", "content": "Hi 👋 - i'm openai"}
]
)
Multi Step Traces
This integration is compatible with the Literal AI SDK decorators, enabling conversation and agent tracing
import litellm
from literalai import LiteralClient
import os
os.environ["LITERAL_API_KEY"] = ""
os.environ['OPENAI_API_KEY']= ""
os.environ['LITERAL_BATCH_SIZE'] = "1"
litellm.input_callback = ["literalai"]
litellm.success_callback = ["literalai"]
litellm.failure_callback = ["literalai"]
literalai_client = LiteralClient()
@literalai_client.run
def my_agent(question: str):
response = litellm.completion(
model="gpt-4o-mini",
messages=[{"role": "user", "content": question}],
metadata={"literalai_parent_id": literalai_client.get_current_step().id}
)
return response
my_agent("Hello world")
literalai_client.flush()
Learn more about Literal AI logging capabilities.
Bind a Generation to its Prompt Template
This integration works out of the box with prompts managed on Literal AI. This means that a specific LLM generation will be bound to its template.
Learn more about Prompt Management on Literal AI.
OpenAI Proxy Usage
If you are using the Lite LLM proxy, you can use the Literal AI OpenAI instrumentation to log your calls.
from literalai import LiteralClient
from openai import OpenAI
client = OpenAI(
api_key="anything",
base_url="http://0.0.0.0:4000"
)
literalai_client = LiteralClient(api_key="")
literalai_client.instrument_openai()
settings = {
"model": "gpt-4o-mini",
"temperature": 0,
}
response = client.chat.completions.create(
messages=[
{
"role": "system",
"content": "You are a helpful bot, you always reply in Spanish"
},
{
"role": "user",
"content": message.content
}
],
**settings
)