Integrations
LiteLLM
LiteLLM allows you to interact with 100+ LLMs seamlessly using a consistent OpenAI-compatible format, either use their python SDK or their proxy server.
Starting from LiteLLM v1.48.12, you can:
- Log LLM calls to Literal AI and evaluate your LLM or prompt performance
- Create multi-step traces with Literal AI decorators
- Bind Prompt Templates directly to LiteLLM calls
Pre-Requisites
Ensure you have the literalai
package installed:
Quick Start
Multi Step Traces
This integration is compatible with the Literal AI SDK decorators, enabling conversation and agent tracing
Learn more about Literal AI logging capabilities.
Bind a Generation to its Prompt Template
This integration works out of the box with prompts managed on Literal AI. This means that a specific LLM generation will be bound to its template.
Learn more about Prompt Management on Literal AI.
OpenAI Proxy Usage
If you are using the Lite LLM proxy, you can use the Literal AI OpenAI instrumentation to log your calls.
Was this page helpful?