Logging LLM Generations, Agent Runs or Conversation Threads

1

Install the Literal AI package

pip install literalai
2

Instantiate the Literal AI client

import os
from literalai import LiteralClient

# Not compatible with gunicorn's --preload flag
literalai_client = LiteralClient(api_key=os.getenv("LITERAL_API_KEY")) # This is the default and can be omitted
3

Get your Literal AI API Key

Go to your project page and click on the Settings tab. You will find your API key in the API Key section.

Copy your API key

Start logging LLM generations and agent runs

Literal AI provides

  1. low-level SDKs in Python and TypeScript
  2. integrations with LLM providers (OpenAI, etc.) and AI frameworks (LangChain, LlamaIndex, Vercel AI SDK, etc.)