Literal AI supports OpenLLMetry and its integrations.You can use the OpenLLMetry integration to log your LLM, AI framework, and RAG pipeline calls to Literal AI.
Copy
from literalai import AsyncLiteralClientimport oslai = AsyncLiteralClient(api_key=os.environ["LITERAL_API_KEY"])lai.initialize()
Here is an example of how to log a regular OpenAI chat completion:
Copy
from literalai import AsyncLiteralClientimport oslai = AsyncLiteralClient(api_key=os.environ["LITERAL_API_KEY"])lai.initialize()# Run a regular OpenAI chat completion@lai.thread()async def run_openai_chat_completion(): lai.set_properties( tags=["hello", "world"], metadata={"foo": 1}, ) oai = AsyncOpenAI(api_key=os.environ["OPENAI_API_KEY"]) await oai.chat.completions.create( model="gpt-4o", messages=[{ "role": "user", "content": "Hello, how are you?" }] )asyncio.run(run_openai_chat_completion())