You can use the Literal AI platform to instrument OpenAI API calls. This allows you to track and monitor the usage of the OpenAI API in your application and replay them in the Prompt Playground.
The OpenAI instrumentation supports completions, chat completions, and image generation. It handles both regular and streamed responses.
import osfrom literalai import LiteralClient"""You need to call the `instrument_openai` method from the Literal AI client toenable the integration. Call it before any OpenAI API call."""literalai_client = LiteralClient(api_key=os.getenv("LITERAL_API_KEY"))literalai_client.instrument_openai()# Now you can use the OpenAI API as you normally would
You can use Threads and Runs on top of the OpenAI API to create structured and organized logs.
import osfrom literalai import LiteralClientfrom openai import OpenAIopenai_client = OpenAI()literalai_client = LiteralClient(api_key=os.getenv("LITERAL_API_KEY"))literalai_client.instrument_openai()@literalai_client.step(type="run")defmy_assistant(user_query:str): completion = openai_client.chat.completions.create( model="gpt-4o-mini", messages=[{"role":"user","content": user_query,}],)return completion.choices[0].message.contentdefmain():with literalai_client.thread(name="Example")as thread: initial_user_query ="Hello, how are you?" my_assistant(initial_user_query) follow_up_query ="Follow up query" my_assistant(follow_up_query)main()# Network requests by the SDK are performed asynchronously.# Invoke flush to guarantee the completion of all requests prior to the process termination.# WARNING: If you run a continuous server, you should not use this method.literalai_client.flush()
You can add tags and metadata to the Generations created by the instrumentation.
completion = openai_client.chat.completions.create( model="gpt-4o-mini", messages=[# ...],# Only tags are supported right now on the Python client literalai_tags=["tag1","tag2"], literalai_metadata={"key":"value"},)
You can sync your OpenAI Assistant threads with Literal AI in a few lines of code.
import OpenAI from'openai';import{ LiteralClient, User }from'@literalai/client';const openai =newOpenAI();const literalAiClient =newLiteralClient({apiKey: process.env["LITERAL_API_KEY"]});const syncer = literalAiClient.openai(openai).assistant.syncer;asyncfunctionmain(){// You can sync a thread at any moment. We recommend to sync it once you get a `completed` run status.const threadId ='THREAD_ID_TO_SYNC';// Optional: Add/update a user to the thread. Use any unique identifier you like.const user =newUser({ identifier:'willy', metadata:{ name:'Willy'}});await syncer.syncThread(threadId, user);}main();