Client
BaseLiteralClient
Base class for LiteralClient and AsyncLiteralClient.
Example:
Attributes:
instrument_openai
Instruments the OpenAI SDK so that all LLM calls are logged to Literal AI.
instrument_mistralai
Instruments the Mistral AI SDK so that all LLM calls are logged to Literal AI.
instrument_llamaindex
Instruments the Llama Index framework so that all RAG & LLM calls are logged to Literal AI.
langchain_callback
Creates a Callback for Langchain that logs all LLM calls to Literal AI.
Arguments:
Returns:
start_step
Creates a step and starts it in the current context. To log it on Literal AI use .end()
.
This is used to create Agent steps. For conversational messages use message
instead.
Arguments:
Returns:
get_current_step
Gets the current step from the context.
get_current_thread
Gets the current thread from the context.
get_current_root_run
Gets the current root run from the context.
reset_context
Resets the context, forgetting active steps & setting current thread to None.
flush_and_stop
Sends all threads and steps to the Literal AI API. Waits synchronously for all API calls to be done.
LiteralClient
Synchronous client for interacting with the Literal AI API.
Example:
AsyncLiteralClient
Asynchronous client for interacting with the Literal AI API.
Example:
Was this page helpful?