The Llama Index integration enables to monitor your RAG pipelines with a single line of code.

The Llama Index integration already support LLM tracing. You should not use it in conjunction with other LLM provider integrations such as OpenAI.

The Llama Index integration in the Python SDK is compatible with Llama Index starting at version 0.10.58.

Each Thread will result in the following tree on Literal AI :

A Llamaindex RAG thread on Literal AI

A Llamaindex RAG thread on Literal AI