Important Notice: Literal AI will be discontinued, with the service remaining available until October 31st, 2025. This guide will help you migrate to alternative solutions.
Chainlit is a great tool for building internal and external AI chatbots and copilots. We’re very thankful for the community that has built on top of it.In 2024, we also built Literal AI, an LLMOps product. Both products achieved notable milestones:
Chainlit: Used by Stability AI, Microsoft, Nvidia and many others
Literal AI: Adopted by various companies for LLM observability
However, despite enabling great AI projects, we weren’t able to differentiate enough in these competitive markets to build sustainable revenue.
After careful consideration, we’ve made the difficult decision to discontinue Literal AI (service remains available until October 31st, 2025) and are actively seeking a new maintainer to take over Chainlit. These decisions are not taken lightly, but we believe it’s the best way to move forward. While this marks the end of our direct involvement with these products, we’re excited about what comes next.
Export all your data from Literal AI using our Export Data functionality. This includes datasets, experiments, prompts.
2
Choose an alternative and upload your datasets
Once you have exported your data, you can choose an alternative and upload your assets.
Update your application logging mechanism to point to the new solution you chose.
Choose an alternative that supports OpenTelemetry. Most AI frameworks now support OpenTelemetry-based observability and most LLMOps platforms should support it.
Literal AI will remain fully operational until October 31st 2025. You have ample time to plan and execute your migration.
Will I lose my data if I don't migrate?
Yes, all data will be permanently deleted when the service is discontinued. Please ensure you export all important data before October 31st 2025.
What happens to Chainlit?
We’re actively seeking a new maintainer for Chainlit. The project will continue under new leadership to serve the community.
We apologize for any inconvenience this transition may cause and appreciate your understanding. The LLMOps ecosystem has grown significantly, and these alternative solutions will serve you well as you continue building amazing AI applications.