Vercel AI SDK
This integration allows you to very simply add observability and monitoring to your LLM application based on Vercel’s AI SDK. The instrumentation is available for the two main methods of the Vercel AI SDK: generateText
and streamText
.
With Threads and Runs
In most cases, you will want to keep track of the different generations from your application by grouping them into Threads or Runs. This is especially useful when you want to understand the context in which a generation was made, or when you want to compare different generations.
With Metadata, Tags and Step IDs
Using our Vercel AI SDK integration, you can pass metadata, tags and a step ID at the generation level. These values will be automatically added to the generation when it is logged on Literal AI.
Cookbooks
You can find more involved examples in our Cookbooks repository :
- This chatbot uses Vercel AI SDK’s
useChat
hook in the frontend - This example uses the Vercel AI SDK integration in the backend
Was this page helpful?