You can use the Literal AI platform to instrument Mistral AI API calls. This allows you to track and monitor the usage of Mistral AI API calls in your application and replay them in the Prompt Playground.
The Mistral AI instrumentation supports sync, async, streamed and regular responses!
from mistralai import Mistral, UserMessageimport json, osfrom literalai import LiteralClientliteralai_client = LiteralClient(api_key="")literalai_client.instrument_mistralai()client = Mistral(api_key="")model ="mistral-large-latest"messages =[{"role":"user","content":"What is the best French cheese?",}]# With streamingstream_response = client.chat.stream( model = model, messages = messages,)for chunk in stream_response:print(chunk.data.choices[0].delta.content)# Now you can use the OpenAI API as you normally would