Monitor all LLM providers via LangChain simple unified interface init_chat_model:
Copy
from langchain.chat_models import init_chat_modelfrom literalai import LiteralClientliteralai_client = LiteralClient()gpt_4o = init_chat_model("gpt-4o", model_provider="openai", temperature=0)claude_sonnet = init_chat_model("claude-3-5-sonnet-20241022", model_provider="anthropic", temperature=0)# Literal AI callbackcb = literalai_client.langchain_callback()# Invoke the model with input and callback configurationgpt_4o.invoke("what's your name", config=RunnableConfig(callbacks=[cb]))claude_sonnet.invoke("what's your name", config=RunnableConfig(callbacks=[cb]))literalai_client.flush()
You can thus monitor Anthropic, Mistral AI, Cohere and many other LLM providers. Full list here.