0.0.612

  • Removed runtime dependency on llama-index, only required during instrumentation.

0.0.611

  • Adapt the Mistral AI integration to account for Mistral AI Python client changes
  • Added option to send a root run ID upon creation of a Step
  • The API update_prompt_ab_testing replaces the old promote_prompt and extends on it to allow for A/B testing
  • Tags and metadata arguments passed to LangChain will now show in logs
  • Fixed the event processor when batch not filled every X seconds
Starting with 0.0.611, you need to have Literal AI version 0.0.617-beta or above.

0.0.610

  • Adapt the LlamaIndex integration to the new instrumentation API
  • Add option to send metadata on Generation creation
Starting with 0.0.610, you need to have Literal AI version 0.0.615-beta or above.
The LLamaIndex integration is compatible with llama-index versions 0.10.58 and above.

0.0.608

  • LangChain integration improvements:
    • runs logs have a properly nested structure
    • variables serialization for complex objects

0.0.607

  • Added Mistral AI Instrumentation

0.0.606

Fixes

  • JSON parse errors are now logged

0.0.605

Fixes

  • The HTTP calls are now following redirections

0.0.604

Improvements

  • Enhance error handling

0.0.603

Fixes

  • Default prompt version to None

0.0.602

Fixes

  • Flush doesn’t wait for the internal batch to be empty
  • Add repr to classes
  • Error with participant identifier

0.0.601

Improvements

  • Strip bytes from steps

New Features

  • Add get_steps api

0.0.600

Improvements

  • Change default batch size from 1 to 5

New Features

  • Rename literal_ to literalai_

Fixes

  • Make params optional

0.0.509

Deprecations

  • format() is deprecated. format_messages() should now be used.

New Features

  • Add support for tags with OpenAI Instrumentation

0.0.508

Deprecations

  • create_prompt() is deprecated. get_or_create_prompt() should now be used.

New Features

  • get_or_create_prompt(). For creating a new Prompt, use get_or_create_prompt().

    A `Prompt` is fully defined by its `name`, `template_messages`, `settings` and tools.
    If a prompt already exists for the given arguments, it is returned.
    Otherwise, a new prompt is created.
    
    Args:
        name (str): The name of the prompt to retrieve or create.
        template_messages (List[GenerationMessage]): A list of template messages for the prompt.
        settings (Optional[Dict]): Optional settings for the prompt.
    
    Returns:
        Prompt: The prompt that was retrieved or created.