Skip to content

Releases: dimagi/open-chat-studio-docs

Weekly Release 2026.05.04

05 May 10:12
d14feeb

Choose a tag to compare

New Features

  • Added Email as a messaging channel. Users can now communicate with chatbots via email — inbound messages are received through a webhook, routed to the correct chatbot, and replied to in a threaded email conversation. This feature is gated behind the flag_email_channel feature flag. The email channel also supports bidirectional file attachments. Inbound attachments (PDFs, CSVs, images, etc.) are saved and made available to the LLM or pipeline; oversized or blocked file types are surfaced as inline notes so the bot can explain the rejection. Outbound files produced by the pipeline (e.g. via add_file_attachment() in a Python node) are sent as MIME attachments in the same threaded reply. (Channels docs)
  • Trace detail pages now include a Performance Metrics section showing LLM turn count, tool call count, total tokens, input tokens, and output tokens for each pipeline execution. Metrics are collected across all LLM nodes (Router and LLM nodes).
  • Added Voyage AI as a local embedding provider. Teams can now configure a Voyage AI service provider to embed and semantically search documents in collections using models such as voyage-4-large, voyage-4, and voyage-3.5. (LLM Providers docs)
  • LLM evaluators can now automatically tag sessions or messages based on evaluation results. A new Tag Rules section on the evaluator form lets you define conditions (equals a value, or falls within a numeric range) that apply a tag when matched. Applied tags are recorded in an audit log and shown in the Applied Tags column on the run results page. (Evaluators docs)

Weekly Release 2026.04.27

27 Apr 10:52
bee3405

Choose a tag to compare

New Features

  • Added intron.io as a Text-to-Speech (TTS) provider. Teams can configure an intron provider with an API key to access 90 voices across 45 African and international accents (male and female), including Afrikaans, Hausa, Igbo, Kinyarwanda, Swahili, Yoruba, Zulu, and more.
  • Pipeline structure and event trigger data are now provided as context to the chat widget when viewing a pipeline, enabling you to ask the assistant questions about the pipeline you are currently viewing.
  • Evaluation datasets now support session-level evaluation, allowing evaluators to judge an entire conversation holistically rather than individual message pairs. When creating a dataset, choose "Message level" (existing behavior) or "Session level" (new). Session-level datasets can be populated by cloning sessions, and incompatible evaluators are automatically filtered out when configuring evaluations.
  • Annotation exports (CSV and JSONL) now include three additional fields: session_id (the UUID of the linked session), flagged (whether the item is flagged), and flagged_reason (the list of flag entries). Flagged items with no annotations are also included in the export.

Bug Fixes

  • Fixed a SyntaxError in the evaluator form that prevented the variable autocomplete feature from loading correctly.
  • Fixed a TypeError on mobile Safari that prevented the trends chart from rendering when the chatbot table was dynamically loaded.
  • Fixed an error that occurred when rapidly removing filters in the UI.
  • Fixed an API error that occurred when serializing sessions whose external_id contained a dot character (e.g., Slack-format identifiers). The sessions API endpoint now correctly handles all external_id formats.

Weekly Release 2026.04.13

14 Apr 05:48
48e064d

Choose a tag to compare

New Features

  • Added date range and participant filters to the annotation queue view, making it easier to find specific sessions.
  • Python Node code in pipelines can now call end_session() to programmatically end the current session, enabling more complex and deterministic session-ending logic.

Bug Fixes

  • Fixed an issue where document indexing failed for certain markdown files containing multi-byte UTF-8 characters.

Weekly Release 2026.04.06

07 Apr 06:32
59d7774

Choose a tag to compare

New Features

  • Added the ability to import sessions from an existing evaluation dataset into an annotation queue
  • Added three session selection modes when adding sessions to an annotation queue: Selected only (default, hand-pick via checkboxes), All matching filters (bulk-add every session matching the current filter), and Sample (add a random percentage of matching sessions using a configurable slider). A confirmation modal is shown for bulk operations

Weekly Release 2026.03.30

30 Mar 09:34
d3fec0c

Choose a tag to compare

New Features

  • When uploading files to a media collection, it will now indicate which channels cannot send this file. Hovering over a channel will also show the reason why it cannot send the file.
  • Added ElevenLabs as a speech service provider, supporting text-to-speech (TTS) and speech-to-text (STT). Providers can sync voices from the ElevenLabs catalog, and custom voices created in ElevenLabs are automatically synced to Open Chat Studio.
  • The Meta Cloud API WhatsApp provider now supports media messages. Users can send and receive images, videos, audio, and documents through WhatsApp channels.
  • The Meta Cloud API WhatsApp provider now supports template messages as a fallback when the 24-hour service window has expired. When a bot cannot send a message due to an expired window, it automatically sends a pre-configured WhatsApp template instead of silently dropping the message.

Improvements

  • The default timeout for Custom Action HTTP calls has been increased from 10 seconds to 30 seconds to better accommodate complex or slow external services.

Bug Fixes

  • Fixed an issue where chat poll API responses could not generate correct URLs due to missing request context in the response serializer.
  • Fixed an authentication error that occurred when an invalid chatbot_id was provided in API requests.
  • Fixed an error that could occur when displaying file sizes for files with no recorded content size.
  • Fixed an issue where timeout triggers stopped firing after publishing a new experiment version. Sessions created before the publish were silently excluded from timeout detection.

Weekly Release 2026.03.23

23 Mar 13:59
42cd214

Choose a tag to compare

New Features

  • Python Node code in pipelines can now use print() to capture debug and diagnostic output. Printed output is collected and visible as console data in the node's trace span, including in Langfuse.
  • The Meta Cloud API WhatsApp provider now supports voice messages in addition to text messages.
  • Excel and Word document attachments are now automatically converted to text before being sent to the LLM, enabling these file types to be processed in conversations alongside PDFs and images.
  • Added support for Meta Cloud API as a new WhatsApp messaging provider, enabling direct integration with the WhatsApp Business Platform without requiring a third-party intermediary. Configure it using your WhatsApp Business Account ID, System User Access Token, App Secret, and Webhook Verify Token.
  • Added Set Session State Key and Get Session State built-in tools that allow LLM nodes to read and write data from the session state during a conversation.
  • Added Append to Session State and Increment Session State Counter built-in tools, mirroring the existing participant data tools for managing lists and counters in session state.
  • Session CSV exports now include a Session State column containing the data stored in the session_state field, making it easier to inspect pipeline state alongside conversation history.
  • Session detail views now display participant data as of the latest trace, with a timestamp note. AI messages that triggered participant data changes show a diff icon — click it to see a color-coded popover of what was added, removed, or modified.

Bug Fixes

  • Fixed an issue where pipelines triggered by events (e.g. conversation end, timeout) silently discarded updates to participant data, session state, and session tags. These state changes are now correctly persisted.

Weekly Release 2026.03.16

16 Mar 09:14
7136513

Choose a tag to compare

New Features

  • Added an Annotation Reviewer team role that grants scoped access to annotation queues. Users with this role can view and annotate queues they are assigned to, but cannot manage queues, add sessions, export results, or access other parts of the app.
  • Participant data changes are now tracked per trace. The trace detail page shows a color-coded summary of what data was added, removed, or modified during each conversation turn, and CSV exports include a Participant Data column with the data snapshot at each message.
  • The Send Email pipeline node's subject and recipient fields now support Jinja2 templates, and a new optional body field also accepts Jinja2 templates — the same variables available in the Render Template node. Existing pipelines are unaffected.
  • Natural language filtering is now available to all users. Type plain-English queries (e.g., "sessions from last week excluding WhatsApp") on session, message, traces, participants, and notifications tables and click ✨ Generate to automatically build filter rows.

Weekly Release 2026.03.02

09 Mar 09:06
dc32119

Choose a tag to compare

New Features

  • Tracing is now available to all users — no feature flag required. View and debug conversation traces directly from the session detail page. Learn more
  • Natural language filter input added to session and message tables. Users can type plain-English queries (e.g., "sessions from last week excluding WhatsApp") and click ✨ Generate to automatically create filter rows. This feature is in beta and can be enabled by team admins from the team feature flags page.
  • LLM deprecation notifications - Teams now receive in-app notifications when LLM models are deprecated or removed.

Weekly Release 2026.02.23

23 Feb 14:10
b2f9b61

Choose a tag to compare

New Features

  • Added support for Claude Sonnet 4.6 model with adaptive thinking. Claude Sonnet 4.6 is now the default Anthropic model, replacing Claude Sonnet 4.5 as the default.
  • Document source sync logs are now accessible directly from the Collections page via a "View Sync Logs" button, allowing users to inspect sync history, file counts (added/updated/removed), duration, and error details without leaving the page.
  • Added notification events that alert you when something important or noteworthy happens in your system, including failures across custom actions (health checks, API failures), chat operations (pipeline execution, LLM errors, tool failures), media handling (audio synthesis/transcription), and message delivery (platform-specific failures).

Weekly Release 2026.02.16

16 Feb 09:16
3dd053d

Choose a tag to compare

New Features

  • Python nodes can now attach files fetched via HTTP to AI response messages using the new attach_file_from_response() helper and response_bytes field on HTTP responses. Documentation
  • Added http global to Python sandbox for making HTTP requests with security guardrails including SSRF prevention, request/response size limits, timeout clamping, automatic retries, and authentication provider integration. Documentation

Improvements

  • Authentication provider names in Python node HTTP requests are now case-insensitive, allowing auth="My-Provider" and auth="my-provider" to match the same provider.