Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 3 additions & 2 deletions skills/dotnet-microsoft-agent-framework/SKILL.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
name: dotnet-microsoft-agent-framework
version: "1.4.0"
version: "1.5.1"
category: "AI"
description: "Build .NET AI agents and multi-agent workflows with Microsoft Agent Framework using the right agent type, threads, tools, workflows, hosting protocols, and enterprise guardrails."
compatibility: "Requires preview-era Microsoft Agent Framework packages and a .NET application that truly needs agentic or workflow orchestration."
Expand Down Expand Up @@ -54,6 +54,7 @@ flowchart LR
- `AgentResponse` and `AgentResponseUpdate` are not just text containers. They can include tool calls, tool results, structured output, reasoning-like updates, and response metadata.
- `ChatClientAgent` is the safest default when you already have an `IChatClient` and do not need a hosted-agent service.
- `Workflow` is an explicit graph of executors and edges. Use it when the control flow must stay inspectable, typed, resumable, or human-steerable.
- `AgentWorkflowBuilder` provides high-level factory methods such as `BuildConcurrent` for common agent orchestration patterns. Use it when you need concurrent or sequential agent pipelines without writing custom executor classes.
- Hosting layers such as OpenAI-compatible HTTP, A2A, and AG-UI are adapters over your in-process agent or workflow. They do not replace the core architecture choice.
- Durable agents are a hosting and persistence decision for Azure Functions. They are not the default answer for ordinary app-level orchestration.

Expand All @@ -65,7 +66,7 @@ flowchart LR
| OpenAI-style future-facing APIs, background responses, or richer response state | Responses-based agent | Better fit for new OpenAI-compatible integrations |
| Simple client-managed chat history | Chat Completions agent | Keeps request/response simple |
| Service-hosted agents and service-owned threads/tools | Azure AI Foundry Agent or other hosted agent | Managed runtime is the requirement |
| Typed multi-step orchestration | `Workflow` | Control flow stays explicit and testable |
| Typed multi-step orchestration | `Workflow` or `AgentWorkflowBuilder` helpers | Control flow stays explicit and testable; use `BuildConcurrent` for agent fan-out/fan-in |
| Week-long or failure-resilient Azure execution | Durable agent on Azure Functions | Durable Task gives replay and persisted state |
| Agent-to-agent interoperability | A2A hosting or A2A proxy agent | This is protocol-level delegation, not local inference |
| Browser or web UI protocol integration | AG-UI | Designed for remote UI sync and approval flows |
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ zone_pivot_groups: programming-languages
author: sergeymenshykh
ms.topic: reference
ms.author: semenshi
ms.date: 10/16/2025
ms.date: 03/17/2026
ms.service: agent-framework
---

Expand All @@ -14,14 +14,15 @@ ms.service: agent-framework
The Microsoft Agent Framework supports background responses for handling long-running operations that may take time to complete. This feature enables agents to start processing a request and return a continuation token that can be used to poll for results or resume interrupted streams.

> [!TIP]
> For a complete working example, see the [Background Responses sample](https://github.com/microsoft/agent-framework/blob/main/dotnet/samples/GettingStarted/Agents/Agent_Step17_BackgroundResponses/Program.cs).
> For a complete working example, see the [Background Responses sample](https://github.com/microsoft/agent-framework/blob/main/dotnet/samples/02-agents/Agents/Agent_Step14_BackgroundResponses/Program.cs).

## When to Use Background Responses

Background responses are particularly useful for:
- Complex reasoning tasks that require significant processing time
- Operations that may be interrupted by network issues or client timeouts
- Scenarios where you want to start a long-running task and check back later for results
- Long-running tasks that also invoke function tools during background processing

## How Background Responses Work

Expand Down Expand Up @@ -56,29 +57,26 @@ For non-streaming scenarios, when you initially run an agent, it may or may not

```csharp
AIAgent agent = new AzureOpenAIClient(
new Uri("https://<myresource>.openai.azure.com"),
new AzureCliCredential())
.GetOpenAIResponseClient("<deployment-name>")
new Uri(endpoint),
new DefaultAzureCredential())
.GetResponsesClient(deploymentName)
Comment on lines +60 to +62
.AsAIAgent();

AgentRunOptions options = new()
{
AllowBackgroundResponses = true
};
AgentRunOptions options = new() { AllowBackgroundResponses = true };

AgentThread thread = await agent.GetNewThreadAsync();
AgentSession session = await agent.CreateSessionAsync();

// Get initial response - may return with or without a continuation token
AgentResponse response = await agent.RunAsync("Write a very long novel about otters in space.", thread, options);
AgentResponse response = await agent.RunAsync("Write a very long novel about otters in space.", session, options);

// Continue to poll until the final response is received
while (response.ContinuationToken is not null)
while (response.ContinuationToken is { } token)
Comment on lines +67 to +73
{
// Wait before polling again.
await Task.Delay(TimeSpan.FromSeconds(2));

options.ContinuationToken = response.ContinuationToken;
response = await agent.RunAsync(thread, options);
options.ContinuationToken = token;
response = await agent.RunAsync(session, options);
}

Console.WriteLine(response.Text);
Expand All @@ -91,40 +89,41 @@ Console.WriteLine(response.Text);
- If a continuation token is returned, the agent has started a background process that requires polling
- Use the continuation token from the previous response in subsequent polling calls
- When `ContinuationToken` is `null`, the operation is complete
- Use `AgentSession` (via `CreateSessionAsync()`) to hold conversation context instead of `AgentThread`

## Streaming Background Responses

In streaming scenarios, background responses work much like regular streaming responses - the agent streams all updates back to consumers in real-time. However, the key difference is that if the original stream gets interrupted, agents support stream resumption through continuation tokens. Each update includes a continuation token that captures the current state, allowing the stream to be resumed from exactly where it left off by passing this token to subsequent streaming API calls:

```csharp
AIAgent agent = new AzureOpenAIClient(
new Uri("https://<myresource>.openai.azure.com"),
new AzureCliCredential())
.GetOpenAIResponseClient("<deployment-name>")
new Uri(endpoint),
new DefaultAzureCredential())
.GetResponsesClient(deploymentName)
.AsAIAgent();

AgentRunOptions options = new()
{
AllowBackgroundResponses = true
};
AgentRunOptions options = new() { AllowBackgroundResponses = true };

AgentThread thread = await agent.GetNewThreadAsync();
AgentSession session = await agent.CreateSessionAsync();

AgentResponseUpdate? latestReceivedUpdate = null;
AgentResponseUpdate? lastReceivedUpdate = null;

await foreach (var update in agent.RunStreamingAsync("Write a very long novel about otters in space.", thread, options))
await foreach (AgentResponseUpdate update in agent.RunStreamingAsync("Write a very long novel about otters in space.", session, options))
{
Console.Write(update.Text);

latestReceivedUpdate = update;

// Simulate an interruption
break;

lastReceivedUpdate = update;

// Simulate connection loss after first piece of content received
if (update.Text.Length > 0)

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Guard stream update text before checking length

This new interruption condition dereferences update.Text directly, but streaming updates are not guaranteed to carry text content (for example, tool-call or metadata updates), so update.Text can be null. In that case this sample throws NullReferenceException before storing a continuation token, which breaks the resume flow it is trying to demonstrate.

Useful? React with 👍 / 👎.

{
break;
}
}

// Resume from interruption point captured by the continuation token
options.ContinuationToken = latestReceivedUpdate?.ContinuationToken;
await foreach (var update in agent.RunStreamingAsync(thread, options))
options.ContinuationToken = lastReceivedUpdate?.ContinuationToken;
await foreach (AgentResponseUpdate update in agent.RunStreamingAsync(session, options))
{
Console.Write(update.Text);
}
Expand All @@ -136,6 +135,53 @@ await foreach (var update in agent.RunStreamingAsync(thread, options))
- Store the continuation token from the last received update before interruption
- Use the stored continuation token to resume the stream from the interruption point

## Background Responses with Tools and State Persistence

Background responses also support function calling during background operations. Functions can be invoked by the agent while it processes in the background. Combined with session serialization, you can persist the agent state between polling cycles and restore it in a new process or after a restart.

> [!TIP]
> For a complete working example, see the [Background Responses with Tools and Persistence sample](https://github.com/microsoft/agent-framework/blob/main/dotnet/samples/02-agents/Agents/Agent_Step10_BackgroundResponsesWithToolsAndPersistence/Program.cs).

```csharp
AIAgent agent = new AzureOpenAIClient(
new Uri(endpoint),
new DefaultAzureCredential())
.GetResponsesClient(deploymentName)
.AsAIAgent(
name: "SpaceNovelWriter",
instructions: "You are a space novel writer. Always research relevant facts before writing.",
tools: [AIFunctionFactory.Create(ResearchSpaceFactsAsync), AIFunctionFactory.Create(GenerateCharacterProfilesAsync)]);

AgentRunOptions options = new() { AllowBackgroundResponses = true };
AgentSession session = await agent.CreateSessionAsync();

AgentResponse response = await agent.RunAsync("Write a very long novel about astronauts exploring an uncharted galaxy.", session, options);

while (response.ContinuationToken is not null)
{
// Persist session and continuation token to durable storage
await PersistAgentState(agent, session, response.ContinuationToken);

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Persist the restored session state inside polling loop

Inside the new tools/persistence example, each iteration persists session even after the run is resumed with restoredSession. On the second and later iterations this can save an outdated session snapshot paired with a newer continuation token, causing state drift when restoring after interruptions. The persisted session should be the same session instance used for the latest RunAsync call.

Useful? React with 👍 / 👎.


await Task.Delay(TimeSpan.FromSeconds(10));

// Restore state (e.g. after process restart)
var (restoredSession, continuationToken) = await RestoreAgentState(agent);

options.ContinuationToken = continuationToken;
response = await agent.RunAsync(restoredSession, options);
Comment on lines +170 to +171
}

Console.WriteLine(response.Text);
```

### Key Points for Tools and Persistence:

- Tools registered via `AIFunctionFactory.Create(...)` are called normally during background operations
- Use `agent.SerializeSessionAsync(session)` to persist the session to a `JsonElement`
- Use `agent.DeserializeSessionAsync(serializedSession)` to restore a session from storage
- Use `AgentAbstractionsJsonUtilities.DefaultOptions` when serializing `ResponseContinuationToken` directly
- Persisting state enables recovery from process restarts and server-side recycling between polling cycles

::: zone-end

::: zone pivot="programming-language-python"
Expand All @@ -152,13 +198,15 @@ When working with background responses, consider the following best practices:
- **Implement appropriate polling intervals** to avoid overwhelming the service
- **Use exponential backoff** for polling intervals if the operation is taking longer than expected
- **Always check for `null` continuation tokens** to determine when processing is complete
- **Consider storing continuation tokens persistently** for operations that may span user sessions
- **Consider storing continuation tokens and session state persistently** for operations that may span user sessions or process restarts
- **Use `DefaultAzureCredential` carefully in production**: it is convenient for development but uses credential fallback chains; prefer `ManagedIdentityCredential` or a specific credential in production to avoid latency and security risks

## Limitations and Considerations

- Background responses are dependent on the underlying AI service supporting long-running operations
- Not all agent types may support background responses
- Currently only agents using the OpenAI Responses API (`GetResponsesClient`) support background responses
- Network interruptions or client restarts may require special handling to persist continuation tokens
- Function tools registered with `AIFunctionFactory` are supported during background operations

## Next steps

Expand Down
Loading