A real-time AI agent chat using the Embabel Agent Framework and Atmosphere. Embabel agents handle planning, tool calling, and orchestration — Atmosphere streams the agent events (progress, tool calls, streaming texts) to the browser over WebSocket in real time.
An Embabel @Agent class defines the agent's behavior:
@Agent(name = "chat-assistant",
description = "A helpful chat assistant that answers user questions")
public class ChatAssistantAgent {
@Action(description = "Answer the user's question")
public String answer(String userMessage) {
return "Answer the following question clearly and concisely: " + userMessage;
}
}A @ManagedService endpoint at /atmosphere/embabel-chat:
- Client sends a prompt via WebSocket
@Messagehandler delegates toAgentRunnerwhich creates aStreamingSessionAgentRunnerlooks up thechat-assistantagent on the EmbabelAgentPlatform- Calls
agentPlatform.runAgentFrom()with anAtmosphereOutputChannel— agent events stream to the browser
var session = StreamingSessions.start(resource);
var agent = agentPlatform.agents().stream()
.filter(a -> "chat-assistant".equals(a.getName()))
.findFirst().orElseThrow();
var agentRequest = new AgentRequest("chat-assistant", channel -> {
var options = ProcessOptions.DEFAULT.withOutputChannel(channel);
agentPlatform.runAgentFrom(agent, options, Map.of("userMessage", userMessage));
return Unit.INSTANCE;
});
Thread.startVirtualThread(() -> ADAPTER.stream(agentRequest, session));Same streaming UI as the other AI samples — connects over WebSocket, renders streaming texts and progress events.
The Embabel platform auto-configures via Spring AI. Set your LLM API key:
export OPENAI_API_KEY=sk-...
# Or use any OpenAI-compatible provider:
export LLM_BASE_URL=https://generativelanguage.googleapis.com/v1beta/openai
export LLM_API_KEY=AIza...
export LLM_MODEL=gemini-2.5-flash./mvnw spring-boot:run -pl samples/spring-boot-embabel-chatOpen http://localhost:8082 in your browser.
spring-boot-embabel-chat/
├── pom.xml
└── src/main/
├── java/.../embabelchat/
│ ├── EmbabelChatApplication.java # Spring Boot entry point
│ ├── EmbabelChat.java # @ManagedService endpoint
│ ├── ChatAssistantAgent.java # @Agent definition
│ ├── AgentRunner.java # AgentPlatform bridge
│ └── LlmConfig.java # Configuration
└── resources/
├── application.yml
└── static/
├── index.html
└── assets/ # Bundled atmosphere.js client
- AI / LLM Streaming Guide
- AI chat sample — built-in LLM client
- LangChain4j sample — LangChain4j adapter