Skip to content

Latest commit

 

History

History
190 lines (142 loc) · 8.8 KB

File metadata and controls

190 lines (142 loc) · 8.8 KB

Atmosphere

Atmosphere

The transport-agnostic real-time framework for the JVM.
WebSocket, SSE, Long-Polling, gRPC, MCP — one API, any transport.

Maven Central npm Atmosphere CI Atmosphere.js CI


Atmosphere was built on one idea: your application code shouldn't care how the client is connected. Write to a Broadcaster, and the framework delivers to every subscriber — whether they're on a WebSocket, an SSE stream, a long-polling loop, a gRPC channel, or an MCP session. The transport is pluggable and transparent.

The two core abstractions are Broadcaster (a named pub/sub channel) and AtmosphereResource (a single connection). Additional modules — rooms, AI/LLM streaming, clustering, observability — build on top of these.

Generate a Project

jbang generator/AtmosphereInit.java --name my-app --handler ai-chat --ai builtin --tools
cd my-app && ./mvnw spring-boot:run

Generates a ready-to-run Spring Boot project with your choice of handler (chat, ai-chat, mcp-server), AI framework, and optional @AiTool methods. See generator/README.md for all options.

Quick Start

<dependency>
    <groupId>org.atmosphere</groupId>
    <artifactId>atmosphere-runtime</artifactId>
    <version>4.0.10</version>
</dependency>
@ManagedService(path = "/chat")
public class Chat {

    @Ready
    public void onReady(AtmosphereResource r) {
        // r could be WebSocket, SSE, Long-Polling, gRPC, or MCP — doesn't matter
        log.info("{} connected via {}", r.uuid(), r.transport());
    }

    @Message(encoders = JacksonEncoder.class, decoders = JacksonDecoder.class)
    public ChatMessage onMessage(ChatMessage message) {
        // Return value is broadcast to all subscribers
        return message;
    }
}

What's New in 4.0 (full list)

Atmosphere applies the same philosophy to AI: your code shouldn't care which AI framework is on the classpath. Tools (@AiTool), conversation memory, guardrails, multi-backend routing, metrics, and observability are declared once with Atmosphere annotations and automatically bridged to Spring AI, LangChain4j, Google ADK, or Embabel at runtime. Per-endpoint model selection, auto-detected persistence (Redis/SQLite), and broadcast filter auto-registration round out the platform.

@AiEndpoint(path = "/ai/chat",
            systemPrompt = "You are a helpful assistant",
            conversationMemory = true,
            tools = AssistantTools.class)
public class AiChat {

    @Prompt
    public void onPrompt(String message, StreamingSession session) {
        session.stream(message);  // auto-detects the AI framework from the classpath
    }
}

Tools are declared with @AiTool — framework-agnostic, portable across all backends:

public class AssistantTools {

    @AiTool(name = "get_weather", description = "Get weather for a city")
    public String getWeather(@Param("city") String city) {
        return weatherService.lookup(city);
    }
}

Swap the AI backend by changing one Maven dependency — no tool code changes:

Backend Dependency Bridged via
Built-in (Gemini/OpenAI/Ollama) atmosphere-ai direct
Spring AI atmosphere-spring-ai SpringAiToolBridge
LangChain4j atmosphere-langchain4j LangChain4jToolBridge
Google ADK atmosphere-adk AdkToolBridge
Embabel atmosphere-embabel EmbabelAiSupport

See spring-boot-ai-tools for the full tool-calling sample, spring-boot-ai-classroom for multi-persona conversation memory, and expo-client for React Native/Expo mobile chat. Four official framework samples have been forked and augmented with Atmosphere streaming: LangChain4j tools, Spring AI routing, Embabel horoscope, and ADK tools.

CLI-powered LLM backend

Already have a Claude Code, Copilot, Cursor, or Gemini CLI license? Embacle turns any CLI tool into an OpenAI-compatible LLM provider — no separate API key required.

LLM_BASE_URL=http://localhost:3000/v1 LLM_MODEL=copilot:claude-sonnet-4.6 LLM_API_KEY=not-needed \
  ./mvnw spring-boot:run -pl samples/spring-boot-ai-classroom

MCP — expose tools to AI agents

@McpServer(name = "my-tools", path = "/atmosphere/mcp")
public class MyTools {

    @McpTool(name = "ask_ai", description = "Ask AI and stream the answer to browsers")
    public String askAi(
            @McpParam(name = "question") String question,
            @McpParam(name = "topic") String topic,
            StreamingSession session) {
        session.stream(question);  // tokens broadcast to all clients on the topic
        return "streaming to " + topic;
    }
}

Modules

Core

Module Artifact What it does
Runtime atmosphere-runtime WebSocket, SSE, Long-Polling (Servlet 6.0+)
gRPC atmosphere-grpc Bidirectional streaming transport (grpc-java 1.71)
Rooms built into runtime Room management with join/leave and presence

AI

Module Artifact What it does
AI core atmosphere-ai AiSupport SPI, @AiEndpoint, filters, routing, conversation memory
Spring AI atmosphere-spring-ai Adapter for Spring AI ChatClient
LangChain4j atmosphere-langchain4j Adapter for LangChain4j StreamingChatLanguageModel
Google ADK atmosphere-adk Adapter for Google ADK Runner
Embabel atmosphere-embabel Adapter for Embabel AgentPlatform
MCP server atmosphere-mcp Model Context Protocol server over WebSocket

Cloud

Module Artifact What it does
Redis atmosphere-redis Cross-node broadcasting via Redis pub/sub
Kafka atmosphere-kafka Cross-node broadcasting via Kafka
Durable sessions atmosphere-durable-sessions Session persistence across restarts (SQLite / Redis)

Extensions

Module Artifact What it does
Spring Boot atmosphere-spring-boot-starter Auto-configuration for Spring Boot 4.0+
Quarkus atmosphere-quarkus-extension Build-time processing for Quarkus 3.21+
Kotlin DSL atmosphere-kotlin Builder API and coroutine extensions
atmosphere.js atmosphere.js (npm) Browser & React Native client with React, Vue, Svelte, and RN hooks
wAsync atmosphere-wasync Async Java client — WebSocket, SSE, long-polling, gRPC

Requirements

Java Spring Boot Quarkus
21+ 4.0.2+ 3.21+

JDK 21 virtual threads are used by default.

Documentation

Commercial Support

Available via Async-IO.org

License

Apache 2.0 — @Copyright 2008-2026 Async-IO.org