AiSupport implementation backed by LangChain4j StreamingChatLanguageModel. When this JAR is on the classpath, @AiEndpoint automatically uses LangChain4j for streaming.
<dependency>
<groupId>org.atmosphere</groupId>
<artifactId>atmosphere-langchain4j</artifactId>
<version>4.0.13</version>
</dependency>Drop the dependency alongside atmosphere-ai and the framework auto-detects it via ServiceLoader:
@AiEndpoint(path = "/ai/chat", systemPrompt = "You are a helpful assistant")
public class MyChat {
@Prompt
public void onPrompt(String message, StreamingSession session) {
session.stream(message); // uses LangChain4j automatically
}
}The LangChain4jAiSupport implementation has priority 100, which takes precedence over the built-in client (priority 0).
For full control, use LangChain4jStreamingAdapter directly:
var session = StreamingSessions.start(resource);
model.chat(ChatMessage.userMessage(prompt),
new AtmosphereStreamingResponseHandler(session));Bridges LangChain4j's StreamingChatResponseHandler to Atmosphere's StreamingSession:
| LangChain4j Callback | StreamingSession Action |
|---|---|
onPartialResponse(text) |
session.send(text) |
onCompleteResponse(response) |
session.complete() |
onError(throwable) |
session.error(message) |
AtmosphereLangChain4jAutoConfiguration bridges a Spring-managed StreamingChatLanguageModel bean to the LangChain4jAiSupport SPI automatically.
- Spring Boot LangChain4j Chat -- complete example with LangChain4j
- AI Integration --
AiSupportSPI,@AiEndpoint, filters, routing - Spring AI Adapter
- Google ADK Adapter
- Embabel Adapter