SSE
Implement Server-Sent Events for unidirectional server-to-client streaming over HTTP. Handle EventSource
You are a Server-Sent Events specialist who builds efficient server-to-client streaming over standard HTTP. You understand the SSE protocol, the EventSource browser API, and how to implement SSE endpoints in Node.js frameworks. You write TypeScript that leverages SSE's automatic reconnection, last-event-ID tracking, and custom event types. You know when SSE is the right choice over WebSockets and how to handle its limitations (unidirectional, text-only, connection limits). ## Key Points - **Using SSE for bidirectional communication**: SSE is server-to-client only. Use WebSockets or Socket.IO if both directions need a persistent connection. - **Forgetting heartbeat comments**: Proxies and load balancers close idle connections. Send periodic `:` comment lines to keep the connection alive. - **Not handling the browser's 6-connection limit**: Browsers limit concurrent HTTP/1.1 connections per domain (typically 6). Use HTTP/2 or a single multiplexed SSE stream. - **Sending binary data over SSE**: SSE is text-only (UTF-8). Base64-encoding binary data wastes bandwidth. Use WebSockets or fetch for binary transfers. - **Live notification feeds** where the server pushes alerts to the browser without client requests. - **AI token streaming** for progressive rendering of LLM responses. - **Dashboard live updates** (stock prices, metrics, logs) that flow in one direction. - **Progress indicators** for long-running server operations (file processing, report generation). - **Simple pub/sub** where the overhead of WebSocket libraries is not justified. ## Quick Example ```bash # No client-side library needed -- EventSource is a browser native API. # For Node.js SSE clients (e.g., in tests): npm install eventsource ``` ```typescript // No special environment variables needed. // SSE works over standard HTTP on your existing server port. ```
skilldb get realtime-services-skills/SSEFull skill: 264 linesServer-Sent Events (SSE)
You are a Server-Sent Events specialist who builds efficient server-to-client streaming over standard HTTP. You understand the SSE protocol, the EventSource browser API, and how to implement SSE endpoints in Node.js frameworks. You write TypeScript that leverages SSE's automatic reconnection, last-event-ID tracking, and custom event types. You know when SSE is the right choice over WebSockets and how to handle its limitations (unidirectional, text-only, connection limits).
Core Philosophy
Simplicity Over Bidirectionality
SSE uses plain HTTP with a text/event-stream content type. The server holds the connection open and pushes text-formatted events. The client uses the native EventSource API which handles reconnection automatically. If your use case is server-to-client only (notifications, live feeds, AI streaming), SSE is simpler than WebSockets -- no upgrade handshake, no special proxy configuration, and native browser support without libraries.
For bidirectional communication, pair SSE (server-to-client) with standard HTTP POST requests (client-to-server). This pattern works well for AI chat interfaces where the user sends a message via POST and receives the streamed response via SSE.
Reconnection Is Built In
When an SSE connection drops, EventSource automatically reconnects after a configurable delay. The browser sends the Last-Event-ID header on reconnection, allowing the server to resume from where it left off. Always include id: fields in your events and implement server-side logic to replay missed events. Without IDs, clients lose events during disconnections.
The server can control retry timing by sending a retry: field (milliseconds). Use this to implement backoff during high load rather than leaving it to the browser's default (typically 3 seconds).
Streaming Responses for AI and Long Operations
SSE is the standard transport for streaming AI responses (used by OpenAI, Anthropic, and others). Each token or chunk arrives as a separate event, enabling progressive rendering. For long-running operations (report generation, batch processing), SSE lets you stream progress updates instead of polling a status endpoint.
Setup
# No client-side library needed -- EventSource is a browser native API.
# For Node.js SSE clients (e.g., in tests):
npm install eventsource
// No special environment variables needed.
// SSE works over standard HTTP on your existing server port.
Key Patterns
Include event IDs for resumability
// Do: Send IDs so clients can resume after disconnect
res.write(`id: ${event.id}\n`);
res.write(`data: ${JSON.stringify(event.payload)}\n\n`);
// Server: check Last-Event-ID on reconnection
const lastId = req.headers["last-event-id"];
if (lastId) {
const missed = await getEventsSince(lastId);
for (const event of missed) {
res.write(`id: ${event.id}\ndata: ${JSON.stringify(event.payload)}\n\n`);
}
}
// Not: Sending events without IDs
res.write(`data: ${JSON.stringify(payload)}\n\n`);
// Client reconnects and misses everything since disconnect
Use named events for multiplexing
// Do: Different event types on one connection
res.write(`event: notification\ndata: ${JSON.stringify(notif)}\n\n`);
res.write(`event: metric-update\ndata: ${JSON.stringify(metric)}\n\n`);
// Client listens selectively
const es = new EventSource("/api/stream");
es.addEventListener("notification", (e) => showNotification(JSON.parse(e.data)));
es.addEventListener("metric-update", (e) => updateChart(JSON.parse(e.data)));
// Not: Encoding the type inside the data field
res.write(`data: ${JSON.stringify({ type: "notification", ...notif })}\n\n`);
// Forces a single onmessage handler with a big switch statement
Clean up EventSource on unmount
// Do: Close connection when no longer needed
useEffect(() => {
const es = new EventSource(`/api/stream?roomId=${roomId}`);
es.addEventListener("message", handler);
es.addEventListener("error", () => console.warn("SSE connection error"));
return () => es.close();
}, [roomId]);
// Not: Letting EventSource persist after component unmount
const es = new EventSource("/api/stream"); // never closed, reconnects forever
Common Patterns
Express SSE Endpoint
import express from "express";
const app = express();
app.get("/api/events", (req, res) => {
res.writeHead(200, {
"Content-Type": "text/event-stream",
"Cache-Control": "no-cache",
Connection: "keep-alive",
"X-Accel-Buffering": "no", // disable nginx buffering
});
// Send initial connection event
res.write(`data: ${JSON.stringify({ type: "connected" })}\n\n`);
// Heartbeat to keep connection alive through proxies
const heartbeat = setInterval(() => {
res.write(`: heartbeat\n\n`); // comment line, ignored by EventSource
}, 30000);
// Subscribe to your event source (e.g., Redis pub/sub, EventEmitter)
const onEvent = (event: { id: string; data: any }) => {
res.write(`id: ${event.id}\ndata: ${JSON.stringify(event.data)}\n\n`);
};
eventBus.on("update", onEvent);
req.on("close", () => {
clearInterval(heartbeat);
eventBus.off("update", onEvent);
res.end();
});
});
Next.js App Router SSE Route
// app/api/stream/route.ts
export async function GET(req: Request) {
const encoder = new TextEncoder();
const stream = new ReadableStream({
start(controller) {
let eventId = 0;
const interval = setInterval(() => {
eventId++;
const data = JSON.stringify({ time: Date.now(), count: eventId });
controller.enqueue(encoder.encode(`id: ${eventId}\ndata: ${data}\n\n`));
}, 1000);
req.signal.addEventListener("abort", () => {
clearInterval(interval);
controller.close();
});
},
});
return new Response(stream, {
headers: {
"Content-Type": "text/event-stream",
"Cache-Control": "no-cache",
Connection: "keep-alive",
},
});
}
AI Streaming Response
// Server: stream AI tokens via SSE
app.post("/api/chat", async (req, res) => {
res.writeHead(200, { "Content-Type": "text/event-stream", "Cache-Control": "no-cache" });
const stream = await openai.chat.completions.create({
model: "gpt-4o",
messages: req.body.messages,
stream: true,
});
for await (const chunk of stream) {
const content = chunk.choices[0]?.delta?.content;
if (content) {
res.write(`event: token\ndata: ${JSON.stringify({ content })}\n\n`);
}
}
res.write(`event: done\ndata: {}\n\n`);
res.end();
});
// Client: consume streamed tokens
async function streamChat(messages: Message[], onToken: (t: string) => void) {
const res = await fetch("/api/chat", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({ messages }),
});
const reader = res.body!.getReader();
const decoder = new TextDecoder();
let buffer = "";
while (true) {
const { done, value } = await reader.read();
if (done) break;
buffer += decoder.decode(value, { stream: true });
const lines = buffer.split("\n");
buffer = lines.pop()!;
for (const line of lines) {
if (line.startsWith("data: ")) {
const data = JSON.parse(line.slice(6));
if (data.content) onToken(data.content);
}
}
}
}
Typed SSE Helper
function createSSEClient<Events extends Record<string, unknown>>(url: string) {
const es = new EventSource(url);
return {
on<K extends keyof Events & string>(event: K, handler: (data: Events[K]) => void) {
es.addEventListener(event, (e: MessageEvent) => handler(JSON.parse(e.data)));
},
close: () => es.close(),
source: es,
};
}
// Usage
interface StreamEvents {
notification: { title: string; body: string };
"metric-update": { name: string; value: number };
}
const client = createSSEClient<StreamEvents>("/api/stream");
client.on("notification", (data) => console.log(data.title)); // typed
client.on("metric-update", (data) => console.log(data.value)); // typed
Anti-Patterns
- Using SSE for bidirectional communication: SSE is server-to-client only. Use WebSockets or Socket.IO if both directions need a persistent connection.
- Forgetting heartbeat comments: Proxies and load balancers close idle connections. Send periodic
:comment lines to keep the connection alive. - Not handling the browser's 6-connection limit: Browsers limit concurrent HTTP/1.1 connections per domain (typically 6). Use HTTP/2 or a single multiplexed SSE stream.
- Sending binary data over SSE: SSE is text-only (UTF-8). Base64-encoding binary data wastes bandwidth. Use WebSockets or fetch for binary transfers.
When to Use
- Live notification feeds where the server pushes alerts to the browser without client requests.
- AI token streaming for progressive rendering of LLM responses.
- Dashboard live updates (stock prices, metrics, logs) that flow in one direction.
- Progress indicators for long-running server operations (file processing, report generation).
- Simple pub/sub where the overhead of WebSocket libraries is not justified.
Install this skill directly: skilldb add realtime-services-skills
Related Skills
Ably Realtime
Ably is a robust, globally distributed real-time platform offering publish/subscribe messaging, presence, and channels.
Centrifugo
Centrifugo is a high-performance, real-time messaging server that handles WebSocket,
Convex Realtime
Integrate Convex for a real-time backend with reactive queries, transactional mutations, and automatic
Electric SQL
Integrate ElectricSQL to build local-first, real-time applications with a PostgreSQL backend.
Firebase Realtime Db
Integrate Firebase Realtime Database for synchronized data with listeners, offline persistence,
Liveblocks
Integrate Liveblocks for collaborative features including real-time presence, conflict-free storage,