Back to Blog
5 min read

Building Real-Time Features with Socket.IO and Node.js

How I implemented real-time synchronization across distributed services in a logistics management system — patterns, pitfalls, and production lessons.

Building Real-Time Features with Socket.IO and Node.js

Building BatailLog — a logistics management system for a French 5PL construction provider — required real-time synchronization across multiple services. Here's the architecture I used.

The Use Case

A 5PL logistics company manages relationships between multiple parties: suppliers, transporters, construction sites, and the company itself. When an order status changes, everyone needs to know instantly. Polling was out of the question — too much load, too much delay.

Socket.IO with a pub/sub pattern was the answer.

Architecture

Client (React) ←→ API Gateway ←→ Socket.IO Server
                                      ↕
                               Redis Pub/Sub
                                      ↕
                           Microservices (Orders, Inventory, Sites)

Each microservice publishes events to Redis. The Socket.IO server subscribes and fans out to connected clients.

Socket.IO Server Setup

import { Server } from "socket.io"
import { createClient } from "redis"
import { createAdapter } from "@socket.io/redis-adapter"
 
const pubClient = createClient({ url: process.env.REDIS_URL })
const subClient = pubClient.duplicate()
 
await Promise.all([pubClient.connect(), subClient.connect()])
 
const io = new Server(httpServer, {
  cors: { origin: process.env.FRONTEND_URL, credentials: true },
  adapter: createAdapter(pubClient, subClient),
})
 
// Authenticate connections
io.use(async (socket, next) => {
  const token = socket.handshake.auth.token
  try {
    const payload = verifyToken(token)
    socket.data.userId = payload.userId
    socket.data.companyId = payload.companyId
    next()
  } catch {
    next(new Error("Unauthorized"))
  }
})
 
io.on("connection", (socket) => {
  // Join company room — scoped broadcasts
  socket.join(`company:${socket.data.companyId}`)
 
  socket.on("subscribe:order", (orderId: string) => {
    socket.join(`order:${orderId}`)
  })
 
  socket.on("unsubscribe:order", (orderId: string) => {
    socket.leave(`order:${orderId}`)
  })
})

Publishing Events From Microservices

Each microservice publishes to Redis when something changes:

// orders-service/src/events.ts
export async function publishOrderUpdate(order: Order) {
  await redisClient.publish(
    "order:updated",
    JSON.stringify({
      orderId: order.id,
      companyId: order.companyId,
      status: order.status,
      updatedAt: order.updatedAt,
    })
  )
}

The Socket.IO server subscribes and forwards:

subClient.subscribe("order:updated", (message) => {
  const data = JSON.parse(message)
  
  // Broadcast to everyone in this company
  io.to(`company:${data.companyId}`).emit("order:updated", data)
  
  // Also broadcast to anyone watching this specific order
  io.to(`order:${data.orderId}`).emit("order:updated", data)
})

Client Side (React)

// hooks/useSocket.ts
import { useEffect, useRef } from "react"
import { io, Socket } from "socket.io-client"
 
export function useSocket(token: string) {
  const socketRef = useRef<Socket | null>(null)
 
  useEffect(() => {
    socketRef.current = io(process.env.NEXT_PUBLIC_WS_URL!, {
      auth: { token },
      reconnectionDelay: 1000,
      reconnectionDelayMax: 5000,
    })
 
    return () => {
      socketRef.current?.disconnect()
    }
  }, [token])
 
  return socketRef.current
}
 
// hooks/useOrderUpdates.ts
export function useOrderUpdates(orderId: string) {
  const socket = useSocket(useAuthToken())
  const queryClient = useQueryClient()
 
  useEffect(() => {
    if (!socket) return
 
    socket.emit("subscribe:order", orderId)
 
    socket.on("order:updated", (data) => {
      // Optimistically update the cache
      queryClient.setQueryData(["order", orderId], (old: Order) => ({
        ...old,
        status: data.status,
        updatedAt: data.updatedAt,
      }))
    })
 
    return () => {
      socket.emit("unsubscribe:order", orderId)
      socket.off("order:updated")
    }
  }, [socket, orderId])
}

Handling Reconnections

The biggest production issue: clients lose connection (mobile users, flaky networks). When they reconnect, they may have missed events. I solve this with a "catch-up" request on reconnect:

socket.on("connect", async () => {
  // Fetch missed updates since last known timestamp
  const lastSeen = localStorage.getItem("lastEventTimestamp")
  if (lastSeen) {
    const missed = await api.get(`/events?since=${lastSeen}`)
    missed.forEach(applyUpdate)
  }
})

Production Lessons

  1. Always use rooms — broadcasting to all connected clients is almost never what you want. Scope everything to a company, user, or entity room.

  2. Redis adapter is mandatory at scale — the in-memory adapter breaks as soon as you have more than one server instance.

  3. Monitor connection counts — Socket.IO connections are stateful. Set maxHttpBufferSize and pingTimeout appropriately.

  4. Rate-limit event emissions — a client that emits 100 events/second can DoS your server. Throttle on the server side.

Real-time features turn a good app into a great one. The complexity is worth it.