← Blog
react nativesocket.ionode.jsmongodbred sposi

The Group Chat That Runs During the Wedding

11 April 2026

The Group Chat That Runs During the Wedding

There's a fundamental difference between ReD Sposi's betting game and the chat.

The betting game is asynchronous. The questions sit there, guests answer when they want, the leaderboard updates in real time but there's no urgency. If the server goes down for ten minutes during the week before the wedding, nobody notices.

The chat is different. It's live. Guests use it on the day itself, during the reception, while I'm in a suit doing the rounds of the tables. If something breaks that day, I can't open a laptop. Not physically, not practically, not emotionally.

This constraint — it must work autonomously in a precise time window, with no possibility of intervention — was the primary design criterion.

Base Architecture

The chat is a Socket.IO room. All authenticated guests join the same room on login; messages are emitted by a client, bounce off the server, and arrive at everyone else connected.

// server: join room and broadcast
io.on("connection", (socket) => {
  const guestId = socket.data.guestId;

  socket.join("wedding-chat");

  socket.on("message:send", async (data) => {
    const message = await Message.create({
      guestId,
      text: data.text,
      createdAt: new Date(),
    });

    io.to("wedding-chat").emit("message:new", {
      _id: message._id,
      guestId,
      guestName: socket.data.guestName,
      text: message.text,
      createdAt: message.createdAt,
    });
  });
});

Nothing exotic. The complexity isn't in the architecture — it's in the edge cases.

Persistence: Late Arrivals See History

If a guest opens the chat at 10pm during the dancing, they need to see the messages from the previous hours — not enter a chat that looks empty.

Messages are saved to MongoDB on every send. When a client connects to the chat, it makes a REST request to load the last N messages before subscribing to Socket.IO events.

// client: load history then listen
const loadChat = async () => {
  const history = await api.get("/messages?limit=50");
  setMessages(history.data);

  socket.on("message:new", (msg) => {
    setMessages((prev) => [...prev, msg]);
  });
};

The order matters: history via REST first, then Socket.IO listening. If you do it the other way, you risk a race condition where a message arrives via socket before the history has loaded, and when the history arrives it overwrites the live message.

Reconnection: Guests Move Around

A wedding reception isn't an office. Guests move between the hall, garden, and dance floor. Connections drop — poor 4G signal in certain corners of the venue, the venue's WiFi struggling under load, phones going to sleep.

Socket.IO handles reconnection automatically, but you have to manage what happens when a client reconnects after an absence. Messages that arrived in the meantime aren't re-emitted — the client needs to fetch them.

I solved this by saving the timestamp of the last received message on the client, and on reconnect requesting messages after that timestamp:

socket.on("connect", async () => {
  if (lastMessageAt) {
    const missed = await api.get(`/messages?after=${lastMessageAt}`);
    if (missed.data.length > 0) {
      setMessages((prev) => [...prev, ...missed.data]);
    }
  }
});

It's not pure real-time — there's a window of a few seconds where a message might not appear — but for a low-frequency group chat it's more than sufficient.

Push Notifications and Socket.IO: Coexistence

Guests aren't all on the chat screen all the time. They're in the app looking at the gallery, placing bets, or have their phone in their pocket.

The problem: Socket.IO only reaches the client if it's connected and in foreground. If the app is in the background, the connection is suspended by the OS.

The solution is to duplicate the notification: when a message arrives, the server emits the Socket.IO event and sends a push notification to all guests who aren't actively connected to the chat.

socket.on("message:send", async (data) => {
  const message = await Message.create({ ... });

  // broadcast to all connected
  io.to("wedding-chat").emit("message:new", message);

  // push to those not in chat
  const connectedIds = getConnectedGuestIds("wedding-chat");
  const offlineGuests = await Guest.find({
    _id: { $nin: connectedIds },
    pushToken: { $exists: true },
  });

  await sendPushNotifications(offlineGuests, {
    title: message.guestName,
    body: message.text,
    data: { screen: "chat" },
  });
});

getConnectedGuestIds tracks who has an active socket in the room. It's not precise to the millisecond — there can be a small overlap where someone receives both the push and the message via socket — but that's acceptable.

The Social Context That Simplifies Everything

Building a chat for people who all know each other removes entire layers of complexity.

No moderation. No report/block. No anonymity. No spam. The guest's name is real — every message is signed with the name I assigned them when creating the invite code. If someone writes something inappropriate — a theoretical scenario, but I handle it anyway — I already know who they are.

This means the chat is essentially a pub/sub system with persistence and an already-complete authentication layer. No moderation algorithms, no aggressive rate limiting, no blocking system.

Social complexity is handled by the real-world context, not by the software.

What I Was Most Worried About

The highest-risk moment wasn't technical — it was peak hour.

Imagine: the ceremony ends, everyone takes out their phone, thirty people open the app simultaneously, each loads the message history, each establishes a Socket.IO connection, and within five minutes they start sending messages in rapid succession.

I did a minimal load test — not automated, just me opening the app from six different devices simultaneously — and the server held up without issues. For a few hundred guests on a single Node.js VPS with Socket.IO, the volume is never a real problem.

The real risk was the database: many parallel writes to MongoDB in a short time window. I added an index on createdAt for history queries, and verified that writes wouldn't become a bottleneck with explain(). They weren't.

Being the Groom and the Developer

There's something strange about building a system that will run at a moment when you can't monitor it.

For every other product I've built, I'm always reachable. If Barba Studio has a problem, I open the laptop. If the accounting system stops working, I connect to the server. Being present is always possible even if not always immediate.

August 2nd isn't like that. That time window — from eleven in the morning to midnight — is inaccessible as a developer. It's accessible only as a groom.

Building with this awareness pushes you toward a kind of simplicity different from the economical kind. You don't simplify because you want to finish faster. You simplify because every extra piece is a piece that could break when you can't intervene. Simplicity becomes a form of respect for the day itself.

The chat works. I've tested it. August comes anyway.