Skip to main content
MMujtaba
Back to Projects

Battle Questions — Competitive Trivia Gaming Platform

Web AppSaaS
ReactNode.jsWebSocketsRedisPostgreSQLAWS

10K+

Simultaneous Battles

<50ms

Game Engine Latency

75K+

Daily Active Users

Overview

Battle Questions is a competitive trivia gaming platform where players enter real-time 1v1 and tournament brackets, answering questions under time pressure with live scoring. At peak, 10K+ battles run simultaneously with 75K+ daily active users. I led backend engineering for the game engine, matchmaking system, and anti-cheat infrastructure.

The Challenge

Trivia games have a precision requirement most web apps don't: timing is a gameplay mechanic. A 500ms difference in when two players receive a question or have their answer registered is a fairness problem, not just a performance problem. In a competitive game, latency asymmetry between players is cheating — even if unintentional.

The anti-cheat requirement added another layer: players would naturally try to exploit any deterministic timing advantage, look up answers, or use automated bots. The system needed to detect and prevent all three without degrading the experience for honest players.

Architecture & Technical Decisions

Game Engine in Redis

Each battle was managed as a Redis Hash — a lightweight, in-memory state machine. The game engine server consumed events (answer submitted, timer expired, player disconnected) and executed state transitions atomically using Lua scripts. Lua scripts run atomically in Redis, eliminating race conditions when two events arrive simultaneously. Game state for 10K concurrent battles fit comfortably in Redis memory (<2GB).

  • Redis Hash per battle: player IDs, current question index, scores, timestamps
  • Lua scripts for atomic state transitions (answer validation, score increment, advance-question)
  • Game engine events published to Redis streams for async processing (analytics, leaderboard updates)
  • Battle state persisted to PostgreSQL only on completion — no per-event DB writes during gameplay

Timing Fairness Protocol

To ensure both players in a battle received questions at the same moment, the server used a synchronized delivery protocol: the question was pre-loaded into Redis at T=0, and a scheduled broadcast (using Redis keyspace notifications + setTimeout precision) fired simultaneously to both WebSocket connections. Answer timestamps were validated server-side against the broadcast timestamp — client-reported timings were rejected entirely.

  • Server-authoritative timestamps: all timing measured server-side from broadcast moment
  • Pre-loaded question delivery: question in Redis before broadcast, eliminating DB lookup latency from the hot path
  • Network jitter compensation: 50ms grace window added to answer deadlines
  • Sub-50ms p99 latency from question broadcast to client receipt (measured via WebSocket ping tracking)

Anti-Cheat Systems

Three layers of anti-cheat protected competitive integrity. First, answer timing analysis: legitimate human response times follow a distribution; answers arriving in <150ms or with machine-regular intervals triggered flagging. Second, answer pattern analysis: a player answering correctly on questions across wildly different difficulty levels at consistent speed was statistically improbable. Third, device fingerprinting to detect multiple accounts from the same device/network.

  • Statistical timing analysis: responses <150ms or with coefficient of variation <0.05 flagged
  • Elo-adjusted accuracy tracking: suspicious accuracy relative to rating triggers review
  • Progressive penalty system: warning → temp ban → permanent ban
  • Shadow mode for confirmed cheaters: they play normally but can only match against other flagged accounts

Results

  • 10K+ simultaneous battles sustained during peak hours without performance degradation
  • Game engine latency: <50ms p99 from answer submission to score update
  • 75K+ daily active users with 38-minute average session length
  • Cheat detection: 2.1% of accounts flagged in first month, reduced to 0.4% after shadow mode deterrence
  • Zero timing-related fairness complaints from competitive tournament participants
  • AWS infrastructure cost: $0.0003 per battle (Redis efficiency vs. DB-per-event approach)

What I Learned

Game backends have a different performance contract than typical web APIs. A 200ms response on a CRUD endpoint is perfectly acceptable. A 200ms delay in a trivia game is a lost round and a frustrated user. This project pushed me to think about latency at a microsecond level — not because the infrastructure demanded it, but because the product's fairness contract demanded it. Redis Lua scripts, pre-loading, and server-authoritative timing aren't over-engineering for a trivia game; they're the minimum viable correctness.

Related Projects