Site icon FSIBLOG

How to Design a Optimising Feedback Forms Backend Design for Gaming Application

How to Design a Optimising Feedback Forms Backend Design for Gaming Applications

How to Design a Optimising Feedback Forms Backend Design for Gaming Applications

As a fresher backend developer, I was recently given the task of implementing a feedback form system for a gaming application. Sounds simple, But as I started designing it, I realized it’s much more than just storing some questions and answers. The key challenge was showing the right feedback form to the right user at the right time, based on in-game triggers like “XP Increased” or “Game Won”.

This blog is a deep dive into how I approached the problem, the data structures I used, the performance issues I faced, and the optimization strategies I explored all explained in a hands-on, “I’m building this” tone.

The Problem Statement

Here’s the use case I was solving:

My Initial Data Design (MongoDB)

I went with MongoDB as my NoSQL choice since forms could be dynamic and didn’t need complex joins or transactions.

form_collection (Feedback Forms)

{
"_id": ObjectId,
"title": "Post-Game Feedback",
"description": "Tell us about your experience!",
"created_at": ISODate(),
"status": true,
"start_time": ISODate("2025-06-25T00:00:00Z"),
"end_time": ISODate("2025-06-30T23:59:00Z"),
"questions": [
{
"question_id": ObjectId,
"question_text": "Did you enjoy the game?",
"question_type": "multiple-choice",
"options": ["Yes", "No", "Maybe"]
}
],
"triggers": [ObjectId("trigger1"), ObjectId("trigger2")]
}

user_collection (Users & Triggers)

{
"_id": ObjectId,
"username": "player123",
"email": "player123@example.com",
"triggers": [
{
"trigger_id": ObjectId("trigger1"),
"triggered_at": ISODate("2025-06-26T14:10:00Z")
},
{
"trigger_id": ObjectId("trigger2"),
"triggered_at": ISODate("2025-06-26T14:15:00Z")
}
]
}

The Two Backend Flow I Built

Recording a Trigger Event

When a player achieves something (like winning a game), we store the trigger in the user document.

// Node.js + MongoDB example
await db.collection('user_collection').updateOne(
{ _id: userId },
{ $addToSet: { triggers: { trigger_id: newTriggerId, triggered_at: new Date() } } }
);

Fetching Eligible Feedback Forms

Now comes the tricky part fetching forms that match the user’s current triggers and are active right now.

My First Attempt

const currentTime = new Date();
const userTriggers = user.triggers.map(t => t.trigger_id);

const forms = await db.collection('form_collection').find({
start_time: { $lte: currentTime },
end_time: { $gte: currentTime },
triggers: { $all: userTriggers }
}).toArray();

This worked in dev but I realized this approach is not optimized for high traffic or large datasets. Two key problems:

My Alternative Approach

Then I thought what if I precompute the trigger combinations during form creation

So when an admin creates a form with triggers [A, B], I generate a key like:

"TriggerA-TriggerB" : [Form1, Form2]

I Store This in a Key Value Store

Then during API runtime:

Image Word Coding

If I imagine the system as an image word, this is what it would say:

"TRIGGER SNAPSHOT" + "FORM TIMING WINDOW" => "SHOW FEEDBACK"

It’s like I’m freezing a frame of the user’s game state and quickly checking if any form fits perfectly into that snapshot.

Pros and Cons of Precomputing

Pros:

Cons:

Other Ideas I Explore

Inverted Index Table

Instead of precomputing combinations, store:

 → [Form1, Form3]
TriggerB → [Form2, Form3]

Then at runtime, intersect the lists of forms for each of the user’s triggers.

Materialized Views (in SQL)

Batch generate views of “eligible forms per user” based on their triggers. But:

Graph DB

Model relationships in Neo4j:

Use Cypher queries to traverse. But I dropped this since it felt like overkill.

Final Thoughts

This project was one of my first real experiences tackling a backend design challenge, and it taught me a lot. I realized that it’s not enough to simply make a system work it also needs to scale efficiently. One key takeaway was that data modeling isn’t just about how data is stored; it plays a critical role in how the system performs under load. I also discovered the power of shifting complexity from the read side to the write side for example, by using precomputed data to speed up API responses.

Key value databases turned out to be incredibly useful for achieving low-latency lookups in scenarios like this. Of course, I’m still learning, and I know there’s room for improvement in this design. If you’ve worked on similar features or have ideas for making it better, I’d genuinely love to hear your thoughts and experiences.

Exit mobile version