Naresh Shan Logo

March 11, 2026

Your Sprint Review Attendance Is Your Alignment Score

Your Sprint Review Attendance Is Your Alignment Score

You spend 45 minutes demoing your design implementation. Half the engineering team has their cameras off. The PM asks three questions you already answered in the Figma file. Nobody mentions the edge case Sarah spent two days solving. The sprint review ends. Nothing changes in the next sprint.

This isn't a meeting problem. It's an alignment signal, and it's been compounding for at least two sprints before today.

When developers disengage from sprint reviews, the instinct is to treat it as a motivation issue. Alex schedules a retrospective about the retrospective. Marcus tweaks the agenda. Someone suggests making it more interactive. Those interventions target the symptom. The signal developers are sending is about the structure of the feedback loop itself: it isn't surfacing information they need, and they've learned through experience that showing up doesn't change what they build next.

The Scrum.org community forum documents this pattern explicitly. In a thread titled 'Developers Show Less Interest During Sprint Reviews,' practitioners describe developers who view demos as a waste of time — not because they're lazy, but because the ceremony doesn't connect to anything that affects their work. One Scrum Master noted that developers wanted the Product Owner and stakeholders to handle the review and just relay the feedback, effectively requesting to opt out of a ceremony they no longer trusted to route useful information back to them.

That preference for indirect feedback isn't apathy. It's a rational response to a broken loop.

Developer disengagement from sprint reviews typically surfaces one of three structural breakdowns. First, the demo doesn't reflect what they built. When Sarah's sprint review demo presents the product as designed in Figma rather than as implemented by engineering, developers are watching a version of their own work that doesn't exist. Second, the review isn't surfacing information they need. A sprint review where Marcus asks surface-level questions teaches developers that attendance delivers no new signal. Third, the feedback loop doesn't reach them anyway. In many teams, feedback from sprint reviews travels through the Product Owner before reaching developers. By the time it arrives, it's been filtered, prioritised, or deferred.

The attendance problem you observe in today's sprint review is the output of decisions made two or three sprints ago. The feedback loop broke before the cameras started turning off.

This is why treating sprint review attendance as an engagement metric misframes the problem. Engagement is a lagging signal — it reflects accumulated experience. Alignment is what's being measured. When developers no longer believe that attending the sprint review improves the quality of the decisions they make next sprint, disengagement isn't irrational. It's accurate.

Flowtrace's 2025 State of Meetings report found that 67% of meetings are considered unproductive by the executives who run them. But sprint reviews aren't generic meetings. They're the only ceremony in Scrum specifically designed to inspect the increment and adapt the product backlog. When that ceremony degrades into a demo watched by disengaged attendees, the product's feedback loop has been severed.

The useful diagnostic question isn't: how do we get developers back in the room? It's: what would a developer need to gain from attending that they can't get any other way?

If sprint review attendance functions as an alignment indicator, it can be read diagnostically. Track who's not attending, not just how many. If the same developers consistently opt out, the disengagement isn't random. It reflects which parts of the team have lost faith in the feedback loop. Ask what information is exclusive to the sprint review. If everything discussed in the review is available before or after, then the ceremony isn't generating new signal. Watch the gap between questions asked and answers incorporated.

The cameras turn off one by one across two or three sprints until someone notices the attendance has dropped and frames it as a team engagement problem. It's not. It's your alignment score. The developers who aren't in the room have already calculated that attending doesn't improve what they build. That calculation is the signal worth reading, before it becomes a retro item.

Start by asking the developers who've stopped attending one question: what would need to be true about the sprint review for it to be worth your time? The answers will map the alignment gap more precisely than any attendance tracker.