Few are those who see with their own eyes and feel with their own hearts.
Note: For more on SAFe Scrum, please read the additional Framework articles in the Scrum series, including SAFe Scrum, Scrum Master/Team Coach, Iterations, Iteration Planning, Iteration Goals, and Iteration Retrospective
The Iteration Review is a regular SAFe Scrum event where the team inspects the iteration increment, assesses progress, and adjusts the team backlog.
The iteration review is the second to last event of the iteration. It provides a way to regularly gather immediate, contextual feedback from the team and its stakeholders. The iteration review offers several benefits:
- It brings closure to the iteration timebox
- It allows team members to demonstrate their contributions and to take some satisfaction and pride in their work
- It provides an opportunity for the team to receive feedback to improve the solution under development
- It shows the results of the latest system increment to help determine future work
An iteration review is where the team demos a working, tested increment. No slides are needed. Instead, the focus is on the solution instead of a presentation. The team and stakeholders review the accomplishments in the iteration—based on this information, attendees collaborate on what to do next. The Team Backlog may also be adjusted to meet new opportunities.
Inputs and Outputs of the Iteration Review
Inputs to the iteration review include:
- Iteration goals and PI Objectives
- The team’s increment deployed to a staging environment (or production environment where appropriate)
- A brief list of work to be demoed to prepare people for what they are about to see
A successful iteration review event delivers the following outputs:
- Feedback on the increment and progress toward the iteration goals and broader PI Objectives
- Adjusted Team Backlog based on feedback
- Identification of risks and impediments
The preparation for the iteration review begins during Iteration Planning, where teams start thinking about how they will demo the committed Stories. ‘Beginning with the end in mind’ facilitates iteration planning and alignment, fostering a more thorough understanding of the functionality needed ahead of iteration execution.
The PO starts the iteration review by discussing the iteration goals and their status. It proceeds with a walk-through of all the committed stories. Teams demonstrate the significant new behavior and knowledge gained from the iteration’s completed stories, Spikes, Refactors, and Nonfunctional Requirements (NFRs). The demos should be part of a working, tested system—preferably in a staging environment closely resembling production. Spikes and NFRs can be demonstrated via a presentation of findings if the functionality lacks a user interface. Stakeholders provide feedback on the stories that the team demoed, which is the primary goal of the review.
The team reflects on stories not completed after the demo and why they could not finish them. This discussion usually results in discovering impediments or risks, false assumptions, changing priorities, estimating inaccuracies, over-commitment, or other problems with Team Flow. These findings often lead to further study in the Iteration Retrospective and may result in improvements to support better planning and execution going forward. Figure 1 shows an iteration review in action.
The team reflects on how well it did within the iteration and determines its progress toward its Team PI objectives. It finishes the event by refining the Team Backlog, based on the feedback received, before the next iteration planning event.
Attendees at the iteration review include:
- The Product Owner (PO)
- Scrum Master/Team Coach
- All team members and other stakeholders or subject matter experts
- Stakeholders, which may also include other teams or trains.
The timebox for the event is a maximum of 90 minutes for a two-week iteration. Figure 2 shows an example agenda and a description of each item.
The Scrum Master/Team Coach or PO typically facilitates the iteration review for the team, ensuring they stay within the agreed event agenda and timebox. Following are descriptions of the example agenda:
- Review iteration goals – Discuss the status of each iteration goal. Teams may also review PI objectives for a broader context.
- Demonstrate completed stories – The iteration review proceeds with a walk-through and demonstration of each completed story (spikes, NFRs, and any other work finished by the team). Demos should show progress towards iteration goals, PI objectives, solution changes, test scenarios, or a prototype representing the user’s environment. Spikes can be demoed as a presentation of findings or learning. The team and stakeholders should ask questions and provide constructive feedback.
- Reflect on any incomplete stories – Next, the team should reflect on missed iteration goals and stories they did not complete to identify opportunities for future improvement. This discussion usually results in discovering impediments or risks, false assumptions, changing priorities, estimating inaccuracies, or over-commitment.
- Identify risks and impediments – After the demo and reflecting on any incomplete stories, the team identifies new risks or dependencies that might impact achieving the PI objectives. Teams often use the ROAM (Resolved, Owned, Accepted, Mitigated) process to address the risks as needed.
- Refine the Team Backlog – Based on stakeholder feedback, the team can refine their backlog to reflect any adjustments before the next Iteration Planning event.
Below are some tips for running a successful iteration review event:
- Limit preparation to less than two hours
- Timebox the event to about 90 minutes
- Minimize the use of slides; the iteration review is intended to garner feedback on working, tested system components
- Verify that completed stories meet the definition of done (DoD)
- Demonstrate incomplete stories if enough functionality is available to get feedback
- Encourage providing constructive feedback and celebration of the team’s accomplishments
- If a significant stakeholder cannot attend, the PO should follow up to report progress and get feedback
Learn More Leffingwell, Dean. Agile Software Requirements: Lean Requirements Practices for Teams, Programs, and the Enterprise. Addison-Wesley, 2011.  Leffingwell, Dean. Scaling Software Agility: Best Practices for Large Enterprises. Addison-Wesley, 2007.
Last update: 23 November 2022