Posted in Scrum Events

Sprint Review Real Examples

Pixar (3)

Pixar use regular meetings every few months called Braintrust Meetings to inspect progress made on a film, talk through what works and what doesn’t, and how to adapt the film to improve (i.e. similar to a Sprint Review).


‘put smart passionate people in a room together, charge them with identifying and solving problems, and encourage them to be candid with one another’

page 86-7

Structure (1, page 94)

  • Film screening 
  • Lunch together in a conference room (to gather thoughts) 
  • Director will give a summary of where they think they are
  • Feedback begins

Feedback (page 93)

  • Braintrust notes, then, are intended to bring the true causes of problems to the surface – not to demand a specific remedy (page 93)
  • The director (as mirrors the PO role) is able to change whatever he/she wants from the feedback
  • Feedback is given as constructive criticism (page 103) 
    • “The writing in this scene isn’t good enough” (just criticism) 
    • “Don’t you want people to walk out of the theater and be quoting those lines?” (more of a challenge and shows you want the same things) 

References

  1. Creativity Inc. by Ed Catmull

Explore more


Library

Agile games
Posted in Scrum Add-ons

Sprint Review ‘Science Fair’ style

Interactive increment demonstrations


Motivation

We have multiple teams delivering towards the same business goals. They often weren’t fully aware of what the other teams were achieving and missed the opportunity to ask questions.

Research

Articles

  • There is a similar idea to this but demonstrating by feature developed rather than by team in the Nexus Framework (1, pg 67)
  • Quick scrum.org summary is here

Starting Point

  • Each team doing their own end of iteration/ release reviews.
  • Unsure approach from teams – especially those who had worked on features they didn’t feel have a ‘wow’ factor
  • Some team members said they felt they had already completed a review of their work for iteration or release and were not confident any further review would be

Trial Method

  • Each team having a ‘stall’ at the Science Fair (we had 6 teams in total holding stalls)
  • Stalls set out around the edges of the room with enough space for people to wander about and stand around each stall
  • Inviting all stakeholders from all teams and some in-business users
  • A number of rounds of ten minutes each were set up to allow the teams to also rotate and see each other’s stalls
  • An introduction was created to explain the format and summarise the features delivered in the interval length agreed

Results

Notes from setup

  • We took the idea to schedule these at regular intervals that suited the iteration or release cycle of all of the teams involved as they work to different cadences
  • Setup time in the room was important so that when people entered we were ready to go
  • Talking through the features that were delivered at the start felt like a waste of time as everyone could then talk through them with each team. This took time away from actual conversations so we decided not to keep it in the next one and bring our objectives wall into the room instead for people to refer to if wished.

Stall Activity

  • Each team had someone viewing their demo at almost every ’round’ as we called them
  • Each team felt they got value out of it as they were able to have more in depth conversations and ask more questions about the features from other teams than they usually feel able to in the team specific reviews.
  • The teams who were concerned on repetition of reviews and their stall not having exciting enough features had ample interest, questions, and feedback for us to repeat this Science Review format again
  • Stakeholders and in-business users who would usually only attend the review sessions for specific features broadened their knowledge to the work from other teams

Lessons Learned

  • Introduction to the features at the start is unnecessary – this has now been replaced with bringing the feature board into the room
  • Worthwhile start to improving the cross-team knowledge sharing and communication. It did highlight how difficult it is for each team to keep up with and understand the work of 5 other teams whilst also maintaining their own work.
  • Requests were made from all to make the event more ‘jazzy and exciting to attend with an extension to more people included in this. Biscuits have been suggested
  • This review format allowed a different type of conversation to a solo team review which I believe was because there were less people at one time and so questions of more personal interest seemed appropriate. This is why I don’t believe it felt more repetitive
  • There are more interested people in the features than you immediately think of within the business

Extensions to try

  • Invite more people and advertise as an event around the business for whoever wants to attend.
  • Consideration on whether making it a competition for best stall would create a brighter atmosphere.

References

  1. Nexus Framework by Kurt Bittner

Read more