Be More Productive

Get the most from your processes now!

Sprint Commitments

your exit criteria is my intake criteria


Unless commitment is made, there are only promises and hopes… but no plans.

— Peter Drucker

Commitment is a good thing, but we should choose wisely what we commit to. An unfortunate anti-pattern is to treat the concept of Sprint Commitment as asking the development team to make a commitment to complete all the stories that get started in a Sprint: to “live up to their estimate”. A better choice would be a commitment to the full rigor of done and delivered.


Merriam Webster’s definition of commitment is “an agreement or pledge to do something in the future”. Considering that Scrum asks stakeholders to trust the team, it’s not unreasonable that stakeholders ask for a commitment in return. The question is, to what?

The Scrum Guide provides unambiguous guidance that the goal of a Sprint is to produce a potentially releasable increment of “Done” product at the end of each Sprint.

Fushimi Inari Taisha, Kyōto-shi, Japan
Fushimi Inari Taisha, Kyōto-shi, Japan

If the something we’re after is a potentially releasable increment of completely done work, then the way that we craft our Definition of Done will be an important aspect of how we determine if the team lived up to it’s commitment. A common anti-pattern is applying DoD criteria to stories, but not to the Sprint increment a whole.

Evidence of the kind of commitment we’re after would be a Sprint Review Change Log Entry, listing features delivered, environment or configuration settings introduced or modified, API or service interaction changes, changes to cache management, redirects, routing, security headers, changes in release process, and so on. In short, a comprehensive list of changes relating to the code, testing deployment and documentation of the system as a whole.


Change Log as Sprint Review Agenda

A proposed Change Log Entry should serve as the agenda for the Sprint Review meeting. The objective of the Sprint Review is for the Product Owner to approve the features in the Sprint release candidate, or to remand some stories for additional work. There should be no shame in stories not making the cut. Having all of the work accepted every Sprint is hardly a good sign; it would more likely be an indication of deficient release standards.

Posting the Change Log Entry (revised to reflect what was accepted) and releasing the approved features should be done immediately following Sprint Review, within the closing Sprint, not deferred to the next Sprint.

The Release Change Log Entry can provide brief yet hard proof of commitment fulfilled, including traceability back to the stories and acceptance criteria the drove the work through the Sprint.


A Dress Rehearsal

Every Sprint should end with a delivery which is no less robust than a production release. Treat every Sprint as a dress rehearsal, so that when the curtain goes up on opening night, you’ll be ready.

While the Sprint End Release does not mean to a production environment, it should still be production grade protocols. If it is inconvenient to make the release after the Sprint Review before the Sprint Closes, then resolving that should take priority in the next Sprint over working on more features.

For organizations that must adhere to a formal Software Development Life Cycle process, the Sprint Change Log Record should contain all of the information needed to comply, stating what you set out to do, what you did, and how you verified it.


Trust But Verify

The essence of “done” comes down to some verifiably complete increment of functionality. We can look at each of those three components:

  • Verifiably — equates to the quality of the work,
  • Complete — equates to preparedness of delivery, and
  • Increment of functionality — equates to features that are meaningful from a customer perspective.

Verifiable

Code quality and fitness to purpose must be verified against integrated code, not on feature branches. Of course testing should also be done by developers on feature branches before integration, but the state of being verifiably done can only be said to apply to integrated code in a stable environment. A stable environment means at least on step downstream from the integration environment.

Each Increment is additive to all prior Increments and thoroughly tested, ensuring that all Increments work together

— Scrum Guide

Environment settings, system configuration, service integrations and feature toggle settings are as important to verify as the code itself. There’s little value in trying to verify those aspects of a system on a feature branch. Verification of the Sprint Product must occur on a stable environment.


Completely Done

Preparedness for delivery means that all aspects of configuration are committed to code, the increment is tagged for release, and all aspects of releasing the increment to a subsequent environment is either fully automated, or detailed in a checklist of manual steps, with that checklist also being committed with source code. No aspect of release may reside in someone’s head.

Development Teams deliver an Increment of product functionality every Sprint. This Increment is useable, so a Product Owner may choose to immediately release it.

— Scrum Guide

Releasing doesn’t have to mean to a production environment; releasing to a highly stable “Showcase” environment serves the same purpose. Issues that arise from post-Sprint environments are tracked separately from routine feature demand as a measure of quality.


Increment of Functionality

Stakeholders should be able to understand the goals of a Sprint. There will always be work that is technical in nature and not of particular interest to stakeholders, but that is the exception. The Sprint Goal will typically be of a set of features which are meaningful to customers.

If stakeholders routinely can’t make sense of the features being presented in Sprint Review with a lot of contextual explanation, that may be a sign that the composition of what’s being delivered doesn’t pass muster as useful from a business perspective. Just make sure that there’s clear business value to what goes into the Sprint Review.

Work delivered in an increment does not need to be feature complete, so long as what is delivered is coherent to stakeholders.

The essence of what makes Scrum work is leaving some degree of negotiability of the work in the hands of the Development Team.

The Product Owner sets priorities, but the number of items selected from the Product Backlog for the Sprint is solely up to the Development Team. Only the Development Team can assess what it can accomplish over the upcoming Sprint.

— Scrum Guide

Work that is verifiably done but lacking essential criteria that make it comprehensible to stakeholders should not be delivered in an increment, but held back until some minimally viable state of the feature is verifiably done.

As a general rule, the customer or stakeholder should understand the increment of functionality being delivered in an increment; you shouldn’t have to do a lot of explaining about what it is and isn’t.


Releasable Increments of Verifiably Done Product

The concept of “Completely Done” trips up some people, who mis-interpret it as meaning that all aspect of a given story specification by complete, thereby missing the more important concept of the work being in a deliverable state.

The Development Team consists of professionals who do the work of delivering a potentially releasable Increment of “Done” product at the end of each Sprint. A “Done” increment is required at the Sprint Review.

— Scrum Guide

Even when the specific requirements for production release are non-negotiable, Scrum provides for negotiability of features into increments. There is no need to delivery features completely in any one increment, but everything included in an increment must always be fully backed as a deliverable.


Scope, Time and Cost

Project Management teaches the discipline of the Iron Triangle of Scope, Time and Cost. Scrum breaks the project into increments in order to allow adjustments to the Iron Triangle constraints over iterations. This approach has the benefit of allowing us to focus on the work within the iteration, while imposing a different set of constraints.


Verifiable — Complete — Increment

The constraints that proscribe a Scrum Sprint Goal is a verifiably complete increment of work. You can think of these constraints as the Iron Triangle Scrum. The difference is that two of the three sides, Verifiable and Complete are fixed. Verifiable and Complete are non-negotiable in Scrum, but the composition of the work-in-progress that actually goes in the Sprint Product increment is always negotiable.

The verifiable aspect equates to the quality of the work. The completeness aspect equates to preparedness of delivery, while the Sprint Product equates to the increment of functionality that is a meaningful set of features from a customer perspective


It's All About the Releases

Pay no attention to that man behind the curtain!
Pay no attention to that man behind the curtain!

The Sprint Product increment is a release candidate. It’s unrealistic and counter-productive to think that somehow all the work in progress in a Sprint would somehow get whipped into a verifiably releasable state by the end of the Sprint.

Rolling work over into the next Sprint is often preferable to succumbing to immediate gratification of squeezing in Pull Requests at the very end of the Sprint to “bag the points”. When work that lands in the end of the Sprint is rolled over instead of released, then the next Sprint starts with that work in verification: instead of rushed integration, it will be enjoy a full Sprint to settle any details. Sometimes work needs to be expedited, but if it’s not the exception, then we have other problems.

In Scrum, a Sprint Product containing one line of verifiably done code is preferable to any set of features that is not verifiably done. We never give ground on verifiably done. That’s why it’s called Scrum: move the ball down the field, never give up ground.

The Sprint Goal can be any coherence that causes the Development Team to work together rather than on separate initiatives.

— Scrum Guide


Ancient History

Scrum Guide sought to correct a common misconception about Sprint Commitment a decade ago:

Development Teams do not commit to completing the work planned during a Sprint Planning Meeting. The Development Team creates a forecast of work it believes will be done, but that forecast will change as more becomes known throughout the Sprint.

— Scrum Guide Revisions — 2010

Insisting on a concept of Sprint Commitment as the obligation of the Development Team to make their estimate equal their output takes away the team’s options to adapt scope within the Sprint, and thereby forces a compromise on the Scrum values of verifiably done.


The Scrum Dictionary sorts the question for us:

  1. The Team commits to its Sprint Goal and to doing its due diligence in order to have all completed Stories get to “Done”. The Team does not commit to doing all the Stories in the Sprint Backlog.
  2. As defined in the Scrum Guide, the Team commits to accomplishing the Sprint Goal during the Sprint.
  3. In original Scrum, the Team commits to completing the Sprint Backlog during the Sprint; the Scrum Guide deprecated this definition.


People get tripped up by equating Sprint estimates with Sprint commitments because of the need to improve stakeholder confidence. Teams that place an undue emphasis on proxy metrics such as the estimate burn-down do so at the expense of productivity.

Certainly stakeholder confidence is a first-order objective, since we won’t get by long without the backing of the people who are funding and otherwise providing for the team’s continued existence. The point is that working software is the most reliable way to earning and keeping stakeholder trust.

The best way to predict the future is to create it.

— Peter Drucker

Scrum has always endorsed working software as the primary measure of success, but somehow it has become a practice of getting tickets done, rather than one of getting releases out. The gold standard in software is a production grade release. Try making a commitment to that as your Sprint Goal.


Photo Credits

unsplash-logo Tim Foster — "Fushimi Inari Taisha, Kyōto-shi, Japan"

unsplash-logo Martijn Baudoin — "Tangled Wires"

unsplash-logo Logan Liu — "Triangles"



pattern language

Let's agree to define productivity in terms of throughput. We can debate the meaning of productivity in terms of additional measurements of the business value of delivered work, but as Eliyahu Goldratt pointed out in his critique of the Balanced Scorecard, there is a virtue in simplicity. Throughput doesn’t answer all our questions about business value, but it is a sufficient metric for the context of evaluating the relationship of practices with productivity.