One of the common confusion around agile requirements is concerning the difference between Acceptance Criteria and Definition of Done. In this post I highlight the differences, and suggest a few practical ways of understanding and expanding your Definition of Done.
In a subsequent post I will address Acceptance Criteria.
The Dry Definition
Definition of Done are the terms which define what must and should (or must not and should not) be performed in order to declare a single requirement as complete. As such, the Definition of Done is a kind of a governing contract between the team and the product owner, reflecting the current standards of work that the team - developers and product owner.
Acceptance Criteria are a set of conditions that, when accomplished, constitute a specific requirement as complete. As such, the Acceptance Criteria represent an increment that enables validating that all desired aspects of a specific requirement are done.
Why is Definition of Done important?
Compared to older traditional forms of software development processes, agility calls for a high level of transparency and discipline.
If, in older processes, we had a "safety net" to catch the outcomes of poor standards, in an agile work-mode, we want to gradually and continually increase our standards so that such "safety nets" are not required anymore.
Such a "safety net" is important when we:
- Postpone or ignore the risks
- Undermine complexity
- Are over optimistic
- Automatically assume our process is good or even improving
The Definition of Done creates an opportunity to make our standards visible - often painfully - and hence incrementally improve them.
Who owns the Definition of Done?
In Scrum terms, the Definition of Done belongs to the team and is 'owned' by the Product Owner.
It belongs to the team because the team is responsible for its definition, enactment and evolution.
It is 'owned' by the Product Owner because he or she is responsible, for any given requirement, to say whether it is Done or not.
So far so good. But...
What happens when the DoD is not enacted? Or someone overrides it?
In Scrum terms, it is the responsibility of the Scrum Master to reflect the current standards of the team, and to facilitate their continued improvement - of lack of.
And oh my, what an unpractical statement is that! What on earth does that mean - to reflect and to facilitate?
I mean - how often do you meet teams where its members say "We want to try out Cucumber/Jenkins/Pair-programming/..." and the Product Owner replies: "Oh, go ahead, just drop some content my dears, and I'm sure everything will be great"?
If that was commonplace, Scrum Masters would be out of job, Scrum would be a relic of the past, and Definition of Done would be as transparent as, well, air.
What in practice does it mean to exercise continuous improvement through the team's Definition of Done?
Let's get practical!
Be the change
The best way to improve the standards is by doing it yourself. Write better tests automations, refactor regularly, never ever ever ever ever bypass the build-server or directly send a DLL, write INVESTed and SMART user stories, pair with someone - whatever you do, push it one notch up. Especially if you are the Scrum Master.
Have a visible DoD statement
Hang your Definition of Done in the team room or area. If you are the Scrum Master, you may hold a series of meetings on the DoD, or otherwise facilitate the discussion in the team. Right after the daily standup may be a good time for this.
Waste Sorting Bins
Hang A4 or A3 posters with the 8 types of waste, and place post-its underneath each whenever a waste is found. During the retrospective discuss the post-its from the week, and add to the DoD whichever are relevant.
Hear no evil, see no evil
When the Product Owner ignores the Definition of Done, and the developers are not 'putting their foot down', there is little chance of improving standards. It is a covert coalition that leads to maintaining poor standards.
A possible action is to have a conversation with the Product Owner on the results, and a 5-Whys to analyze potential root causes. When root causes are found, reflect the implication of accepting requirements (stories) that do not meet the standards. Then offer help during Sprint Review to keep working on the requirement until it meets the standard. For example, zero defects on a story.
"In God we trust, all others must bring data" (Prof. Deming quote)
When quality does not improve. For example, recurring defects, spaghetti code, untestable software/modules
The first thing to do before trying to solve problems is to know they exist. If the teammates do not start on their own, the Scrum Master starts a measurement to make the PO and team aware. Here are a few examples:
Place near the Scrum board the # of defects per week and show it on a chart; run a static code analysis on the codebase; work with teammates to rank the testability of modules, etc.
Then bring these data to the retrospective.
What's your trick?
How does your team raise the bar on your Definition of Done?
Please share your ideas for identifying, visualizing and enacting your Definition of Done.