The Definition of Done: What Product Managers Need to Know

Are we there yet? This is a difficult question to answer when no one agrees on where exactly “there” is. In an Agile world full of cloud-based solutions, there’s no shrink-wrapped container full of widgets that truly signifies completion. And, there’s always an opportunity to ship code that is far from a “finished” product. That’s why agreeing on what we call the “Definition of Done” is of critical importance to achieving a consensus on when projects, initiatives, and features are actually complete.

It all starts with a common vocabulary—if people aren’t speaking the same language there’s ample room for confusion, frustration, and mixed signals. To avoid this scenario, product teams should take the time to work with their engineering and testing counterparts to agree on what qualifies as “done” in different cases.

To get on the same page, here’s a quick guide to deconstructing agile product management.

Defining the definition of done

The Definition of Done is an agreed-upon set of items that must be completed before a project or user story can be considered complete. It is applied consistently and serves as an official gate separating things from being “in progress” to “done.”

While the particulars vary from organization to organization, a typical definition of done consists of a checklist containing items such as:

  • Code is peer-reviewed
  • Code is checked in
  • Code is deployed to test environment
  • Code/feature passes regression testing
  • Code/feature passes smoke testing
  • Code is documented
  • Help documentation is updated
  • Feature is OK’d by stakeholders

Different companies and development groups will come up with their own variant, but they all tack back to the same ideal: the code does what it’s supposed to and doesn’t break anything else. Making every feature/release/sprint go through these steps to ensure done-ness is important most of all to ensure consistent quality and completeness.

There should also be an element of transparency since everything can be tied back to that done-ness checklist. If a release or feature hasn’t checked off all the boxes, then it can’t move forward and everyone knows why.

Who defines done?

The engineering organization is typically the lead player in defining the Definition of Done since much of it is to guarantee that things work well and meet basic technical requirements. The definition might be lead by the Scrum Master or the head of engineering.

But, it should be a collaborative exercise to agree on what qualifies as “done.” Without input and approval from product, quality assurance, and other stakeholders, there won’t be widespread acceptance of whether something is actually done or engineering just says it is.

“Think about all of the tasks that must be done to put the story into production. Be imaginative and include everything, even tasks that might be out of your team control,” says product development consultant Luís Gonçalves. From this ideal vision of “done” you can whittle it down to a more realistic definition.

Putting it into practice

Defining done is a timesaver in the long run since it reduces unnecessary revisions later on. When the code meets the definition, everyone has assurances that it is ready for prime time.

“The definition of done (DoD) is when all conditions, or acceptance criteria, that a software product must satisfy are met and ready to be accepted by a user, customer, team, or consuming system,” says Derek Huether of ALM Platforms. “We must meet the definition of done to ensure quality. It lowers rework, by preventing user stories that don’t meet the definition from being promoted to higher-level environments. It will prevent features that don’t meet the definition from being delivered to the customer or user.”

Once the definition is in place, it applies to everything, ensuring consistency and quality.

“These rules apply to every single work item that goes through our task boards, so long as it involves code. Whether it’s a large user story with multiple dependencies or a tiny bug fix, the person doing the work is expected to run through these checklists,” says Danny Smith of CharlieHR. “That doesn’t mean that everything on the checklists has to be ticked off for every work item, though — a tiny technical improvement is unlikely to need a marketing email written about it, for example. It does mean that everything in the checklist must be considered for every work item. We trust our engineers to use their judgment.”

Why product managers should care about the definition of done

Leaving whether or not something is “done” open to interpretation can cause conflict, misunderstandings, and lead to negative user experiences and revenue impacts, which is a good reason to settle on that criteria before the Sprint ever begins. Sharing a common vision for what the end result should be is a good place for any project to start, and agreeing on the gates a feature must pass through to reach completion creates a consensus of expectations.

An additional benefit of not giving every single project its own measure of being “done” is also a big-time saver and lets people focus on innovation and execution versus definition, so investing a little time in creating a baseline understanding of what done means to everyone is a worthy endeavor. With the ambiguity removed, everyone can concentrate on their core responsibilities instead of arguing later on in the process about fitness for release.

Also, although a feature might appear done on the surface, if the technical team hasn’t dotted the i’s and crossed the t’s behind the scenes, those resources will be continuing to circle back to those “done” projects to clean things up and address open issues.

“Incomplete work has a nasty habit of mounting up, and without visibility of how much effort truly remains, the deficit can quickly get out of hand,” says Ian Mitchell of proAgile. “The tyranny of work which is nearly done, but not really done, can put a team in servitude to technical debt.”

Definition of done vs. acceptance criteria

If you’re beginning to wonder why this is a product management issue and not a quality control topic for the technical team, that’s in part due to the difference between a general Definition of Done and the specific acceptance criteria for a particular user story.

DoD is universally applied (with a few exceptions) to everything the engineering organization is attempting to ship. While a product management “OK” might be one of the items on the checklist, it’s a fairly generic definition.

Acceptance criteria, however, are unique to the user story or feature in question. These criteria should be defined by product management, with input from the technical team on any specific use cases or parameters that must be met to green light this item before it’s considered done.

Since DoD is considered for everything, product management should review the definition and make sure they agree that it is comprehensive enough. However, the ownership and management of the definition doesn’t necessarily need to be the responsibility of product management. As long as product is satisfied that “done” items pass the tests spelled out in the DoD, they can largely leave it be.

But a shipped product or feature can hardly be considered done in the eyes of product, either.

“For a product manager, you’re not done with a product (or feature) until you’ve put it out to pasture,” says Adam Sigel of Hometap. “Once it’s launched, you begin the long tail of customer support, price changes, bug fixes, and compatibility updates. Once you’re done supporting it, it’s time to sunset it. Then, and only then, are you done with a product.”

Where to begin

The defining process shouldn’t happen in a vacuum, it should be collaborative between stakeholders and those actually doing the work. Whether it starts with brainstorming or a straw man suggested by the technical team, there should be ample opportunity for comment and unanimous support for the final product.

Assigning owners to each criteria is also a good idea, as they can be the arbiter if there’s a disagreement amount a particular items ability to check that box. This reinforces consistency and removes any doubt from the equation.

And like all well-intentioned methodologies, a Definition of Done should be as simple and short as possible. The idea is to create consistent quality and not bureaucratic hurdles that slow things down unnecessarily.

“The DoD is a contract between the product owner and the team, so it’s tempting to want to fit as many items in the DoD as possible in order to ensure the quality of the product. But this can backfire,” says Yves Riel of Okapya. “When teams are confronted with too many DoD items, they either work only on a subset or try and fail to do all of them, eliminating the value of establishing the DoD in the first place.”

Your mileage may vary

The Definition of Done primarily deals with code and its fitness for being released. But for the product team, you’re definitely not done when something ships, so you’ll need to create your own definition that extends much further into the product’s lifecycle.

Metric-based goals such as adoption, usage, retention, or revenue could be signifiers that a feature is “done” or it could be as simple as the requesting customer agreeing that it meets their requirements. And given that user feedback and analytics may drive additional development—not to mention UX feedback or changes in business models—the engineering team should be prepared to revisit items they previously deemed “done.”