User:BDavis (WMF)/Notes/Definition of Done

Definition of Done is a concept used in scrum to help a team make more robust software.

Definition of Done
The definition of Done serves as a contract between the members of the development team, the organization and its clients to ensure that a high quality product with minimal defects is designed, developed, deployed and supported by the organization. Any increment meeting the definition of Done should be considered potentially deliverable to the customer.

This list taken as a whole looks pretty daunting. It turns out that producing production ready software for a high traffic web site is hard work. It is such hard work that it takes a group of well trained individuals working as a team to complete properly. This list is a recipe that can and should be used by the team to ensure that they produce an increment that is worthy of their combined energy. When used properly it will increase the reputation and worth of the team, their product and the organization.

This particular definition of Done is written from the perspective of a cross functional team responsible for implementing features in a product. It does not include Done criteria for the operations or support teams that will maintain the deployed software or assist customers in its use. It does however include deliverables that must be produced by the development team to support those additional teams.

Done with Grooming a Story
A groomed story is clear, feasible and testable.


 * Business Goal described
 * Why will we build this?


 * Acceptance criteria defined
 * What will it do?


 * Tasks identified
 * How will we do it?


 * Story points estimated
 * What will it cost?

It may take several iterations to achieve this level of clarity. In fact anything that can be quickly groomed is necessarily trivial. It may still take significant time to implement, but it would have to be a variation on work that has already been done that is understood by the whole team.

Themes, Epics and large stories will need to be decomposed into smaller parts. This must happen recursively until the smallest parts are describable using the criteria established above.

Spikes or other research may need to be done to remove uncertainty about new tech or legacy impact. These things are stories in their own right and should be treated as such. R&D must be traceable expense and is just as important as the final product/feature.

Done with a Story

 * Everything from "Done with Grooming a Story"
 * A story must be groomed before it can be implemented.


 * Design complete
 * Design is not one size fits all. Some stories must have UML and detailed functional descriptions. Others will only need a statement of "do this just like we always do an X feature." The level of design required should be determined during grooming by the team.


 * Design artifacts in wiki/bugzilla/other
 * Design isn't complete until it's tangible artifacts are available to the team and the business.


 * Design reviewed by peers
 * Similar to a code review, design should get a once over by at least one tangentially involved party to ensure that the level of detail is appropriate to the story and that the proposed implementation makes sense.


 * Code complete
 * All code for the story has been written.


 * Unit tests written
 * Unit tests have been written to verify that the code works at the most basic level. This can be done via TDD or code-then-test as best suits the team and the story.


 * All code checked into version control
 * Feature code and tests are committed to version control.


 * All unit tests passing
 * Unit tests are passing in all testable environments.


 * Automated code checks passing
 * PHPCS, lint and other common automated code quality measurements are passing according to the organization's definition of passing.


 * CI tests passing
 * Automated tests in the continuous integration environment are passing.


 * Peer code review completed
 * A code review has been completed involving at least one tangentially involved party.


 * Material defects from code review addressed
 * All questions and defects raised in the code review have been addressed.


 * All acceptance tests (manual and automated) identified, written and passing.
 * Given/When/Then style or other detailed acceptance tests for the story have been written and verified either with automated tests or manual testing. Automated tests are preferred as they do not increase the overall manual testing load of the product.


 * Help/documentation updated
 * "Just enough" help and documentation has been produced so that the feature can be used by clients, maintained by developers and supported by customer service and operations.


 * Release notes updated
 * Deliverable artifacts and deployment procedures have been documented.

Done with a Sprint

 * Everything from "Done with a Story"
 * All stories in the sprint must be done (or returned to the backlog) for the Sprint to be done.


 * Released to beta/integration environment
 * The deliverables identified in the release notes for the Sprint must be deployed in the beta/integration environment.


 * Demoed in beta/integration environment (UAT)
 * The demonstration of the increment to Product Owner and other Stakeholders must be performed from the beta/integration environment.


 * Approved by Stakeholders
 * The increment must be approved following UAT.


 * CI/automated tests passing
 * All automated tests against the product must be passing.


 * Integration tests passing
 * Manual integration tests for the product must be passing.


 * Regression tests passing
 * Manual regression tests for the product must be passing.


 * Code coverage for automated tests meets acceptable guidelines.
 * Code coverage measurements for unit tests must be within acceptable ranges.


 * Performance tests passing
 * Performance/scaling tests must return results within acceptable ranges.


 * Diagrams/documentation updated to match final state
 * Documentation for design, implementation, deployment, support and use must be updated to match the completed increment.


 * Bugs closed or pushed into backlog
 * Defects identified in UAT, QA and development must be resolved or appended to the backlog for Product Owner triage.


 * Unfinished stories pushed into backlog
 * Any work in the sprint which does not meet this definition of done will be returned to the backlog. The Sprint isn't done as long as any non-done issues are associated with it.

Done with a QA/Staging Release

 * Everything from "Done with a Sprint"
 * All Sprints that are to be included in the release must be Done.


 * Operations guide updated and approved by Ops
 * The support documentation delivered to Ops via the wiki must be updated and those updates must be approved (UAT) by the Operations team.


 * Automated tests passing
 * All automated tests available for the QA/Staging environment must be passing.

Done with a Production Release

 * Everything from "Done with a QA/Staging Release"
 * A successful QA/Staging release is a prerequisite for a Production release.


 * Stress/Load tests passing
 * Stress/Load testing in the QA/Staging environment must return results within acceptable ranges.


 * Network/Component diagrams updated
 * Documentation for design, implementation, deployment, support and use must be updated to match the proposed release.