“Done”. “Done Done”. When is a feature really done?

At Telerik, just like at every other software development shop, we have struggled with the definition of "Done". Throughout the years we've had many definitions of "Done" within teams and individuals:

"It's done, but it's not tested" (Testing is left for the next sprint).

"Yeah, it's working well. There are a few blocking issues but we'll fix that later" (Bug fixing is always a "tail" task and is never budgeted as part of the v1 of a "Done" feature).

"Sure, it's 100% done but it doesn't have any UI yet" (Missing core deliverables of the feature like a UI for an end-user feature and defining done as having the API done).

"It's kinda ugly but it works, right?" (Zero focus on usability).

"It works on my machine so it's done" (Doesn't take into account the differences between the local and target deployment environment).

"Yeah, the API is a bit awkward and customers won't get it but it works" (The feature does not work in a way the customer expects it to work).

I am sure that most of you have experienced frustration when everyone's definition of "Done" is very different. In addition to tension, it has many cascading effects on planning as well as trust between team members and with customers. After analysis, we came to the real culprit - everyone interpreted "Done" from only their perspective. There really was no good flow from the "Done" on the personal level, to "Done" from the customer's point of view.

Seeing this happen, we came to the conclusion that we need a more uniform definition of "Done" that we could push across the company. We sat down and tried to find the common denominator. While we knew there would be variations from product to product, we came up with the following generic checklist items to try define "Done" from a product team's perspective:

Acceptance tests identified and written. The feature must cover the use cases/stories for the customer. Everything starts from here - picking the correct minimal set of features to accomplish the use cases is the real key as it drives true customer value. You need to think about the problem you are solving - the feature by itself is not providing value to the customer.

All of the must-have components of features are implemented. Every feature should deliver a complete benefit to the customer. For example, often teams would complete 9 out of 10 tasks for a feature and would miss a key task that allows the feature to provide true benefit to the customer.

All the code and documentation artifacts for the feature have undergone review and have been moved from the features branches to the main trunk in source control.

Feature is covered with appropriate amounts of unit and integration tests and you have a "green" automated build that compiles and runs these tests.

Feature is covered with automated UI tests to prevent future regressions that are often outside of control (e.g. browsers, library upgrades etc.).

Feature is tested in the near-real environment; for example a forum software should be tested with 1 million threads with hundreds of simultaneous users (definition of "near-real" depends of course on the feature, scope and target).

All public APIs have a real world example; if an example cannot be thought of, API should not be public. If there is no example, public facing API is only regression risk.

Release notes (internal) and/or a video have been posted so that colleagues are aware of the feature and implementation.

It's live in the test environment and there's a build or URL where you can see the feature live.

There are no blocking bugs that hinder the adequate usage of the software.

There are examples and documentation how to properly use the feature.

And so on….

It was a long list of prescriptive guidance. Each team picked the items that made the most sense to them but there was an element of commonality - they are designed to ensure that nothing gets in front of a user of our products with known critical bugs, and without meeting some acceptance criteria set for it. This is the primary reason for a definition of done. It ensures everyone shares a common understanding of what it means to be done, and it ensures that the "done" stuff that gets to a user works as it was intended to.

The above established the "framework" but we found another problem - teams were not always conscious of the entire flow of a feature, from the time it is conceived, to the time a customer can start using it. What was the obvious decision to solve the problem? To extend further the definition of "Done." We came up with our own definition of "Done Done" in order to capture the notion that "Done" is really done when a customer can benefit from the feature:

It's shipped. Only shipped features are really done. Even the nicest and most tested and complete feature is not truly done if it's not in the hands of customers.

It's discoverable. This starts much earlier and there are many artifacts that need to be produced such as marketing collateral, website copy, newsletters, promotion through social media, docs, SDK, examples, etc. It's a team effort, not just an engineering effort and everyone needs to prepare the stuff that will allow customers to come across the feature. A feature is useless if customers don't know about its existence.

There's mechanisms to capture feedback and/or telemetry and understand whether this feature really is used, whether it's relevant for customers and where it needs improvement so that we can run future cycles on it.

After creating this comprehensive guideline we went back to the drawing board and tried to summarize the long bulleted list definition of "Done Done" as a single sentence just to make sure we clearly identify the core item we are solving and make it easy for everyone to understand the basic premise:

"Each product group at Telerik is empowered to define "Done" in a way that ensures that every feature and every release meets a minimum state of completeness that allows customers to find, properly use and genuinely appreciate the new release of our software."

Today each team has its own definitions of "Done" all through the lifecycle but the common denominator is that it culminates in "Done" from the perspective of our customers. It might sound like a small change of focus/perception but it has had a tremendous impact on results, from customer loyalty and happiness, to reduced support costs due to higher quality.

15 Comments

Steve Smith
29 Mar 2012

Definition of Done is an important part of agile software development. Ithink it's best when each team is able to define it for themselves, butI also think it needs to be centered around value to the customer. Itsounds like your approach does a good job of capturing both of thesegoals, and more importantly, that you're seeing positive results frommaking these definitions explicit.There are some good resources, especially in the comments, on this topic here as well: http://www.allaboutagile.com/definition-of-done-10-point-checklist/

Vassil, great post and great timing. I was exactly in the middle of reviewing what "done" means for my team and I've crossed with some of your conclusions. "Done" is different for every person and this is due to the "delegation" nature of work. I think that good teams should be aware of everyones responsibility in the team and everyone's contribution to the greater "done done". If someone's doing dev and another is doing release management, the team is "done done" when both are "done". I'd like to find ways to improve coherence in my team and I'll start by better defining what "done done" is and how each team player fits in that overall goal.Thanks for the post, it would be great to hear how things improve, by how much, and your thoughts on what has contributed the most.

Francis
30 Mar 2012

Thanks for this postAs a team member, i had always a definition of the word "Done" even when the team leader has not verified it but in later years i have learnt to became careful with that word and redefined it in terms of team correlation and customer benefits even when the team leader certifies it.

Nice post. The definition of done is one of the most critical aspects of agile development. There should never be ambiguity around whether a story is finished or not. I'm happy to see that you allow teams to derive their own definitions of done. I agree that a universal definition is not achievable or desirable.

Thanks for the comments, guys.@Radi - as in many other cases, the big change and positive impact did not come from fixing a single big issue. My experience has been that whether it's a problem in a non-performing product or an organizational issue, there's a 1,000 small issues that need 1,000 small solutions. I haven't seen quick wins to bigger problems. My advice has always been to load yourself with patience and work on the sequencing - start with the stuff that is easiest to fix as well as the most problematic one. Then iterate until you tame the problem.The above blog post is a short piece that might sound like a standard problem with a standard solution but it took us years to successfully reform our practices. Even today, it is a work in progress as every day we are learning new things.

Greg
02 Apr 2012

We went through a lot of this years ago. Ultimately we started using the term "Done, done" and actually defined what that meant. It's not "Done, done" until the following requisites are met.After that, it wasn't an issue.

My principle has been to throw a rope around a project and declare what it is to be done. The idea is to deliver a product. I saw a friend once lose the chance to be a millionaire with a product he was creating, but his programmer could never deliver a working version of the code. He kept seeing features in other products an would insist on putting them in, even if it meant rewriting major sections of the program. If the programmer had been working for me, I'd have insisted on a set of features that must be delivered and once he had done that, he could add all the other features after that.I ask my clients to give me their wildest fantasies and then we scale them back to what can be delivered, knowing where the client wants to get to ultimately, makes it a lot easier to structure the code to include those features. I use the example of a builder, "Don't tell me you want a wet bar on the other side of the room, after I've laid the concrete slab floor." That usually gets the point across to them.Deciding when a project is done, is when you can deliver it. After that, you can add on features and documentation that is needed. Just make sure you keep all your design notes, in whatever format so you can do so. No point in making work for something that will never be needed.

@Curt - excellent points! That's what most people miss - "Done" and perfect are not the same. If you decide to chase perfection, you will keep re-writing your software and you will never ship. That's why you put some boundaries, you define what's the minimum acceptable threshold the team has to cover and you ship. If you want to make it even better, you do another cycle.

I agree with Curt, Vassil above. Done really is about quality of delivery, not necessarily quantity. A calculator can be "done" in v1 once it supports add, multiply, subtract and delete, consistently 100% of the time to an end user's expectations. Could it do more? Certainly. In the next release.

Good article.But I'm wondering about the definition of "Done" for some "community edition" of software tool (I don't say which one for not toshame theguilty).Some tools (library) that I've downloaded, reference to files that are only on the developer's PC!

@roberto - Can't comment with anything concrete without knowing what product(s) you are referring to. Community Edition or paid version, the quality level should be the same. Every product that bears the Telerik name on it means it has to meet certain standards. If it does not, it's a problem and we need to understand why something slipped and to fix it.In any case, don't treat the article as gospel. Its intent was to share what we are doing and why rather than promote the notion we are perfect. We try to be but we've had our fair share of mishaps. It's mainly through those painful experiences that we've improved our internal processes.

R. Anders
11 Apr 2012

Great article, Vassil! I've always believed that the definition of done is the most important part of any organization. Do you believe this is so for Telerik's native ecosystem?