This post originated from an RSS feed registered with Agile Buzz
by Dave Churchville.
Original Post: Are Functional Specs Redundant?
Feed Title: Agile Project Planning
Feed URL: http://feeds2.feedburner.com/AgileProjectPlanning
Feed Description: Thoughts on agile project planning, Extreme programming, and other software development topics
Should an agile project team write functional specifications? What about design documents?
The drumroll please....
And the answer is a resounding "It depends!"
For most agile methods, a feature begins as a "story" as told by a customer, and captured in some format, possibly an index card, spreadsheet, or more sophisticated planning tool. This story is considered a placeholder for the detailed discussions that will take place between the developers and the customer when implementing that story.
But in a situation without an onsite customer and frequent contact, there is certainly some value in capturing a bit more details about this story. This can take the form of acceptance tests, which are basically the set of things that must be true in order for your story to be considered "done."
However, in many organizations, there is a separate QA or test team that is expected to write test cases, run regression tests, and be able to trace their tests back to a set of documented requirements. The QA team often needs formal documentation in order to keep this all up to date. Now, some people would argue that they shouldn't be doing this, but for various reasons (regulatory or contractual, primarily) this is sometimes a real world requirement.
So for this situation, a set of detailed specifications is needed before any test cases are written, and the tests must map back to requirements. This typically means that business rules, user interface nuances, and other details must be discussed in the requirements, the specifications, and the test cases.
But doesn't this violate the Don't Repeat Yourself (DRY) principle? Why can't we skip the specifications, and just write acceptance tests? And given a full set of acceptance tests, slowly work towards automating as many as possible?
Well, one way of looking at this is that you've got increasing levels of detail and rigor as you work from Requirements to Specifications to Tests.
Requirement: The system shall accept orders from the web
Specification: Web Data entry form will take the user name, credit card, address, etc.
Acceptance Test: Login and go to the order entry page. Enter a credit card number. The system should validate it, and save the order to the database.
So when trying to communciate with other team members, you'd pick the level of detail appropriate to the task. Acceptance tests, especially the automated kind, are great for developers to exercise the system, but the format that's useful to developers may be difficult for stakeholders to understand.
The key is to know your audience. If you've got remote customers, you may need to communicate with documents, so here a functional specification has a lot of value. Similarly, if you're using an offshore development team, it's going to be even more important to write down the specifications in detail if you hope to get close to what you want.
On the other hand, a small team with onsite customers, and a commitment to automated acceptance tests might not need as much documentation.
Rule of thumb: Before writing any document, ask yourself "Who is this for, and how does it help them?" If you can't answer, or the answer is "No one, and it doesn't", then don't write it. If you have a clear need, go ahead and create the document, but be sure to have a strategy for keeping it up to date, or discarding it when it's past it's useful lifetime.