Last week I challenged Agile Tool Vendors and the post got a little bit of attention, I got replies from Rally: How Agile is Rally?; Danube: How Agile is Danube? and MicroTool: How We Use Agile to Develop Our Tools.
In addition there were several comments with replies: Robert Dempsey:
CEO and Founder of Atlantic Dominion Solutions. We developed Scrum'd: http://www.scrumd.com. A great post and an excellent challenge. To answer your questions:
# Your Definition of Done
We define done for each user story. Acceptance criteria includes the behavior a user can expect from the app when using it, the workflow for the feature, that it needs to be tested (and how it will be tested), any validations that occur, and if there is documentation for the feature that it is updated.
# Whether you use TDD? Or at least Unit Testing?
We do full TDD for all of our stories. We didn't start out that way, however now that we have a full test suite we keep it that way. Also, if a bug is found, then a test needs to be written for that and then the code fixed.
# What do you do for Acceptance Testing?
We have staging servers set up for internal testing. We also have a few select customers that we let into our staging servers, and I show (in person) some of our current and potential clients the features to get their feedback as well. In addition, we use Get Satisfaction for our support site, and ask that commenters who suggested features that we implement to check out the release and see if it's what they wanted.
# How often do you release?
This depends. We try to do monthly releases. Sometimes it longer, but typically it's less. The release schedule really depends on the importance of features to our user community.
# What did you learn in your last retrospective?
You don't always get it right the first time, however anything you do release needs to work correctly. Also, continuous integration is a huge help as your product gets bigger, and of course, testing is an absolute necessity. We would rather delay a release to ensure quality than release stuff that won't work.
Marcin Niebudek (creator of TinyPM)
I like challenges, so...
Our DoD for stories:
- code meeting minimal requirements committed
- unit tests passing
- feature is working under FF, Chrome and Opera
- feature is working without any significant failures under IE
- feature is tested and accepted by at least one team member that was not implementing the story
Our DoD for release:
- WAR and .exe distros built successfully
- all unit test passing
- all functional bugs fixed
- all UI bugs fixed or small UI bugs (mostly for IE) at least scheduled to be fixed
- installation and upgrade documentation updated
- installation tested on Windows and Linux
- new features list updates at product web site
Do we use TDD? Well I would not say that becuase our coverage is not 100% which means some of our code is not unit tested (glue code), but yes we do unit testing with "test first" style for all domain and business logic code and we tend to shift into BDD style right now.
As our Product Owner is also one of active developers, for acceptance testing we tend to verify features within a team as we have strong and common vision of a product within the team (see also acceptance in our DoD for stories). It's a low ceremony process (as all our processes in the team)
How often do we release? Every 2-3 months.
What have we learned from our last retrospective? That we need to shift into feature branches to be able to release small stories more often while the big ones are still in progress.
Do you see any flaws or warning signs in what I've posted here? Go on, say it... we're always happy to improve...
It would be fun to hear from Target Process, ThoughtWorks Mingle team, VersionOne, AgileBuddy, anyone else I might have missed.
If you enjoyed this post, subscribe now to get free updates.
If you want to bring Mark into your organization for Training, Coaching or Consulting please visit the corporate site: The Agile Consortium.