The Artima Developer Community
Sponsored Link

Agile Buzz Forum
Built it in

0 replies on 1 page.

Welcome Guest
  Sign In

Go back to the topic listing  Back to Topic List Click to reply to this topic  Reply to this Topic Click to search messages in this forum  Search Forum Click for a threaded view of the topic  Threaded View   
Previous Topic   Next Topic
Flat View: This topic has 0 replies on 1 page
Simon Baker

Posts: 1022
Nickname: sjb140470
Registered: Jan, 2006

Simon Baker is an independent consultant, agile coach and scrum master
Built it in Posted: Aug 18, 2006 11:20 AM
Reply to this message Reply

This post originated from an RSS feed registered with Agile Buzz by Simon Baker.
Original Post: Built it in
Feed Title: Agile In Action
Feed URL: http://feeds.feedburner.com/AgileInAction
Feed Description: Energized Work's blog.
Latest Agile Buzz Posts
Latest Agile Buzz Posts by Simon Baker
Latest Posts From Agile In Action

Advertisement
It's worth being test-driven and working from the outside-in, starting with automated acceptance tests. And it's worth getting done in the iteration, which means performing any supplementary testing, e.g. ad-hoc testing on a per user-story basis in the iteration. This means that testing responsibilities and skills are an intrinsic part of the team and are colocated.

Why?

There are many reasons. Here are some ���

1. You build quality in rather inspecting for it afterwards. And building it in produces a higher standard of quality than trying to retrofit it afterwards.

2. By not having trailer-hitched testing, the lifecycle is shortened enabling you to deliver more quickly.

3. Performing ad-hoc testing, iteratively within the iteration, means you avoid tail-end crunch that usually results in cutting the amount of testing or slipping the delivery date, or both.

4. Testing within the iteration and resolving defects as they occur keeps the defect count low.

5. Keeping the defect count low means that defects can be managed more collaboratively within iterations. Write the defects on index cards and use face-to-face communication to shepherd them through to resolution. This is a more effective way to get defects resolved. Bug tracking systems provide an auditing process that is required when you have defects ping-ponging between separate development and test teams. By not having trailer-hitched testing, you can eliminate this waste.

6. Automated acceptance tests provide less expensive regression testing. They are quicker and more exhaustive than manual regresison testing. (That���s not to say that you won���t require some eyeballing using ad-hoc testing, but this can be minimised and focused to test the things that humans do better than machines)

7. Creating automated tests in a test-driven fashion means your automated regression tests grow in sync with the code.

8. Automated acceptance tests can be run regularly through continuous integration, which means regressed code can be caught early and fixed promptly. This reduces the feedback time for defect resolution.

9. Automated regression testing through continuous integration, that���s executed automatically on every code check-in, gives you confidence to make changes and move forward quickly.

Using FIT or FitNesse ���

10. Automated acceptance tests are easily understood by business people because they're written in plain English using business domain language and are formatted in tables.

11. Designing automated acceptance tests facilitates the collaboration between the customer, the developers and the testers necessary to reveal the details of the user stories.

12. Automated acceptance tests provide executable requirements.

13. Automated acceptance tests are less ambiguous because they capture the details of a requirement as concrete examples of required behaviour rather rules stated in a requirements specification document, which can be interpreted differently by readers.

14. Automated acceptance tests provide a single source for requirements and acceptance tests, therefore eliminating duplication and the divergence between a product requirements specification and the acceptance test cases.

15. Because they're executable, automated acceptance tests either pass or fail. There is no interpretation of results requiring a tester to determine whether the system satisifed an ambiguous requirement statement in the specification document.


Tags:

Read: Built it in

Topic: BAKER'S DOZEN: Statements from an Agile subculture Previous Topic   Next Topic Topic: Grass Roots lib wrapping

Sponsored Links



Google
  Web Artima.com   

Copyright © 1996-2019 Artima, Inc. All Rights Reserved. - Privacy Policy - Terms of Use