The Artima Developer Community
Sponsored Link

Agile Buzz Forum
Integration Testing: The Story Continues

0 replies on 1 page.

Welcome Guest
  Sign In

Go back to the topic listing  Back to Topic List Click to reply to this topic  Reply to this Topic Click to search messages in this forum  Search Forum Click for a threaded view of the topic  Threaded View   
Previous Topic   Next Topic
Flat View: This topic has 0 replies on 1 page
Simon Baker

Posts: 1022
Nickname: sjb140470
Registered: Jan, 2006

Simon Baker is an independent consultant, agile coach and scrum master
Integration Testing: The Story Continues Posted: Feb 1, 2010 10:27 AM
Reply to this message Reply

This post originated from an RSS feed registered with Agile Buzz by Simon Baker.
Original Post: Integration Testing: The Story Continues
Feed Title: Agile In Action
Feed URL: http://feeds.feedburner.com/AgileInAction
Feed Description: Energized Work's blog.
Latest Agile Buzz Posts
Latest Agile Buzz Posts by Simon Baker
Latest Posts From Agile In Action

Advertisement
Over the past few months I've been reading the 'Integration Tests Are A Scam' serious of articles by J.B. Rainsberger and following some of the responses to it such as this one by Steve Freeman. I put in my 2 cents a few days ago which I've reproduced here:
Interesting series of articles & comments. I also read Steve Freeman’s article in response to the same topic. It’s got me thinking about how we work and I thought I’d take the time to describe it here.

You define an integration test as “… any test whose result (pass or fail) depends on the correctness of the implementation of more than one piece of non-trivial behavior.” We have many such components that exhibit such non-trivial behaviour in the products we create, many of which are not developed by us. And we have integration tests to verify they work. I’m not just talking about 3rd party libraries and frameworks here, I’m referring to the whole system: caching layers. load balancers, DNS servers, CDNs, virtualization etc. When we build software it only becomes a product or service for our users when it has been deployed into a suitable environment; an environment that typically contains more than just the software we have written and packaged. Since our users’ experience and perception of quality result from their interaction with a deployed instance of the whole system, not just their interaction with the software at a unit level, we have come to value end-to-end integration testing. I believe there’s merit in testing these components in symphony and will attempt to clarify what kind of integration testing I’m talking about.

For a given piece of functionality we write an executable acceptance test in human readable form (for web projects we typically use some domain-specific extensions to selenium, for services we have used FIT and it’s ilk, sometimes we roll our own if there’s nothing expressive enough available). We run it against a deployed version of the application (usually local though not always) which typically has a running web/application server and database. The test fails. We determine what endpoint needs to be created/enhanced and then we switch context down into unit-test land. A typical scenario would involve enhancing a unit test for the url mappings, adding one for the controller, then one for any additional service, domain object etc. When we’re happy and have tested and designed each of the required units we jump back up a level and get our acceptance test to progress further. The customer steers the development effort as he sees vertical ‘slices’ of functionality emerge. The acceptance test is added to a suite for that functional area. The continuous build system will then execute that test against a fully deployed (but scaled down) replica of the production environment, with hardware load balancer, vlans, multiple nodes (session affinity) and so forth. Any additional environmental monitoring (e.g. nagios alerting) is also done as part of this development effort and is deployed into the test environment along with the updated code.

Setting up the infrastructure to do this kind of testing takes investment, both initial and ongoing. The continuous build needs to be highly ‘parallelized’ so you get feedback from a checkin in 10 mins or less (we’re heavy users of virtualization, usually VMWare or OpenVZ). The individual acceptance test suites need to be kept small enough to run quickly before check-in.

Benefits of this approach

  • The continuous context-switch between acceptance test and unit test is key to our staying focused on delivering what the customer actually wants.

  • The customer has multiple feedback points that he can learn from and use to steer the development effort.

  • It confirms that the whole system works together – networking, DNS, load balancing, automated deployment, session handling, database replication etc.

  • We create additional ‘non-functional’ acceptance tests that automatically exercise other aspects of the system such as fail-over and recovery.

  • Upgrades to parts of the system (switches, load balancers, web caches, library versions, database server versions etc.) can be tested in a known and controlled way.

We’ve caught a number of integration-related issues using this approach (a few examples: broken database failover due to missing primary keys, captcha validation not working due to a web cache not behaving correctly, data not persisting because one database server had the wrong locale) and stopped them before they have reached our users. We have used the feedback as a basis for improving our products and their delivery at a system level.

OK this reply has now become far too long :-/ It would of course be good to discuss this in person sometime :)

J.B.'s taken the time out to respond and it seems that there's a lot of common ground. Maybe there's a language problem here in developer land? Do we need some clear common definitions in this area?

Read: Integration Testing: The Story Continues

Topic: Continuous Integration Refcard Previous Topic   Next Topic Topic: New SeaBreeze

Sponsored Links



Google
  Web Artima.com   

Copyright © 1996-2019 Artima, Inc. All Rights Reserved. - Privacy Policy - Terms of Use