Wednesday, June 21, 2006

Those pesky nightly tests...

It's come up a few times over the years, how do I make sure my coverage is good and make sure we are testing all we should all the time?  Or in time enough?  Is there enough time to test all we want to?  The questions just keep coming over and over.

I've actually ended up doing this a couple of different ways over the years, and I don't guarantee any one of them will work for anyone else, and I am not exactly sure they worked for me all the time either.  There was always more analysis we needed to do.

  1. Jump in there and do it.  Oh yes, we jumped headfirst into making some nightly tests, anything was good so long as it tested SOMETHING.  This was the worst way, and it was only done because the Engineering VP wanted something in place, mostly I think it was a way to tell the Board that we have automation or nightly testing in place.  Has this topic ever been covered in an in-flight magazine?  While it was a good exercise in getting QA involved with the Nightly Tests (we previously had none) and allowing us some time to do some scripting we actually got more out of the scripts used for our regular Test Phase.  This is the worst way to do it, since you are basically just doing anything for the sake of doing something, I think we ended up calling this the Exploratory Nightlies; no plan, no review of technology, just seat of your pants Nightly Testing.
  2. Use what Dev had.  As there were some tests that Developers had written already, in both Perl and a few other Unit Test frameworks we had a good setup.  Tests were written, established, covered needed areas of code as these were always written to cover bug fixes; they had Unit Tests as well but they were shorter to run and the Nightly Tests ran longer so they became the place to put all longer running tests.  This worked fine, as the reporting mechanism was also automated and plugged into many of the Dev and IT infrastructure in place.  Problem was if new Unit Tests were added that did not quite fit the model in Perl, which the Nightly Framework was written in, then reporting would cover an entire suite of tests as one success or failure.  When we added JUnit, it also caused some issues with both the reporting, and some issues with environments which was not always translated to the individual test; which was sometimes running Ant and JUnit started by Perl.  It was a framework that worked for a time, but was not essentially scalable with new technology as we added new applications and languages to the product.
  3. Where was that Matrix again?  Ah yes, the Test Matrix.  Each item of functionality had a test point and a test case, so we wrote up something to cover it.  This was a way to get the coverage we wanted, and we could say that if a test failed we knew what work was being done, and if a whole section of tests failed we knew a checkin was bad.  So we would take note of it and spend more time later on testing that area.  Technology here was not so important, though we tried to make sure all of the Matrix we wanted to put in was capable of being done so with the technology we chose, the fit was more important.  Still we ended up with a blend of technologies, but that seems to be the norm.  It became easier to manage though, in the point of view, where we could match tests with cases, though how this will go in the long run I can't perceive yet.
I actually have never considered buying anything, though its been tantalizing, the issue I have with buying a product is that I tend to work on products that are a mix of technologies or have hooks already in them for Unit Tests.  QA takes those hooks and expands on them, it also allows us to program our tests and know what works, and we can also keep an eye on the framework and make sure its working and the failure is actually from the product and not the framework itself; often a cause of issues.  Homegrown systems are also more managable as there is experience building them, so you have more intimate detail on what it is doing, rather than just learn a scripting language.  I'm not knocking testing tools, I like some of them, but I've had better success, even with some of the massive failures I have seen, in having the diy framework in house.  But its not part-time, this means dedicated time and resources, and the ability and will to take time to learn new things, concepts and more; its not for the faint of heart but its worth the effort.

No comments: