Wednesday, March 25, 2009

The First 24

Another subject that has come up in company meetings is the concept of the First 24.  Something we have all encountered whenever using a new product, piece of software or even a new car.  Those first 24 hours help to make or break a User's perspective of a product, if you can't win the User over in that time you may not over time, as the saying goes you only have one chance to make a first impression.

Consider the time taken to install a product, if the install process is onerous, complex or unintelligable in parts Users won't proceed, the install process is really the first contact a User has with a product.  Forget all that fancy packaging, its all to get eyes on a rack in the store, and with a lot of software being sold online these days it doesn't matter as much as reviews, and if Users can't install your product then bad reviews will follow.  If you follow the Amazon model and have a rating system, then most people are not going to buy software that has bad reviews, future Users who do research online will look for comments from other people.  If there are numerous comments about installation issues or problems, or people looking for advice and help on forums, that information will be there and can be a first negative to potential Customers.

Once past install is the Out of the Box experience adequate?  Is the product usable from the get-go or is there need for Customization?  When I have used software I often like to just get in and check it out, basically my tester's instincts to do a post-install review - is everything in place and working.  Can I start the product, access the help screens, am I going to get a license or registration  notification right away, these are things I want to know.  I may even start clicking around on screens and check out menus to see what is where, and if its understandable to me before I've gone into the manual or tutorials in-depth.  Problems at this stage will probably engender a support call, or a search on forums or through an internet search engine to see if someone else has the same issue.  If it can be fixed easily or needs an update I often forgive updates, after all I work with Windows and software, there are always updates to be had and its normal - so I may forgive that.  What if the software has a nasty crash?  That tends to be irritating, and if it can't be fixed or has a large number of steps to fix then my view of the product goes down, I shouldn't need to go through a lot of configuration to get something working, unless it is a complex product in which case I know this up front, or I am engaged with a consultant to help me out.  If I have gotten past installation and the first use then I'm going to be happy and have a positive view of the product, this is a good test to run in addition to any others, if you have a User Acceptance Test it will incorporate these kinds of things to make sure it will meet your Users expections and make their first 24 a good experience.

I mentioned using a car earlier, and let me close with the best example I can come up with, while comparing software to a car is not really the best example one can make the steps one goes through to purchase a car and software is the same.  When you pick up your new car you want it clean and ready for you, maybe with a new tank of gas, but there should still be that new car smell inside and the mileage should be similar to what it was in the test drive or what was needed to get the car off the delivery truck.  Still, when you get that car and sit in it when its yours for the first time, drive it off the lot and home you expect it to be smooth, smell nice and run well until you get it home.  The first 24 in your car should be a good smelling, smooth and happy experience to make you feel good about putting down all that money on the car, just as that experience should be nice so should software, but without the new car smell.

(Sorry to dissappoint you Jack Bauer fans but yes, it was intentional that I not mention him at all, well until now to mention that I did not mention him.)

Monthly Topics

It's June and it means its another montly topic, and even though the group likes having the QA Talks every month I find it hard to keep people coming up with something they want to talk about, or even do a presentation.  So most months it falls to me to make something, last month I did Regular Expressions since there is more Perl being used, and a few others on the team are learning how to write scripts for the automation framework.  So either I am better than I think, I doubt it, or no one wants to do the work.

I usually need to be inspired to write a topic, its not easy to just up and present for 20 minutes, and I try to maintain interest to keep people's attention so its tough to make good technical topics that are enjoyable.  Not sure yet what this month will be about, but having read some more lectures by Yifa, a Buddhist nun associated with my wife's temple, I may have something on how to break down tests.  Funny how we find inspiration in unusual places.

I'll probably come back and add but topics I have done or have thought about include:
  • Test Heuristics - DONE!
  • Agile & Scrum - done but always needs more
  • Basic Debugging - DONE!
  • Driver Debugging - DONE!
  • What is Quality? - DONE!
  • Regular Expressions - DONE!
  • Find your Buddha (seeking Test Nirvana)  - DONE!
  • Test Frameworks
  • Database Test Techniques
  • Ruby
  • QTP Testing - DONE!
  • Accessing Data Query Methods in Web Services API - DONE!
  • Blink - Thinking Without Thinking (Overview of the Book by Malcolm Gladwell) - DONE!
  • DNS and product usage - DONE!
  • Product Installer - DONE!
  • Virtualization - DONE!
That's all for now, I may update the list later on.

Wednesday, March 11, 2009

Automation is not easy, nor should it be

If you've ever been in an organization that starts to get some process around itself you will discover in testing that someone, at some point, wrote a script.  That script was the beginning of other scripts, which eventually became under the control of maybe a master script or were all bundled into a suite.  When that bundle, or suite, was run time and time again and the results were known and used, you suddenly have the good beginnings of a Test Harness.

It can take time to develop, but that's what those scripts are, development.  Script Development, with all the process and care that goes into the code that comes under test, so to should those scripts being written be vetted and reviewed by someone else.  It's not "just for QA" when the scripts are doing the work and validating someone elses code, this is something that has become a tool which is being used for a purpose, that purpose is validation.  When you validate something there needs to be a story to back it up, someone has to be able to stand up in front of the crowd and proudly say "this is my script and here is what it does!".  This invites all kinds of questions and discussion and debate, it does happen, but you know this is not a bad thing.  Discussion and debate is good for not just the Requirements and the Product, but for those little things that help the product get out the door, because that script is code and code needs reviewing, and commenting, and that is work that needs to be done.

Automating work takes time, dedication, patience and practice.  It's not easy to generate a framework that will allow you to run tests, especially a framework that you can run and say that when a test case fails its because the product has a problem and not the automation.  Automation is hard work because you need to develop something that will aid you in your goal of running tests and validating the product, but its also a project on its own and needs its own schedule, set of tasks, care and review.  Just going out and writing a script is fine, if all you want or need is a script that does X and never Y, but sometimes there is a point where there is a need to do X, Y, Z and all the other letters before them.  It's not easy, and it should not be because a badly written and implemented Automation system does no one any good, but it doesn't all need to be out at once either, it can be done in stages, just like a real project because after all that is what it is.

I've done this already, and this is the background for what I have here.  When I came to my current place of employment there were a lot of scripts written, there was even a beginning stab at automation, done by someone else who was here.  We worked together and extended the automation scripts and built a nice little test harness, it didn't test everything, but it tested a good piece of functionality consistently and routinely and gave results.  So we knew where things were and we had discussions about where we wanted the automation to go, saw what was lacking and talked about how to fill in the gaps and brought Developers into the discussion to get their feedback.  The other person left and I took over, I kept up with what we had discussed but now that I was driving I was talking about the automation but fewer and fewer people were talking back, eventually it was just me.  Then the product took a new direction and I said to myself "hey me!  Let's get back on board and do this right and fill in those gaps!"  Because I had been patching the system we had for awhile, working around it when the scripts we had were breaking because they were starting to be used in ways we never intended, in between testing and writing the automation the automation became a part time job.  During the break while the new direction was being created, I came up with a full blown plan on what we needed, how we were going to get there, brought people into the conversation who were excited about it and we made a staged release plan.

Now we are building some cool stuff, people are talking about it, other groups in the company find it cool and think its a nice idea, and they want to talk about it, and I am feeling reenergized in what I am doing because we will have some really nice stuff when this is all done.

About a year from now.