Wednesday, October 23, 2013

No Software is Perfect, but the ACA was not the best

"If you believe the doctors, nothing is wholesome; if you believe the
theologians, nothing is innocent; if you believe the military, nothing is safe."
Lord Salisbury

To paraphrase, "if you think a software deployment never has problems, nothing is shipped".

Originally I started this post thinking about deployments but with the release of the ACA web site, portal, whatever it really is I saw a very public forum airing something on which I deal with on a consistent basis.  Anyone, and I mean ANYONE, who has worked in software has been involved with or seen a botched deployment on code either being shipped to customers, or as is more the case these days, on code released to a live site.  Things generally go well, and the scale of problems varies, but it is always there.  To mitigate we test, check, use the code in an environment that will simulate the production environment.  There are too many test types to mention that can be used, but in any of them actually USING the code or application will show up issues of some kind, at the very least.  I will say something that anyone working in tech knows, No Software is Perfect.

Listening to On Point on NPR, I support Public Radio and typically I am agreement with most of what Tom Ashbrook says but this time I sadly think he was out of his depth.  Sure it looks bad, but this is a political hot potato and anything that could have gone bad would be a candidate for hyperbole.  Was the rollout of the government web site bad?  Oh yes, just from the comments I wonder what sort of process they ran it through, but doing this for a living I know how hard it is and what it takes.  My older relatives who barely use computers would probably get a different perspective.  Yet I also know people who have worked on government projects, even looking back at many of the governments weapon's procurement and development programs, none of this should be a surprise.  The US Government still does a large amount of work on a waterfall process where they just start down a road and keep driving, filling up the tank as it gets empty so the work continues and many times no one is navigating, so the project just keeps driving along.

Still, after hashing through many of the reports over the past two weeks here is what I see as the major issues, and yet I see this on many projects, some even today, and no one has died yet.

  • No Testing, or whatever kind there was seems to be minimal.  If the site was crashing as people said I would have to ask what sort of Load Testing there was.  I don't even think I want to bring up Security testing.
  • Coding done by Government Contract, seems like a big waterfall of a project with a process to match, seems like there was little Acceptance testing done until the end when it was deployed.
  • Outsourcing, not a bad thing, but was the company that really doing this the best choice?  From some of what I had seen few traditional software companies wanted in on this, and the company that did do it seems to have a spotty track record at best.
  • No Communication or Transparency, this was a big project (I am using that sarcastically since more Web Applications and sites don't have millions of lines of code, but it's big in the scale of visibility) and with the size it seemed much of the status was held close to the leaders.  Sure its a political liability, but when you aren't transparent people make up their own rationales.
  • Tech Surge, oh yeah we all know throwing more people and money at something gets it done faster!  Right?
  • Were the people really the "best and brightest"?  Or if they are only coming in now why would they want to refactor code?  Seems a waste of a skill set to me.  And if they are only coming in now then who was working on this?
  • Bureaucracy.  'nuff said.

Rocky roll outs are a norm, but maybe this time people who actually run projects, or those on them that they feel are going wrong can point out to the ACA web site rollout and say, "let's not be like them."

Maybe, just maybe, then, someone will listen.

Monday, September 16, 2013

Reading about Quality

On occasion I read books on Quality that are basically throwbacks, or what I like to think of as historical documents.  Recently I was going through one of Juran's books on Quality to see what I might be able to learn and apply from what Quality Control and Process Design was all about before software.  This was sort of inspired by a recent book I read on Toyota and its evolution of Lean Manufacturing leading up to Kanban and its current practices, I also found out some interesting stuff on Toyota cars and how they are designed.  Considering I drive a Highlander and my wife a Camry it was cool stuff!

So while reading through Juran's book I noticed the following formula:

Quality = (Frequency of deficiencies) / ( Opportunity for deficiencies)

I wondered how I might be able to use this in my own arena.  Dealing with web sites that have hundreds or thousands of visitors a day there is no direct linkage here.  So just toying around with ideas I thought I might be able to use something along the lines of Visitors from our web analytics and determine then some number through contacts to our Support Organization to see if I could get an idea of Opportunity and/or Frequency through these numbers.  I'm no statician at all, so not sure if this will work at all but I would like to try out, as a mental exercise, whether I can get something useful from this and see if I can get an idea of what our Quality is.

Typically I am against just making up metrics, but in some cases I think its fairly good to be able to give a representable number to Business Users if that number CAN be representative and provide a USEFUL idea of what is going on.

Friday, September 6, 2013

Who am I reading?

Like many others out on the interweb, I tend to read certain blogs or newsfeeds on a consistent basis.  I do have a focus on Testing and Quality but I have a few other divergent interests, some align and some don't.

Without further ado, my list of favorites includes:
Enjoy, if you are not already doing so!

Tuesday, September 3, 2013

Working with CSV and PowerShell for Link Checkers

One of the current tools in my toolbox is Xenu Link Sleuth which does a great job of scanning through our site and finding links that are broken or need updating.  Although while the report it gives is good to review afterwards it is difficult to send out to others, so to make the data useful to my business users I needed to make something that was more readable.  Since PowerShell is something that we use often it seemed an easy choice to write something that reads through the exported data and generates a useful report.

# Crawl Report
$crawlReport = Import-Csv "" -Delimiter "`t"
$crawlList = @()
# Page Map
$siteMap = Import-Csv "" -Delimiter "`t"
$siteList = @()
# Links List
[string] $brokenLinksList = "BrokenLinksList.txt"
[string] $brokenLinksOverview = "BrokenLinksList.csv"
# Clean up from last time
if (Test-Path $brokenLinksList) {
  Remove-Item $brokenLinksList -Force
}
if (Test-Path $brokenLinksOverview) {
  Remove-Item $brokenLinksOverview -Force
}
 
"Just some initial information on what we are dealing with."
"Crawl Report is " + $crawlReport.Length + " lines."
"Site Map is " + $siteMap.Length + " lines."
# Check each Address that has a Status-Code of 404, add it to an array we will
# use to get its origin from the Site Map
foreach ($row in $crawlReport) {
  if ($row.StatusCode -eq "404") {
    $crawlList += $row.Address
  }
}
"We now have " + $crawlList.Count + " broken links to deal with."
# Check in the Site Map for the Address as a Link to Page
foreach ($link in $crawlList) {
  foreach ($line in $siteMap) {
    if ($link -eq $line.LinkToPage) {
      # Pull these together to make it simpler to review later on
      $data = $link + "," + $line.LinkToPage + "," + $line.OriginPage
      $siteList += $data
    }
  }
}
"Overall there are " + $siteList.Count + " broken links to fix."
# Output the Address, Link To Page and Origin Page to a separate file
# since these were put together as CSV's earlier let's just make an
# exported CSV file for ease of use later on
"BrokenLink,LinkToPage,OriginPage" >> $brokenLinksList
foreach ($entry in $siteList) {
  $entry >> $brokenLinksList
}
# Now that we have everything in one file let's make it a CSV
$brokenLinksList | Export-Csv $brokenLinksOverview

This puts the data into a CSV file that can be filtered and sorted in Excel, which puts the data in a format that's more useful for my business users.

Friday, August 2, 2013

Operational Quality

In order to make sure that our production site is running properly, twice a week I do a check on the live site.  Mostly what I want to see is that the site is responding to search crawls, since our content needs to be available for queries by Users who come looking for specific topics.  I do a check to make sure current content is being crawled and indexed.  We don't want a stale index, and since the crawls run incrementally each night I check that the index is up to date.

Coming up with a name for this was hard at first since its not really maintenance but its not really an Acceptance Test in the strict definition of the word, if there is one in Software Testing, since the site is live and there are no changes.  I asked a bit in the Software Quality Assurance and Testing Stack Exchange and received the great suggestion of Operational Quality from a fellow poster.

So what does this comprise?

  • I reuse existing SpecFlow and Selenium Web Driver tests in my Test Framework to query against the index and make sure I get results, checking the pages for terms I know need to be in the result set
  • Query the SharePoint a backup of the content database, I don't want to interfere with production, for what has been added recently and look for those pages in the index
  • Log in to the SharePoint farm Administration server and check the Search Service Application for Crawl Log errors.  There typically are some false negatives for access denied, but I have become accustomed to what to ignore over time (isn't that always the case?)
  • Run some search queries on topics I know are in the site content or just want to be sure still come up

Twice a week I get a calendar reminder, to make sure I have time for this, to do the work.  I have made it easier for myself over time, which is something I try to do in a way on continual improvement, so it takes me maybe more time to log in to all the servers I need to check than for me to do the actual work.

Monday, July 29, 2013

Dealing with Redirects

We use a lot of redirects in my current environment.  I mean ALOT.  Currently we have 150.  While that doesn't seem like a big number in many ways each of these are loaded into the Web Servers memory each time it starts.  Often, every 6 months or so, I like to have these reviewed to see which ones are valid, and this is difficult since we have a tracking system for requests, and we store the files in source control but there is little visibility about which ones are actually in use.  Which led to my recent foray into PowerShell and XML.

I was asked recently about getting a list of the updated redirects for review.  Nothing new, again I do this every 6 months, mostly because some just get asked to deal with a specific marketing campaign, a scheduled event and we have a few that exist from when we updated our web site in July 2011.  Those update ones are there to point people who may have had links to information on the old site, to the locations on the new site.  At times even those age and need to be removed at some point.  Still getting the information wasn't simple since I needed to collate from multiple sources, until last week when I had some spare multi-task time in the afternoon and thought, "hey, I can do this in powershell!"

So I sat down and pulled the XML into a PowerShell variable so I could look at it:
 [xml]$redirectXML = Get-Content $redirectFile

From there I looked at the structure I was getting and looking through many of the nodes noticed that the one that was going to give me trouble was the system.webserver since that . notation typically means that you have another node.  Why Microsoft gives it that notation I don't know but this stumped me for awhile.  Until coming across an answer for something else I saw that when you have node names like that you use quote marks around them.  So while following the nodes down to the Rules section I wanted from the Redirect Map I came up with:
$rules= $redirectXMl.configuration."system.webServer".rewrite.rewriteMaps.rewriteMap.add

This gave me the first part, but then it was a simple matter to then pull out the actual rules:
$redirectUrl= $redirectXMl.configuration."system.webServer".rewrite.globalRules.rule

This helped a lot, so what I ended up doing was pulling these together then outputting them into a table delimited file so I could pull them into Excel.  In the end I came up with a fairly good script that pulled out what I wanted, it does need clean up and I would like to actually output into an Excel worksheet, but that is for later.  My basic script ended up being:


# Get the values that we will need from the command line

# So far I only need the path and name of the redirect file, the

# ApplicationHost.Config file

Param([string]$redirectFile)

# Constants that will be necessary

[string]$redirListDir = $null

[string]$redirectList = "redirectList.csv"

# Check that we have a file to pull redirects from

if( ($redirectFile -eq $null) -or ($redirectFile -eq "")) {
    "Make sure its the path including the name "
    "of the applicationHost.config that needs to be checked. Try something like "
    ".\redirectReview.ps1 C:\filelocation\applicationHost.config"
    "Thank you!"
    exit
}

# Check that we have a place to put the file at the end
if(($redirListDir -eq $null) -or ($redirListDir -eq "")) {
    if (Test-Path -Path "d:\temp") {
        $redirListDir = "d:\temp"
        "Using D:\Temp"
    }
    elseif (Test-Path -Path "c:\temp") {

        $redirListDir = "c:\temp"
        "Using C:\Temp"
    }
    else {

        "Cannot find a directory D:\temp or C:\temp. Create one and try again.`n"
        exit 
     }
}
$redirectListFile = Join-Path -Path $redirListDir -ChildPath($redirectList)

if (Test-Path $redirectListFile) {
    Remove-Item $redirectListFile -Force
}

# Open the file and put it in an XML format

[xml]$redirectXML = Get-Content $redirectFile

# Review the XML tags

# One of my old pointers

# $redirectXMl.configuration."system.webServer".rewrite.rewriteMaps

"Getting the Rewrite Maps"
# This gets all the rewrite Map redirects
$rules = $redirectXMl.configuration."system.webServer".rewrite.rewriteMaps.rewriteMap.add

"Now for the Global Rules"
# This gets all the rules that have been added
# Trying some different match formats that did not work
# $inputUrl = $redirectXMl.configuration."system.webServer".rewrite.globalRules.rule | % {$_.match)
# $redirectUrl = $redirectXMl.configuration."system.webServer".rewrite.globalRules.rule | % {$_.action}

$redirectUrl = $redirectXMl.configuration."system.webServer".rewrite.globalRules.rule

"Creating the output file"
# Output the claimed values into a CSV file that can be reviewed somewhere
$rules | % {
    $a = $_.key + "`t" + $_.value
    $a >> $redirectListFile
}
$redirectUrl | % {
    $b1 = ($_.match).OuterXML
    $b2 = ($_.action).OuterXML
    $b = $b1 + "`t" + $b2
    $b >> $redirectListFile
}

Monday, July 8, 2013

Starting my QA journey over again

It's been awhile, and even though I have had many older posts I decided to change venues.

Let's see how long this one lasts.