On occasion I read books on Quality that are basically throwbacks, or what I like to think of as historical documents. Recently I was going through one of Juran's books on Quality to see what I might be able to learn and apply from what Quality Control and Process Design was all about before software. This was sort of inspired by a recent book I read on Toyota and its evolution of Lean Manufacturing leading up to Kanban and its current practices, I also found out some interesting stuff on Toyota cars and how they are designed. Considering I drive a Highlander and my wife a Camry it was cool stuff!
So while reading through Juran's book I noticed the following formula:
Quality = (Frequency of deficiencies) / ( Opportunity for deficiencies)
I wondered how I might be able to use this in my own arena. Dealing with web sites that have hundreds or thousands of visitors a day there is no direct linkage here. So just toying around with ideas I thought I might be able to use something along the lines of Visitors from our web analytics and determine then some number through contacts to our Support Organization to see if I could get an idea of Opportunity and/or Frequency through these numbers. I'm no statician at all, so not sure if this will work at all but I would like to try out, as a mental exercise, whether I can get something useful from this and see if I can get an idea of what our Quality is.
Typically I am against just making up metrics, but in some cases I think its fairly good to be able to give a representable number to Business Users if that number CAN be representative and provide a USEFUL idea of what is going on.
I'm old. I'm cranky and think the Luddites were on to something.
Monday, September 16, 2013
Friday, September 6, 2013
Who am I reading?
Like many others out on the interweb, I tend to read certain blogs or newsfeeds on a consistent basis. I do have a focus on Testing and Quality but I have a few other divergent interests, some align and some don't.
Without further ado, my list of favorites includes:
Without further ado, my list of favorites includes:
- The All Things Quality blog by Joe Strazzere
- Len DiMaggio's mostly Software Testing
- Angry Weasel by Alan Page
- Gojko Adzic and his thought provoking blog
- The other Michael Bolton
- The Boston 1775 history blog, which I find thoughtful and as a love of history this is great for me!
- The Sharepoint Stackexchange
- and the SQA Stack Exchange
Tuesday, September 3, 2013
Working with CSV and PowerShell for Link Checkers
One of the current tools in my toolbox is Xenu Link Sleuth which does a great job of scanning through our site and finding links that are broken or need updating. Although while the report it gives is good to review afterwards it is difficult to send out to others, so to make the data useful to my business users I needed to make something that was more readable. Since PowerShell is something that we use often it seemed an easy choice to write something that reads through the exported data and generates a useful report.
# Crawl Report
$crawlReport = Import-Csv "" -Delimiter "`t"
$crawlList = @()
# Page Map
$siteMap = Import-Csv "" -Delimiter "`t"
$siteList = @()
# Links List
[string] $brokenLinksList = "BrokenLinksList.txt"
[string] $brokenLinksOverview = "BrokenLinksList.csv"
# Clean up from last time
if (Test-Path $brokenLinksList) {
Remove-Item $brokenLinksList -Force
}
if (Test-Path $brokenLinksOverview) {
Remove-Item $brokenLinksOverview -Force
}
"Just some initial information on what we are dealing with."
"Crawl Report is " + $crawlReport.Length + " lines."
"Site Map is " + $siteMap.Length + " lines."
# Check each Address that has a Status-Code of 404, add it to an array we will
# use to get its origin from the Site Map
foreach ($row in $crawlReport) {
if ($row.StatusCode -eq "404") {
$crawlList += $row.Address
}
}
"We now have " + $crawlList.Count + " broken links to deal with."
# Check in the Site Map for the Address as a Link to Page
foreach ($link in $crawlList) {
foreach ($line in $siteMap) {
if ($link -eq $line.LinkToPage) {
# Pull these together to make it simpler to review later on
$data = $link + "," + $line.LinkToPage + "," + $line.OriginPage
$siteList += $data
}
}
}
"Overall there are " + $siteList.Count + " broken links to fix."
# Output the Address, Link To Page and Origin Page to a separate file
# since these were put together as CSV's earlier let's just make an
# exported CSV file for ease of use later on
"BrokenLink,LinkToPage,OriginPage" >> $brokenLinksList
foreach ($entry in $siteList) {
$entry >> $brokenLinksList
}
# Now that we have everything in one file let's make it a CSV
$brokenLinksList | Export-Csv $brokenLinksOverview
This puts the data into a CSV file that can be filtered and sorted in Excel, which puts the data in a format that's more useful for my business users.
# Crawl Report
$crawlReport = Import-Csv "" -Delimiter "`t"
$crawlList = @()
# Page Map
$siteMap = Import-Csv "" -Delimiter "`t"
$siteList = @()
# Links List
[string] $brokenLinksList = "BrokenLinksList.txt"
[string] $brokenLinksOverview = "BrokenLinksList.csv"
# Clean up from last time
if (Test-Path $brokenLinksList) {
Remove-Item $brokenLinksList -Force
}
if (Test-Path $brokenLinksOverview) {
Remove-Item $brokenLinksOverview -Force
}
"Just some initial information on what we are dealing with."
"Crawl Report is " + $crawlReport.Length + " lines."
"Site Map is " + $siteMap.Length + " lines."
# Check each Address that has a Status-Code of 404, add it to an array we will
# use to get its origin from the Site Map
foreach ($row in $crawlReport) {
if ($row.StatusCode -eq "404") {
$crawlList += $row.Address
}
}
"We now have " + $crawlList.Count + " broken links to deal with."
# Check in the Site Map for the Address as a Link to Page
foreach ($link in $crawlList) {
foreach ($line in $siteMap) {
if ($link -eq $line.LinkToPage) {
# Pull these together to make it simpler to review later on
$data = $link + "," + $line.LinkToPage + "," + $line.OriginPage
$siteList += $data
}
}
}
"Overall there are " + $siteList.Count + " broken links to fix."
# Output the Address, Link To Page and Origin Page to a separate file
# since these were put together as CSV's earlier let's just make an
# exported CSV file for ease of use later on
"BrokenLink,LinkToPage,OriginPage" >> $brokenLinksList
foreach ($entry in $siteList) {
$entry >> $brokenLinksList
}
# Now that we have everything in one file let's make it a CSV
$brokenLinksList | Export-Csv $brokenLinksOverview
This puts the data into a CSV file that can be filtered and sorted in Excel, which puts the data in a format that's more useful for my business users.
Subscribe to:
Posts (Atom)