17.3.09

A framework for quality / maturity analysis

I was looking through some old files recently and I stumbled across a "quality" framework I wrote that actually works quite well as a pre-engagement "maturity" framework, that might lead you to conclusions about whether a particular organization or company was ready to consider outsourcing any portion of their engineering or product development work.  It's a long list of questions and concerns, but I figured it might be useful to someone, hence the framework in its entirety is posted below.

As always, while I have found this framework to be useful, your mileage may vary.




Notes on using this framework

This framework can be employed through technical "interviews" with key engineering team staff.  Primary targets for interviews are:
  • Development engineering management
  • Development engineering staff (small sample set)
  • QA engineering management
  • QA engineering staff (small sample set)
  • Functional area managers for any other engineering disciplines

This framework is intended to get at core quality concerns primarily through "gut feel".  This approach is admittedly insufficient for comparative analysis.  
Project methodology

Regardless of the software development methodology used (waterfall, agile, XP, scrum, etc.) there are several common constructs that usually have a significant and direct impact on overall quality.  These items should be considered in any analysis of product quality engineering practices.

Process definition and documentation

  • Is the software development methodology well understood by the engineers building the software? (as measured by correlation between various interviewees)
  • Is the understanding consistent across the team?  (as measured by correlation between various interviewees)
  • Has the team implemented projects using this process before?
  • Is the process well documented? 

Clarity and purity of role
  • Are roles defined for team members?  (for example, "QA Engineer" vs. "Development Engineer")
  • What are the roles?
  • Are the roles well documented and understood by all parties?  (as measured by correlation between various interviewees)
  • Do individual team members serve more than one role based function?  Do they do so simultaneously?

Process phases
  • Are process phases formally defined? (for example, Requirements, Design, Implementation)
  • Are entry and exit criteria defined and documented?
  • Are formal process phase gate reviews performed?  
  • Are results of phase gate reviews documented and shared with the full team?
  • What is the theoretic impact of a failed phase gate review?
  • What is the real-world impact of a failed phase gate review?

Documentation
  • What project artifacts (internal documentation such as Requirements, Use Cases, Specifications) are produced?
  • Are project artifacts common across projects (i.e., are they standardized as part of the development methodology)?
  • Are project artifacts stored in a common repository under version control?
  • What change control methodology is used for project artifacts?
  • What traceability methodology is used to track dependencies among various project artifacts?

Project complexity
  • Has the team implemented and successfully delivered a project of this size and scope before?


Development engineering methodology

Quality engineering starts well upstream of "QA".   The following common aspects of software engineering can have significant impact to product quality.  

Repository and Build
  • Is the software source stored in a source control system?  If so, what repository?
  • Does the repository allow highly granular version control?
  • Does the product / project in question build every day?  More often?
  • Is the build fully automated, and does it proceed from "start" without any human intervention?
  • What is the outcome of a failed build?
  • Does the repository and build system allow 100% identical build results  (i.e., can any previous build number be reproduced from the repository without human intervention?)

Code Inspection
  • Is peer inspection of software source code performed?
  • If so, how often?  (all source, only new source, only tricky source, etc.)
  • Are code inspections required prior to check-in, or simply suggested?
  • Are code inspection artifacts produced? (code inspection forms, etc.)
  • What is the outcome of a "failed" code inspection?
  • Are defect reports entered against code inspection "failures"?

Developer Unit Test
  • What (if any) unit test framework is used?
  • Do developers follow a "test first, then code" methodology?
  • How are unit tests run (automatically, manually, ad hoc)?
  • How often are unit tests run?
  • What is the result of a failed unit test?
  • Are defect reports entered against failed unit tests?

Developer Integration Test
  • Are developers required to build and test the project / product prior to checking in source?
  • What is the result of a failed integration test?
  • Are defect reports entered against failed integration tests?

Build Yield
  • Is build yield measured through the life of the project?  (I.e., X attempts to build, Y successful builds, breaks for reason Z, etc.)
  • If so, how often does the build break?
  • What is the outcome of a failed build?
  • Is there a quality feedback loop from build failures (i.e., is root cause analysis done and corrected?)

Code Coverage Tools
  • Are code coverage tools in use?
  • If so, where and when is code coverage measured?

Advanced Diagnostic Tools
  • Are advanced diagnostic tools in use?  (for example, Bounds Checker?)
  • Who uses the tools, and how often?

  • Development Environment
  • Does development engineering have its own systems to develop and test the product in question?
  • Does development engineering have autonomy within this environment?
  • Is the development engineering team measured on product quality?

Quality assurance methodology

While quality engineering is significantly influenced by upstream processes and events, as discussed in the previous two sections, the execution within the QA organization is of primary importance to this audit framework. 

General QA Methodologies
  • Is "QA" a separate function?
  • How empowered is QA?  
  • Does the QA team have a separate reporting structure?
  • Does QA perform all, some or none of the "testing"?
  • What is the development engineer: QA engineer ratio?
  • Does QA staff participate in product design?  
  • Does QA staff participate in project phase gate reviews (if applicable)?

Overall testing approach
  • What phases or test levels does the QA function define? (for example, black box, integration, performance, beta)

Task decomposition
  • How does QA decompose tasks in order to completely test all components, sub-systems and functions?  (for example, by use case, by feature, by architectural component, etc.)
  • Is this task decomposition reviewed outside of the QA team?

Smoke Test
  • Is each build tested for "happy path" to ensure basic functionality?
  • Are the smoke test "test cases" documented and well understood?
  • Does the smoke test run automatically?
  • Is Smoke yield tracked through the life of the project?
  • What is the result of a failed smoke test?
  • Are defect reports entered against failed smoke test?

Correctness testing
  • Irrespective of task decomposition, is test strategy documented?
  • Is this test strategy reviewed outside of QA?
  • Are all individual test cases documented and reviewed?
  • Are validation methodologies and data sets documented along with the test cases?
  • Is any attempt made to "normalize" test cases (i.e., one function per test case, or "about an hour of setup and verification")
  • Who authors test cases?
  • Who reviews the test cases for correctness?
  • What approach is taken in the presence of complex combinatorial features (for example, testing a function on multiple operating systems)?
  • Reproducible data sets?
  • Reproducibility of results?
  • Feedback loop for defects found outside of test cases?

White/Black Box?
  • What verification methods are used across the product / project testing?
  • Are verification methods reviewed as part of test case review?

Function coverage
  • What estimated level of "function coverage"  (i.e., use cases, features) are covered during a given QA test cycle?
  • What methodology was used to arrive at this estimate?


System (integration) testing
  • Is System Testing viewed as a separate function from general correctness testing?
  • Is the full system tested "as deployed", or are only sub-components tested?
  • Are negative "scenario tests" performed? (for example, pull out a disk while the server is doing some function.)

Load testing
  • Is the full (integrated) system or product tested in a variety of "over-clocked" scenarios?
  • Are system components isolated and tested under "over-clocked" scenarios?

Performance testing
  • Is a distinction made between load testing and performance testing?
  • Is performance testing done on full-scale systems?  
  • If not, what analysis has been done to facilitate scaled down performance measurement?
  • How are performance requirements and scenarios determined?
  • Is the performance testing modeled after customer deployments?

Network Simulation
  • If the product / project has any network layer dependencies, is any modeling done to determine possible network-layer perturbations?
  • Is any testing done using WAN / LAN simulation software?

Beta / EFT Testing
  • Are customer beta programs or "early field trials" part of the QA process?
  • If so, how many customers participate?
  • If so, what is the typical duration of the Beta?
  • If so, what is the typical number of defects discovered during Beta?
  • If so, what is the typical response to defects discovered in Beta phase?

Pre-release Regression Testing
  • What methodology is employed to ensure validity of test results through the "end-game" of a release?
  • How much of the final "gold candidate" is tested pre-release?

Defect Tracking & Measurement
  • Is a defect tracking tool (for example, Bugzilla, MKS Integrity Manager, etc.) used?  If so, what flavor?
  • Is the defect workflow documented and well understood by all members of the team?
  • Is the defect schema well documented and well understood by all members of the team?
  • In which project phase are defects tracked?
  • Can defects be expunged by the system?
  • Who gets to close defects?
  • What defect metrics are tracked through the course of a project?
  • Are defect trends for existing projects compared to historic projects?
  • How many open defect reports are there?

Defect Verification
  • How are defects "committed" for fix?
  • How are defects verified for fix?
  • How are defects analyzed and verified for potential "collateral damage"?

Statistical Methods
  • Are any statistical methods in place to extrapolate product quality (and hence customer satisfaction) from software defect data?

Test case management
  • How are test cases managed?  
  • Are they under change control?

Test Environment 
  • Does QA team have its own systems to develop and test the product in question?
  • Does QA team have autonomy within this environment?

Empowerment
  • Can QA stop a release?  If so, how and why?
  • Has QA stopped releases?  If so, how and why?

Customer Experience

Overall quality is best determined by overall customer satisfaction.  These parts of the framework attempt to get at the customer experience:

  • What is the size of customer base?
  • What is the largest single deployed instance?
  • What is the diversity of customer base?  (range of sizes, market verticals, etc.)
  • What is the average number of tech support calls per customer per time period?
  • What is the average number of defects reported per customer (over some time period)?
  • What is the "work flow" for customer-reported defects?
  • Is the customer-reported defect work flow well understood and documented?

Special considerations

Product Complexity
  • How many lines of code in the product?
  • Are there any measurements of the complexity of the product?  (Rose, etc.)
  • How many components or systems comprise the "system as deployed"?

Engineering Team Scalability
  • How big is the engineering team?  
  • How many releases (and of what scale) does the team deliver a quarter/year?
  • How many releases / projects does the team work in at any given point?
  • Are there any significant staffing or resource bottlenecks?

Sustaining Engineering Methodology 
  • How are customer "issues" handled?  
  • Is there a dedicated "sustaining" team?  If so, what disciplines are involved?  How many people in each respective discipline?
  • What is the support model?
  • What is the triage / prioritization model?
  • How fast, after a fix has been coded, can a "hot fix" be released?
  • What level of QA is done on bug-fix releases?
  • How much of the product / project must be built and released in order to deliver a bug fix?