Matrices that measures UAT Readiness, Cost of quality etc…

A small debate on bugs over lunch got converted into a discussion. During the discussion on how to ensure that team is delivering the right quality, business is getting value, cost of quality is gradually optimized, test coverage is giving confidence etc.  All of us agreed that there is a need of adding checkpoints. Each checkpoint will produce some useful matrices, which can help in identifying the symptoms of non-conformance of quality (budget, scope, time). Here are few matrices I would prefer to use, depending upon the development model, availability of the data, and need of the specific information. I believe one should not create a matrices which he or other can’t use in decision making. Please let me know if you would like to add more.

=================================
Pre UAT/Production Matrices
=================================

Test coverage
Unit test coverage
Functional/non-functional Test coverage
Requirement /Acceptance Criteria Traceability
Executed tests vs pending tests

Defect Analysis
Defect Density Feature-wise
Defect Density feature/requirement Size-wise
Defect Escape Rate
Bug Find Close Trends
Bug Reopen trends
Bug Aging
Use Stories Stability
Test Cases Effectiveness
Browser wise Bug Distribution
Root Cause Analysis
Stability Component wise
Productivity Matrix

Test Automation
Automation Execution Results
Automation Execution Build Wise
Automation Execution SprintWise
Automation Progress Week Wise
Automation Script Maintanance

UAT Readiness
Status of Stories/requiremetns
Tests execution coverage and status
Priority 1 and 2 defect status
Requirement/component/feature stability
Regression Status

Productivity
Number of deployment re-runs in test environment
Bug Reopen trend

=================================
Post Production Matrices
=================================

Customer Satisfaction
Production Defect slippage
Number of Product/release rollback (Quality Issues)
Product enhancement requests after each production release (Requirement Completeness)
Maintenance fixes per year/release (Technical debt)
Number of emails/calls to customer service (Usability Issues)
Number of training session for end user/customer (Usability Issues)

Defect Analysis
Production Defect slippage per production release
Production Defect slippage per requirement size
UAT Defect slippage per UAT release
UAT Defect slippage vs requirement size
Root cause Analysis (requirement/code/test/configuration)
Pre UAT/UAT/Production bugs ratios
Defects found by all in Production in a year

Defect Cost
Business loss per defect
Costs of work-around
Brand image impact
Cost to fight/pay legal cases
Raito of Maintenance and enhancements

Product Reliability
Availability Actual vs Expected
Mean time between failure (MTBF)
Mean time to repair (MTTR)
Reliability ratio (MTBF / MTTR)
Production Defect slippage
Number of Product/release rollback (Quality Issues)

Deployment Quality
Number of Production deployment re-runs (Deployment issues)
Ratio Failed deployment and total deployment attempts

Turnaround Time Matrices
Turnaround time for production issues (severity wise)
Turnaround time for Enhancements developments (Major/minor)
Planned/Actual Turnaround time for production issues (Major/minor)

Cost of Quality
Cost of preparing Test strategy/plans
Cost of test development and management, defect tracking
Cost of test execution and defect reporting
Cost of static validations like reviews, walkthrough, and inspections
Costs of analysis, debugging and fixing, retesting
Costs of tools

Cost of trainings

44 seconds, to check if you are an Agile team

Does having Daily stand up, Planning, and Retrospective meeting ceremonies, make a Team agile.

Just spent few seconds to respond 11 quick check points, to find out if you are an Agile team. If your response is NO for any question, then you may like to reconsider your claim of being Agile.

#

Quick Check Items

Your Response (Yes/No)

1 Do the team members have direct access to the Product owner?  
2 Do the Team members know the team’s velocity?  
3 Do the Team member estimate the User Stories without influence of the supervisors?  
4 Do the Team members help each other?  
5 Do the Team members respect viewpoint and skills of other team members?  
6 Do the Team members have courage to accept the mistakes without fear?  
7 Does the Iteration (sprint) size fall below 6 weeks?  
8 Does the Team improve and tracks the action items determined in the Retrospective meeting  
9 Does the Team use various Burn down charts?  
10 Is the Team empowered to push back the change in the scope during the iteration?  
11 Are Items in Product backlog prioritized by their business value?  

These 11 questions look at the existing process from different perspective to evaluate the agility. Leave your comments if you have experience in working with Agile teams.