1. Planning (Product Definition Phase) 1.1. Requirements definition begins 1.2. High Level Test Plan, (includes multiple test cycles, automation, integration) 1.3. Quality Assurance Plan (Quality goals, Beta criteria, etc) 1.4. Identify when reviews will be held 1.5. Defect Tracking and Problem Reporting Procedures 1.6. Identify Automation tools and begin automation planning 1.7. Identify Acceptance Criteria (Entrance / Exit) - for QA and Users. 1.8. Identify application testing databases 1.9. Identify measurement criteria, i.e. defect quantities/severity level and defect origin (to name a few). 1.10. Identify metrics for the project (Traceability, code coverage, defect metrics, ROI, etc �) 1.11. Begin overall testing project schedule (time, resources etc.) 1.12. Requisite: Review Product Definition Document | 1.12.1. QA input to document as part of the Process Improvement Project 1.12.2. Help determine scope issues based on Features of the Product 1.12.3. 5 - 10 hours / month approximately 1.13. Plan management of all test cases in a database, both manual and automated | 2. Analysis (External Document Phase) 2.1. Develop Functional validation matrix based on Business Requirements. 2.2. Develop Test Case format - time estimates and priority assignments. 2.3. Develop Test Cycles matrices and time lines 2.4. Begin writing Test Cases based on Functional Validation matrix 2.5. Map baseline data to test cases to business requirements 2.6. Identify test cases to automate. 2.7. Automation team begin to setup variable files, GUI maps, and high level scripts in Certify / WR/ QTP, etc 2.8. Setup TRACK and AutoAdviser for tracking components of automated system. 2.9. Define area for Stress and Performance testing. 2.10. Begin development of Baseline Database as per test case data requirements. 2.11. Define procedures for Baseline Data maintenance, i.e. backup, restore, validate. 2.12. Begin planning the number of test cycles required for the project, and Regression Testing. 2.13. Begin review of documentation, i.e. Functional Design, Business Requirements, Product Specifications, Product Externals etc.. 2.14. Review test environments and lab, both Front End and Back End. 2.15. Prepare for using McCabe tool to support development in white box testing and code complexity analysis. 2.16. Setup Requite and start inputting documents. 2.17. Requisite: Review Externals Document 2.17.1. QA input to document as part of the Process Improvement Project 2.17.2. Start to write test cases from Action Response Pair Groups 2.17.3. Start to develop metrics based on estimated number of test cases, time to execute each case and if it is "automatable" . 2.17.4. Define baseline data for each test case 2.17.5. 25 hours / month approximately | 3. Design (Architecture Document Phase) 3.1. Revise Test Plan based on changes. 3.2. Revise Test Cycle matrices and timelines 3.3. Verify that Test Plan and cases are in a database or other tool 3.4. Revise Functional Matrix 3.5. Continual to write out test cases and add new ones based on changes. 3.6. Develop Risk Assessment Criteria 3.7. Formalize details for automated testing and multi-user testing. 3.8. Select set of test cases to automate and begin scripting them. 3.9. Formalize detail for Stress and Performance testing 3.10. Finalize test cycles. (number of test case per cycle based on time estimates per test case and priority.) 3.11. Finalize the Test Plan 3.12. (Estimate resources to support development in unit testing) 3.13. Requisite: Review Architecture Document 3.13.1. QA input to document as part of the Process Improvement Project 3.13.2. Actual components or modules that development will code. 3.13.3. Unit testing standard defined here, pass/fail criteria, etc. 3.13.4. Unit testing reports, what they will look like, for both white and black box testing including input/outputs and all decision points. 3.13.5. List of modules that will be unit tested. | 4. Construction (Unit Testing Phase) 4.1. Complete all plans (Manual and Automated) 4.2. Complete Test Cycle matrices and timelines 4.3. Complete all test cases. (Manual and Automated) 4.4. Complete scripting of first set of automated test cases. (data driven, optimization) 4.5. Complete plans for Stress and Performance testing 4.6. Begin Stress and Performance testing 4.7. McCabe tool support - supply metrics 4.8. Test the automated testing system and fix bugs. 4.9. (Support development in unit testing) 4.10. Run QA Acceptance test suite to certify software is ready to turn over to QA. 5. Test Cycle(s) / Bug Fixes (Re-Testing/System Testing Phase) 5.1. Test Cycle 1, run first set of test cases (front and back end) 5.2. Report bugs 5.3. Bug Verification - ongoing activity 5.4. Revise test cases as required 5.5. Add test cases as required 5.6. Test Cycle II 5.7. Test Cycle III | 6. Final Testing and Implementation (Code Freeze Phase) 6.1. Execution of all front end test cases - manual and automated. 6.2. Execution of all back end test cases - manual and automated. 6.3. Execute all Stress and Performance tests. 6.4. Provide on-going defect tracking metrics. 6.5. Provide on-going complexity and design metrics. 6.6. Update estimates for test cases and test plans. 6.7. Document test cycles, regression testing, and update accordingly. | 7. Post Implementation 7.1. Post implementation evaluation meeting to review entire project. (lessons learned) 7.2. Prepare final Defect Report and associated metrics. 7.3. Identify strategies to prevent similar problems in future project. 7.4. Create plan with goals and milestone how to improve processes. 7.5. McCabe tools - produce final reports and analysis. 7.6. Automation team - 1) Review test cases to evaluate other cases to be automated for regression testing, 2) Clean up automated test cases and variables, and 3) Review process of integrating results from automated testing in with results from manual testing. 7.7. Test Lab and testing environment - clean up test environment, tag and archive tests and data for that release, restore test machines to baseline, and etc. | |