Abstract
This article will discuss Automated Software Testing as a proposed solution to the ever increasing testing problem: This proposed solution is backed up, by presenting a background of the current testing problem, supported by results of a recent IDT survey.
Automated Software Testing refers to the "Application and implementation of software technology to allow for automation throughout the entire software testing lifecycle (STL) with the goal to improve the STL efficiencies and effectiveness."
We discuss the importance of Automated Software Testing as part of the System Engineering Lifecycle and we describe the return on investment (ROI) of some efforts undertaken thus far, as well as other benefits. Additionally, this article will talk about some of the automated software testing pitfalls to avoid and how to accomplish successful automated software testing. Problem: Too much time is spent on software testing!
Too much time is spent on software testing. As software programs are increasing in complexity, testing times only seem to have increased. As stated by Hailpern and Santhanam: "... debugging, testing, and verification activities can easily range from 50 to 75 percent of the total development cost.1"
One recent testing improvement initiative is the establishment of a task force to improve development test and evaluation. A "Memorandum for Chairman, Defense Science Board" with the subject "Terms of Reference � Defense Science Board (DSB) Task Force on Development Test and Evaluation (DT&E)", states that "approximately 50% of programs entering Initial Operational Test and Evaluation (IOT&E) in recent years have not been evaluated as Operationally Effective or Operationally Suitable." Because of this memorandum, dated 2007, it was requested that "DSB establish a task force to examine T&E roles and responsibilities, policy and practices, and recommend changes that may contribute to improved success in IOT&E along with quicker delivery of improved capability and sustainability to Warfighters."
Evidence of another improvement initiative appeared in the Washington Post, June 3, 2007: "The IRS has launched an initiative to enhance and expand current testing by integrating industry best testing practices to gain efficiencies that improve our overall testing processes."
The outcome of a recent software testing survey2 conducted by IDT, LLC3 backs up the findings related to long software test timelines and high testing time percentages relative to the rest of the software engineering lifecycle. The survey's goal was to determine software testing related issues in order to derive needed solutions, while reaching as many software testers (with a wide demographic) as possible: It was sent to tens of thousands of test engineers, posted on Quality Assurance (QA) user sites, and advertised on various Government tech sites. So far. there are nearly 280 responses from all over the world: 74% of the responses are from the U.S. and 26% of the responses are from other countries, such as India, Pakistan, Canada, South Africa, China, and Europe. More than 50% of survey respondents work for organizations of 1000 and more employees.
The survey contained various software testing related questions, and specifically the survey response to "Time currently spent on testing in relationship to overall software development lifecycle" is listed in Table 1: Almost 50% state that 30-50% of time is spent on software testing in relation to the overall software development lifecycle, and nearly 25% state that more than 50% of time is spent on it. Automated Software Testing as Part of the System Engineering Lifecycle
Automated Software Testing success is increased, if implemented as part of the system engineering lifecycle. This includes developer involvement; starting with automated unit testing; integration-testing and then building on those initial tests, automating the system testing. Additionally, automated testing as part of the system engineering lifecycle includes stakeholder understanding of what Automated Software Testing entails. Developers need to keep application testability issues in mind when developing software. They need to understand, for example, how a change in a GUI control/widget implementation could affect existing automated scripts, or how logging is required for test results evaluation, etc. Project managers need to include Automated Software Testing efforts as part of the schedules and budgets. Test managers need to hire qualified Automated Software Testing personnel, and so forth. Figure 1 shows the Automated Testing Lifecycle that parallels the system engineering lifecycle.4
Automated Software Testing can be effectively applied to all software testing phases that run in parallel to the system engineering lifecycle, such as developing an automated requirements traceability (RTM) via the use of a Requirements Management System during the requirements phase; automated build verification processes that include an automated unit test during the development phase; defect tracking, test status reporting; and metrics collection during the testing
|