Automated Software Testing Life Cycle: Part II

Posted by Lisa Corkren in Automation Automated Software Testing Life Cycle

The automated testing life cycle has many phases. Recently, I wrote an article on the first phase which included scoping the project and the Proof of Concept (POC). This included gathering all the information necessary to evaluate the time constraints and processes that the testing team or client want implemented into a project.

The next phase is construction and verification. When considering all the steps to software testing, specification documents are a necessary part of the equation. In this phase, test plans and test scripts are created. The development team writes detailed specifications and starts coding the application, so it is during this time that static testing of specifications and code via reviews should be done. Listed in the following sections are common reviews used in construction and verification techniques.

Inspections: Inspections involve a team led by a trained moderator (not the author of the document). The team formally reviews the documents and work product during various phases of the product development life cycle. The benefit of this process is that the experience and knowledge of peers produce a higher level of quality. The bugs that are discovered during this review are documented and communicated to the next level in order to take care of them. (Graham, 2007)

Walkthroughs: A walkthrough is a step by step presentation that is less formal than an inspection. During the walkthrough meeting, the author introduces the material to all the participants in order to familiarize them with the application under test. Even though the walkthroughs can help in finding potential bugs, they are mostly used to gather information, establish an understanding of content, and to further the communication process. (Graham, 2007)

Buddy Checks: Buddy checks are a type of proof reading, the simplest type of review activity used to find bugs in a work product during the verification. It is useful for a person of a similar background to go through the documents prepared by the author, in order to find out if there are mistakes or bugs which the author couldn't find previously.

Verification: Verification activities involve requirement specification verification, functional design verification, internal/system design verification and code verification. Each activity makes sure that the product is developed the right way and every requirement, specification, design code, etc. is verified. (Graham, 2007)

Construction: After static testing has determined the project is ready to move forward, we now need to evaluate which test cases can be automated.

The following questions are a guideline that will help us determine what will need to be automated:

  • Does this test have the potential for modularity and portability?
  • Is this test part of a regression or build testing?
  • Does this test cover most critical feature paths?
  • Does this test cover high risk areas?
  • Is this test expensive to perform manually?
  • Is this test part of a performance test?
  • Are there timing-critical components and dependencies that are a must to automate?
  • Does the test cover the most complex area?
  • Does the test require many data combinations using the same steps?
  • Are the expected results constant?
  • Is the test outcome analysis overly time-consuming?
  • Does the test need to be verified on multiple software platforms?
  • Does the test automation ROI look promising and meet organizational criteria?
  • What are the clients' testing tools?
  • Which application feature can be used for testing (web, GUI)?

When asking these questions we would probably come to the conclusion that not every test should be automated, and evaluating the list above will help to choose tests that will bring a greater ROI. Constructing and verification will help your team understand the goals of the software, help to build confidence in the application, and increase the morale of the testing team. Choosing to automate tests that meet these guidelines will save resources as well as time.


About the Author: Lisa Corkren is a technical lead for DeRisk IT Inc. who specializes in automated testing strategies, project management, and SmartBear's TestComplete. She has worked on numerous projects involving both Manual and Automated Testing.

Note: DeRisk IT is now known as DeRisk QA.