QA Testing Process
We currently use the following processes for testing: Write a test plan (currently stored in Microsoft Test Manger) Execute Test plan Once the tested functionality is promoted to Production, automate the Test plan using Borland SilkTest, for future regression testing Question : Is this a typical process for a shop that uses automation? As changes to the original test plan change (due to future modifications), do you continue to maintain both the manual test plan as well as the automated one? Thanks!
It is probably different for each organization/team/product/process, but here is a typical process I have used (starting from new functionality): Identify the key test parameters such as oracles, surfaces (variables like platform, inputs, outputs, etc), and risk areas. Explore the function under scrutiny at the time using multiple manual test sessions. Develop automation to make testing that function faster or identify functions that would make a good addition to a smoke test (fully automated) suite. Integrate automation into regression test automation suite as smoke test or independent computer assisted test modules that help setup or check specific areas of the application. This step is done as a part of the weekly or biweekly cycle while it is fresh. Only document enough to identify which automation to use with each regression test area. The function of the scripts are documented as comments in the script itself. The key here is that regression automation is going to break on a regular basis, so a smart mixture of automated and manual test for regression is best to make it as useful as possible but still maintainable. The scope of the automation will work itself out over time based on the time you have available for implementation and maintenance. Hope that helps!