How to Organize Tests for Software That's Already Developed?



  • I've managed tests for a project that I was with since its infancy. I used Microsoft Test Manager to tie tests to requirements and organize them in folders. It went well. However, now I'm being tasked with organizing tests for a program that has been in production for a decade and is much more complex. The regression test we have is just a list of almost 7,000 steps in an Excel sheet, not even broken into tests. I'm looking for any advice on how to manage this project. I want to organize test cases without wasting a lot of time. Thoughts? Something else to consider is that I've also been tasked with automating the majority of these tests (using TestComplete). While organizing the existing regression tests will be faster than automating them, I'm considering that automation will naturally result in organization. So should I propose to focus on well-documented automation and not have many manual regression tests?



  • As the other answers have said, your first task is to organize the list. There are a number of reasons you want to do this: The items may not be accurate - This is ridiculously common. What tends to happen is that the people who execute the test steps know enough about the system to adjust for flaws in the list and act accordingly. The items may not be relevant - This is another common problem with long manual regression tests. They tend not to be updated, so they'll specify actions or features that the application no longer supports. There's a lot of repetition - Chances are a document like your 7k item spreadsheet grew organically, and the lack of organization in it means it's a serious pain to try to find out if it already has a test for some condition that's needed: so the test gets added. It's also probably never been reorganized, so there's going to be a whole lot of similar actions in there that could be handled more effectively as a grouping. It's the easiest way to figure out what's in it - I'm speaking from experience here: with something like this, the easiest way to work out what's there is to go through it and organize and group it. The tests may not be suited for automation - Good candidates for automation and existing manual regression test cases are not the same. There may be very little overlap (and that's a separate problem). It's going to be a major time-sink no matter what you do, but organizing the list of test cases will give you a feel for the problem areas of the application (because I guarantee you your spreadsheet will have grown by adding reported bugs to the list) and the areas where automated regression should be best targeted. It will also allow you to identify redundancies and tests that are no longer relevant or valid. The structure you use to organize the tests can then be echoed in your automation, but you'll start that project with a much clearer idea of what's needed - which will allow you to target your automation projects much more cleanly. TestComplete does support manual test suites if you wish to use that. If the project team has a life cycle management tool in place, I'd use that (whether it be Team Foundation Server with Microsoft Test Manager, QAComplete with Test Complete, HP ALM or a home-grown setup) unless they use spreadsheets... If they don't have one in place, you might want to consider something like TestLink as your organizing tool (to be honest, even a set of smaller spreadsheets on network shares would be better than what you have right now). Regardless of the tool you choose, you're still going to want to get that list tamed before you start automating.



Suggested Topics

  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2