Should you test regression tests or just verify/check?



  • When conducting regression tests, the goal is to find out if the new release affects existing functionalities of the application. Is it enough just to check/verify the application or should one test the application trying to break it? By definition, verification is just to verify something is working. But when testing we try to break things by inputting unexpected inputs (boundaries values for examples) and observe behavior of the system. The automated regression tests are examples of check/verification since we asserts existing features works like button clicking and submitting a text. But is it then enough during regression tests just run automated regressions or should one also try to break the system by testing systems aptitude.



  • Yes, as per the definition 'Regression' is all about making sure that new changes are not causing any side effects for existing functionality.

    Having said that, as part of regression this is what we follow:

    Manually:

    • Test all the impacted areas as part of the task (feature/bug) itself. We talk to respective devs to find out all of the 'impacted areas' (impacted by the changes made in this task) and include it in our testing.
    • So in a way we do a quick regression as part of the testing that task itself.
    • Create a test case sheet for all of the possible cases.
    • Test cases are categorized into: Smoke, Regression, Automation not required.
    • Smoke tests - Basic Add/Edit/Delete operation.
    • Regression tests - In depth testing, focused with breaking the application with all possible scenarios.

    Automation:

    • Once first round of manual testing is completed and the build is stable, we start automating the feature.
    • First we automate smoke tests so that once ready we can run these tests on every build to make sure that the build is stable and can be used for further testing. We maintain is single smoke test file which contains all the tests marked as 'smoke' for every feature.
    • Once smoke tests automation is completed, we start with automation 'regression' tests from the test case sheet.
    • Once the regression tests are automated, we include them in the regression suite which we run daily (night mode).

    Long story short: For us regression is not only about checking the side effects but also running tests which tries to break the application with all possible combinations/scenarios.



Suggested Topics

  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2