Vulnerability regression testing in an agile environment



  • Starting point: We used different solutions in the test team to test our websites for vulnerabilities in different ways.

    In doing so, we naturally also want to retest the bugs anchored in the backlog and found beforehand accordingly after a fix.

    Tooling: We use Owasp ZAP and Waipiti to perform vulnerability testing, and each in a Docker environment.

    We write all of our test cases ourselves. Here, at the Python level.

    The questions:

    1. How did you guys set up your vulnerability regression testing?
    2. When and how do you deploy the regression tests?
    3. How did you use the regression tests in the Definition of Ready and Done respectively?
    4. Have you found a way to use Owasp ZAP appropriately in a regression pipeline as well?

    Owasp Zap itself and the team behind it, has also made this a topic since June / July 2021 and I would hereby also just point it out.

    We actually have a Google Summer of Code project for retesting vulnerabilities found by ZAP scans.

    The blog https://pranavsaxena17.github.io/GSoC-with-ZAP/ is a bit light but hopefully the student will update it soon. In any case the project is progressing well.

    Here is an overview of the possible test approaches in the area of security testing.

    Quick glance at types of Security Testing

    Further information:



  • How did you guys set up your vulnerability regression testing?

    I have used static analysis tools like HCL AppScan and SonarQube. These run against your source code and can be set up as "checks" in GitHub, Azure DevOps, etc.

    When and how do you deploy the regression tests?

    You can use a dynamic scanning tool like SortSite or ZAP.

    How did you use the regression tests in the Definition of Ready and Done respectively?

    Results of scans we tied to a "release" (stored in a shared directory) and we scanned periodically through the sprint including the last commit before release. My government contracts required we send the results before they would deploy the code.

    Have you found a way to use Owasp ZAP appropriately in a regression pipeline as well?

    ZAP has a CLI. I have only run these manually after automated tests pass and let me know the code is stable enough as the scans as they can take some time. You can run anything via CI/CD and shell commands, so if there is a results parsing plugin for your tooling then you are all set. I know if the results are in JUnit (XML) format than there is likely a plugin. I have done this with JMeter tests results output as JUnit format, which were published in Azure DevOps.



Suggested Topics

  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2