Is your QA Team effective?
Is your QA Team effective. I am finding that many QA people that I have encountered are more verifiers and software breaking experts.
What I mean by verifiers is, they step through all scenarios provided, basically walking through the application and ensuring that it does what is supposed to.
What I mean by breakers is, they verify but they also diligently seek out scenarios that break the software and uncover defects.
Are your findings similar?
Of historical note on breaking software There was a team inside IBM in the 80's called the Black Team. They had a culture of saying that they had "succeeded" when they broke the software to encourage the identification of defects. They considered their work a "failure" when they failed to find/identify any faults in the software. On the other hand, the outcome of their "failure" was great reliable software...
And the book: "How to Break Software: A Practical Guide to Testing" by James Whittaker
emmalee last edited by
In many instances I'd be delighted if QA did that. My experience is that if QA exists at all it often consists of poorly paid agency recruits with the barest understanding of the business. Doing QA well is hard, like programming is hard, it's rarely resourced or managed as the crucial part of the development process that it is.
On the other hand, the best QA I worked with had 20+ PC ghost images, and knew what to do with them, they were DB ninjas, who could also communicate with end users. There was a testbed of machines with widely ranging hardware. The lead was strong, knew when to push and when to let it go, she knew the customers, and she knew how developers thought. We soon learned that our standards of good enough were not the same as the customers. Unfortunately the product was still a shabby heap of junk, that's business, but it rarely crashed and the customers loved it.
I'm actually on a testing team, and a lot of what we do is verification and software breaking, but that's only part. We also maintain automated tests of the products so that bugs are caught earlier in the development cycle and run large scalability tests in cooperation with development in order to push the limits of our product and find the areas that can be updated to allow for further scalability in the future.
I think we're being pretty effective in our goals of getting the best product possible into the hand of our users, and simple verification is an important part of that process.
Also... I think that in a lot of ways QA is limited by the resources they are given, and we're lucky enough to have a lot of access to development tools, large scale virtual environments (as well as decent hardware based machines), and we're given enough freedom to actually experiment with the product.
Some of the testing is by necessity "run these test cases" types of deals, but that's only one portion.
I believe our QA team is effective. But, we don't employ testers, we employ QA Analysts who:
Write their own test cases, utilizing the Business Requirements and Technical Design documents provided for the project. QA is involved as early in the development cycle as possible.
Have intimate knowledge of the business.
Know the difference between unit testing, functional testing, systems integration testing, regression testing, etc. In other words, they've put effort into studying QA methodologies.
Regularly perform regression testing on the system. This ensures that new projects have not introduced errors in working code.
Have the respect of the development team, along with salaries which are comparable to development salaries. Because QA has the respect of the development team, developers are quick to ask QA for opinions on some code changes, especially because QA has a better general knowledge of the business overall than a particular developer, who is almost always specialized in a certain area.
A QA team composed of a group of manual testers who use developer-provided test cases and do not have a grasp of QA methodology or the business domain may not be as effective.