How to test your tests without having the system under test?
emmalee last edited by
Problem We and devs are an Agile team and finally managed to work in parallel on development and testing within an iteration. When developers were working on new features, we were automating test scenarios, preparing test data, implementing fixtures and talking to developers to expose necessary interfaces for those fixtures. So, when developers delivered the features implemented, we already had tests automated to run. That was very encouraging. However, we found some tests failing because of errors in the tests, like: NullPointerExceptions in fixtures Some configuration required by the tested component was not provided by the test Some small differences in naming convention of expected results (devs called them differently) Since devs delivered the feature on the last day of the iteration, we didn't have much time to fix our tests and stories were carried over for the next iteration. What are the ways to address this problem? Proposed solutions Make devs deliver stories for testing quickier, so we have time to verify our tests as well. Test our tests before getting the feature from devs. Some problems could be, in fact, avoided with unit testing of fixtures, but others would require mocking the system/feature. That's a lot more work from our side. We focus on automating integration tests. Those are quite high-level tests, but maybe we are still too tightly coupled with interfaces which the devs expose to us and instead should we focus on less-coupled end-to-end tests?
Although it is an old question, I would like to provide my thoughts on this for future readers.I was in similar situation numerous times, in fact, its very common situation for test engineers working in Agile. Our journey was somewhat as below: 1). We all as a team realised this issue. Dev and test agreed and we planned to move large no. of tests from low/middle level (integration level) to unit level and aimed to have only 1 E2E UI test per user story. 2).Focused to design more black box automated tests without relying on the internals.This is true for any level of tests. 3).However, We made sure(by code review) all the boundary/edge cases are fully covered in unit tests only.Also in unit tests we found we needed to focus more to test public interfaces, not directly the internals. 4).We(testers) worked more closely with devs and helped them in developing/testing/reviewing unit tests.This period we moved more towards TDD(but not exactly). At this point, We also started thinking actually "Are we developers or testers"? 5).After few iterations, we ended up having the majority of tests as unit tests and fewer integration tests and very few E2E tests. At this point as a team, we found ourselves in a far better situation and were overall satisfied.(Happy Ending ) Above all (I think which helped), we had to blur the boundary between who is "Developer" and who is "tester" in the team. I believe Project Nirvana comes when quality becomes a mindset/culture in the team not just having an individual as "Tester".