How to automate functional regression tests through database data comparisons?
I have a Java batch job application that reads input data from an Oracle database, does a lots of calculations, handles many edge cases, and outputs results back into different database tables.
The calculations should always return the same results on my subset samples regardless of the application code changes.
Therefore, my idea is to run the batch jobs automatically on each new application version and compare the result data with expected results stored in some file.
Resulting test dataset size is on average 1,000,000 rows in about 20 tables which have about 10 columns each. I would like to check the results on the test environment and compare them with expected data stored in CSV or some other format (ignoring IDs, timestamps).
What would be the best approach to automate checking correctness of this resulting data?
Is there a framework suitable for this task, or should I write something completely custom? I've looked into DbUnit and Database Rider but I am unsure whether they are the right choice for this job. They seem to focus more on data generation, which I already have, and data cleanup, which I don't really need afterwards.
Resulting test dataset size is on average 1 000 000 rows in about 20 tables which have about 10 columns each.
You are dealing with two different concerns:
- The calculations are performed by Interactors by manipulating Entities. For checking these calculations, you can use any unit testing libraries to exercise these objects and use your test data set. Nothing related to databases are done here.
- The mapping of Entities into the databases is done on the Entity Gateways. In this case, you can use database-specific tools, but you don't need to deal with your large dataset, since the point here is only the mapping, not calculations.