Typically, a vast volume of data and information is used to run projects. For applications under test, a substantial amount of data is required to carry out tests, more so for testing techniques such as Boundary Value Analysis and Equivalence partitioning. Volume (lots of Test Data!) is an additional testing prerequisite which is exemplary for verticals such as Banking and Financial Services where historical and future dated data play a key role in methodical application testing.
Business impact due to non-compliance of test data guidelines is characteristically ignored when testing the application in the QA Environment. Applications being run in test environments do not have any legal or practical implications. However, in instantaneous environments, they create data-related pressures, which can cost the organizations a significant amount.
A few scenarios to take a look at are described in detail below.
Typical Business Problems
For example, consider the scenario where a new bank is being launched for retail customers. In this situation, post the development staging, testing would span across various units within the product life cycle, such as Credit Card Bill Payments, Payments and Transfers, Fraud Checking, Remote Deposit Capture & Cheque deposits, etc. Plenty of pre-existing test data is required for testing these units, so that edge case scenarios can be covered. A few examples below illustrate those very circumstances.
Validation of interest capitalizations in a leap year (Feb 29th) is a testing scenario which is required to be confirmed before rolling out the banking system into production. In real-time testing, it may need years to test this state, depending where you are in calendar terms (- in the leap year example as you have the leap year appear once in 4 Years). It would become a difficulty if the leap year interest capitalization is not validated, and it actually turns out to be unsuccessful in production. These are precisely the edge cases which ought to not be ignored while testing.
Reporting Regulatory Requirements
Certain government policies need your financial system to keep hold of the past years’ worth of data and either discard or archive the rest. These are the strict guidelines which may encompass additional financial implications (Tax Audit etc.). Such regulations need to be tested with immense specifications. With a new system, however, there might be a shortage of test data due to which there would be a requirement for effective testing! This all the more intensifies the significance of successful management of financial information.
To test these circumstances, we need scrupulous preparation of test data in the system, without which we would have the subsequent problems.
- Low Volume Testing – With a vacant database, the performance of the test environment will be enhanced when compared to the production environment; this is an illusory way to test the application, as we would not be able to imitate the real-time environment and the consecutive defects.
- Back dated Testing – This is a crucial point to regard when testing data intensive applications, the ability to go back in time. We run the peril of not being able to test backdated scenarios when we take the ‘fill as we go’ approach. This would cause noteworthy test-coverage-related risks.
- Integration (Upstream/Downstream) – Testing interfaces also pose plenty of blind spots for tests and test data broadcasts should be across interfaces. For example, a bill payment information should not only be in our Core Banking System (CBS) under test, but should also exist on the interfacing payment provider (3rd Party bill payment Channel) in order to avoid account reconciliation issues which would lead to inaccurate accounting reports.
- Time Travel – Certain kinds of test scenarios require user to travel back and forth in time (that is, moving the system date backwards / forward) in order to accommodate date related flows. For Example: if one would want to validate a Loan Amortization scenario wherein auto debits are validated for each month.
Date movements are complicated, as they may disturb the sync between multiple interfaces, and are not straightforward to co-ordinate.
There are no decisive solutions available that can be utilized for all the test data requirements. However, there are a few strategies that may be deployed in order to mitigate the impact and aid in improving the overall test coverage.