Automated Regression Testing

Post Reply
bunchukokoy
Regular Participant
Posts: 190
Joined: Thu Dec 03, 2009 8:47 am
OLAP Product: IBM Cognos TM1
Version: 10.2.2.x
Excel Version: 2010
Location: Singapore

Automated Regression Testing

Post by bunchukokoy » Sun Jul 29, 2018 3:38 pm

Hi TM1 Experts,

I hope everyone's still confident with TM1 :)

My organization is currently implementing innovations to improve our IT processes. And our TM1 systems are subject for this initiative. Specifically, we're looking on how to automate regression testing of our systems. I'm now gathering information and ideas on how other organizations or teams approached this one.

Looking at automated regression testing, I am aware of the following:

1. Automated testing will not totally replace the manual testing performed by the users
2. All T.I. processes executing successfully does not mean the written logic within them are all correct
3. Cube data, objects security and the UI process flow must still be tested by the users
4. The automated testing will not be 100% successful as it's also a program itself. Users must still test.
5. There are other myths in automated testing.


One approach I can think of is Side-By-Side comparisons of cube data and business data; OR comparisons of cube data and T.I.-calculated data. Use AsciiOutput function to dump records with discrepancies. This is only one. But...

May I know how you've done this in your organization or any idea on how to approach this?
Did you use other tools?
What are the principles you applied, the limitations there are?

Thank you in advance!

Bunch

User avatar
paulsimon
MVP
Posts: 635
Joined: Sat Sep 03, 2011 11:10 pm
OLAP Product: TM1
Version: 10.1.1 and 10.2.2
Excel Version: 2013

Re: Automated Regression Testing

Post by paulsimon » Sun Jul 29, 2018 6:42 pm

Hi

The title of your question is regression testing, however, the detail seems to be about wider testing.

For me regression testing is typically about comparing a system before and after a change has been made to verify that the only difference is what is expected and that no side-effect has affected another part of the system. Firstly this implies that you have two non-production environments where you can hold the data static for long enough to carry out the tests. Do you have the environments for that?

You can do regression testing by exporting and comparing cube data, but I am not sure how much that really proves. Even if you have a simple TM1 system that has no user input and lets say it loads from a General Ledger overnight and reports the data then at the very least you would need to run the overnight processes on both the old and new versions of system to prove that they are deliverying the same results. However, clearly the regression testing should tell you that something has changed otherwise the change that was introduced has had no effect. It is not always that easy to work out what area of the system will now deliver different results and what those different results should be. In some cases you can simulate the calculations using Excel but in others it is not feasible due to the volumes involved.

This sort of regression testing is feasible if it is a pure reporting system but somewhat harder to organise if it is a typical TM1 system that involves some form of user input and workflow, since the results can depend on the timing.

If the changes are to reports, views, or subsets, then cube data comparison won't spot issues.

We have automated regression testing to compare the output of one report with another and to compare selected cube views. However, most of our checks are at the cube rather than the report level. It is only really worth doing regression testing on a report if you know that you have changed that report. If you check the underlying data, that is probably going to give a lot more coverage than one report.

You mention comparison to other systems but I would say that this is outside the area of pure regression testing.

Where you can get more benefit from automated report comparison is where you can get a report from an independent system such as the General Ledger and produce the same report from TM1, and then use the automated comparison feature. This is something that a wrote in VBA a while ago. It does a good job but the figures need to be in exactly the same cells on both reports.

Most accountants will introduce some degree of self-checking into their reports, eg summing a column to check the value on the consol from TM1.

One of the cornerstones of any testing mechanism like this is that you need to be able to automate the promotion of objects first. For example, if you are introducing changes into live environment by manually editing dimensions then in my view there is no way to test this since you cannot guarantee that the change is the same as that made in the dev environment. You therefore need a way to promote a related set of TM1 objects, from one environment to another so that tests can be carried out.

We have an overnight promotion mechanism that copies in objects and can run processes to complete the promotion, eg a new dim build process might be promoted but it then needs to be run to take effect. We combine this with a routine that copies Production back to a test environment so that we can simulate what will happen when the release is applied to production and carry out tests on that. Of course there will be some companies where you will not be allowed access to production data, but that is rare for TM1 systems. In that case you would need a reliable set of test data with good coverage that you can copy back.

We have some checks built into the system and other integrity checks that are carried out each morning by the TM1 Admins, since the interpretation of the results can require a degree of experience.

As an example of checks built into the system, we collect Trial Balance data from numerous partner organisations. This is all governed by custom built workflow with various checks, so that when they submit, the Local TB has to balance, there have to be mappings in place from the Local Chart of Accounts to the Universal Chart of Accounts, the mappings must be to valid codes for their organisation, etc. If their data does not meet these conditions then they can never advance beyond the Contribute stage. Central Accounts have screens to monitor the workflow and can see anyone who has not submitted to the agreed timescales.

When hierarchies are rebuilt at the backend we carry out validation processes to ensure that eg for each alternate hierarchy that every element has a single parent, to detect any double counting.

Our daily checks spreadsheet ensures that the figures at the top of the various alternate hierarchies match those on the default All elements consolidation to ensure that nothing has been left out.

It automatically lists any TM1ProcessError files generated in the last x days to promote review of these.

It includes checks such as the values on one cube matching those on another. Although depending on where we are in the month, there can be reasons why they won't match hence the need for some interpretation.

We have automated checks for some calculations, eg do the same calc in Excel for a sample and compare to the TM1 output.

Regards

Paul Simon

bunchukokoy
Regular Participant
Posts: 190
Joined: Thu Dec 03, 2009 8:47 am
OLAP Product: IBM Cognos TM1
Version: 10.2.2.x
Excel Version: 2010
Location: Singapore

Re: Automated Regression Testing

Post by bunchukokoy » Wed Aug 01, 2018 5:32 pm

Hi Paul,

Thank you for sharing your ideas and your set-up. Our TM1 systems are highly user input-driven with heavy calculations.
And, the current situation is, even the SAP reports that key users are using to validate the result in TM1 have no one-to-one mapping in the corresponding TM1 reports. So basically those SAP reports are the closest independent source for comparison, still the structure is different.

You're right, if the cubes are mostly calculations with little user input, and an independent comparison source is available, a side-by-side data comparison is easy.

Thanks again Sir!

Post Reply