Automated testing is a key factor in successful digital transformation projects. It eliminates human errors and repetitive work, thus making the process highly effective. But how does it work exactly in the context of a mainframe migration project?

At Astadia, we perform automated application modernization. This means we migrate an entire application, typically from mainframe to Linux or Windows. At the same time, we modernize the underlying architecture: we bring old VSAM files to a relational database, convert programs from COBOL, Natural or IDMS to Java or C#. Basically, just about everything changes.

Needless to say, testing is an essential part of this process.

And just like we have tools to automate the migration itself – the data conversion, the source code transformation – we also have tools to automate the testing.

How Does Automated Testing Work?

We have mentioned earlier that in a migration project just about everything changes: the programming language, the database, the operating system. Nevertheless, there is one big exception – the behavior of the application, which is supposed to remain 100% identical to the original one.

The behavior of the application can be, for instance, the screen output of an online application. Or the listings that are produced by batch jobs. In general, given the same INPUT, the migrated application should produce the same OUTPUT as the original application.

This is what we call “like for like” migrations: very complex projects, but in the end, straight forward to define “success” - when functional equivalence is achieved, and the migrated application behavior is 100% the same as the original.

Automated Testing Tools

This is where Astadia’s automated testing tools come into play. They are used to prove or disprove functional equivalence.

Here are the main steps of the process:

  1. We record the behavior of the original application – input and corresponding output – for each online screen and batch job.
  2. We replay the recorded inputs onto the migrated application.

Same input should yield the same output. If we detect the slightest difference between original response and replayed response, we haven’t achieved functional equivalence yet.

The recording of the original application is rather straight-forward. Users simply utilize the original application –typically on a reference environment. The test tool will capture network-interactions and will automatically convert that into a test scenario. That same test scenario can be replayed onto the target environment.

No Human Testers

The best part is that, while the recording needs to happen just once, we can replay that recording automatically, 10times, 100 times, 1000 times, without requiring human testers.

The migration project now becomes this iterative process in which we will run the conversion tools, then run the testing tools to find any potential issue. We fix the issues, then rerun the conversion tools, rerun the testing tools and analyze the results.

Because the entire process is done by the software, the execution is extremely fast, and turnaround times are very short. The entire application can be reconverted and retested in a matter of hours. And since we retest everything with each cycle, we can rest assured that we don’t introduce regressions.

In conclusion, this is a quite powerful setup. With each new iteration we can measure progress, black on white. And everyone can see the project evolve towards the ultimate goal: a migrated application that behaves exactly like the original one.

Schedule a demo to learn how automated testing can be used for your transformation project.

Subscribe to our newsletter

Related news

Related white papers:

No items found.

Let's Talk

Get in touch with our experts and find out how Astadia's range of tools and experience can support your team.

contact us now