Most organizations I talk to are in the midst of executing their cloud strategy and busily planning or migrating their portfolio to a cloud vendor, or are putting the finishing touches on their cloud strategy. There are stragglers, of course – many of them are mainframe shops who believed they couldn’t leverage everything cloud has to offer.
We now know this isn’t the case, and Astadia’s mainframe-to-cloud reference architecture papers can help guide the way of mainframe shops to realizing the full potential of cloud computing. But what comes next? What’s the next big change for companies with legacy systems to keep those systems working in harmony with their contemporary peers?
One of the latest technological developments rapidly gaining adoption is containers.
Although containers have been around since about 2000, they weren’t very easy to use and few organizations exploited them or saw their benefits.
It wasn’t until recently that open-source and other projects have made containers easy and practical enough for widespread use,
A container is a stand-alone, executable package of software that includes everything needed to run it: application code, runtime, system tools and libraries, etc.
Containers are similar to Virtual Machines, except they do not have an operating system. Instead, they use the operating system of the machines on which they are deployed (including VMs).
This difference makes containers easier to manage from both a development and deployment perspective.
This is important in a DevOps world of continuous application delivery and focused on agility, portability and cost savings.
At worst, it means you could put legacy applications into a container that are easily portable from one promotional environment to the next and could realize cost savings by reducing time requirements for deployment.
Ideally, however, it would be best to undertake an initiative to split your legacy applications into numerous containers making development and deployment even easier.
It could also enable you to develop independent microservices from your legacy applications that could be shared, possibly reducing the amount of code you need to maintain.
Of course, IT shops with COBOL are probably thinking (like they did with cloud computing) that this is another innovation available only to the most modern of programming languages.
Not true. Major COBOL vendors like Micro Focus, who spends about $60 million a year on COBOL R&D, are no doubt hard at work on providing the necessary support allowing COBOL to exploit containers as efficiently and effectively as any modern programming language.
I’ll be watching this closely and write more on this exciting development in future posts.
Everybody agrees that automation is key to mainframe modernization regardless of the selected transformation approach, particularly in replatforming or refactoring projects. Not everybody, however, has the vision to automate the journey all the way.
COBOL talent shortage is a major mainframe modernization driver and that’s when every IT leader must make a decision about what will happen to the COBOL programmers in this scenario.
Get in touch with our experts and find out how Astadia's range of tools and experience can support your team.contact us now