
Technology is constantly and rapidly evolving, especially with recent advances in AI and Large Language Models, which are encouraging engineers to find novel approaches to solving some of the most pressing business, societal and environmental problems we face today.
While these data science innovations are exciting and promising – this isn’t the first or last IT ‘revolution’. Regardless of how cutting-edge these advancements are when they emerge, most will eventually decline in the adoption curve as further innovation emerges – and that’s not a bad thing if appreciated and approached in the right way.
It’s tempting to see older applications or the technologies they’re built upon as “legacy”. However, that can come with a negative connotation of ‘no longer being fit-for-purpose’ or a ‘blocker to evolution’. The reality however can be very different.
In this blog, I’d like to explore the challenges many organisations face in maintaining and modernising applications as they age. Key to any modernisation effort is to keep business and client value at the heart of planning, design and decision making. Working with ‘mature’ software can also be vital for engineers’ learning and development, empowering growth through a different kind of problem solving.
The Challenges of Software Maturity
Investing in software developed over multiple decades will have resulted in complexity that can’t be replaced with commercial off-the-shelf (COTS) products. The business requirements that these will have been designed for are often now so broad and difficult to reverse-engineer that to understand and replicate these would be a huge undertaking. Investment on that scale would cost a significant amount of time and resource to replicate existing functionality that is already stable and fulfils the business and user needs sufficiently. These factors mean there is not always an adequate business case for fully rebuilding or replacing certain applications.
A clear example of this is the continued use of COBOL in global banking. COBOL is a programming language first designed in 1959 that formed much of the foundation of business, financial, and administrative systems being developed until the early 2000s. While still under active development, with the latest stable version being released in 2023, COBOL can without a doubt be considered a ‘legacy’ technology. Despite this, it won’t be going anywhere any time soon due to how deeply it is embedded into critical business applications.
Placing Value at the Heart of Investment Decisions
The reality of assessing application modernisation strategies against business value will fall on a scale, and in the context of cloud modernisation; Amazon Web Services’ ‘7 Rs Model’ (extending Gartner’s 5 Rs Model) is a really valuable tool for weighing up business value against modernisation strategy.
The example of COBOL-based banking applications would fall in the ‘Retain’ category, while a recent modernisation project in collaboration with the National Police Coordination Centre would be classed as ‘Replatform’. In this project, we moved the core infrastructure of an application up the cloud maturity ladder from VMware based hypervisors hosted on-premises to Infrastructure as a Service (IaaS) in AWS.
While some aspects of this modernisation were ‘lift-and-shift’, certain server-based components (e.g, email sending, document generation and storage, monitoring and analytic capabilities) were able to be entirely re-built using cloud-native technologies such as AWS Lambda, reaping the associated rewards of lower cost and easier maintenance. Further improvements such as blue/green deployment strategies and autoscaling also enabled zero-downtime deployments on a much higher cadence with lower risk, as well as significant availability improvements thanks to leveraging ‘warm-standby’ failover instances and multiple availability zones.
There are techniques that can also be employed at an application-level that help offset an aging codebase. This includes leveraging APIs to maintain business-logic-heavy components that would require uneconomic rebuilds, while allowing for new front-end or interface layers to be built on top of these. While this kind of approach would technically fall under the costliest ‘Refactor’ category, limiting changes to a certain application tier or set of classes/services can help prioritise value. For example, completely picking apart that monolithic application to rebuild it from the ground up might not be viable. However, encapsulating its controller and service methods into exposed, RESTful, API methods might be.
Impact On Learning
I’m fortunate to have had many opportunities to work on greenfield projects where modern technology stacks build upon the latest best practice architectures. While these are often the kinds of projects everyone looks forward to being involved in, my deepest learning experiences have always been presented to me when having to find a unique workaround to a constraint of an older technology stack, or finding a bug written into a piece of software over ten years ago by a completely different team. It teaches a very different kind of problem solving, needing to rely on outdated documentation, ancient forum posts or having to delve deeper into the nuances of a technology that might usually be abstracted away from a developer when working on something fresh.
My growth as an engineer has been built on those experiences. Even if the team may never encounter that specific problem again, acknowledging and sharing the approach to how that problem got solved is an extremely valuable learning to take away, and more importantly to pass on.
Just because a product or solution is aging, that doesn’t mean it needs to deteriorate or can’t continue to evolve. The important thing is recognising where the value resides through continuous improvement and modernisation. I also hope this offers a perspective to help reframe what an early-career engineer might feel when being onboarded onto a “mature” codebase. It comes with its challenges, but there are also valuable learning opportunities to be grasped.