Opel Kadet 1980's designed as driver less car no steering wheel only center entertainment screen

From Legacy to Longevity: A Structured Path to IBM i Transformation

Discover how to preserve business-critical logic, reduce technical debt, and enable future development - without abandoning decades of investment or disrupting daily operations.
Niels Liisberg
Niels Liisberg, Chief Innovation Officer - IBM Champion
August 28, 2025

The primary reason for initiating a transformation project often lies in demographic factors, such as the aging workforce and the need to secure future expertise. At the same time, it is essential to protect the significant investments made in software development over the past decades. The existing solution already supports the business’s needs far more effectively than a standard “off-the-shelf” alternative would, unless substantial and costly customizations are introduced. Therefore, both safeguarding business continuity and preserving the value of past investments are critical considerations when undertaking a transformation project.

 

Bridging the Old and the New

Transitioning from an IT architecture designed 20 to 30 years ago can often feel like untangling a Gordian knot. The challenge lies in reconciling legacy design principles and solutions with the methodologies and technologies underpinning modern IT strategies.

 

A successful transformation requires both respect for the existing system and a structured approach to adopting contemporary practices. Striking this balance is, in fact, the most critical challenge - and the key determinant of success - in any modernization initiative.

 

"The challenge isn’t rewriting everything. It’s knowing what to keep - and how to evolve it."

 

Our Approach

At System & Method, we have spent several years developing tools designed to address the technical challenges of IT modernization. These tools enable seamless integration of existing solutions with future requirements, leveraging modern technologies and development principles.

 

In practical terms, we deploy a software package that enables the integration of legacy “green/black screen” environments with modern, web-based frameworks. Initially, users gain access to all existing functions and applications through a redesigned, user-friendly interface. For some organizations, this modernization alone delivers significant value, as it improves productivity and provides a familiar experience aligned with today’s IT expectations.

 

However, we refer to this first phase as a “lipstick-on-a-pig” approach, since it does not address underlying demographic challenges. The existing codebase remains unchanged - the applications are neither converted nor rewritten, but continue to run exactly as before within the legacy environment.

Nonetheless, this modernization step is a critical enabler for the broader transformation journey. Without it, developing a completely new system in parallel would leave the organization without tangible progress for an extended period, undermining confidence and momentum during the transition.

 

"You don’t modernize by replacing what you have. You modernize by making it work where the future is headed."

 

By first modernizing the current solution - specifically by implementing a new user interface while maintaining existing functionality - the organization can focus on transforming the highest-priority sub-areas of the system. The primary objective of this transformation is to ensure long-term maintainability by enabling newly recruited resources to work on the solution without requiring prior knowledge of the existing platform or codebase. This approach effectively reduces dependency on the current, highly skilled - but potentially retiring - IT developers.

 

Furthermore, the transformation process can progress without strict time constraints, allowing new components to be introduced gradually. Each part can be deployed as soon as it meets the updated requirements and addresses both current needs and previously identified areas for improvement.

 

Modernization vs. Transformation

It is essential to clearly distinguish between modernization and transformation, as they represent two distinct processes in terms of scope, duration, and objectives.

 

  • Modernization is typically measured in months. It focuses on improving the user experience by introducing a modern interface while keeping the underlying applications and architecture intact. This phase requires minimal analysis - only a basic understanding of how the current menu system operates and the location of applications.

  • Transformation, on the other hand, is a multi-year process. It involves a comprehensive analysis of the entire system landscape, including mapping dependencies between applications, identifying mission-critical components, distinguishing between essential and non-essential maintenance modules, and determining which parts should be decommissioned, reimplemented, or replaced - especially in cases where source code is missing.

 

"Modernization brings usability. Transformation brings sustainability. The difference defines your future readiness."

 

Over the years, inadequacies often accumulate: features implemented for convenience but challenging to maintain, workflows that evolved in unintended ways, or decisions once considered optimal that later introduced complexity. This buildup of “development debt” must be identified and visualized during the transformation process. Much of this knowledge resides with the current IT developers, who best understand where the system’s pain points lie.

 

Furthermore, the cultural landscape on the IBM i platform has traditionally enabled the rapid deployment of solutions, sometimes overnight - an approach that modern agile teams could learn from. However, this culture has also resulted in low levels of documentation, and in some cases, none at all. As a result, the source code often serves as the primary documentation, making it crucial to gather insights directly from both users and developers to build an accurate understanding of the system before embarking on the transformation.

Modernization

The modernization phase focuses on installing and configuring the IceBreak/IceCap software package, which provides all the essential components required for User management, Role-based access control, Menu management, and Processing of 5250 applications.

 

To integrate IceBreak/IceCap with the existing environment, an integration module must be developed for the current menu system. Once this step is completed, users will be able to access and operate all their familiar applications and functions within a modernized interface.

 

"A new interface isn’t just lipstick - when designed right, it serves as a bridge to deeper architectural changes."

 

In some cases, minor adjustments may be required - either by modifying existing applications or extending IceBreak/IceCap - to ensure full compatibility and optimal performance.

 

Key steps in the modernization process:

  1. Set up the working environment -  Prepare the infrastructure and ensure prerequisites are in place.

  2. Install IceBreak/IceCap -  Deploy the software package and configure core components.

  3. Develop menu integration -  Connect the existing menu system with IceBreak/IceCap for seamless navigation.

  4. Conduct user testing -  Validate functionality and gather feedback from end users.

  5. Implement necessary adaptations -  Address compatibility issues or enhance features as required.

  6. Commissioning -  Roll out the modernized environment for full-scale use.

Transformation 

The transformation process is a comprehensive, multi-phase journey designed to modernize not just the user interface but the entire system architecture and business processes. It is structured into four key phases:

  1. Analyze - Conduct a thorough assessment of the existing systems, dependencies, and business processes to identify areas for improvement. Identify mission-critical components, obsolete modules, and opportunities for optimization.

  2. Modulation - Define a future-ready architecture by segmenting the system into manageable, modular components. Establish priorities, set transformation goals, and create a clear implementation roadmap.

  3. Execution - Implement the transformation plan by developing, migrating, or replacing system components as needed. Introduce modern technologies and best practices to achieve higher scalability, maintainability, and efficiency.

  4. Evaluation - Review outcomes against defined objectives, validate performance improvements, and ensure alignment with business needs. Gather feedback to refine future transformation activities.

1. Analyze

The analysis phase establishes the foundation for the entire transformation process. Its purpose is to gain a complete understanding of the organization’s business environment, existing systems, available skills, and potential collaboration partners. At the same time, it defines a baseline, providing a measurable and factual starting point from which progress and success can be evaluated.

 

Defining a baseline

Defining this baseline involves visualizing the company’s functional areas and understanding which business domains they represent. On the technical side, it involves mapping all active and discontinued systems, modules, and functions, as well as quantifying key metrics such as the number of programs, applications, users, customers, suppliers, and external interfaces. Establishing this overview ensures a shared understanding of the current state and enables effective planning for the transformation journey.

 

Identify “misuse” cases

As part of the analysis, it is equally vital to identify so-called “misuse” cases - areas where the system is being used in ways that diverge from its original design. These are often minor workarounds that seem harmless to daily users and developers but can create significant complications when redesigning or re-implementing processes. For example, in some systems, entering "***" in a customer’s address might indicate bankruptcy, highlighting a missing workflow feature that needs to be formally addressed in the new architecture. Documenting such scenarios ensures they are properly handled in the transformation process.

 

Uncovering pain points

Another key aspect of the analysis involves identifying the pain points within the current setup. These can be technical, functional, or organizational and may include both known issues and anticipated challenges. For instance, some companies maintain multiple departmental databases that, over time, are discovered to be better consolidated into a shared database,  or conversely, a distributed model might be preferable for scalability. Understanding these pain points helps prioritize what must be addressed early in the transformation.

 

Set future goals and assumptions

The analysis also defines the goals and assumptions for the future system. It documents how the current system operates, identifies desired capabilities that are currently missing, and highlights any limitations imposed by dependencies, such as external business partners, purchased secondary solutions, or immutable third-party systems. Legal and compliance considerations, such as GDPR and data security, are also assessed, along with potential risks. Where relevant, a risk assessment and even an exit strategy may be prepared to safeguard business continuity.

 

Maintaining a Decision Manifest

Throughout this phase, a decision manifest is maintained. This document serves as a record of key issues, possible solutions, and the reasoning behind final choices. While it may initially be a list of existing technologies, future technological ambitions, dependencies, and integrations, it evolves continuously as decisions are made. Far from being bureaucratic, the decision manifest promotes transparency and enables future stakeholders to understand precisely why specific paths were chosen and others dismissed.

 

Create a walking Skeleton

A significant milestone in the analysis phase is defining the walking skeleton. This is a minimal, end-to-end implementation that incorporates all architectural components, from the database and security framework to deployment pipelines and the user interface. It serves as the foundation for the transformation process, similar to a Minimum Viable Product (MVP), but with a broader scope, ensuring that the entire system structure is sound before additional functionality is layered on top. System & Method will propose a technology stack suited to the organization’s needs, drawing on insights from similar projects and adapting recommendations where necessary.

 

A  living Documentation

The information gathered during this phase must be treated as living documentation rather than static reports. By maintaining the analysis and decision manifest in collaborative tools such as SharePoint, Confluence, or Git-based platforms, all stakeholders can access, update, and reference the evolving knowledge base throughout the project.

 

The First Milestone

The analysis phase concludes with Approval - Milestone #1. At this point, the project framework is finalized, expectations are aligned, and deliverables are validated with stakeholders. This milestone ensures that everyone shares a common understanding of the path forward, setting the stage for the subsequent phases of Modulation, Execution, and Evaluation.

2. Modulation

The modulation phase focuses on building the Walking Skeleton, a minimal yet complete framework that lays the foundation for the future system. This phase transforms the insights and decisions made during the analysis phase into a tangible architecture. It involves selecting the appropriate technologies, defining the security model, setting up user and role management, and establishing the development, testing, and DevOps environments.

 

Walking Skeleton

At the heart of the modulation phase is the construction of the Walking Skeleton, which serves as the initial, functional prototype of the future system. Based on the decisions made during the analysis phase, the first service is implemented, and the chosen technology stack is stress-tested under realistic conditions.

 

This process is inherently iterative: the Walking Skeleton is designed, developed, tested, and refined until the desired structure and functionality are achieved. Throughout this phase, the decision manifest is continuously updated with finalized choices regarding the user interface, microservices, development languages, software stack, DevOps tools, and security architecture. Each decision is aligned with the organization’s business requirements, user needs, and long-term goals.

 

User Interface

For IceBreak/IceCap, the recommended approach is to use a web-based framework to deliver a modern and flexible user experience. By default, the solution leverages ExtJS, a component library optimized for building complex ERP applications. This enables users to access the system via a standard web browser or through a dedicated desktop application, depending on preference.

The architecture is designed to be framework-agnostic, allowing organizations to develop applications using popular alternatives such as React, Vue, or Angular. Regardless of the framework chosen, the solution ensures a unified and consistent look and feel across all applications.

 

Microservices

System & Method strongly recommends adopting a microservices architecture for implementing business logic within the new system. Microservices are small, independent services designed to perform a single function efficiently, robustly, and scalably.

 

This approach enables the organization to add, modify, or replace individual components without disrupting the overall system, reducing downtime and improving agility. However, transitioning from a traditional monolithic design to a microservices-based architecture requires careful planning and, in some cases, compromise, as the paradigms differ significantly.

 

Development Language

Selecting the right development language is critical for ensuring long-term maintainability, scalability, and developer availability. System & Method supports multiple languages depending on the task, but for building microservices on IBM i, we typically recommend Java, Node.js, or Python.

 

RPG remains a strong candidate for certain scenarios due to its high performance and close integration with the database. However, for ERP-related applications, Java is the safest choice. With over three decades of stability, widespread adoption, and ongoing support in educational institutions worldwide, Java provides access to a large pool of qualified developers.

 

System & Method recommends using Java, combined with the Spring Boot framework, to simplify the development of microservices and cloud-native applications. Notably, IceBreak supports running Spring Boot services alongside RPG services, enabling a homogeneous service layer built on a heterogeneous codebase.

 

Software Stack

A software stack defines the technologies and layers required to deliver a complete solution, from the user interface through business logic to secure data storage. Transformation projects typically involve integrating multiple technologies, including microservices, SQL-based data access, and static web content.

 

In some cases, stored procedures are used to decouple legacy business logic from modern applications, thereby facilitating seamless integration. Since SQL is widely taught and understood by both experienced developers and new graduates, this approach can help bridge demographic challenges and accelerate knowledge transfer.

 

DevOps

Modern systems demand efficient, continuous integration and delivery pipelines. The modulation phase includes selecting and establishing an appropriate DevOps environment to enable seamless collaboration between development and operations teams.

 

Implementing DevOps practices ensures that code is developed, tested, deployed, and maintained efficiently. Meanwhile, CI/CD tools enhance reliability by automating testing and delivery processes.

 

This creates an iterative, feedback-driven development cycle where enhancements and optimizations are continuously incorporated into the solution.

 

Security

While legacy systems often had limited security requirements, modern applications demand significantly stronger protections. During this phase, the security model is defined, tested, and documented. Security now encompasses data protection, service authentication, user access control, and compliance with regulations such as GDPR. The finalized security architecture is captured in the decision manifest to guide subsequent development phases.

 

User, Role, and Menu Management

In the legacy environment, user and role information were handled directly by the original system. In the new service-oriented architecture, this functionality is redesigned and extended to ensure compatibility with modern security standards and microservice-based systems. The new architecture supports fine-grained access control and better integration between applications.

 

The Second Milestone

The modulation phase concludes with Milestone #2, where the Walking Skeleton is formally approved. At this stage, the foundational architecture is fully operational, although still minimal; it provides structure without yet offering full functionality.

 

Outstanding questions from the analysis phase are resolved, decisions are validated against business goals, and the updated decision manifest is finalized. Approval at this stage signals readiness to proceed to the execution phase, where the whole system begins to take shape.

 

3. Execution

After completing the modulation phase, the technological foundation is in place, the architecture is defined, and the development approach is clarified. At this point, the execution phase begins, where the Walking Skeleton is gradually expanded into a fully functional system. The goal is to “add flesh to the skeleton” by implementing core functionalities, starting with the development of a Minimum Viable Product (MVP).

 

Execution is an iterative process. Development, testing, deployment, and feedback cycles are repeated continuously to ensure a stable and scalable result. However, before development can truly gain momentum, several prerequisites must be completed, including establishing the production environment and, where required, performing domain segregation. Once these foundational tasks are finalized, a reliable project estimate and delivery schedule can be determined.

 

Minimum Viable Product (MVP)

The MVP represents the first fully functional component of the future system, serving as a proving ground for the chosen technology stack. The component is carefully selected to ensure that all layers of the architecture are exercised - from the database to the user interface - using modern tools and frameworks.

 

The source code is managed in a Git repository, and the selected framework is applied consistently throughout the implementation. Deployment and testing are handled through the CI/CD pipelines defined in the DevOps environment, ensuring the entire process is automated, repeatable, and reliable. This approach establishes a rhythm of iterative development where enhancements can be delivered quickly and consistently without disrupting ongoing operations.

 

Production Environment

While a stable production environment typically exists within the current system, transitioning to a service-oriented architecture introduces new requirements. Modern services may still rely on existing data but demand a dedicated environment for running services and handling infrastructure components, including logging and tracing for end-to-end traceability.

 

In the traditional ILE environment, applications and data often coexist within a few libraries. In contrast, microservices typically operate in a Java-based environment, running on the Integrated File System (IFS) within IBM i or on a separate partition of the Power Server. Since Java services are generally more resource-intensive than their ILE counterparts, resource allocation, infrastructure scaling, and deployment strategies must be carefully planned during this phase.

 

Domain Segregation

A significant step in the execution phase is the implementation of domain segregation, which involves restructuring the system to align data and functionality with business contexts rather than technical dependencies.

 

Traditional ERP systems were often designed using functional dependencies, where as much information as possible is stored under a single key. For example, all customer-related data - personal, financial, and marketing information - might reside in a single table linked to the customer number.

 

In contrast, modern domain-driven design (DDD) applies the principle of Separation of Concerns (SoC). Data is segmented into context-specific domains. A customer viewed from an accounting perspective is treated as a debtor, whereas the same customer in a marketing context is stored in a CRM system. Each domain manages its own subset of data, with interfaces connecting them where necessary.

 

Domain segregation has two key benefits:

  1. It reduces interdependencies, allowing domains to evolve independently without unintended ripple effects.

  2. It simplifies cognitive complexity, making systems easier to understand, develop, and maintain.

 

System & Method has developed a methodology called evidence-based domain segregation to analyze and efficiently restructure existing systems. Whether and to what extent this method is applied depends on the current system’s complexity and its adherence to functional dependency principles.

 

Parallel Sub-Projects

The MVP, production environment setup, and domain segregation are treated as independent sub-projects. Each can be executed in parallel, but all must be finalized before producing the final project estimate. This approach accelerates development while ensuring the system is built on a robust and scalable foundation.

 

Estimation

Once domains are clearly defined, interfaces are documented, and the size and complexity of each domain are understood - including the number of tables, applications, and integrations - a comprehensive project estimate can be produced. This estimate guides resource allocation, scheduling, and prioritization for the remainder of the execution process.

 

The third Milestone

The execution phase concludes with Milestone #3. At this point, the MVP must be fully operational and actively used in day-to-day business operations. All deployment processes should run seamlessly through the CI/CD pipelines established in the DevOps environment, eliminating the need for manual deployment steps entirely.

 

Approval at this stage validates that the development approach is practical, the architecture is functioning as intended, and users are satisfied with the delivered solution. It also confirms readiness to proceed with expanding functionality and scaling the system to full production use.

 

4. Evaluation

The evaluation phase focuses on assessing the outcomes of the execution phase and preparing the organization for a smooth transition from a project-based initiative to a permanent development and operational model. By reviewing results, formalizing documentation, and defining team structures, this phase lays the groundwork for scaling and sustaining the transformation process.

 

Purpose and Documentation

The primary objective of the evaluation is to ensure that the work completed during execution can be handed over seamlessly to the organization’s permanent development and operations teams. To achieve this, comprehensive documentation must be finalized.

 

This includes not only details of what was delivered during the execution phase but also clear descriptions of how applications are to be developed, tested, deployed, and maintained going forward. Establishing structured documentation ensures that new team members, external partners, and operational staff can work effectively without relying on historical knowledge of the legacy system.

 

Team Formation 
Based on the estimates produced during the execution phase, the evaluation process identifies the team structure required to drive the remainder of the transformation. This involves determining:

  • Which roles and competencies are essential?

  • How many resources are needed internally?

  • Whether to recruit and train new IT staff.

  • The extent to which external specialists or consulting partners are required.

By clarifying these needs early, the organization can ensure it has the right mix of skills to maintain momentum and achieve transformation goals.

 

Scaling the Project

With the team established, a refined estimate of the overall transformation timeline can be produced. Scaling involves balancing speed, quality, and resource utilization so that the project progresses efficiently without compromising stability.

 

Scaling also includes a technical dimension. As the system expands, future hardware and infrastructure requirements must be assessed to ensure performance, scalability, and reliability as new components are introduced and additional domains are transformed.

 

"You don't have to choose between stability and innovation. The Sitemule approach unites both - through a structured, milestone-driven transformation process."

 

The fourth Milestone

The evaluation phase concludes with Milestone #4, which formalizes the transition from a transformation project into an ongoing development and operations framework.

 

At this stage, the organization achieves clarity on:

  • Which domains are most critical and thus represent the highest priority for transformation?

  • Which domains hold the most significant potential for optimization?

  • Which parts of the system can remain untouched without further changes?

 

This milestone establishes a strategic foundation for planning the remainder of the transformation journey, ensuring business continuity, operational readiness, and scalability.

The Transformation Process

The transformation journey is a structured, multi-phase process designed to modernize legacy IT systems, improve maintainability, and enable scalable, future-ready architectures. While modernization provides immediate usability improvements, transformation addresses deeper structural, organizational, and technical challenges to ensure long-term sustainability.

 

Strategic Benefits

By following this structured, milestone-driven approach, organizations achieve:

  • Business continuity while transitioning from legacy systems to modern architectures.

  • Reduced dependency on scarce, specialized legacy knowledge.

  • Faster delivery cycles enabled by CI/CD and microservices.

  • Improved scalability through domain-driven design and service-oriented architectures.

  • Future-proofing with modern technologies, frameworks, and security models.

 

This methodology strikes a balance between stability, innovation, and agility, ensuring transformation occurs in manageable, controlled phases while delivering incremental value throughout the process.

 


Insights

October 5, 2025

We had to evolve - and so must you!

Rather than abandoning the IBM i platform, we chose to innovate within it. We’ve helped hundreds of organizations transform their IBM i systems - without the disruption and risk often associated with large-scale migrations. We have made business logic modular, workflows more adaptable, and integration effortless. Most importantly, we have preserved the core investment - the decades of code and embedded processes that define each company’s uniqueness.
Metal ball in a maze made of frosted blue glass

October 8, 2025

Modernizing IBM i - from Monoliths to Modern Architectures

Modernizing IBM i means more than changing code - it’s about evolving stable systems into agile, modular architectures while preserving decades of business logic and operational strength
Transparant cube with multiple smaller sized cubes in the behind on dark blue background

October 8, 2025

Why Modernizing Your 5250 Programs is no longer Optional

Explore how Sitemule Workspace provides a practical, low-disruption path to transforming IBM i green-screen applications into modern, web-ready interfaces - without rewriting core logic.
Opel Kadet 1980's style with a large infotainment screen replacing the whole dashboard

August 24, 2025

New Technologies, Same Great Platform

How containers, microservices, and open source echo concepts IBM i has delivered for decades - and why the platform is still built for the future.
A split-scene composition showing the contrast between modern cloud-native microservices (containers, Kubernetes nodes,

August 31, 2025

How does IceCap work, and what’s the secret behind it?

A native middleware layer that modernizes 5250 - enabling web access, automation, and API integration without changing source code. Here, you’ll get a clear breakdown of IceCap’s architecture and runtime model.
A middle-aged male developer with short, neatly combed hair, wearing a white short-sleeved button-down shirt and glasses. He is bending over slightly, peering with curiosity through a narrow hole in an old wooden plank wall.


The Companies We Help

We provide solutions and services that support both standard and tailor-made systems for companies worldwide, serving a wide range of industries such as banking, finance, insurance, manufacturing, retail, logistics, and beyond. Let us help you - get in touch today!

Santander Bank

Santander Bank

Enabling financial confidence, smart mobility, and personal growth
ABN AMRO

ABN AMRO

Empowering innovation, sustainable finance, and inclusive progress
Co-Ownership

Co-Ownership

Redefining ownership, affordability, and community living
Molslinjen A/S

Molslinjen A/S

Connecting people, regions, and experiences
Berry Superfoss

Berry Superfoss

Driving circular packaging, customer value, and smarter logistics
Uno-X

Uno-X

Fueling cleaner mobility, energy access, and everyday simplicity
Get in Touch
Please select