Legacy mainframe systems power some of the world's most critical operations — from bank transactions to government records. Yet modernizing these systems has long been slow, costly, and risky. AI agents are changing that. This guide shows you exactly how to use AI agents to plan, execute, and validate a mainframe migration — step by step.
The Core Prompt
Copy and paste this exact prompt:
<Step-by-Step Guide to Using AI Agents for Legacy Mainframe Migration>
Why This Topic Matters in 2026
The numbers are hard to ignore. The legacy modernization market hit $24.98 billion in 2025 and is on track to reach $56.87 billion by 2030. Yet 70% of Fortune 500 companies still run software that is more than two decades old. An estimated 200 billion lines of COBOL code still process transactions at banks, insurance companies, and government agencies every single day.
The problem is real: the developers who built these systems are retiring. Maintenance costs keep rising. And every year a company delays migration, the technical debt compounds.
AI agents are the first technology to make mainframe migration genuinely faster. AWS Transform — launched in May 2025 as the world's first agentic AI service for enterprise transformation — has already analyzed an estimated 1.1 billion lines of code and saved customers more than 810,000 hours of manual effort. BMW Group cut its testing time by 75% using AI-powered migration tools. Air Canada reduced expected time and cost for a major migration by 80%. One financial services company reduced a 700–800 hour migration effort by 40% using generative AI agents.
This is not a future promise. It is happening now.
What Are AI Agents in the Context of Mainframe Migration?
An AI agent is a software program that can plan, reason, and take actions to complete a specific task — without a human doing each step manually. In mainframe migration, agents are specialized. Each agent handles one part of the migration pipeline.
Think of it like a factory production line. Each worker (agent) does one job well. An orchestration layer manages the flow between them. Compare this to using a general AI assistant like GitHub Copilot in your IDE — that is a conversation. The multi-agent approach is a production line.
Here is how the agent types break down:
| Agent Type | What It Does |
|---|---|
| Code Analyzer Agent | Scans COBOL or PL/1 files, maps structure and logic |
| Data Source Agent | Identifies VSAM, DB2, IMS, and file-based data sources |
| Dependency Optimizer Agent | Finds third-party libraries and flags modern replacements |
| Business Logic Extractor Agent | Pulls out core business rules from legacy code |
| Documentation Agent | Auto-generates technical documentation for undocumented systems |
| Code Converter Agent | Transforms COBOL into modern Java or other target languages |
| Test Planning Agent | Creates test plans, data collection scripts, and automation scripts |
| Activity Analysis Agent | Analyzes runtime data to identify what code is actually used |
Step-by-Step Guide: Using AI Agents for Mainframe Migration
Step 1 — Assess Your Current Mainframe Environment
Before any agent touches your code, you need a clear picture of what you have.
Run a code discovery job as the mandatory first step. Tools like AWS Transform for Mainframe and Microsoft's Azure Migrate can automate this. The Code Analyzer Agent scans your source files and reports:
| Metric | Why It Matters |
|---|---|
| Total lines of code | Sets scope and timeline expectations |
| Code complexity score | Flags high-risk areas for human review |
| Language distribution (COBOL, PL/1, JCL, etc.) | Determines which agent pipelines to activate |
| Number of programs and copybooks | Reveals reuse patterns and dependency chains |
| Missing or undocumented components | Surfaces gaps before they become blockers |
This step alone can reduce your assessment timeline from several months to a matter of days.
Step 2 — Choose Your Migration Pattern
Not every mainframe workload should be migrated the same way. The industry uses an "8 R" framework to define migration strategies. AI agents can help you select the right one for each workload.
| Pattern | Description | Best For |
|---|---|---|
| Rehost (Lift & Shift) | Move as-is to cloud | Low-complexity batch jobs |
| Replatform | Move with minimal code changes | COBOL to managed runtime |
| Refactor | Restructure code without changing function | Modernizing business logic |
| Rearchitect | Redesign the application from scratch | High-value, high-complexity apps |
| Rebuild | Write new code that replaces the old system | End-of-life legacy apps |
| Retire | Decommission systems no longer needed | Duplicate or obsolete workloads |
| Retain | Keep as-is for now | Systems with no viable migration path |
| Relocate | Move to cloud without modification | Infrastructure-only moves |
The Activity Analysis Agent is especially useful here. It reads runtime data to show you which batch jobs and online transactions are actually used — and which ones can be retired without migrating at all.
Step 3 — Extract and Document Business Logic
This is where AI agents deliver some of their biggest value. Legacy mainframe systems often have no current documentation. Institutional knowledge lives only in the heads of retiring developers.
The Business Logic Extractor Agent reads your COBOL code and produces structured output showing:
- What the program does in plain language
- What business rules govern key decisions
- What data it reads and writes
- What other programs it calls
The Documentation Agent then takes this output and builds human-readable technical specs. Systems that previously had zero documentation can be fully documented in days rather than months.
| Before AI Agents | After AI Agents |
|---|---|
| No documentation exists | Full technical specs auto-generated |
| Business rules buried in code | Rules extracted and structured |
| Knowledge held by retiring devs | Knowledge captured in documentation |
| Manual review of millions of lines | Automated analysis in hours |
Step 4 — Plan Your Migration Waves
Large mainframe estates cannot be migrated all at once. You need a wave plan — a phased schedule that groups applications by dependency, risk, and business priority.
A Migration Planning Agent can process unstructured inputs like existing documents, email threads, and business requirement files to build a wave plan automatically. It applies business context to the technical dependency map produced in Step 1.
A good wave plan looks like this:
| Wave | Focus | Risk Level | Timeline |
|---|---|---|---|
| Wave 1 | Low-complexity batch jobs, retired workloads | Low | Weeks 1–8 |
| Wave 2 | Core business logic, moderate complexity | Medium | Weeks 9–20 |
| Wave 3 | Mission-critical transaction systems | High | Weeks 21–40 |
| Wave 4 | Final cutover and decommission | Very High | Weeks 41–52 |
The exact timing depends on your codebase size and target environment. AWS Transform has helped customers cut modernization timelines from multi-year projects to months.
Step 5 — Run the Code Conversion
This is the heart of the mainframe migration process. The Code Converter Agent transforms your legacy code — typically COBOL or PL/1 — into a modern target language like Java.
Microsoft has integrated GPT-4 into GitHub Copilot specifically to support this conversion. AWS Transform uses a multi-agent architecture built on nearly two decades of AWS mainframe migration expertise.
Key conversion tasks the agents handle:
- COBOL program structure to Java class mapping
- Copybook to Java object translation
- JCL job stream conversion to cloud-native equivalents
- VSAM file access to relational or NoSQL database operations
- Batch job orchestration to modern workflow engines
The Dependency Optimizer Agent runs in parallel, scanning third-party COBOL libraries and flagging where modern open-source equivalents can replace proprietary mainframe components.
Important: Full automation is not yet possible for every codebase. Humans must stay in the loop for validation. Each COBOL codebase is unique. The AI agents dramatically accelerate the work — they do not eliminate the need for experienced developers.
Step 6 — Automate Testing
Testing has traditionally consumed up to half of total mainframe migration project timelines. AI agents attack this bottleneck directly.
The Test Planning Agent automatically generates:
| Test Artifact | What It Covers |
|---|---|
| Test plans | Coverage strategy for converted applications |
| Test data collection scripts | Pulls representative data from mainframe sources |
| Regression test automation scripts | Validates that converted code matches original behavior |
| Business rule validation tests | Checks that extracted logic produces correct outputs |
BMW Group used AI-powered test automation during its mainframe migration and achieved a 75% reduction in testing time and a 60% increase in test coverage simultaneously. That combination — faster and more thorough — was not achievable with manual testing approaches.
Step 7 — Migrate Your Data
Code migration without data migration is incomplete. Mainframe data typically lives in VSAM files, IMS databases, or DB2 instances — all of which need to move to cloud-native storage.
The Data Source Agent maps every data source in your codebase, including:
- Access patterns (read, write, read-write frequency)
- Data structures and field definitions
- Relationships between data sources and programs
- Volume and growth estimates
For VSAM data specifically, AWS has developed hands-on migration patterns that move records to Amazon DynamoDB using generative AI to handle denormalization — the process of restructuring flat mainframe data into a format that works efficiently in a cloud database.
| Mainframe Data Source | Cloud Target Options |
|---|---|
| VSAM files | Amazon DynamoDB, Amazon S3 |
| DB2 | Amazon Aurora, Amazon RDS |
| IMS | Amazon DynamoDB, relational DB |
| Flat files | Amazon S3, data lake |
Step 8 — Validate and Cut Over
The final step is validation and production cutover. This means running the migrated system in parallel with the mainframe for a defined period, comparing outputs, and confirming correctness before switching off the legacy system.
AI agents support this phase through:
- Automated output comparison between mainframe and cloud systems
- Exception flagging when outputs diverge
- Performance benchmarking against mainframe SLAs
- Rollback planning if issues are detected
Do not skip the parallel run phase. The risk of cutting over without parallel validation is significant, especially for transaction-processing systems where errors can have direct financial consequences.
Real-World Results: What AI Agents Have Delivered
These are documented outcomes from actual mainframe migration projects using AI agents:
| Organization | What They Did | Result |
|---|---|---|
| BMW Group | Mainframe migration with AI testing | 75% faster testing, 60% more test coverage |
| Air Canada | Node.js runtime modernization across thousands of Lambda functions | 80% reduction in time and cost |
| QAD | Legacy ERP code migration with AWS Transform | 60–70% productivity gains, weeks reduced to days |
| Top 15 global insurer (McKinsey case study) | GenAI code modernization and testing | 50%+ improvement in efficiency and speed |
| FinTech company (McKinsey case study) | 20,000 lines of COBOL migration | 40% reduction in estimated effort |
Common Mistakes to Avoid
| Mistake | Why It's a Problem | What to Do Instead |
|---|---|---|
| Skipping the assessment phase | Hidden dependencies surface mid-migration and cause delays | Always run code discovery first |
| Expecting full automation | Every COBOL codebase is unique; agents cannot handle every edge case | Keep humans in the loop for validation |
| Migrating everything at once | Risk accumulates when too many systems change simultaneously | Use a phased wave approach |
| Neglecting data migration | Migrated code with no data access is non-functional | Plan data and code migration together |
| Skipping parallel validation | Errors in converted code may not appear until production | Always run both systems in parallel before cutover |
| Ignoring retired workloads | Teams waste effort migrating code that is no longer used | Use Activity Analysis Agents to identify dead code first |
Choosing the Right AI Tool for Your Mainframe Migration
The market has several strong options in 2026:
| Tool | Vendor | Best For |
|---|---|---|
| AWS Transform for Mainframe | Amazon Web Services | IBM z/OS to Java, large-scale COBOL migration |
| GitHub Copilot (with AI agents) | Microsoft / GitHub | COBOL-to-Java, .NET modernization, IDE-integrated work |
| Azure Migrate + Azure Accelerate | Microsoft | End-to-end cloud migration with dedicated enterprise support |
| COBOL Migration Factory (open source) | Microsoft / Bankdata | Customizable agent pipelines, self-hosted |
| Amazon Q Developer | AWS | .NET and Java modernization, code transformation |
All of the major cloud providers now have agentic AI migration capabilities. More than 75% of enterprises are using AI as part of their modernization strategy in 2026, according to DreamFactory research.
Tips for Getting the Best Results
Start with a pilot. Pick a low-complexity batch application for your first AI-assisted migration. This lets your team learn the tooling and workflow before tackling mission-critical systems.
Pair COBOL experts with modern developers. COBOL experts bring irreplaceable domain knowledge. Modern developers bring architecture expertise. AI agents bring pattern recognition at scale. All three are needed.
Document as you go. Use the Documentation Agent output to build a knowledge base. This reduces your dependency on any single team member and creates an asset for future maintenance.
Measure everything. Track lines of code analyzed, agent hours saved versus manual hours, test coverage before and after, and defect rates in the converted code. Data helps you justify ongoing investment.
Plan for compliance. In regulated industries like banking, insurance, and healthcare, compliance requirements must be built into the migration design from the start — not added after.
Conclusion
AI agents have fundamentally changed what is possible in legacy mainframe migration. Projects that once took three to five years can now be completed in months. Testing that consumed half a project's timeline can now be automated. Business logic buried in millions of lines of undocumented COBOL can now be extracted and mapped in days.
The technology is real, the results are documented, and the tools are generally available today. The biggest risk in 2026 is not starting. Every quarter of delay widens the gap between organizations that are modernizing and those that are not.
Use this guide as your starting framework. Run the discovery step first, choose your migration pattern, and let the agents handle the heavy lifting — while your team stays in the loop for the decisions that matter.
