top of page

EA Maturity Assesment from Level 2 to 5

  • Writer: Anand Nerurkar
    Anand Nerurkar
  • Oct 25
  • 6 min read

🏗 Realistic EA Maturity Assessment Flow (Level 2 → 5) with CXO Presentation

Step 1: Discovery & Planning

Objective: Define scope, domains, stakeholders, and assessment plan.

Activities & Tools:

  1. Identify EA domains: Business, Application, Data, Technology, Security, Governance, Tools, People.

  2. Conduct discovery workshops with stakeholders (Business, IT, Security, Compliance).

    • Tools: MS Teams / Zoom, Miro / MURAL

  3. Identify sources of evidence: CMDB, EA repositories (LeanIX / Orbus / Mega HOPEX), Confluence, dashboards.

  4. Prepare assessment plan: domain coverage, interviews, surveys, artifact verification.

    • Tools: Excel / MS Project, Confluence

  5. Request tool access: CMDB, EA repository, dashboards.

Output:

  • Assessment Plan & Timeline

  • Stakeholder List & Tool Access Requests

Step 2: Stakeholder Interviews & Workshops

Objective: Collect qualitative insights and validate current practices.

Activities & Tools:

  • Schedule meetings/workshops per domain.

  • Conduct structured interviews to assess: current processes, governance adherence, challenges.

  • Capture evidence: notes, screenshots, Confluence logs.

  • Validate current-state diagrams in workshops.

    • Tools: Miro / MURAL, Confluence, SharePoint

Identify Stakeholders and Define Interview Objectives

Stakeholder

Objective of Discussion

Example Questions

CTO / CIO

Understand enterprise vision, digital strategy, and architectural priorities

- How do you measure the success of technology strategy?


- How often are architecture decisions reviewed centrally?


- How is alignment ensured between IT and business?

CISO

Understand security governance, compliance posture

- How are security policies enforced across projects?


- Is threat modeling done during design phase?


- Is there an automated compliance framework?

Head of Delivery

Assess delivery maturity and architecture alignment

- How is architecture compliance ensured during delivery?


- Are architects embedded in agile squads?


- How is technical debt tracked or reported?

Domain Architects / Solution Architects

Evaluate process adoption, standards, and tool usage

- Are reference architectures followed?


- Is documentation standardized?


- How do you handle architecture exceptions?

Head of Data / CDO

Data strategy and governance

- Is there a master data repository?


- How do you manage lineage and data ownership?


- How is data quality measured?


Output:

  • Stakeholder interview notes

  • Workshop outputs

  • Evidence log per domain


Step 3: Domain-Specific Questionnaires / Surveys

Objective: Capture quantitative evidence for Level 2 → 3 transition.

Use a Structured EA Maturity Questionnaire

You prepare a domain-wise assessment sheet with rating scales (1–5).Each level represents a clear maturity description.

Example – Architecture Governance

Parameter

Description

Maturity Level (1–5)

Example Evidence

EA Governance Board

Is there a formal Architecture Review Board (ARB)?

1 – No ARB


3 – Ad hoc ARB


5 – Enterprise-wide ARB with decision log

ARB minutes, review templates

Architecture Standards

Are standards published and enforced?

1 – Not defined


3 – Defined but not enforced


5 – Enforced via tooling

Architecture standards doc

Exception Management

Is there a defined exception process?

1 – No


3 – Manual


5 – Automated with traceability

Exception register, Confluence

KPI Tracking

Are EA KPIs monitored regularly?

1 – None


3 – Quarterly review


5 – Continuous dashboard

EA scorecard, dashboard

Each stakeholder rates the domain based on their area.You record observations and qualitative notes alongside scores.


Sample Domain Questions

Domain

Questions

Tool / Evidence

Business

Capability maps documented? Linked to KPIs? Standardized processes across LOBs?

MS Forms / Qualtrics, Confluence

Application

Service catalog complete? Interfaces standardized? Microservices documented?

LeanIX / ServiceNow CMDB, MS Forms

Data

Metadata catalog available? Data lineage tracked? Data quality enforced?

Collibra / Informatica, MS Forms

Technology

Infra diagrams complete? IaC templates used? Monitoring integrated?

Azure DevOps, Terraform, Power BI

Security

Policies enforced? Audit logs available? DevSecOps implemented?

ServiceNow GRC, Splunk / ELK

Governance

ARB meetings conducted? Decisions logged? Standards enforced?

Confluence, SharePoint

Tools

EA repository adoption active? Versioning maintained?

LeanIX / Mega HOPEX

People

EA roles defined? Architects trained? Architecture Guild / CoP active?

Confluence, MS Teams, Miro

Output:

  • Domain dataset

  • Preliminary maturity scores

Step 4: Artifact Verification & Repository Scan

Objective: Validate artifacts for evidence-based scoring.

Activities & Tools:

  • Scan EA repositories / CMDB (ServiceNow, LeanIX, Orbus, Mega HOPEX)

  • Verify artifacts per domain: capability maps, service catalog, data lineage, infra diagrams, audit logs, governance artifacts, training logs

  • Capture screenshots, PDFs, dashboards as evidence

  • Tools: LeanIX, Mega HOPEX, ServiceNow, Power BI / Tableau, Confluence / SharePoint

Validate Responses with Artifact Reviews

Once you collect responses → validate using actual deliverables (artifacts).

Domain

Artifact Examples

What to Look For

Governance

EA Charter, ARB minutes, exception logs

Existence of governance body and process adherence

Business Architecture

Business capability maps, process models

Mapping of business functions to applications

Application Architecture

Application inventory, dependency diagrams

Portfolio visibility, redundancy, reuse

Data Architecture

Data catalog, lineage diagrams, DQ scorecards

Data ownership, quality metrics, lineage completeness

Technology Architecture

Cloud reference architectures, CI/CD templates

Standardization, automation, resilience

Security Architecture

IAM policies, threat models, compliance reports

Evidence of Zero Trust, least privilege, and control monitoring

🧠 Example:If Head of Delivery says “All projects go through ARB”,→ you validate by checking if there are ARB minutes, review templates, and sign-offs in Confluence or ServiceNow.If not, you downgrade the maturity score.

Output:

  • Verified Artifact List

  • Evidence Repository

Step 5: Scoring, Heatmap & Gap Analysis (Level 2 → 3)

Objective: Quantify maturity, identify gaps, and define improvement plans.

Activities & Tools:

  • Score each domain: 1 → Initial, 2 → Repeatable, 3 → Defined

  • Identify gaps: missing processes, standards, tools, or skills

  • Generate heatmaps / radar charts to visualize gaps

  • Document improvement plan per domain

Sample Table (Level 2 → 3)

Domain

Current

Target

Gap

Improvement Plan

Tool / Evidence

Business

2

3

Capability maps not KPI-linked

Update maps, link KPIs, governance review

LeanIX, Confluence

Application

2

3

Interfaces not standardized

Standardize interfaces, complete catalog

ServiceNow CMDB

Data

2

3

Data lineage not tracked

Implement lineage tracking, enforce quality

Collibra / Power BI

Technology

2

3

IaC templates missing

Implement IaC templates & dashboards

Azure DevOps

Security

2

3

Partial audit logs, manual policies

Automate audit, enforce standards

ServiceNow GRC / Splunk

Governance

2

3

ARB exists but limited enforcement across LoBs.

Schedule monthly ARB, log decisions

Confluence

Tools

2

3

Repository adoption low

Training & enforcement

LeanIX / Mega HOPEX

People

2

3

Training / Guild not active

Run guild sessions & training

Confluence, MS Teams

Output:

  • Heatmap & radar chart

  • Gap analysis table with improvement plan

Step 6: CXO Presentation, Sponsorship & Funding

Objective: Secure executive buy-in and funding for each milestone.

CXO Presentation Slide Details

  1. Slide 1: Executive Summary

    • EA maturity level summary

    • Key objectives for Level 2 → 3

    • Expected business benefits / ROI

  2. Slide 2: Current Maturity (Level 2)

    • Heatmaps / radar charts for all domains

    • Evidence: screenshots, CMDB logs, dashboards

  3. Slide 3: Gap Analysis & Findings

    • Domain-specific gaps

    • Missing processes, standards, tools, or skills

    • Evidence captured from tools, surveys, CMDB

  4. Slide 4: Risks & Benefits

    • Risks if gaps persist (compliance, inefficiency)

    • Benefits: standardization, KPI alignment, governance maturity

  5. Slide 5: Proposed Roadmap / Milestones

    • Short-term (0–6 months): quick wins, critical gaps

    • Mid-term (6–12 months): core domain improvements

    • Long-term (12–18 months): cross-domain integration, advanced automation

    • Include tools and owners per milestone

  6. Slide 6: Evidence Repository & Dashboard Snapshots

    • Screenshots, dashboards, CMDB references

    • Demonstrates data-driven approach

  7. Slide 7: Funding & Sponsorship Request

    • Investment needed per milestone

    • Expected ROI / business impact

    • Approval requested

Activities & Tools:

  • Conduct CXO workshop / discussion

  • Capture feedback and adjust roadmap

  • Tools: PowerPoint, Tableau / Power BI, Confluence / SharePoint, MS Teams

Output:

  • CXO-approved roadmap & funding

  • Mandate for Level 2 → 3 execution

Repeat CXO presentation for Level 3 → 4 and Level 4 → 5

  • Overlapping initiatives to accelerate total timeline (~2–3 years total)

Step 7: Define EA Roadmap (Short-, Mid-, Long-Term)

Activities & Tools:

  • Translate gaps into domain-specific initiatives

  • Assign owners, KPIs, tools

  • Integrate roadmap into ARB/governance process

Sample Accelerated Roadmap Table

Horizon

Domains / Level

Milestone / Initiative

Duration

Short-term

Business, Application

Level 2 → 3: standardize catalog, capability map

6–12 mo

Mid-term

Data, Security, Tools

Level 3 → 4: automation, compliance dashboards

6–12 mo

Long-term

Technology, Governance

Level 4 → 5: predictive KPIs, enterprise-wide adoption

6–12 mo

Step 8: Monitoring & Continuous Improvement

Activities & Tools:

  • Track maturity quarterly via dashboards

  • Measure KPIs: adoption, compliance, skill uplift

  • Adjust roadmap & improvement plan based on progress

  • Tools: Power BI / Tableau, Confluence / SharePoint

Output:

  • Evidence-based EA Maturity Dashboard

  • Continuous Improvement Plan

🧠 KPI Examples

Area

KPI

EA Governance

% projects reviewed by ARB; avg deviation resolution time

Business Alignment

% IT investments mapped to business capabilities

Reuse

# of reused reference architectures or components

DevOps Integration

% of pipelines with policy-as-code enforcement

Tooling

# of artifacts version-controlled in EA repository

Data Governance

% of critical data with ownership defined

🧩 Tools & Frameworks Used

Purpose

Tool / Framework

EA Maturity Framework

TOGAF EA Capability Model / Gartner EA Maturity Model

EA Repository

LeanIX / Sparx EA / Bizzdesign

Surveys & Scoring

Microsoft Forms / Excel Dashboards

Workshops

Miro / Mural for visual collaboration

Governance Tracking

Confluence + Jira for ARB tickets

Visualization

Power BI or Tableau for KPI Dashboards

Interview Narrative

“We perform accelerated, evidence-based EA maturity assessments.Using stakeholder interviews, domain questionnaires, and artifact verification using tools like LeanIx,ServiceNow,PowerBI, we generate heatmaps, gap analysis with real evidence, and improvement plans.Level 2 → 3 is achieved in 6–12 months with CXO-approved roadmap, funding, and sponsorship.Level 3 → 4 and 4 → 5 initiatives run in parallel, completing full EA maturity in ~2–3 years.Each step is measurable, evidence-driven, domain-specific, and CXO-ready, exactly as real banking EA programs operate.”

We know the bank has reached Level 5 EA maturity because architecture is fully integrated into delivery, not just advisory. EA practices are embedded into CI/CD pipelines—using Policy-as-Code, OPA agents, Terraform for compliant infra provisioning, and automated exception or expedited approvals. Continuous assessment dashboards track compliance, gaps, and improvement initiatives in real time. This ensures proactive governance, measurable risk reduction, and alignment with business KPIs, which are the hallmarks of Level 5 maturity."


 
 
 

Recent Posts

See All
How to replan- No outcome after 6 month

⭐ “A transformation program is running for 6 months. Business says it is not delivering the value they expected. What will you do?” “When business says a 6-month transformation isn’t delivering value,

 
 
 
EA Strategy in case of Merger

⭐ EA Strategy in Case of a Merger (M&A) My EA strategy for a merger focuses on four pillars: discover, decide, integrate, and optimize.The goal is business continuity + synergy + tech consolidation. ✅

 
 
 

Comments

Rated 0 out of 5 stars.
No ratings yet

Add a rating
  • Facebook
  • Twitter
  • LinkedIn

©2024 by AeeroTech. Proudly created with Wix.com

bottom of page