Every automotive organization claims to take cybersecurity seriously. But there is an enormous gap between an organization that has a documented cybersecurity policy and one that systematically identifies, mitigates, and monitors cyber risks across its entire vehicle portfolio and supply chain. The difference is not just about tools or headcount — it is about organizational maturity: the depth to which cybersecurity practices are embedded in processes, culture, and decision-making at every level of the enterprise.

With UNECE R155 mandating a certified Cybersecurity Management System (CSMS) for type approval in over 60 countries, and ISO/SAE 21434 establishing the engineering standard for automotive cybersecurity, the question is no longer whether an organization needs cybersecurity capability but how to systematically build and measure that capability over time. A maturity model provides the framework for answering this question.

This article presents a practical, automotive-specific cybersecurity maturity model with five levels across six key domains. It is designed to help OEMs, Tier-1 suppliers, and cybersecurity leaders assess their current posture, identify the most impactful gaps, plan a realistic improvement roadmap, and demonstrate measurable progress to auditors, regulators, and customers.

Cybersecurity maturity pyramid: five levels from ad-hoc to optimizing, with UNECE R155 minimum at Level 2 Level 1: Initial Ad hoc, reactive Level 2: Managed Basic processes defined Level 3: Defined Standardized across org Level 4: Quantitatively Metrics-driven L5: Optimizing Increasing Maturity Reactive Repeatable Consistent Measured Innovating R155 Minimum
Cybersecurity maturity pyramid: five levels from ad-hoc to optimizing, with UNECE R155 minimum at Level 2

Why a Maturity Model for Automotive Cybersecurity?

Maturity models have a long pedigree in software engineering (CMMI), IT security (NIST CSF tiers), and systems engineering (ASPICE). Their value lies in providing a structured, repeatable framework for assessing capability that goes beyond binary compliance checks. A UNECE R155 audit can tell you whether your CSMS meets the minimum requirements for type approval, but it cannot tell you how robust, efficient, or sustainable your cybersecurity program is. A maturity model fills this gap.

Specific benefits of a maturity-based approach include:

  • Benchmarking: Compare your organization’s cybersecurity capability against a defined scale, enabling objective comparison across business units, product lines, or peer organizations.
  • Prioritization: Identify the domains where maturity is lowest relative to the risk exposure, focusing improvement investment where it will have the greatest impact.
  • Roadmap planning: Define concrete, incremental targets for maturity improvement rather than attempting to achieve an undefined “best practice” state in a single leap.
  • Audit preparation: Maturity assessments generate evidence artifacts (assessment reports, gap analyses, improvement plans) that directly support UNECE R155 CSMS audits and ISO/PAS 5112 cybersecurity audit requirements.
  • Executive communication: Maturity levels provide a simple, intuitive language for communicating cybersecurity posture to non-technical stakeholders — board members, investors, and customers understand “Level 3 out of 5” in a way they do not understand vulnerability statistics.

Maturity Levels Defined

Our automotive cybersecurity maturity model defines five levels, inspired by CMMI and adapted for the specific context of vehicle cybersecurity. Each level builds on the capabilities of the previous level:

Level 1: Initial / Ad-Hoc

Cybersecurity activities exist but are performed on an ad-hoc basis without consistent processes. Success depends on individual heroics rather than organizational capability. There is no formal cybersecurity management system. Threat analysis may be performed for specific projects when a knowledgeable engineer is available, but there is no standard methodology or tooling. Vulnerability management is reactive — vulnerabilities are addressed when they cause visible problems or when customers complain. There is no systematic security testing program. The organization may have a cybersecurity policy document, but it is not operationalized into daily engineering practice. At Level 1, the organization is at significant risk of failing a UNECE R155 CSMS audit because the minimum process requirements are not consistently met.

Level 2: Repeatable

Basic cybersecurity processes are defined and followed for major projects. The organization has established a CSMS that meets the minimum requirements for UNECE R155 certification. TARA is performed using a documented methodology for each vehicle project. Security requirements are derived from TARA results and tracked through development. Security testing is planned and executed, though coverage may be inconsistent. Vulnerability monitoring exists but may not cover the full software stack. Incident response procedures are documented. At Level 2, the organization can pass a CSMS audit and obtain type approval, but processes are often project-specific, manually intensive, and dependent on a small number of experienced practitioners. Consistency across projects and vehicle lines is limited.

Level 3: Defined

Cybersecurity processes are standardized across the organization and consistently applied to all vehicle programs. A central cybersecurity organization (or function) defines standard processes, methods, tools, and templates that all projects follow. TARA methodology is uniform, with a shared threat catalog and risk assessment criteria. Security requirements are integrated into the standard development process rather than being a parallel activity. Security testing includes SAST, DAST, fuzz testing, and penetration testing as standard deliverables. SBOM generation and vulnerability monitoring are automated. Supply chain cybersecurity requirements are standardized in supplier contracts. At Level 3, the organization has moved from project-dependent execution to organizational capability. New team members can be effective quickly because processes are documented and tooled. Results are predictable across different projects and teams.

Level 4: Managed

Cybersecurity processes are quantitatively measured, and performance data drives decision-making. The organization collects and analyzes metrics across all cybersecurity activities: TARA completion rates, security requirement coverage, test coverage, vulnerability detection rates, mean time to remediate, incident response times, and supplier compliance rates. Baselines and control limits are established for key metrics. When a metric deviates from its baseline, root cause analysis is triggered. Process performance is predictable within defined tolerances. At Level 4, cybersecurity is managed as an engineering discipline with the same quantitative rigor applied to quality and reliability. Executive leadership has visibility into cybersecurity posture through dashboards and regular reviews. Investment decisions are informed by data, not anecdotes.

Level 5: Optimizing

The organization continuously improves its cybersecurity processes based on quantitative feedback, emerging threat intelligence, and lessons learned. Innovation is systematic: new tools, techniques, and approaches are piloted, measured, and adopted based on evidence of effectiveness. The organization contributes to industry standards and shares threat intelligence with peers through ISACs (Information Sharing and Analysis Centers) such as Auto-ISAC. Automation is pervasive — AI-assisted threat modeling, automated compliance evidence generation, and machine learning-based anomaly detection are integrated into standard workflows. The organization anticipates emerging threats (quantum computing, AI-generated attacks, new vehicle architectures) and proactively evolves its cybersecurity capability. At Level 5, the organization is an industry leader that sets the benchmark for automotive cybersecurity excellence.

Maturity Domains

Maturity is assessed across six domains that collectively cover the full scope of automotive cybersecurity. An organization’s overall maturity is the profile of its maturity levels across all domains — it is common for organizations to be at different levels in different domains.

Domain 1: Governance and Organization

This domain covers the organizational structures, policies, roles, and accountability frameworks that enable cybersecurity. It addresses questions such as: Is there a designated cybersecurity officer or team? Does the cybersecurity function have executive sponsorship and adequate budget? Are cybersecurity roles and responsibilities clearly defined across engineering, product management, and operations? Is there a cybersecurity policy that is reviewed and updated regularly? Does the organization maintain a risk appetite statement that guides cybersecurity investment decisions?

Governance is often the domain with the widest variance across automotive organizations. Some have mature governance structures inherited from IT security programs, while others have minimal governance despite significant engineering cybersecurity capability. Without strong governance, even excellent technical practices tend to erode over time as competing priorities claim resources.

Domain 2: Risk Management

Risk management covers the processes for identifying, assessing, treating, and monitoring cybersecurity risks. In automotive, the primary risk assessment method is TARA (Threat Analysis and Risk Assessment) as defined in ISO/SAE 21434. This domain evaluates: Is TARA performed systematically for all items and components? Are threat catalogs maintained and updated with current threat intelligence? Is risk assessment consistent across different assessors and projects? Are risk treatment decisions documented and traceable? Is residual risk formally accepted by accountable stakeholders? Are risk assessments updated when the vehicle architecture changes?

Domain 3: Engineering Security

Engineering security covers the technical practices embedded in the vehicle development lifecycle: secure design, secure implementation, and security verification and validation. This domain evaluates: Are security requirements derived from TARA and tracked through development? Are secure coding guidelines defined and enforced? Is security architecture reviewed at design milestones? Are security testing activities (SAST, DAST, fuzz testing, penetration testing) defined and consistently executed? Is verification evidence linked to security requirements? Are cryptographic implementations validated? Is secure boot and secure update implemented and tested?

Domain 4: Supply Chain Security

Supply chain security addresses the management of cybersecurity risk across the multi-tier automotive supply chain. This domain evaluates: Are cybersecurity requirements included in supplier contracts and purchase agreements? Are supplier cybersecurity capabilities assessed before selection? Do suppliers deliver cybersecurity evidence (TARA, test reports, SBOM) as part of component delivery? Are supplier vulnerabilities tracked and managed? Is there a process for handling cybersecurity incidents that originate in supplied components? Are open-source components tracked and monitored through SBOM?

Domain 5: Operations

Operations covers post-production cybersecurity activities: monitoring, incident detection and response, vulnerability management, and update management. This domain evaluates: Is there continuous monitoring of the deployed vehicle fleet for cybersecurity events? Is there a defined incident response process with clear escalation procedures? Are vulnerabilities tracked from disclosure through remediation? Is there a capability for emergency OTA security updates? Are cybersecurity events reported to regulators as required? Is there a vehicle SOC or equivalent monitoring capability?

Domain 6: Continuous Improvement

Continuous improvement covers the feedback loops that drive ongoing enhancement of cybersecurity capability. This domain evaluates: Are lessons learned from incidents and near-misses captured and acted upon? Are post-mortem reviews conducted after security events? Are cybersecurity metrics collected, analyzed, and used to drive improvement? Is the threat landscape systematically monitored for emerging risks? Are cybersecurity processes audited internally on a regular schedule? Is there a formal improvement program with targets and tracking?

Comprehensive Maturity Matrix

The following matrix provides concrete indicators for each maturity level across all six domains. Use this as an assessment tool by identifying which level best describes your organization’s current practice in each domain:

Domain Level 1: Ad-Hoc Level 2: Repeatable Level 3: Defined Level 4: Managed Level 5: Optimizing
Governance & Organization
Leadership No designated cybersecurity role Part-time cybersecurity coordinator Dedicated cybersecurity team with defined mandate CISO/Head of Cybersecurity with board reporting Cybersecurity integrated into executive strategy
Policy No formal policy Basic policy document exists Comprehensive policy, regularly reviewed Policy effectiveness measured via KPIs Policy continuously adapted to emerging threats
Budget No dedicated budget Project-level budget allocation Central cybersecurity budget Risk-based budget allocation with ROI tracking Dynamic budget tied to threat landscape changes
Risk Management
TARA No systematic TARA TARA on major projects using basic method Standardized TARA methodology, shared catalogs TARA quality metrics tracked, inter-rater consistency measured AI-assisted TARA with continuous threat feed integration
Risk treatment Informal risk acceptance Documented risk treatment per project Standardized risk criteria and treatment process Portfolio-level risk monitoring with dashboards Predictive risk modeling with proactive mitigation
Engineering Security
Security requirements No formal security requirements Security requirements for key components Requirements derived from TARA, tracked in ALM Requirement coverage measured, gaps flagged automatically Requirements auto-generated from threat models
Security testing Ad-hoc testing by motivated individuals Planned penetration test at pre-SOP SAST, DAST, fuzz, pentest as standard activities Test coverage metrics drive investment decisions Continuous automated testing in CI/CD pipeline
Supply Chain
Supplier requirements No cybersecurity requirements for suppliers Basic cybersecurity clause in contracts Detailed cybersecurity requirements, SBOM mandated Supplier cybersecurity capability scored and tracked Collaborative threat intelligence sharing with suppliers
Component security No supplier security evidence collected Supplier self-declaration accepted TARA, test reports, SBOM required at delivery Supplier evidence audited, discrepancies tracked Automated supply chain risk monitoring
Operations
Monitoring No fleet cybersecurity monitoring Basic log collection from connected vehicles Vehicle SOC with defined detection rules ML-based anomaly detection with fleet correlation Predictive threat detection with automated response
Incident response No defined process Documented IR plan, tested annually IR team trained, playbooks for common scenarios IR metrics tracked, response times within SLAs Automated containment with post-incident ML feedback
Continuous Improvement
Lessons learned No systematic learning Post-incident reviews when major events occur Formal lessons learned process after all incidents Lessons learned tracked to process changes with metrics Predictive insights from cross-industry intelligence
Internal audit No cybersecurity audits Audit for CSMS certification only Annual internal cybersecurity audit program Continuous audit with automated evidence collection Audit findings drive real-time process adaptation

Conducting a Self-Assessment

A maturity self-assessment should be conducted annually, or more frequently when significant organizational changes occur. The following process has been validated across multiple automotive organizations:

Step 1: Assemble the Assessment Team

The assessment team should include representatives from each stakeholder group: cybersecurity engineering, product management, quality/compliance, supply chain management, and operations. Including diverse perspectives prevents the assessment from reflecting only the cybersecurity team’s view. An external facilitator can improve objectivity, especially for the first assessment. The team size should be 6–10 people for a focused, productive workshop.

Step 2: Gather Evidence

Before the assessment workshop, collect evidence artifacts for each domain: organizational charts and role descriptions (governance), TARA reports and risk registers (risk management), security test reports and requirement traceability matrices (engineering), supplier contracts and cybersecurity questionnaire results (supply chain), monitoring dashboards and incident reports (operations), and audit reports and improvement action logs (continuous improvement). Evidence turns the assessment from an opinion exercise into a fact-based evaluation.

Step 3: Domain-by-Domain Assessment

For each domain, walk through the maturity indicators from Level 1 to Level 5. For each indicator, determine which level best describes the organization’s current, demonstrated practice — not aspirational plans or one-off achievements, but what is consistently and repeatably done. Record the rationale and supporting evidence for each rating. Where team members disagree, discuss the specific evidence and reach consensus. The final rating for each domain is the level that is fully and consistently achieved; partial achievement of a higher level does not qualify.

Step 4: Identify Priority Gaps

Compare the assessed maturity profile against the target maturity profile. Not every domain needs to be at Level 5 — the target should reflect the organization’s risk exposure, regulatory requirements, and strategic priorities. For an OEM seeking UNECE R155 type approval, Level 2 is the minimum across all domains. For an OEM positioning itself as a cybersecurity leader, Level 4 across governance, risk management, and engineering is a reasonable target. The gaps between current and target maturity, weighted by risk impact, define the improvement priorities.

Step 5: Build the Improvement Roadmap

For each priority gap, define specific improvement actions, owners, timelines, resource requirements, and success criteria. Group actions into quarterly milestones aligned with the organization’s planning cadence. Each milestone should advance at least one domain by one maturity level. Avoid trying to advance all domains simultaneously — focus on the two or three domains with the highest risk-weighted gaps first.

Common Maturity Gaps in Automotive Organizations

Based on assessments conducted across dozens of automotive organizations, certain maturity gaps are consistently prevalent:

Governance Gaps

The most common governance gap is the absence of a dedicated cybersecurity budget that is protected from being absorbed by other engineering priorities. Many organizations have competent cybersecurity engineers but fund their work through project-level budgets, which means cybersecurity investment is negotiated project by project rather than being a strategic organizational commitment. The second most common gap is the lack of executive-level cybersecurity reporting — cybersecurity status is reported to engineering management but does not reach the board or C-suite, limiting the organization’s ability to make informed risk-based investment decisions.

Supply Chain Gaps

Supply chain cybersecurity maturity is typically 1–2 levels below the organization’s internal maturity. OEMs that have invested heavily in their own cybersecurity capability often discover that their Tier-1 suppliers are still at Level 1 or 2. The most common supply chain gaps are the absence of SBOM requirements in supplier contracts, the lack of a process for validating supplier-delivered cybersecurity evidence, and the inability to track and manage vulnerabilities in supplied components after delivery. These gaps represent significant risk because suppliers are responsible for a substantial portion of the vehicle software stack.

Operations Gaps

Post-production monitoring is the domain where the automotive industry lags furthest behind IT security. Many organizations that have mature engineering security practices (Level 3 or 4) are still at Level 1 in operations — they have no continuous monitoring of the deployed fleet, no vehicle SOC, and limited incident response capability. This gap is increasingly problematic as UNECE R155 explicitly requires post-production cybersecurity monitoring and incident response, and as the shift to software-defined vehicles means that the attack surface continues to evolve after production.

Measurement Gaps

The transition from Level 3 (Defined) to Level 4 (Managed) is the most difficult leap in the maturity model because it requires the organization to establish quantitative measurement of cybersecurity process performance. Many organizations have well-defined processes but collect no metrics on how well those processes work. They cannot answer basic questions like: What is our average time from vulnerability disclosure to deployed patch? What percentage of TARA-identified threats have verified mitigations? What is our security test coverage across the vehicle portfolio? Without these metrics, improvement is driven by intuition rather than evidence, and it is impossible to demonstrate to auditors that the CSMS is effective, not just documented.

Aligning Maturity Assessment with ISO/SAE 21434 and UNECE R155

The maturity model is designed to complement, not replace, ISO/SAE 21434 and UNECE R155 compliance. Here is how the domains map to regulatory requirements:

UNECE R155 CSMS Alignment

UNECE R155 Annex 5 defines the requirements for the CSMS that must be certified by the approval authority. The maturity model’s governance domain maps directly to CSMS organizational requirements (7.2.2.2), risk management maps to the risk assessment process (7.2.2.3), engineering security maps to the development phase requirements (7.2.2.4), supply chain maps to supplier management requirements (7.2.2.5), and operations maps to post-production requirements (7.2.2.6). A Level 2 maturity across all domains is broadly equivalent to meeting the minimum CSMS requirements. Higher maturity levels represent capability that exceeds the minimum but that auditors increasingly expect to see as the industry matures.

ISO/SAE 21434 Work Product Alignment

ISO/SAE 21434 defines specific work products (documents, records, analyses) that serve as evidence of cybersecurity engineering practice. The maturity model’s engineering security domain directly maps to ISO/SAE 21434 Clauses 9–14 (concept, development, validation, production). Each maturity level implies a progression in work product quality: Level 2 organizations produce the required work products but with variable quality and completeness; Level 3 organizations produce work products using standardized templates and methods; Level 4 organizations measure work product quality through metrics such as TARA completeness scores and requirement traceability ratios.

ISO/PAS 5112 Audit Alignment

ISO/PAS 5112 provides guidelines for auditing cybersecurity engineering, essentially defining how to audit against ISO/SAE 21434 requirements. A maturity assessment generates many of the same evidence artifacts that an ISO/PAS 5112 audit would examine: process descriptions, role definitions, work product samples, metric reports, and improvement records. Organizations that conduct regular maturity assessments are significantly better prepared for external audits because the assessment process itself exercises the evidence collection and gap analysis capabilities that auditors will evaluate.

A maturity model is not a substitute for regulatory compliance, but it is the most effective tool for going beyond minimum compliance to build genuine cybersecurity capability. The organizations that treat UNECE R155 as a ceiling rather than a floor are the ones most likely to experience cybersecurity incidents that damage their brand, their customers, and their bottom line.

Roadmap Planning: From Current to Target Maturity

Moving from one maturity level to the next typically requires 12–18 months of focused effort per domain. The following guidelines help organizations plan realistic improvement roadmaps:

Level 1 to Level 2: Establishing the Foundation

The priority at this stage is establishing the basic processes required for UNECE R155 CSMS certification. Key actions include appointing a cybersecurity coordinator or team, developing a CSMS document set (policy, TARA methodology, incident response plan), performing TARA on active vehicle programs, establishing a vulnerability monitoring process, and implementing basic security testing. Timeline: 6–12 months. This is often driven by a regulatory deadline (type approval submission) and represents the minimum viable cybersecurity program.

Level 2 to Level 3: Standardization and Scaling

The priority is moving from project-dependent execution to organizational standardization. Key actions include establishing a central cybersecurity function with defined authority, standardizing TARA methodology with shared threat catalogs and tooling, integrating security requirements into the ALM (Application Lifecycle Management) system, automating SBOM generation and vulnerability monitoring, defining standard security testing requirements for all projects, and establishing supply chain cybersecurity requirements. Timeline: 12–18 months. This stage often requires investment in tooling and training to enable consistent execution at scale.

Level 3 to Level 4: Measurement and Management

The priority is establishing quantitative management of cybersecurity processes. Key actions include defining cybersecurity KPIs and metrics for each domain, implementing dashboards and automated reporting, establishing baselines and control limits for key metrics, integrating cybersecurity metrics into executive reporting, and implementing data-driven improvement cycles. Timeline: 12–18 months. This stage requires cultural change as much as technical capability — teams must embrace measurement as a tool for improvement rather than as a surveillance mechanism.

Level 4 to Level 5: Innovation and Leadership

The priority is systematic innovation and industry leadership. Key actions include investing in AI-assisted security analysis tools, establishing threat intelligence sharing partnerships, contributing to industry standards development, implementing predictive risk modeling, and developing advanced automation for compliance evidence generation. Timeline: ongoing. Level 5 is not a destination but a practice of continuous innovation that adapts as the threat landscape and technology evolve.

How ThreatZ Accelerates Maturity Improvement

ThreatZ is designed to accelerate movement up the maturity curve across multiple domains:

  • Risk Management (TARA): ThreatZ provides standardized, AI-assisted TARA methodology with shared threat catalogs, consistent risk scoring, and automated requirement generation — directly enabling the transition from Level 2 to Level 3 in the risk management domain.
  • Engineering Security: Full traceability from threats through requirements to verification evidence, with automated coverage analysis, supports Level 3 and Level 4 engineering security maturity.
  • Supply Chain: ThreatZ SBOM management with continuous vulnerability monitoring and supplier evidence tracking advances supply chain maturity from Level 2 to Level 3 and beyond.
  • Measurement: Built-in metrics dashboards covering TARA completion, requirement coverage, vulnerability remediation rates, and SBOM health provide the quantitative foundation required for Level 4 maturity across all domains.
  • Audit Readiness: Automated compliance evidence generation and traceability reporting directly supports ISO/PAS 5112 audit preparation and UNECE R155 CSMS recertification.

Key Takeaways

  • Cybersecurity maturity goes beyond regulatory compliance to measure how deeply security is embedded in organizational processes, culture, and decision-making.
  • Five maturity levels (Ad-Hoc, Repeatable, Defined, Managed, Optimizing) provide a structured progression path from basic capability to industry leadership.
  • Maturity is assessed across six domains: governance, risk management, engineering security, supply chain, operations, and continuous improvement.
  • The most common gaps in automotive organizations are in supply chain cybersecurity (1–2 levels below internal maturity), post-production operations, and quantitative measurement.
  • Level 2 maturity across all domains is the minimum required for UNECE R155 CSMS certification; Level 3–4 represents a robust, sustainable cybersecurity program.
  • The transition from Level 3 to Level 4 (introducing quantitative measurement) is the most difficult leap and requires cultural change alongside technical capability.
  • Maturity assessments generate evidence artifacts that directly support UNECE R155 audits and ISO/PAS 5112 audit requirements.
  • Realistic improvement planning targets one maturity level advancement per domain every 12–18 months, focusing on the 2–3 highest-risk gaps first.

Accelerate Your Cybersecurity Maturity

ThreatZ provides the standardized TARA methodology, SBOM management, traceability, and metrics dashboards that enable rapid maturity advancement across governance, engineering, and supply chain domains.

Explore ThreatZ