DORA5 min read10 March 2026

DORA and the EU AI Act: The Compliance Overlap Your Legal Team May Have Missed

Two compliance regimes, both in force, both applicable to AI systems in financial services. The documentation obligations are structurally identical. Most firms are running two separate programmes when one would cover both.

Since January 2025, financial institutions operating in the EU have been subject to the Digital Operational Resilience Act. Since August 2026, those same institutions deploying high-risk AI systems have been subject to the EU AI Act. The two regulations were drafted by different legislative bodies with different policy objectives — one focused on operational resilience and third-party risk, the other on AI safety and fundamental rights. But for compliance teams trying to satisfy both, the practical documentation requirements overlap to a degree that the regulations’ architects may not have fully intended.

The overlap is structural, not incidental. Both regulations require documentation of ICT systems, change management records, risk management systems, and logging arrangements. Both require firms to maintain this documentation in a form that can be produced on regulator request. Both impose obligations that attach to AI systems that are also ICT tools — which is most AI systems in production financial services.

Firms running separate DORA and EU AI Act compliance programmes are producing duplicated evidence at duplicated cost. The same information, assembled from the same underlying operational records, satisfies both. The question is whether compliance teams have recognised this or are still treating the two regimes as entirely separate workstreams.

DORA Obligations Relevant to AI

Key DORA Articles

Article 11 — ICT Risk Management Documentation

DORA Article 11 requires financial entities to maintain an ICT risk management framework and to document all ICT systems and their interdependencies. Article 11(1)(e) specifically requires documentation of "ICT systems, ICT assets and ICT dependencies." An AI model and its supporting infrastructure — the endpoints it calls, the data stores it reads, the monitoring systems it feeds — is an ICT system with ICT dependencies. The DORA obligation attaches to it directly.

Article 28 — Third-Party Provider Requirements

AI systems procured from third-party vendors — including foundation models accessed via API, AI-enabled SaaS products, and cloud AI services — fall under DORA's third-party risk provisions. Article 28 requires firms to assess and document the operational risk posed by third-party ICT providers. A firm using OpenAI's API to power a credit assessment tool must document that dependency under DORA — just as it must assess the third-party provider obligations under EU AI Act Article 28 for deployers of third-party AI systems.

Incident Classification

DORA establishes a mandatory incident classification and reporting framework for major ICT-related incidents. An AI model failure that results in material customer harm — incorrect credit decisions, missed fraud, service unavailability — may simultaneously be a DORA-reportable ICT incident and an EU AI Act Article 73 incident that must be reported to the market surveillance authority. Both reporting obligations can be satisfied from the same underlying incident record.

EU AI Act Obligations Relevant to ICT Systems

Key EU AI Act Articles

Annex IV — Technical Documentation

The Annex IV technical file for high-risk AI systems requires a general description of the AI system including its hardware and software environment, its data dependencies, and its integration architecture. This is a documentation of the AI system as an ICT asset — precisely what DORA Article 11(1)(e) also requires. The information needed to satisfy both requirements is the same information, drawn from the same operational records.

Article 9 — Risk Management System

EU AI Act Article 9 requires providers of high-risk AI systems to establish a risk management system that operates throughout the AI system's lifecycle. The system must identify and analyse known and foreseeable risks, adopt risk management measures, and be documented. DORA Article 6 requires an equivalent ICT risk management system. Both require the same documented, ongoing risk assessment process applied to the same systems.

Article 12 — Logging Requirements

High-risk AI systems must automatically log events to the extent this is technically feasible. The logs must be sufficient to trace the system's outputs back to the input data and to reconstruct the circumstances of any incident. DORA Article 11(6) requires firms to maintain logs of ICT-related incidents and to retain them for a defined period. Both logging obligations can be satisfied from a single, well-structured audit log.

The Structural Overlap: Article by Article

The overlap between the two regimes becomes clearest when you examine the specific articles side by side. Consider three direct correspondences.

DORA Article 11(1)(e) requires documentation of “ICT systems and their dependencies.” EU AI Act Annex IV Section 1 requires a general description of the AI system including its hardware and software environment. These are descriptions of the same thing — the AI system as a component in the firm’s ICT estate. A firm that maintains an AI model registry with infrastructure documentation satisfies both requirements from a single source.

DORA Article 11(6) requires documentation of ICT changes and a change management process. EU AI Act Annex IV Section 6 requires a log of significant changes to the AI system, with justification for each change. Both requirements are satisfied by the same change log — a record of every material modification to the AI system, with dates, authorisations, and reasons, maintained continuously and available on regulator request.

DORA’s incident classification framework requires firms to identify, log, and where necessary report major ICT incidents. EU AI Act Article 73 requires providers to report serious incidents to market surveillance authorities. For AI systems, the incident log that satisfies DORA contains the information needed to assess whether an EU AI Act Article 73 report is also required. Running two separate incident processes for the same event is redundant.

What Firms Are Doing Wrong

The most common error is the assumption that because DORA and the EU AI Act are separate regulations passed by different processes, they require separate compliance programmes. Legal teams frequently engage two sets of external advisers — one with a DORA specialism, one with an EU AI Act specialism — and manage two workstreams with two evidence sets, two documentation frameworks, and two review cycles.

The result is significant duplication of effort. Documentation that describes the same AI system is produced twice, in two different formats, stored in two different locations, and reviewed by two different teams. When the system changes, both sets of documentation must be updated. When a regulator requests documentation, firms must assemble it from two sources. The overhead is substantial and the risk of inconsistency between the two evidence sets is material — inconsistency that a forensic regulator will notice.

There is also a cost dimension. Firms running separate programmes are paying for the same compliance output twice. For a firm with fifteen high-risk AI systems in scope for both regimes, the duplicated advisory fees, internal staff time, and documentation overhead represent a significant and unnecessary expenditure.

One Audit Trail, Both Frameworks

Audital’s approach to multi-framework compliance is based on a single, continuously maintained audit trail that is annotated with framework mappings. Every event captured in the audit chain — deployment, change, incident, validation, monitoring check — is tagged with the regulatory requirements it satisfies. A deployment approval event satisfies EU AI Act Annex IV Section 6 change log requirements, DORA Article 11(6) change management requirements, and FCA SS1/23 model governance requirements simultaneously.

When a firm needs to produce DORA evidence for an audit, Audital exports the relevant events with DORA article references. When the same firm needs to produce an Annex IV technical file for EU AI Act compliance, Audital generates the document from the same underlying events with Annex IV section references. The underlying audit data is created once. It satisfies both frameworks from that single source.

This is not a minor administrative convenience. For a firm with a significant AI deployment footprint, the difference between a single integrated compliance programme and two parallel ones represents months of duplicated effort per year and meaningful risk of regulatory inconsistency. The firms that recognise this early — and invest in a unified evidence infrastructure rather than a succession of framework-specific projects — will be materially better positioned as the regulatory landscape continues to develop.

DORA + EU AI Act

Calculate the cost of running two programmes

The ROI calculator shows the compliance cost of separate DORA and EU AI Act programmes versus a unified Audital evidence chain — staff time, advisory fees, and the cost of documentation inconsistency risk.

Open the ROI Calculator →

RegRadar Briefing

Monthly Regulatory Intelligence

Monthly: the regulatory changes that matter, the enforcement actions to learn from, and the deadlines coming up. Read by compliance professionals at regulated firms across the UK and EU.

AC

Audital Compliance Team

audital.ai