HomeCustomersPricingBlog
Back
  • September 19, 2024
  • 9 min read

Did DORA’s last update create an encryption loophole?

Mathew Pregasen

Content Contributor

Through a series of legislations spanning 2022 to 2024, the European Union brought DORA (Digital Operational Resilience Act) into existence. DORA is a very long and detailed set of laws that collectively define requirements that any financial institution should follow. While these requirements are varied, the overarching goal is to prevent breaches of a financial institution’s information and communication technology (ICT).

In a nutshell, DORA is legislation that intends to safeguard financial institutions from being hacked.

Because of its breadth and layered complexity, DORA can be difficult to digest. It is inarguably impossible to summarize in a single article, as the rules span testing, information sharing, third-party risk management, monitoring, and incident response.

However, as an encryption company, we have paid special attention to DORA’s stance on cryptography—specifically what it (constantly) references as the “confidentiality” and “authenticity” of data. Today, we’re going to decode DORA’s tenets on encryption. We will also discuss a potential loophole that DORA might have created.

Let’s begin by breaking down some of DORA’s history and fundamentals.

DORA happened in stages

The online literature surrounding DORA is particularly tricky because DORA was implemented in stages, initially signed into place on December 14th, 2022, and then joined by two supplements that established the specifics. The 2022 law was very broad; the supplements actually define what DORA does and doesn’t dictate.

The first supplement was signed on February 24th, 2024; it specified fines for noncompliance and what third-party entities are covered by the legislation. This supplement has been published and is currently enforceable.

The second supplement was signed on March 13th, 2024; it further specified what entities are covered by DORA, what exact technical standards are expected, and what thresholds determine when something is a major ICT breach. This supplement has been “adopted” but isn’t technically enforceable until it finishes a routine three-month scrutiny that precedes publication.

While DORA is presently enforceable, institutions are only expected to be fully DORA-compliant by January 17th, 2025. This includes making any necessary changes to encryption and data custody practices—guidelines spelled out by the second supplement’s technical standards section in particular.

Unfortunately, many online resources detailing encryption compliance recommendations are out of date due to changes made to the standard between 2022 and 2024.

Who does DORA apply to?

DORA isn’t just for banks. DORA regulates credit institutions, payment institutions, investment firms, trading venues, crypto companies, insurance intermediaries, crowdfunding companies, credit rating agencies, and a dozen other entities defined by Article 2. The most notable list entry is the last addition: third-party service providers.

In essence, a third-party service provider is any company that provides ICT services to a financial institution (such as Google Cloud). There are a number of criteria that dictate what companies qualify as ICT service providers. This includes any company that services a significant percentage of financial institutions (10% or more) or any company that handles sensitive data or a critical service. However, the legislation places the burden on financial institutions to determine which third-party contractors are a legitimate ICT risk and require an ESA (European Supervisory Authority) audit.

Meanwhile, much of DORA does not apply to small and non-interconnected investment firms and certain payment institutions, as specifically defined by Article 16, Section 1.

These qualification rules can, admittedly, be quite confusing. DORA is very loose with who is liable. But, in short, DORA governs all financial institutions and any nontrivial ICT subcontractors, with exemptions only made to boutique investment firms and niche institutions.

DORA’s stance on encryption

The original text of DORA did not provide explicit instructions for protecting data. Instead, it established a goal to protect the “availability, authenticity, integrity, and confidentiality of data” in all three settings: (i) at rest, (ii) in transit, and (iii) in use. However, in the latest supplement, DORA does establish some very explicit instructions for data protection.

Simply, DORA mandates encryption for data at rest and data in transit. It also recommends encryption for data in use but with some (unsurprising) caveats.

Data in transit

For data in transit, DORA explicitly requires:

  • That data be available and authentic, maintain integrity, and be confidential during network transmission (Article 14)
  • That data leakages be prevented and detected during transfer (Article 14)
  • That confidentiality requirements be documented and regularly reviewed (Article 14)

In practice, this amounts to using secure protocols like HTTPS for all network transfers, including internal transfers. Generally speaking, this is already a common practice in any serious organization. And while DORA doesn’t mandate the use of an encryption relay service, a trusted relay would further protect data’s integrity and safeguard against leakage.

Data at rest

DORA does not provide explicit instructions on what encryption techniques should be employed for data at rest, though it does constantly reiterate that data should always be encrypted at rest (Section 9 and Article 6) and use the leading practices (Article 6, Paragraph 3). However, DORA does establish explicit rules for how encrypted data at rest should be further protected. Specifically, DORA requires:

  • That access to data at rest be logged for any potential misuse (Article 12)
  • That cryptographic keys be protected against unauthorized access and modification, and that keys be replaceable upon loss and registered to a log (Article 7)
  • That cryptographic practices be reevaluated if they are no longer resilient to cyber threats (Article 6, Paragraph 4)

Because encrypting data at rest is typically baked into most database applications—and those practices are actively maintained and updated—it isn’t surprising that DORA doesn’t complicate the requirements here.

Data in use

The most interesting tenet of DORA is seeking encryption for data in use. Encrypting data in use is a recent commercial field; a decade ago, handling data in plaintext was common practice and considered safe. But today, with emerging threats that target corporate servers, it’s not super surprising that DORA recommends encryption in use.

In the original legislation, DORA requires companies to care about “availability, authenticity, integrity, and confidentiality” of data in use (Article 9). It doesn’t define, however, what that means. Yet in the latest supplement, DORA reaffirms the importance of encrypting data in use but recognizes that encryption in use is often too complex or not feasible (Summary Paragraph 9). Instead, it allows for other mitigating measures until advancements are made.

However, this allowance raises a question: Did DORA inadvertently create a loophole to protecting data in use?

Did DORA’s vagueness create a loophole?

Some might argue that DORA’s vagueness about the impracticality of encryption creates a loophole. The thinking goes that any organization could, hypothetically, claim difficulty around implementation and skirt any measures. The thinking continues that a CISO could then take the path of least resistance, making data in use vulnerable.

Realistically, this isn’t the case, and there is no “loophole”. Instead, DORA was accounting for the fact that encryption in use (a.k.a. homomorphic encryption) is computationally expensive and often infeasible. In fact, there are no major institutions that uses homomorphic encryption in production due to these mechanical limitations.

While DORA future-proofs the mandate for homomorphic encryption (assuming it will eventually be possible), it allows for other strategies in the meantime. And it doesn’t leave these strategies entirely open ended. In Section 4, Article 6, DORA recommends using a separate and protected environment to process plaintext data when encryption in use isn’t possible.

Today, these protected environments are known as trusted execution environments (or TEEs), where the environment (operating system and application) is verified by an external, trusted verifier. While TEEs were difficult to use in production five years ago, they are particularly easy today with the help of products like secure enclaves.

One could argue that stringent security measures that lockdown access to company systems (e.g. FIDO hardware keys) remove the need for TEEs. However, with zero-day exploits and CI pipeline hacks, there is little evidence that these measures provide the necessary protection. Any reasonable interpretation of DORA would argue that using a confidential computing strategy is a must, and if homomorphic encryption is off the table, then leveraging an in-house or third-party TEE is the next valid step.

DORA’s vagueness is an advantage

DORA uses vague language throughout because security is constantly evolving. Five years ago, asking a company to leverage confidential computing would’ve involved poaching the entire faculty of a university’s cybersecurity department. Today, it is easy due to emerging third-party DORA-compliant vendors that provide security as a service.

If DORA didn’t use vague language, it would result in emerging, state-of-the-art practices that would be weirdly noncompliant. Instead, DORA establishes a standard—that an organization should pursue the best security, defined by leading practices—to achieve “availability, authenticity, integrity, and confidentiality” of data.

If you’ve seen massive criticism for DORA, keep in mind that the legislation was written in batches. Some previous articles about DORA were critical of DORA’s lack of specificity, decrying how it never mandated encryption. However, the EU accounted for the lack of explicit recommendations through the recent supplements while also providing flexibility when evaluating future security techniques like homomorphic encryption.

Do I have time to hit DORA compliance by 2025?

DORA’s January 17th, 2025 enforcement deadline is rapidly approaching. However, many organizations can hit security readiness within three months by leveraging the right third-party ICT vendors. DORA acknowledged this strategy by establishing that vendors are permissible as long as they are also DORA-compliant.

DORA compliance is a necessary step toward achieving a more secure financial future, especially because the past has been riddled with major breaches to financial institutions. In attempts to avoid another major hack, like the European Investment Bank hack of 2023, financial organizations should take DORA seriously and avoid hefty fines (and security breaches).

Mathew Pregasen

Content Contributor

Related Posts