As Europe's digital regulatory landscape continues to evolve, businesses face an increasingly complex web of overlapping requirements. While these regulations aim to protect consumers and ensure responsible tech development, their intersection creates unique challenges for companies operating in the EU market. Given the Competitiveness Compass’ objective of simplifying the regulatory landscape by reducing burden and complexity, it is crucial for the EU to map and address existing overlaps. Here's a practical look at some examples of the most significant regulatory overlaps and their real-world implications for businesses.
Cybersecurity's Triple Reporting Challenge
Perhaps the most immediate pain point comes from overlapping incident reporting requirements across three major regulations: NIS 2, DORA, and the Cyber Resilience Act (CRA). This strains resources during critical incident response times. Instead of focusing on strengthening cybersecurity outcomes, companies are often engaged in legal exercises, which undermines the broader goal of enhancing the EU’s cyber resilience.
A cloud provider experiencing a security breach may need to file an initial incident report under NIS 2, submit a separate report under DORA if they serve financial sector clients, and comply with CRA reporting requirements if the incident involves their products. These separate reports, each with different criteria and timelines, must be filed with different regulatory bodies across multiple jurisdictions.
AI Act's Balancing Act with GDPR
The intersection of AI regulation and data protection creates several practical challenges for companies developing and deploying AI. Training accurate, robust, and representative AI models and systems requires access and use of European data reflecting cultural, linguistic, and geographical diversity. Both the AI Act and the GDPR contain requirements impacting the use of data for training, developing, and deploying AI systems and models.
For example, developers of an AI-powered hiring tool classified as high-risk under the AI Act must:
-
Ensure their training data is representative and unbiased (AI Act requirement)
-
Potentially process sensitive personal data to detect and mitigate bias
-
Simultaneously comply with GDPR's data minimization principle, as well as restrictions on the processing of sensitive data
This creates a catch-22: companies need data access to ensure fairness and compliance with the AI Act, but face restrictions under the GDPR. With different authorities enforcing these regulations at national and European levels, companies face uncertainty about AI development and deployment, potentially disincentivizing investments in AI innovation and delaying the availability of new solutions on the EU market.
Unnecessary Complexity in the Liability Framework for AI
The proposed AI Liability Directive (AILD) would add unnecessary legal complexity and uncertainty to an already intricate and novel legal framework impacting AI in Europe. While the recently updated Product Liability Directive (PLD) - which extends the EU’s liability regime for defective products to software and AI - is yet to be transposed into national law and tested in practice, the AILD would introduce additional liability rules specific to AI.
The AILD would mean higher insurance costs, increased legal complexity, and ultimately, it would discourage bringing new AI solutions to market – without filling a clearly identified legal gap. As the EU plans to pursue regulatory simplification following the Competitiveness Compass, the AILD proposal should be withdrawn.
Navigating Complexities in Data Governance Requirements
The Data Act's intersection with other data-focused regulations creates significant complexity for companies. The Data Act is complex – some of its key provisions are unclear - and it will impact many aspects of the data economy – from data sharing and use to the provision of cloud services. Its far-reaching impact affects product design, contractual arrangements, data usage rights, and business models of many companies, particularly IoT manufacturers and cloud service providers. Some examples of potential inconsistencies:
-
The Data Act introduces new requirements for non-personal data transfers that differ from GDPR's established framework for personal data. This is particularly challenging for companies handling mixed data sets.
-
The interaction between the Data Act, Digital Markets Act (DMA), and GDPR creates complexity around data portability. For example, a European manufacturing company collecting IoT data must comply with GDPR obligations, such as data minimization or portability obligations. At the same time, the Data Act introduces obligations to make data available as well as certain restrictions – for example for portability towards DMA gatekeepers. Companies will need further guidance to balance this complex regulatory landscape. In addition, these frameworks are to be enforced by different regulators – in many cases at national level – with different competences and expertise. This can lead to inconsistent interpretation across the Single Market – translating in legal uncertainty for companies and inhibitions in data innovation.
The EU’s announced efforts to simplify regulatory frameworks - such as the ESG omnibus simplification package demonstrate a welcome responsiveness to industry concerns. Similar efforts should be extended to the broader digital regulatory landscape, where the rapid introduction of multiple legislative initiatives has inevitably resulted in complexities and overlaps. As the EU continues to implement and refine its digital regulations, there's a critical need for:
-
Standardized reporting mechanisms and templates across cybersecurity regulations.
-
Balanced and risk-based application of the GDPR in support of AI innovation.
-
Increased guidance and legal clarity on the application of the Data Act and interactions with other regulatory frameworks.
-
Withdrawal of the AI Liability Directive.
91proÊÓÆµ supports the EU's regulatory goals but advocates for greater coordination between different frameworks to reduce unnecessary complexity while maintaining high standards of protection and innovation.