Data Protection, Privacy and Technology
Last year was another busy year in the data protection, privacy and technology sector. Over the course of the last 12 months, we have seen a number of important data protection developments at EU and national level, including:
- the European Commission’s adoption of the highly anticipated Standard Contractual Clauses (“SCCs”) for international data transfers;
- Guidelines by the Data Protection Commission (“DPC”) on data processing in the workplace in the context of preventing the spread of Covid-19; and
- a record GDPR fine imposed for a company’s failure to provide the necessary transparency information in a privacy notice.
The European Data Protection Board (“EDPB”) has also published a number of helpful Guidelines, which provide some welcome clarity on a number of issues including, what constitutes a “transfer” of data under the GDPR; recommendations on measures to supplement transfer tools; the concepts of controller and processor; the scope of the right of access under Article 15 of the GDPR, and data breach notification.
Key Themes in Data Protection and Technology
A number of important pieces of legislation are also coming down the track at EU and national level, which demonstrate that the GDPR does not resolve all data issues.
On the EU front, as part of its Digital Single Market strategy, the European Commission has proposed the Digital Services Act, Digital Markets Act, Artificial Intelligence Act, Data Act and Data Governance Act. The proposals aim to facilitate the further use and sharing of data between more public and private parties inside the data economy, to support the use of specific technologies such as Big Data and AI, and to regulate online platforms and gatekeepers.
The ePrivacy Regulation, and the NIS2 Directive are also amongst the legislative developments that we will be monitoring closely. This digital framework will be coupled with the GDPR and will grow alongside it, affecting privacy and data protection in unprecedented ways.
The Irish government has also recently published the long anticipated Online Safety and Media Regulation Bill 2022, after three years of engagement with stakeholders, including members of the public, companies, NGOs, and other government organisations. The bill has been described as marking “a watershed moment as we move from self-regulation to an era of accountability in online safety”.
In addition, the government has announced the imminent publication of the Consumer Rights Bill, which has been hailed as representing “the biggest overhaul of consumer rights law in 40 years”.
New SCCs for International Transfers were adopted by the European Commission in June 2021. The SCCs require companies to remove the old SCCs and insert the new SCCs into all legacy contracts by 27 December 2022.
In addition, prior to executing the new SCCs, companies will have to carry out and document a transfer impact assessment, and consider whether supplementary measures need to be adopted in order to ensure the transferred data is afforded an adequate level of data protection. This will be a burdensome exercise for many companies, particularly those transferring massive amounts of data globally.
A new data transfer tool, in the form of a further set of SCCs, is expected in 2022.
The European Commission intends to develop these SCCs to facilitate transfers of data to importers that are already subject to the GDPR by virtue of Article 3(2) of the GDPR. The EDPB has stated that this further set of SCCs are needed due to the fact that less protection is required when transferring data to an importer that is already subject to the GDPR, and in order not to duplicate its direct GDPR obligations.
The DPC imposed a record €225 million fine on a technology company last year for failure to discharge its transparency obligations under the GDPR, in regard to the content of its privacy notice. The decision, which is subject to appeal before the Irish Courts and an annulment action before the European Court of Justice, has implications for all organisations.
It sets out the DPC’s high expectations in respect of the information that must be provided in privacy notices, and how it should be presented. The standard set out in the decision arguably goes beyond that of most privacy notices. We will likely see further regulatory scrutiny and debate about the required content of organisations’ privacy notices in the year ahead.
The largest category of complaints from data subjects to the DPC continues to concern data subject access requests (“DSARs”). In its Annual Report for 2021, the DPC has warned that it intends to increase enforcement in this area and target non-responses and inadequate responses from controllers in respect to DSARs in the year ahead.
The EDPB recently published draft
guidelines on DSARs, which discuss the scope of the
right of access under Article 15 of the GDPR; how to provide access; general
issues controllers should consider when assessing a DSAR; along with restrictions to the
right of access. Interestingly, in the EDPB’s view, no proportionality test
applies when considering the right of access against the efforts the controller
has to take to comply with a DSAR. The draft guidelines also state
that “the fact that it would take the
controller a vast amount of time and effort to provide the information or the
copy to the data subject will not on its own render a request ‘excessive’”, and will not permit the
controller to refuse to act on the request pursuant to Article 12(5) of the
The Consumer Rights Bill 2022 will transpose EU Directives 770/2019 and 771/2019, on consumer contracts for the supply of digital content and digital services, and for the sale of goods, respectively.
It will also update and consolidate the statutory provisions on consumer rights and remedies in relation to contracts for the supply of non-digital services, unfair contract terms, and information and cancellation rights. The Bill is due to be published shortly by the Irish government.
The Online Safety and Media Regulation Bill 2022 will establish a new regulator, a multi-person Media Commission, to which an Online Safety Commissioner will be appointed. The Media Commission will replace the Broadcasting Authority of Ireland. It will be responsible for overseeing updated regulations for broadcasting and video on-demand services, and the new regulatory framework for online safety created by the bill.
The bill will also transpose the revised Audiovisual Media Services Directive into Irish law. The bill was published on 12 January 2022, and will now make its way through all stages in the Oireachtas.
The Digital Services Act (“DSA”) focuses on creating a safer digital space in which the fundamental rights of all users of digital services are protected. Among the core concerns tackled by the DSA are the trade and exchange of illegal goods, online services and content, and algorithmic systems amplifying the spread of disinformation. The European Parliament passed its position on the Digital Services Act on 20 January, allowing for negotiations with EU countries to start.
The Digital Markets Act (“DMA”) aims to establish a level playing field both in the European Single market and globally. It will create harmonised rules defining and prohibiting certain unfair practices by “gatekeeper” platforms (providers of core platform services). The European Commission will have new powers to carry out market investigations, and update the obligations for gatekeepers when necessary.
The European Parliament debated its position on the Digital Markets Act on 14 December 2021 and adopted it the following day. Negotiations with the EU governments started in January 2022.
The Artificial Intelligence ("AI") Act aims to address the development and adoption of safe AI across the EU while respecting the fundamental rights of EU citizens. Like the GDPR, the AI Act takes a risk-based approach.
It categorises all AI into:
- unacceptable risk – activities which are prohibited (e.g. social scoring)
- high-risk activities - which are only permitted subject to compliance with mandatory requirements and a conformity assessment (e.g. AI systems used for recruitment purposes or evaluating creditworthiness);
- limited risk (e.g. chatbots) – where users must be informed that they are interacting with a machine; and
- minimal risk (e.g. spam filters) – where free use of AI is allowed.
The proposed AI Act is still at the early stages of the European legislative process.
The Data Act covers both personal and non-personal data. It will govern who can use and access what data for which purposes across all economic sectors in the EU. The Act aims to unlock the value of data generated, for example, by connected objects in Europe, one of the key areas for innovation in the coming decade. It will clarify who can create value from such data and under what conditions.
The Data Governance Act ("DGA") also applies to both personal and non-personal data. It establishes a framework to facilitate general and sector-specific data-sharing (including data of public bodies, private companies and citizens). The DGA is designed to break down barriers to data sharing.
There are four pillars to the DGA:
- the re-use of sensitive public sector data;
- establishing a framework for new data intermediaries;
- corporate and individual data altruism; and
- fostering coordination and interoperability through the European Data Innovation Board.
The revised Network and Information Security Directive ("NIS2") will strengthen the security requirements, address the security of supply chains, streamline reporting obligations, and introduce stricter enforcement requirements, including harmonised sanctions across the EU to address the growing threats posed by digitalisation and the surge in cyber-attacks.
The proposed expansion of the NIS2 scope will effectively oblige more entities and sectors to comply with cybersecurity requirements.
"The standard set out in the decision goes significantly beyond that of most privacy notices"
Davinia Brennan, Technology and Innovation Partner, Matheson, commenting on the record fine imposed on a technology company by the DPC in 2021.
Online Safety and Media Regulation Bill Published – Major Changes to Irish Media Regulation Ahead
The Online Safety and Media Regulation Bill (the “Bill”) was published on Wednesday 12 January 2022. The Bill will transpose the revised Audiovisual Media Services Directive (“AVMS”) into Irish law and in doing so will establish a new Irish regulator of traditional and online-only media, the Media Commission, with new criminal and civil enforcement powers.
The full consequences of the new Media Commission regime will not be clear until the Media Commission issues the below-described Rules and Codes and we hope that these will be subject to industry consultation . However, there will be more near-term consequences, including an obligation for all providers of audio-visual on-demand services in the State to register with the Media Commission within 3 months of its establishment.
Now that it has been published, the Bill will begin its passage through the two Houses of the Oireachtas, which could take up to a couple of months. Once the Bill is voted through the Oireachtas, it will become law and the Media Commission, including an Online Safety Commissioner, will be established.
Transposition of the Revised Audio Visual Media Services Directive and Establishment of a Media Commission
By transposing the AVMS Directive, the Bill will replace the Broadcasting Authority of Ireland with a multi-personal Media Commission with the responsibility to oversee updated regulations for broadcasting and video on-demand services. This will update the way in which television broadcasting services and video on-demand services are regulated, to ensure greater regulatory alignment between traditional linear TV and video on-demand services, such as RTÉ Player and Apple TV.
The Commission will establish and maintain a register of media service for all audiovisual on-demand media services in the jurisdiction. All providers subject to registration at the date of coming into operation of Part 3 of the Act must notify the Commission not later than the end of the transitional period of 3 months from that date. Following this, the Bill will empower the Media Commission to direct an unregistered on-demand audiovisual media service to make an application for entry into the registry and make it a criminal offence to fail to comply with a direction of the Media Commission to make such an application.
The Introduction of Media Codes and Rules
The Media Commission will also create binding Media Codes and Rules reflecting the standards that audiovisual services must adhere to in relation to programme content, and may investigate the compliance of audiovisual media services with Media Codes, Media Rules, on its own initiative or on the basis of a complaint. The Media Commission will have the power to seek the imposition of a number of sanctions on a non-compliant on-demand audiovisual media service, in the event that they have failed to comply with a warning notice. It will also have the power to seek the prosecution of senior management of designated online services for failure to comply with a notice to end non-compliance.
The Media Commission will also have the power to issue rules imposing a levy on registered providers’ Irish revenue, which would be used to fund its regulatory activities and new grant schemes for Irish media production. However, this power will not be exercised until there is a full review and consultation on its merits and it is not intended that a levy would be imposed on providers with a minimal Irish presence.
As provided for in the Broadcasting Act, 2009, television broadcasting services are regulated on a contractual basis and non-compliance by such services with Media Codes and Media Rules may be pursued by the Media Commission as a breach of contract. Non-compliance with these obligations will be dealt with through a stepped process of compliance and warning notices. Ultimately, non-compliance with a warning notice is a criminal offence.
Amongst the Media Codes and Rules to be enforced by the Media Commission is the new 30% quota for European Works in the catalogues of video on-demand services, alongside a content production levy and content production scheme to support the creation of European Works, including independent Irish productions.
This will translate into an obligation for video on-demand services, such as Apple TV or Disney Plus, to meet a quota of 30% and ensure the prominence of European content in their catalogues, and is to be regulated across Europe depending on the country where the company is based. Ireland will be responsible for regulating a significant number of these companies due to the fact that is home to the headquarters of a number of social media companies such as Apple and Google. It will be considered a criminal offence for an on-demand audiovisual media service to not comply with this obligation and the relevant rules it its application.
Powers and Sanctions Available to the Media Commission
Sanctions available to the Media Commission to seek are:
- The imposition of a financial sanction of up to €20 million or 10% of turnover;
- Compelling a designated online service to take certain specified actions;
- Removing an on-demand audiovisual media service from the registry of such services, or;
- Blocking access to a designated online service in Ireland.
Due to constitutional limitations on the powers of regulatory bodies in Ireland, the imposition of any of these sanctions requires the confirmation of a Court. As television broadcasting services are regulated on a contractual basis, a different set of sanctions are available to the Media Commission. This includes revocation of a contract or an administrative financial sanction, the latter of which is subject to court confirmation.
Additional powers awarded to the Media Commission will include the power:
- to require the provision of information from regulated services;
- to issue content limitation notices to designated online services in respect of individual pieces of harmful online content;
- to impose industry levies.
Establishment of a Regulatory Framework for Online Safety
The Bill will establish a regulatory framework for online safety, by defining “harmful online content” with reference to defined categories of content, including a category containing a schedule of criminal offences and three further categories relating to cyberbullying, the promotion of suicide and self-harm, and the promotion of eating disorders.
As part of the framework, binding online safety codes will be created to tackle the availability of harmful online content, this will be done through content moderation, complaints handling recommendation systems, and advertising. This is in conjunction with non-binding online safety guidance materials and advisory notes in order to foster a safety-first culture of compliance. Systemic issues will also be brought to the attention of the Media Commission through the creation of a “super-complaints” scheme working in tandem with nominated bodies, including expert NGOs in areas such as child protection.
The published Bill addresses the majority of the 33 recommendations of the Joint Oireachtas Committee on Tourism, Culture, Arts, Sports and Media in their pre-legislative scrutiny report of the General Scheme of the Bill. This includes recommendations in relation to better defining harmful online content, reporting requirements for online services, a bigger role for the Media Commission in education and the independence and resourcing of the Commission.
The Minister intends to address a number of further recommendations through potential amendments at Committee Stage. This includes the recommendations to include the provision for an individual complaints mechanism for harmful online content. These recommendations raise a number of complex practical and legal issues, including in relation to scalability, due process and timeliness. In order to ensure that these matters are fully considered, the Minister intends to establish an expert group which will consider these issues and best practices by other regulators. The group will then report to the Minister within 90 days with recommendations for how to address these issues.