Skip to content

Irish Government publishes General Scheme of AI Bill

The Department of Enterprise, Tourism and Employment has published the General Scheme of the Regulation of Artificial Intelligence Bill 2026  (“the Scheme”) which seeks to fully implement Regulation (EU) 2024/1689, known as the EU Artificial Intelligence Act (“the AI Act”). This legislation is necessary to provide for national supervision and enforcement of the obligations set out in the AI Act. This article sets out the key highlights of the Scheme.

Distributed model for AI Oversight

The Scheme gives official recognition to the Irish Government Decisions on 4 March 2025 and 22 July 2025, approving a distributed model for AI oversight. This means that existing sectoral authorities, such as the Data Protection Commission (“DPC”) and the Central Bank of Ireland (“CBI”) will oversee AI within their own domains, with a central authority providing coordination. For example, the DPC will continue to protect fundamental rights related to personal data; the CBI will supervise AI in regulated service and Coimisiún na Meán will oversee AI in audiovisual media services.

A number of these authorities were designated by the Irish Government as market surveillance authorities and notifying authorities (collectively “Competent Authorities”) via Statutory Instrument No. 366/2025 (previously discussed here).  It is hoped that the Government’s decision to use the existing national framework of well-established authorities for enforcement of the AI Act will allow those authorities to leverage their existing sectoral expertise.  This distributed regulatory model highlights how AI touches essentially every regulated sector of society.

Establishment of national AI Office

The Scheme proposes the establishment of a new statutory body called Oifig Intleacht Sharoga na hÉireann or the AI Office. This statutory body will act as the central coordinating authority, ensuring the consistent and effective implementation and enforcement of the AI Act across all regulated sectors.  The AI Office will be overseen by a Chief Executive Officer and a seven-member board appointed by the Minister for Enterprise, Tourism and Employment.

The AI Office is designated as both a market surveillance authority and the single point of contact under Article 70(2) of the AI Act.  In this respect the AI Office will act as the central contact point for engagement with the European Commission, EU AI Office, EU bodies and agencies and other Member States, and will deliver a programme of support for the Competent Authorities.

The Scheme requires the AI Office to be operational by 1 August 2026, to ensure that it is operational in time to meet key obligations under the AI Act, the majority of which come into force on that date, subject to any delayed timelines introduced pursuant to the Digital Omnibus Simplification Package, once agreed (further discussed here).

The key functions of the AI Office include:

  • facilitating, in co-operation with the Competent Authorities, the consistent enforcement of the AI Act;
  • enabling cooperation between the Competent Authorities;
  • promoting AI innovation and adoption and fostering AI literacy; and
  • facilitating access to technical expertise for the Competent Authorities.

The AI Office will also establish a national AI regulatory sandbox, providing businesses, especially SMEs and start-ups, with a controlled environment to test innovative AI systems under regulatory supervision before full market deployment.

Supervision and enforcement

The designated market surveillance authorities in Ireland will have extensive enforcement powers, including the powers set out in the Market Surveillance Regulation 2019/1020 and those specified in the AI Act. These market surveillance authorities include those authorities designated by S.I. No. 266 of 2025, and the additional five authorities that were later agreed by the Government and are being formally designated by amendment to that S.I.

These enforcement powers include, for example:

  • requiring organisations to provide documents demonstrating conformity with the AI Act;
  • requiring organisations to provide relevant information on the supply chain;
  • carrying out unannounced on-site inspections;
  • commencing investigations on their own initiative in order to identify non-compliance and bring it to an end;
  • taking appropriate measures where an organisation fails to take appropriate corrective action or where the non-compliance or the risk persists, including prohibiting or restricting the making available of a product on the market or ordering a product to be withdrawn or recalled; and
  • imposing penalties in accordance with Article 41 of the Market Surveillance Regulation and Article 99 of the AI Act.

Interestingly, the Scheme further provides market surveillance authorities with the power to access the source code of the high-risk AI system, where (i) necessary to assess the conformity of a high-risk AI system with the requirements in Chapter III, section 2 of the AI Act, and (ii) where testing or auditing procedures based on data and documentation supplied by the provider have been exhausted or proved insufficient. This is a last-resort investigative tool designed to ensure compliance with the strict requirements in Chapter III, section 2 of the AI Act.  Accordingly, if AI system developers wish to avoid disclosing such confidential source codes, they will need to ensure they have sufficient technical documentation in place to demonstrate the conformity of their high-risk AI systems with the requirements set out in the AI Act.

In addition, the Scheme enables a market surveillance authority to request a formal evaluation of an AI system if it suspects the system has been incorrectly self-classified by a provider as non-high risk. It will therefore be critical for organisations to ensure they take steps to clearly document and justify why they have classified their AI system as non-high risk, taking into account the criteria in Article 6(3) of the AI Act and the European Commission guidelines on classification. Article 6(5) of the AI Act requires these guidelines to be published no later than February 2026 and they are currently being drafted.

The Scheme also provides market surveillance authorities with the power to appoint as many persons as they think fit to act as authorised officers. These officers will have robust investigatory powers for the purpose of ensuring compliance with the AI Act.

Administrative sanctions

The administrative fines that may be imposed include:

  • for non-compliance with prohibited AI practices under Article 5, fines of up to €35 million, or 7% of the business’ total worldwide annual turnover for the preceding year, whichever is higher;
  • for non-compliance with any provisions related to operators or notified bodies (other than those laid down in Article 5 of the AI Act), fines of up to €15 million, or 3% of the business’ total worldwide annual turnover for the preceding year, whichever is higher; and
  • for providing incorrect, incomplete or misleading information to notified bodies or national competent authorities in reply to a request, fines of up to €7.5 million, or 1% of the business’ total worldwide annual turnover for the preceding year, whichever is higher; and
  • in cases of public-sector bodies, fines of up to €1 million.

The Scheme provides for a number of procedural safeguards in respect of the imposition of any administrative sanctions. Enforcement proceedings start with a notice of suspected non-compliance, followed by a notice period for written submissions, and publication of a notice of non-compliance on a market surveillance authority’s website. Where, following an investigation by an authorised officer, a matter is referred to formal adjudication, it is heard by an independent adjudicator nominated by the AI Office, and appointed by the Minister for Enterprise, Tourism and Employment. The AI Office is required to put in place a pre-established, structured arrangement that allows Competent Authorities to procure adjudication services required for referred investigations from a group of pre-qualified adjudicators (the Adjudication panel).

Furthermore, the Scheme provides for judicial oversight of any administrative sanctions imposed by an adjudicator, which will only take effect upon confirmation by the High Court. An adjudicator’s decision to impose an administrative fine may also be appealed by the person concerned to the High Court within 28 days of notice of same.

Comment

While the structure of AI regulation in Ireland is becoming clearer with the publication of this Scheme, there are undoubtedly still some issues which have yet to be addressed. The Scheme recognises this by providing:

The AI Act and the Digital Omnibus Simplification proposal 2025 introduces a comprehensive framework for the regulation of AI systems across the EU. However, as a new and rapidly developing area of law, the full scope of regulatory needs and operational requirements may not be immediately apparent.”

As such, the Scheme provides for the Minister to adapt the remit of the AI Office over time, ensuring that Ireland’s implementation of the AI Act remains responsive and is, as far as possible, future-proofed. This mirrors the delegated powers granted to the European Commission under Article 97 of the AI Act, which allow for the adoption of delegated acts to address emerging regulatory needs.

Next steps

The Scheme will now undergo pre-legislative scrutiny, followed by drafting of the Bill and its passage through Dáil Éireann and Seanad Éireann. In the event that any amendments to the Scheme are required once the Digital Omnibus Simplification proposal is agreed at an EU level, the Irish government has confirmed it will bring forward further proposals for consideration by the Oireachtas.

Contact us

For more information, please contact any member of our Technology and Innovation Group or your usual Matheson contact.

With thanks to Ben Doyle for his contribution to this article

© 2025 Matheson LLP | All Rights Reserved