
Tech Governance Series
In this series of articles, Gráinne Boyle (Partner, International Business, Corporate), Catherine Carrigy (Senior Associate, International Business, Corporate), and Elaine Hayden (Solicitor, International Business, Corporate), together with Marie McGinley (Partner and Head of Technology & Innovation) and Stephanie McCarthy (Associate, Technology & Innovation), provide an overview of the key governance considerations arising from recent and upcoming EU tech-focused regulations. The focus is on what these regulations specifically mean for board oversight and responsibility, rather than outlining the broader technical compliance obligations introduced on companies generally.
Introduction
The adoption and deployment of AI systems is rapidly transforming business operations and creating significant opportunities to realise efficiencies and enhance performance across organisations. Simultaneously, national and EU legislators have introduced regulatory frameworks to govern these technologies. Strong corporate governance practices are needed now to mitigate risks, to ensure companies are ready to meet the requirements of new and emerging legislation, and to create frameworks that can adapt as these technologies evolve. In this dynamic regulatory landscape, companies cannot afford to take a “wait‑and‑see” approach.
Boards of directors are responsible for setting the strategic direction for AI use, overseeing compliance readiness, and monitoring the key risks and challenges that may arise as the regulatory landscape continues to develop.
This article offers an overview of some of the critical areas that boards should consider in their oversight of AI, and which should inform meaningful, robust discussions in boardrooms.
Existing legislative framework
As a general comment, there is currently limited legislation, either in force or expected in the near term, that prescribes detailed corporate governance obligations specifically aimed at ensuring AI compliance in relation to many company’s existing AI practices. While the two pieces of legislation outlined below will have relevance for AI governance, the core principles of good corporate governance and existing Irish law will remain particularly significant:
The EU AI Act
The EU AI Act (the “Act”) is the EU’s fist comprehensive framework regulating AI and is directly applicable across the EU. While the Act entered into force on 2 August 2024, it will enter into effect on a phased basis with all provisions scheduled to be effective by August 2027. The Act introduces a risk‑based framework governing the development and use of AI systems within the EU. It also establishes significant regulatory obligations, elevating AI governance to a matter of strategic and board‑level responsibility.
The General Scheme of the Regulation of Artificial Intelligence Bill 2026
The Irish government recently published the General Scheme of the Regulation of Artificial Intelligence Bill 2026 (the “AI Bill”), which seeks to implement the Act. The AI Bill proposes to: (i) establish a new independent authority, Oifig Intleachta Shaorga na hÉireann1, or the AI Office of Ireland; and (ii) set out the penalties for infringement of the Act.
Tip: In addition to AI‑specific legislation, boards should be also mindful of discharging their duties as directors under Irish law generally.
Key board considerations
Directors are not expected to become technical AI experts in order to discharge their duties in relation to AI governance. Rather, they can strategically delegate authority to the appropriate specialist risk, compliance and technical teams to implement:
- appropriate controls to maintain ongoing compliance and oversight; and
- risk management frameworks to identify, manage and monitor material risks.
Working closely with specialist teams to develop governance frameworks to ensure that the AI systems deployed align with the company’s key values and strategic objectives is essential for risk management and compliance in the current regulatory landscape.
While not exhaustive, the following are some of the key areas we recommend that boards consider and engage with their stakeholders on to ensure effective governance and compliance.
Key areas
- What are the business outcomes (short, medium and long term) that the company wants to achieve with AI usage?
- What is the company’s risk appetite when it comes to AI? Is there a committee to consider use case prioritisation before implementing AI systems to ensure that they are deployed in a manner which offers the greatest impact and value to the company and aligns with the company’s strategic goals and cultural values?
- How will the implementation of AI systems impact workflows in the company? Does it have a proactive strategy to address these changes in workflows and mitigate disruptions?
- If developing AI capabilities in house, does the company have appropriate IP protection strategies to protect these developments and maintain its competitive advantage?
- Have clear delegations of authority been granted by the board to ensure effective management of AI implementation and compliance? If so, are processes in place to ensure the actions taken by delegates are reported back to the board?
- Does the company have a separate committee or working group established (comprised of key stakeholders) with responsibility for AI to ensure effective compliance (as part of the delegation structure)? Is a framework / charter required to clearly set out its parameters and powers?
- Should an AI implementation and compliance update be provided on an ongoing basis as part of the board meeting forum?
- As part of the company’s AI system onboarding process, do individual(s) with the relevant expertise complete an assessment of the AI system? This might include considering the model incorporated into the AI system, the training approach undertaken in respect of the underlying AI model or whether mitigations have been implemented to prevent any inappropriate bias).
- Have controls been put in place to rigorously test AI systems before deployment within the business?
- Has the company prepared an AI system inventory and analysed the risk classification of the AI systems utilised (and the related responsibilities under the Act)?
- Are internal controls and ongoing review mechanisms in place to ensure continued compliance with the Act and appropriate standards for AI governance? Should the company adopt an AI policy?
- Is the approach being taken on AI compliance consistent and aligned with other relevant compliance functions (e.g. privacy, information security, ESG)? Are structures in place to ensure efficient continued engagement between these functions?
- Are processes in place to ensure horizon scanning of potential changes to AI legislation, standards, and market expectations on compliance? Are we engaged with industry forums?
- Is the expenditure being incurred by the company on AI usage justified by the existing / potential benefits?
- Are sufficient internal and external resources being allocated to ensure the company’s compliance?
- Has the company developed a third party AI system vendor onboarding questionnaire?
- Do the company’s third party contracts require amendment to reflect the company’s use of AI systems?
- Should a review of the company’s engagement with third parties be undertaken in the context of third parties’ use of AI system?
- Has appropriate, sufficient training been provided to employees on how to use the AI systems deployed in the company effectively and responsibly?
- Does the company’s existing internal policies and procedures require amendment to address acceptable usage of AI systems by employees?
Conclusion
AI governance now forms part of the core supervisory and risk management function of boards. By adopting robust governance frameworks, promoting continuous learning, enhancing transparency, and aligning AI management with broader business goals, boards can ensure meaningful compliance.
For more information please reach out to Gráinne Boyle (Partner, International Business, Corporate), Marie McGinley (Partner and Head of Technology & Innovation) or your usual Matheson contact.
