In this blog post, we break down Article 4 of the EU AI Act, which mandates that organizations ensure a sufficient level of AI literacy for employees and stakeholders involved in AI systems. We define AI literacy, give a step-by-step roadmap to implement AI trainings, and showcase best practices from leading organizations on how to implement effective AI education programs.
On 1 August 2024, the European Artificial Intelligence Act (AI Act) entered into force, aiming to ensure that AI developed and used in the EU market is trustworthy, lawful, ethical, and robust. From the 2nd of February 2025, the first two key provisions of the EU AI Act come into full effect, namely Article 4 (AI literacy) and Article 5 (prohibited AI practices). These are especially relevant for providers and deployers under the AI Act. In this blog article, we explore what Article 4 encompasses and how your organization can comply with its requirements.
Article 4 of the EU AI Act mandates that providers and deployers of AI systems must take measures to ensure a sufficient level of AI literacy among their staff and other individuals involved in the operation and use of AI systems. The level of literacy required should take into account employees' technical knowledge, experience, education, and training, as well as the context in which AI systems are deployed.
“Providers and deployers of AI systems shall take measures to ensure, to their best extent, a sufficient level of AI literacy of their staff and other persons dealing with the operation and use of AI systems on their behalf, taking into account their technical knowledge, experience, education and training and the context the AI systems are to be used in, and considering the persons or groups of persons on whom the AI systems are to be used.”
- Article 4 of the EU AI Act
AI literacy, as defined in the AI Act, is a combination of knowledge, understanding, skills, and experience. Organizations must ensure that their employees:
In general, the EU AI Act lays down two main actors in the AI value chain who are affected by the provisions, namely
Article 4 specifies who is obliged to be AI literate, namely anyone dealing directly with AI systems, which includes staff of providers and deployers, employees working on their behalf, as well as contractors or service providers. As the use of Generative AI in organizations increases, both for own use in the professional context, as well as integrated into systems or services, organizations have a high chance to be affected by the provisions of Article 4.
But additionally, it is recommended to foster a general understanding of AI within the company regardless of the maturity of AI, as employees might be unaware of the AI Act's provisions and the scope and risks of AI systems they might already use for work even without the company's approval. Providing an AI literacy program covering the fundamentals of AI helps your employees to stay innovative while preventing and preparing for potential risks.
AI literacy is a fundamental pillar of effective AI governance, which in turn plays a crucial role in maintaining a competitive edge through responsible innovation and enabling the adoption of emerging technologies. However, the importance of fostering AI literacy extends beyond these strategic advantages. There are several additional reasons why investing in AI education and awareness from the outset is essential:
Translated into practice this means that companies should…
While Article 4 compliance is not strictly enforced, organizations can face serious consequences, both monetary and reputational, if they’re not compliant. In the event of a compliance issue, authorities may evaluate whether employees have received adequate training, and failure to demonstrate adequate AI literacy initiatives could lead to regulatory scrutiny and reputational harm. Moreover, companies could face fines if an employee, due to gross negligence, causes an AI-related incident or lacks the necessary compliance knowledge, resulting in a regulatory violation. Additionally, liability risks arise if damages occur due to insufficient AI training. Regarding the sufficiency of training endeavors, the AI Office has made it clear that merely providing written instructions on AI usage is inadequate. Article 4 explicitly requires organizations to offer structured guidance and training to ensure employees can operate AI systems responsibly and in compliance with regulations, according to the European AI Office.
A potential roadmap for the successful implementation of AI literacy could look like the following…
Determine whether you are a developer or deployer of an AI system, for instance by taking our free EU AI Act Compliance Checker.
In each company, the following groups of workforce are most likely:
Depending on their role and the context of the AI system(s) in use, the requirements for levels of AI literacy can vary.
To ensure holistic AI governance within your organization you need to assign responsibilities for implementation and monitoring of the trainings. Moreover, documenting these steps can be helpful for potential audits and compliance.
Do you want to get a roadmap for successfully implementing AI trainings in your organization? Download our step-by-step checklist here!
The European AI Office has released a living repository of best practices to support the implementation of Article 4 under the EU AI Act. It contains data from surveyed companies, who signed the AI Pact and are sharing their implementation best practices. Here is a summary of the AI literacy best practices from European organizations:
In summary, the current best practices balance the tailoring of courses and their scalability, enabling the general workforce to responsibly use AI on a day-to-day basis, while equipping more advanced roles also with more advanced skills and knowledge. Moreover, whether and whatever Learnings Management System an organization has, the AI literacy training is integrated into clear structures, including a platform that hosts the training and a review mechanism. In general, trainings and programs are developed in close collaboration with the respective intended roles and context, both of the AI system in use, but also the organization as a whole. This means that a training program for employees of an insurance company will vary regarding content and focus when compared to a health care provider or a design studio, which uses AI-powered design features. A one-size-fits-all solution is therefore only appropriate in a very general, fundamental and basic context of AI systems, for instance, the professional use of conversational or generative AI for daily tasks. As soon as there are more specific AI systems in place, when AI is developed in-house or part of a product or service, a tailored training seems to be the most appropriate practice. This tailoring can and should be done in alignment and close collaboration with the employees working on or with AI to build a well-working AI literacy program.
Article 4 of the EU AI Act mandates that providers and deployers of AI systems must take measures to ensure a sufficient level of AI literacy among their staff and other individuals involved in the operation and use of AI systems. With the rapid adoption of enterprise AI and generative AI models, it is likely that many organizations fall under the scope of Article 4 and, hence, must take steps to comply now. There are already many best practices that show how successful implementation can look like and how to balance generic, fundamental programs with role-specific trainings.
When searching for the right external provider of AI trainings, you can browse this AI training marketplace and find a program that fits your needs and budget.
If you want to kickstart your AI governance, leveraging a platform or GRC-tool can facilitate your governance processes - be it for procured or internally developed AI use cases. trail is putting holistic AI governance intro practice right from the start, by not only integrating the management of all your AI trainings directly in trail, but also collecting all your AI systems in one central place. Our platform is also helping you to provide more technical evidence to prepare for audits and regulatory compliance. Check out how trail’s features can transform your AI governance journey here!