Skip to content

Updated Guide on Compliance Assessments Under the EU AI Regulation by FPF and OneTrust

FPF and OneTrust have released a new edition of their Conformity Assessment Guidelines under the EU AI Act, coupled with an accompanying Infographic. This revised guide aligns with the text of the EU Artificial Intelligence Act (EU AIA), which was passed in 2024 by the European Union.

Updated Guide Outlining Conformity Assessments under Europe's AI Act by FPF and OneTrust
Updated Guide Outlining Conformity Assessments under Europe's AI Act by FPF and OneTrust

Updated Guide on Compliance Assessments Under the EU AI Regulation by FPF and OneTrust

The Future of Privacy Forum (FPF) and OneTrust have published an updated Conformity Assessment Guide under the EU Artificial Intelligence Act (EU AIA), providing a comprehensive roadmap for organizations seeking to build internal processes that align with the Act.

The updated guide offers a step-by-step approach to understanding and complying with the Conformity Assessment requirements under the EU AI Act. Here's a summary of the key phases:

1. **Understanding the AI Act Scope and Timeline:** The EU AI Act, which entered into force in August 2024, has a phased applicability, with some provisions starting in early 2025. Key early milestones include the ban on unacceptable-risk AI, an AI literacy obligation for companies, and transparency requirements for generative AI models. The full applicability for most high-risk AI systems will apply from 2 August 2026.

2. **Identifying Whether Your AI System Is High-Risk:** Conformity assessment requirements primarily apply to "high-risk" AI systems as classified under the Act. Organizations should classify their AI systems according to the risk categories defined by the regulation to understand their obligations.

3. **Preparing Documentation and Risk Management:** Develop and maintain a technical documentation file demonstrating compliance with the AI Act’s requirements. Implement a risk management system detailing how risks associated with the AI system are identified, mitigated, and monitored. Ensure transparency, robustness, and data governance measures are in place.

4. **Engaging with Notified Conformity Assessment Bodies:** Member States designate notified bodies responsible for third-party conformity assessments. Organizations must coordinate with these bodies to conduct the required conformity assessments to verify that their high-risk AI systems meet the regulatory standards.

5. **Implementing Governance and Internal Controls:** Establish internal policies and training programs to ensure organizational awareness and compliance culture. Monitor AI system performance continuously and document any incidents or breaches.

6. **Certification and Market Surveillance:** After conformity assessment, obtain CE marking or other certification as proof of compliance before placing AI systems on the EU market. Be prepared for ongoing market surveillance by national authorities enforcing the AI Act’s provisions.

7. **Staying Updated with the AI Act Governance Framework:** Follow guidance from the European AI Office and participate in emerging frameworks like the AI Code of Practice. Keep abreast of updates regarding general-purpose AI models’ specific rules effective August 2025.

This roadmap emphasizes a phased approach with early focus on risk classification and staff education, followed by rigorous documentation, third-party assessment, and eventual certification aligned with the Act’s rollout schedule. Coordination with national authorities and notified bodies is essential for formal conformity assessment and market approval.

The updated guide and infographic, available for viewing [here](URL for the guide) and [here](URL for the infographic), aim to support organizations in navigating their obligations under the AIA and building internal processes that align with the Act. It's important to note that the guide does not constitute legal advice for any specific compliance situation, and organizations should consult the original FPF-OneTrust publication directly for precise procedures and checklist details.

[1] FPF and OneTrust. (2023). The EU AI Act: A Step-by-Step Guide to Understanding and Complying with the Conformity Assessment Requirements. Retrieved from [URL for the guide] [2] FPF and OneTrust. (2023). The EU AI Act: An Infographic Guide to Conformity Assessment Requirements. Retrieved from [URL for the infographic] [3] European Commission. (2022). Regulation (EU) 2023/XX: The Artificial Intelligence Act. Retrieved from [URL for the EU AI Act] [4] European Commission. (2023). Notified Bodies for the EU AI Act. Retrieved from [URL for the Notified Bodies list]

  1. The updated Conformity Assessment Guide by the Future of Privacy Forum (FPF) and OneTrust under the EU Artificial Intelligence Act (EU AIA) highlights the importance of trust and compliance for organizations seeking to align with the Act.
  2. This roadmap emphasizes that global privacy and security should be primary considerations in developing technology, particularly in the context of AI systems, as stated in the guide.
  3. The guide stresses the need for a forum where organizations can discuss best practices related to trust, privacy, and security in the implementation of AI, as seen in their discussions on the EU AIA.
  4. Organizations must ensure their AI systems meet global legislation requirements, such as the EU AIA, to maintain customer trust and avoid legal complications, as pointed out in the guide.
  5. The updated guide by FPF and OneTrust demonstrates the potential of artificial-intelligence-driven technology to address complex issues, such as compliance with the EU AIA, while prioritizing trust, security, and privacy concerns.

Read also:

    Latest