Everything you need to know about the ISO 42001 Standard

What is ISO 42001:2023?

On December 18, 2023, the International Organization for Standardization (ISO) adopted ISO 42001-2023, which sets a voluntary standard for organizations to implement an artificial intelligence (AI) quality management system.ISO42001 - What you need to know

ISO 42001 outlines the primary objective of an AI management system as the establishment of policies, procedures, and objectives for organizations regarding their AI systems. This standard offers a set of overarching principles and objectives for an organization’s stakeholders to adhere to during the implementation of their AI management system. The annexes in ISO 42001 serve to map these principles and objectives to more detailed controls and provide implementing guidance for a more comprehensive and effective approach.

Who is concerned?

The ISO 42001 standard is designed for organizations engaged in the development, deployment, and/or utilization of AI. In a manner akin to the National Institute of Standards and Technology (NIST) AI Risk Management Framework (RMF), ISO 42001 is crafted to be adaptable and scalable, accommodating the specific requirements and scale of each organization. Furthermore, its flexibility spans across various industries, demonstrating its agnosticism to the specific products or services offered by the organization.

Moreover, organizations seeking to showcase trustworthiness in their AI applications may opt to proactively embrace ISO 42001. With increasing consumer awareness of how AI is employed and the associated risks, organizations demonstrating adherence to ISO 42001 may foster significant trust among their customers and the public.

What is a “voluntary” standard?

 

Organizations that are considering how to manage their AI systems can think of ISO/IEC 42001 standard as another tool in the toolbox to assist them in implementing the appropriate policies and procedures. However, organizations should also consider how voluntary frameworks and standards can easily be translated into enforceable regulatory requirements.

Policymakers, especially in the European Union (EU), may gravitate towards the standard ISO/IEC 42001 as an enforceable standard for AI governance. For instance, organizations deploying high-risk systems must demonstrate compliance with the EU AI Act through conformity assessments once the law becomes effective. EU policymakers have indicated that compliance with ISO 42001 may be incorporated into the conformity assessment requirements.

Additionally, USA President Joe Biden’s Executive Order on AI directs NIST to continue building upon the AI RMF and its work on other AI-related standards. NIST might align some of its future recommendations with ISO 42001. Given that US regulators are likely to rely on NIST’s AI standards for establishing AI regulatory regimes, this has the potential to take over elements of ISO 42001 into regulatory in the US. But this is obviously only assumption at this stage.

 

How does ISO 42001 compare to other standards?

How does ISO 42001 compare to NIST AI RMF?

From the content point of view, ISO 42001:2023 can be considered of as a complement to the NIST AI RMF. While NIST focuses primarily on managing risks of AI systems, both address high-level policies and procedures that organizations should consider to manage quality of AI systems generally.

Significantly, although both the NIST AI RMF and ISO 42001 are technically voluntary standards, ISO 42001 can function as an auditable standard. Furthermore, ISO 42001 offers a more comprehensive level of detail for the implementation of controls compared to the NIST AI RMF. For example, while the NIST AI RMF requires organizations to establish policies and procedures for the ongoing monitoring of AI systems throughout their lifecycle, it lacks specific details regarding those policies and procedures. In contrast, ISO 42001 provides more explicit guidance on the considerations organizations should incorporate when implementing continuous AI monitoring policies and procedures.

How does ISO 42001 standard compare to ISO 27001-2022 and to ISO 23894-2023 – AI Risk Management systems?

Standard ISO 27001:2022

While ISO 27001 is not an AI-related standard, it is an internationally dominating information security management standard. And Cybersecurity is very closed to AI security in several points of view. An organization can attain ISO 27001 certification, showcasing the implementation of adequate measures to safeguard its information system assets. In a similar vein, ISO 42001 is akin to ISO 27001, offering organizations a certifiable standard for the management of their AI systems.

Standard ISO 23894:2023

This is all about scope.:

  • ISO 23894 standard adapts to AI the generic risk management standards from ISO 38001-2018
  • ISO 42001 focuses on internal operating procedures to manage AI systems.

While ISO 42001 does encompass risk management, it does so within the broader context of organizational policies and procedures governing AI development, deployment, and internal AI use cases. For example, ISO 42001 mandates regular risk assessments by organizations and includes implementing guidance on the comprehensive documentation of AI risks. Nonetheless, for more detailed guidance on designing effective risk assessments, ISO/IEC 23894 standard offers a more descriptive framework.

 

Shall I get certification for ISO 42001:2023?

While this is still voluntary, this is very likely that the EU AI Act will put this standard in the spotlight. So for best practice or for compliance, this is likely to become a must have.

As with other ISO standards, there will be certifying bodies established to audit organizations that seek to adopt ISO 42001.

Please not that you cannot download ISO 42001 for free, as this is an ISO standard with copyright. Go there to have the official source (and pay something like 200 USD I assume…):

About Author