On 31 January 2025, the Department for Science, Innovation and Technology (DSIT) published a new artificial intelligence (AI) Cyber Security Code of Practice, along with an accompanying implementation guide.

Purpose of the Code

The Government explains that a voluntary Code of Practice focused specifically on the cyber security of AI is needed due to the distinct differences between AI and software, including security risks from “data poisoning, model obfuscation, indirect prompt injection and operational differences associated with data management”. It also note that software needs to be secure by design and that stakeholders in the AI supply chain require clarity on what baseline security requirements they should implement to protect AI systems.

What is covered by the Code?

The scope of the Code is focused on “AI systems”, including systems that incorporate deep neural networks such as generative AI. It sets out cyber security requirements for the lifecycle of AI, which it has separated into five phases: secure design, secure development, secure deployment, secure maintenance and secure end of life. The Code signposts relevant standards and publications at the start of each principle to highlight links between the various documents and the Code.

Future global standard

DSIT has developed the voluntary Code with the intention that it will form the basis of a new global standard for secure AI through the European Telecommunications Standards Institute (ETSI), which will set baseline security requirements. The UK Government plans to submit the Code and the implementation guide to ETSI so that the future standard is accompanied by a guide, and it notes that it will update the content of the Code and the guide to mirror the future ETSI global standard and guide.

Implementation guide

The implementation guide is intended to support organisations in complying with the requirements of the voluntary Code (and future global standard), by providing a “one-stop shop” which brings together guidance and key steps to follow.