Westminster Policy News & Legislative Analysis

Data Protection Act 2018 AI Code Regulations Start 12 May

According to the statutory instrument published on legislation.gov.uk, the Data Protection Act 2018 (Code of Practice on Artificial Intelligence and Automated Decision-Making) Regulations 2026 were made on 16 April 2026, laid before Parliament on 21 April 2026 and will come into force on 12 May 2026. The Regulations apply across England and Wales, Scotland and Northern Ireland. The instrument records that the Secretary of State acted under sections 124A and 124B of the Data Protection Act 2018 and consulted the Information Commissioner, along with other persons considered appropriate, under section 182(2). Its immediate effect is procedural rather than substantive: it requires the Commissioner to begin producing a formal code of practice on AI and automated decision-making.

The new duty sits inside the existing data protection regime rather than outside it. Regulation 2 requires the Commissioner to prepare guidance on good practice in the processing of personal data under the UK GDPR and the Data Protection Act 2018 in relation to developing and using artificial intelligence, and in relation to automated decision-making. That drafting matters for policy and compliance teams. The Regulations do not create a separate AI statute, nor do they set out a fresh set of direct operational duties on day one. Instead, they instruct the regulator to produce an authoritative code under the current legal architecture, which means the practical detail will follow through the Commissioner’s drafting process.

The scope of the future code is deliberately broad, but it is not unlimited. The Explanatory Note states that relevant data protection legislation means the UK GDPR and the Data Protection Act 2018, excluding Part 4 of the 2018 Act, which covers intelligence services processing. For most organisations, that means the forthcoming code is aimed at mainstream public and private sector uses of AI where personal data is involved. Departments, local authorities, NHS bodies, suppliers and commercial controllers using AI systems will therefore need to read the code alongside existing UK GDPR duties, rather than treating it as a separate compliance stream.

One feature is expressly written into the Regulations rather than being left to the Commissioner’s discretion. Regulation 2 requires the code to include guidance on good practice in the processing of children’s personal data. In policy terms, that moves children’s data from a possible subtopic to a mandatory part of the final document. The practical message is clear. Any organisation developing or deploying AI tools that touch children’s information should expect the Commissioner to examine governance, justification, safeguards and oversight with particular care. The Regulations do not yet define those measures in detail, but they make clear that children’s data must be addressed in the code itself.

The instrument also ties the future guidance to specific statutory provisions on automated decision-making. Regulation 2 defines the term by reference to Article 22C(1) of the UK GDPR and section 50C(1) of the Data Protection Act 2018, both inserted by section 80 of the Data (Use and Access) Act 2025. That linkage gives the code a clear legal anchor. Organisations using algorithmic tools for assessment, routing, ranking or decision support should expect the Commissioner’s guidance to be structured around the post-2025 legislation, not around a general or informal understanding of what AI might mean in practice.

A narrower but important procedural change appears in regulation 3. It modifies section 124B of the 2018 Act so that any panel established to consider or amend the code must not consider, or report on, any aspect of the code relating to national security. The Explanatory Note confirms that this is a specific adjustment to the normal panel process. The same note also says no full impact assessment has been produced for the instrument because no, or no significant, effect on the private, voluntary or public sector is expected from the Regulations themselves. Instead, the Commissioner must produce an impact assessment when preparing the code. That is the point at which the operational effect is likely to become clearer. Until then, the policy position is that the Regulations start the formal guidance process, signed by Ian Murray at the Department for Science, Innovation and Technology, but leave the detailed compliance consequences to the Commissioner’s next stage of work.