Regulating artificial intelligence

The UK Law Commission’s approach

Industry specialisms08.08.20257 mins read

Key takeaways

AI challenges traditional legal frameworks

Autonomous systems raise new questions of liability.

Legal scrutiny rises around AI data use

Unreliable training data can lead to regulatory scrutiny.

Future proof contracts for AI integration

Review clauses to address AI use and compliance.

Regulating artificial intelligence: the UK Law Commission’s approach

As businesses increasingly adopt AI-enabled systems, they face a growing challenge: how to ensure contracts, compliance, and risk frameworks keep pace with the technology. Whether you’re procuring AI tools or taking them to market, the Law Commission’s recent paper offers a timely lens through which to assess your legal exposure — and your readiness for what’s next.

On 31 July 2025, the Law Commission published its discussion paper on Artificial Intelligence and the Law, marking a significant moment in the legal sector’s engagement with AI. While the paper does not propose reforms, it sets out a framework for identifying where AI may challenge existing legal doctrines - and where future reform may be necessary.

Can the law keep up with machines? 

The paper identifies three principal areas where AI intersects with legal uncertainty:

  • Autonomy and adaptiveness: AI systems are increasingly capable of making decisions without human oversight. This raises questions about liability, particularly in commercial contexts where automated systems execute trades, interpret contracts, or manage logistics. The Commission notes that current legal frameworks may struggle to allocate responsibility when harm arises from autonomous decision-making.

  • Training data and bias: The use of opaque or biased training data presents risks in sectors such as finance, recruitment, and insurance. Commercial clients deploying AI tools may face exposure under discrimination law, data protection regimes, and regulatory scrutiny. They will need to seek advice on contractual safeguards and audit mechanisms to mitigate these risks.

  • Interaction and reliance: As businesses integrate AI into decision-making processes, the law must consider how reliance on AI affects duties of care, misrepresentation, and contractual expectations. For example, if a party relies on an AI-generated forecast when entering a contract, does that create a new basis for liability?

Implications in practice 

  • Contracts: We should anticipate increased demand for contractual clauses addressing AI usage, transparency obligations, and liability allocation. Businesses will need protection against AI-induced errors and assurances regarding data integrity.

  • Risk and compliance: AI tools used in due diligence, financial modelling, or supply chain management introduce new risks. Businesses must assess not only the technology but the legal structures surrounding its deployment.

  • Regulatory exposure: As AI intersects with GDPR, consumer protection, and competition law, businesses will need to stay up to date on compliance, reputational risk, and enforcement trends.

Future challenges

The Commission flags several areas for future inquiry, including aviation autonomy and product liability. It also raises the challenging question of whether AI systems should eventually be granted a legal personality - a concept that could radically alter liability frameworks. While the Commission concludes that AI is not yet sophisticated enough to warrant such status, it acknowledges that this debate will resurface. 

Why this matters for businesses

The Law Commission’s paper is not a call for immediate reform, but a strategic invitation to engage. For businesses and commercial lawyers, it signals the need to future-proof contracts, reassess risk allocation, and stay alert to regulatory developments.

If you’re integrating AI into your operations — or preparing to take AI products to market — now is the time to review your legal frameworks. From drafting robust contractual protections to navigating emerging regulatory risks, we’re here to help you stay ahead of the curve.

Your content, your way

Tell us what you'd like to hear more about.

Preference centre