CJC consultation on AI in litigation: what the proposed rules mean for your court documents

Article21.04.20267 mins read

Key takeaways

Transparency needed when AI shapes court documents

Mandatory declarations and clear protocols may be required where AI influences key court documents.

Witness statements must remain free from any AI input

AI should not generate, edit or rephrase any part of a witness statement.

Legal representatives retain full accountability for AI assisted documents

Legal representatives remain accountable for all documents drafted with AI assistance.

The Civil Justice Council (CJC) has released its 'Interim Report and Consultation on the Use of AI for Preparing Court Documents', inviting views on whether the Civil Procedure Rules should be updated to address the growing use of AI in civil litigation. The consultation remains open until 14 April 2026.

The report reflects a wider trend across the market: while AI enabled tools can streamline case preparation, reduce time spent on document review, and support cost efficiency there remains clear concern about accuracy, reliability and evidential integrity. The CJC therefore adopts a targeted, risk based approach, differentiating between low risk administrative uses of AI and higher risk scenarios where AI could meaningfully affect the substance of documents relied upon by the court.

Witness statements are the most significant area of focus. The CJC proposes a mandatory declaration confirming that AI has not been used to generate, edit, rephrase or embellish the content of trial witness evidence. This aligns with the strict requirements of PD 57AC, which mandates that evidence must reflect the witness’s own words and recollection. From law firm perspective, this means that firms must ensure they have clear internal protocols: even seemingly benign tools such as summarisation or wording suggestion features could breach the rule if they alter the language or structure of the witness’s account.

For expert reports, the CJC is not seeking to prevent AI use but instead to promote transparency. Experts would be required to explain how AI contributed to their analysis - other than for administrative tasks - and to identify the systems used. Given the increasing sophistication of expert work, this approach aims to preserve accountability while recognising that experts are already using AI to manage large datasets and improve turnaround times.

For statements of case and skeleton arguments, the CJC does not propose new rules provided the responsible lawyer is clearly identified. In practice, this means law firms may continue to use AI tools as drafting aids, but professional judgment remains paramount; AI generated errors, hallucinated citations or misstatements will still be treated as the legal representative’s responsibility.

Disclosure remains unchanged. Although technology assisted review already relies heavily on machine learning, the CJC does not propose additional disclosure obligations relating to AI use. However, practitioners may wish to consider potential complexities such as privilege attaching to prompts, queries, or internal workflows if AI tools are used during document review.

Administrative uses of AI - such as transcription, formatting or spell checking - do not require any declaration. These functions are now commonplace across the legal sector.

Finally, the CJC notes that AI use by litigants in person falls outside this consultation but may require attention as consumer grade AI platforms become more accessible. This represents a potential future challenge for courts and practitioners alike, particularly where litigants in person rely on AI generated legal arguments or evidence.

The key takeaway is that AI can be hugely helpful and welcome in the litigation process, and indeed may drive costs savings but it needs to be used carefully with the appropriate safeguards. If you have questions about what the CJC’s proposals mean for your business or ongoing disputes, our team at Hill Dickinson is here to talk it through.

For further information, please contact Anna Timms and Paul Walsh.

Your content, your way

Tell us what you'd like to hear more about.

Preference centre

Related views