This guide aims to help you consider the advantages and pitfalls of using AI to help you with your family case. It aims to:
Safer, low‑risk uses
Do not use AI to
Bottom line: Your story and evidence matter most.
Don’t risk your case by trusting a machine.
Never paste into public AI:
If AI mentions a law, rule, practice direction or case, you must verify it yourself:
How to verify (simple steps):
Why this matters:
Using false or misleading legal material can seriously harm your case and may lead to a costs order against you.
If you share confidential information (e.g. family court documents) with a public AI, you are likely to be breaking the law and/or be in contempt of court.
Most people only have access to public/consumer AI tools (ChatGPT, Gemini, Claude, Grok etc). These may keep or use your inputs to improve their models (some offer opt‑outs, but you should not rely on them for case privacy).
Enterprise/business AI (for example, Microsoft Copilot for Microsoft 365) is designed for organisations and typically offers:
Important: Unless you have a business‑grade AI with a clear confidentiality agreement, assume your data is not private in AI tools.
For family cases, treat confidentiality as essential.
Safer (generic, no private data)
Unsafe
Nothing on this website constitutes legal advice and the inclusion of any other website or publication does not imply or mean an endorsement of the contents thereof. Any costs or fees mentioned were correct at the time of writing but should always be checked at source. Any messages sent via this website do not constitute formal or official communication with any member of the judiciary or court staff.
Copyright © 2025 Cumbria DFJ Website - All Rights Reserved.