Facebook

WE'VE MOVED! FH&P HAS RELOCATED TO LANDMARK 4 (400 – 1628 DICKSON AVE).

Go back to news + community

Civil Litigation & Personal Disputes, General Litigation, Radio/Media

A.I. in Litigation in British Columbia: Risks, Rules, and Real-World Court Consequences

December 19, 2025 by Clay Williams, David Horvath


kelowna law firm


Artificial intelligence is everywhere right now, and it is starting to show up in a place where precision matters most: the courtroom. In this episode of FH&P Lawyers’ Law Talk Podcast, host Clay Williams sits down with senior litigator David Horvath to explore AI in litigation and what it means for cases in British Columbia. From AI “hallucinations” and fake case law to overly long, AI-generated pleadings that can get a claim struck, David breaks down the risks, the practical uses, and why legal advice still matters in this new frontier.

Clay: Welcome to another edition of FH&P Lawyers Law Talk. I’m Clay Williams, a partner here at FH&P Lawyers. You can find us on Spotify, Apple Podcasts, YouTube, and now, this is new for us, on Rogers TV as well. So today we have David Horvath. Welcome, David!

You can’t turn on the radio or look at a podcast without somebody talking about AI, so I guess it’s our turn today. But we’ve got a bit of a focus on AI and litigation. So what are you going to talk about, and what brought that along?


Why Courts Are Seeing More AI-Generated Claims

David: AI is right now the wild, wild west. Everybody has AI; everyone’s using it. In my practice, what happens is that clients come in. Usually, there are two situations where they use AI. They come in with a ChatGPT printout saying, “This is what I think my case is about. Tell me if I’m wrong or right.” Or they file something with the court and use ChatGPT or any of the AI services, and there may be some issues with that from the court’s perspective. So they come in for advice about that.

Clay: That just happened to me. We did a matter where there was a shareholders’ agreement, and the client ran it through ChatGPT, and it gave comments about it, which concerns me a bit because that goes out into the World Wide Web. But that happens, and sometimes ChatGPT has good ideas.


When ChatGPT Helps, and Why It Still Needs Review

David: Actually, it does. Sometimes it has good ideas, but it has to be reviewed by a lawyer. You’ll get comments from ChatGPT saying, “I’m not a lawyer. Go take this to a lawyer,” which is helpful. I’ve had clients put my work through ChatGPT after things have been filed. Usually, that’s okay. It helps them understand a lot of the legal jargon that we use at the end of the day.

Clay: I completely agree with you that it’s easy to put your legal problem into ChatGPT and get something out, but you still need the background to understand what is being told to you, and sometimes it’s not true.


AI Hallucinations: Fake Case Law, Sanctions, and Dismissed Cases

David: Sometimes it hallucinates. There are cases where lawyers have used AI and provided that argument to the court, resulting in sanctions, being spoken to by the court, and sometimes cases being dismissed. What happens with AI, especially open-source AI, is that it can provide fake case law or fake legislation. If you don’t check it and rely on it, the court will check it and discover that you used fake cases or fake law. That will not go well for your lawsuit.

Clay: That’s happened to lawyers, too, not just self-represented litigants. One thing we’re seeing more of in court is people bringing their own claims and relying on ChatGPT for legal arguments and case law. That can get you in trouble.


Notice of Civil Claim Basics: Causes of Action and Why Specificity Matters

David: If you do that, first of all, call FH&P Lawyers, and we’ll review what you have and suggest changes. If you proceed without doing that, the court may dismiss your case. If you sue someone in British Columbia, whether it’s Provincial Court or Supreme Court, you have to file a Notice of Civil Claim or Notice of Claim. That document identifies what your case is about. You must be very specific and articulate what’s called a cause of action. A cause of action is a legal term that indicates you have a right to sue and what that right is.

Examples include negligence or breach of contract. Each has a specific legal test. Everything in that document has to tie back to that test. When you use ChatGPT, that often doesn’t happen. There may be superfluous material that doesn’t matter.

Clay: Yeah, ChatGPT can produce something, but if you don’t understand when it’s alleging fraud or the repercussions of that, you can get into trouble.

David: It does. AI usually spits out something favourable to you. It gives jargon and explanations and tries to convince you that your case is right. When you come to FH&P Lawyers, my job is to tell you the good, the bad, and the ugly. I’m not a cheerleader. We tell you whether your case is good, bad, or somewhere in between.

AI usually just caters to you. That’s why it’s important to have materials reviewed by a lawyer. If you don’t, you might end up in a situation where materials get struck from the record and the case is dismissed because the Notice of Civil Claim wasn’t clear.

Clay: That’s why you wanted to discuss this. You’ve had experience with clients dealing with self-represented litigants who brought ChatGPT-based claims, and you were successful in having them struck.

David: That’s correct. In one case, it wasn’t clear whether AI was used, but self-represented litigants filed a Notice of Civil Claim alleging negligence based on bylaw infractions. It took about five years and multiple court appearances to strike the claim. We were finally successful in striking the negligence claim.


Lower-Cost Legal Support: Limited Retainers and Unbundled Services

Clay: I also wanted to talk about some of the reasons people go to the Supreme Court on their own. Many people do so because of the cost. Lawyers aren’t accessible to everyone. That’s why ChatGPT is being used more. But we should talk about ‘ad hoc’ or unbundled legal services. The traditional model was full representation. That doesn’t have to be the case anymore.

David: We do more limited retainers now. Clients bring in AI-generated material, and I review it, saving time and cost. With unbundled services, it's sort of a pay-as-you-go system; we may draft a Notice of Civil Claim but not appear in court. We’re very clear about boundaries. The client goes to court, and we advise in the background. That saves money and still provides legal guidance, rather than relying solely on AI.


Limited Retainers and AI: Where They Help, and Where They Don’t

Clay: Yeah, knowing how to articulate a claim using legal causes of action is critical. People don’t have the training lawyers do. Limited retainers are being used more, and people are doing a good job with them. One benefit of AI is that it helps people understand concepts, but the risk is hallucination. Without understanding the law, you won’t know what’s correct.

David: Yeah, in limited retainers, we may advise on a fraud or negligence claim, but the client may not be able to articulate it to a judge, especially in complex matters. The more complex the case, the less useful AI is. However, clients can use AI to help understand what we tell them. I’ve had clients do that and come back saying they finally understand. There are privilege concerns, but AI helps people understand legal concepts and what courts expect.

Clay: Interesting! You seem to be an AI advocate.


AI Can Help, But Courts Still Expect Clear Pleadings

David: I am, because it helps clients and the community engage with legal disputes in a cost-effective way. AI helps, but it’s not the be-all, end-all. Lawyers are still required because legal issues must be articulated in a certain way. Otherwise, cases can be dismissed.

I mentioned a case where self-represented litigants filed a 55-page Notice of Civil Claim. That’s extremely long. The court called it prolix. The court gives opportunities to fix mistakes, but repeated failures can result in dismissal, which is what happened.

When I see self-represented litigants on the other side, I often send a letter outlining problems with their case. Courts want to see that an opportunity was given. If you don’t fix it after that, dismissal becomes more likely.

Clay: Right, so what are some of the reasons a court may strike a claim, anything we haven’t already mentioned?


Reasons a Judge May Dismiss a Claim Without Hearing the Evidence

David: The court may strike a claim if it’s abusive, vexatious, frivolous, or has no basis in law. The court looks at the Notice of Civil Claim, not evidence. It’s about articulating your case, not proving it. If you fail to do that, the court may dismiss it, although they usually give chances. Eventually, if those chances are wasted, the case will be dismissed.

Clay: Yeah, and a lot of people don’t realize the time pressure when they go to court. Judges are under immense time pressure. A 50-page Notice of Civil Claim doesn’t help. You need to clearly state what the case is, the causes of action, and the relief sought, concisely.

David: Putting everything in doesn’t help. It’s natural to want to tell your story, but the law doesn’t work that way. You must articulate claims succinctly and precisely. In the case I referenced, the judge said it shouldn’t take hours to understand a Notice of Civil Claim. Even with years of training, I couldn’t understand it. That’s a problem for the court. People often include too much information. Some facts matter; others don’t. Clear, efficient word usage is critical.

Clay: You said you’re a fan of AI. Do you recommend clients use it to explore concepts?


A Safer Approach: Use AI to Learn Concepts, Not to Draft Your Case or Replace Representation

David: I haven’t explicitly done that, but it’s a good idea if done carefully. Don’t upload privileged documents. Instead, look up legal concepts like negligence or breach of contract using generic examples or fake names.

Open-source AI stores information, and uploading privileged material can waive solicitor-client privilege.

Clay: Next, are lawyers going to be replaced by AI? I don’t see it. AI is a great tool, like Google was. It gives people the power to understand concepts. But we’re not being replaced anytime soon. Unless Skynet becomes sentient.

David: Exactly, AI improves communication between lawyers, clients, courts, and opposing parties. It can reduce court time, which helps access to justice. I believe in AI, but lawyers are still needed.

Clay: So take advantage of it, but come talk to FH&P Lawyers so we can help you do it properly. Thanks for tuning in. Thank you, David, for a great discussion. David is one of our senior litigators here at FH&P Lawyers. If you have issues you can’t resolve and need to litigate, David can help.

Tune in next time on Spotify, Apple Podcasts, and YouTube. Please subscribe, follow, and leave comments. We’ll try to answer them. Until next time!

AI can be a useful tool, but in litigation, mistakes are expensive. When pleadings are unclear, too long, or built on unreliable sources, the risk is real: delays, added costs, or a claim being struck. FH&P Lawyers helps clients use technology wisely while keeping their case grounded in the rules and realities of British Columbia courts.

If you are dealing with a dispute, considering court, or unsure whether AI-generated materials are accurate, contact FH&P Lawyers to review your situation and discuss practical next steps.


Disclaimer: This material is provided for informational purposes only and should not be construed as legal advice on any subject matter. Consult with a qualified lawyer for advice on specific legal issues.