Moradi v. British Columbia (Human Rights Tribunal)
Supreme Court of British Columbia 18 July 2025
- Party
- Pro Se Litigant
- AI Tool
- ChatGPT
Hallucinated Content
Fabricated
- Case Law
Index of authorities contained close to thirty new authorities, 22 of which the Court found were generated by ChatGPT and were not real cases; petitioner admitted they were produced by ChatGPT and 'turned out to be incorrect.' Court relied on this in assessing costs.
- Case Law
Example fabricated authority listed in the index; did not correspond to a real reported case and was generated by ChatGPT.
- Case Law
Second example of a fabricated authority generated by ChatGPT that the Court found did not exist.
Outcome
Misconduct taken into account in allocating costs
Data from Damien Charlotin's AI Hallucination Cases Database.