CBA memberships expired on August 31, 2025. Renew today to continue enjoying your benefits.

Rules of Civil Procedure Relating to Evidence and Artificial Intelligence

September 17, 2025

Introduction

The Ontario Bar Association (“OBA”) provides feedback on the Artificial Intelligence Subcommittee of the Civil Rules Committee’s (“Subcommittee”) potential proposals to amend the Rules of Civil Procedure (“Rules”) as they relate to evidence and artificial intelligence. The goal of this submission is to support the Subcommittee’s provision of guidance to the judiciary and the bar on the procedure for receiving evidence generated by Artificial Intelligence.

Ontario Bar Association

Established in 1907, the OBA is the largest and most diverse volunteer lawyer association in Ontario, with close to 16,000 members, practicing in every area of law in every region of the province. Each year, through the work of our 40 practice sections, the OBA provides advice to assist legislators and other key decision-makers in the interests of both the profession and the public and we deliver over 325 in-person and online professional development programs to an audience of over 20,000 lawyers, judges, students, and professors.

This submission was prepared and reviewed by members of the OBA’s Artificial Intelligence Taskforce. Members of this taskforce include barristers and solicitors in public and private practice in large, medium, and small firms across Ontario. These members have extensive experience dealing with technology law, including cyber security and intellectual property as well as commercial matters.

Comments & Recommendations

Proposal 1: Defining Artificial Intelligence

The Subcommittee proposes including a definition of “artificial intelligence” in the Rules. The OBA is generally in favour of using the Sedona Conference definition but believes the Conference’s original characterization of AI as technologies that “emulate” human

intelligence should be restored. AI does not “replicate” human intelligence; it merely mimics the human thought process by predicting words and sequences of words. This is emulation, not replication. Accordingly, we recommend the minor amendment, demonstrated below.

Proposal 2: Identification of Evidence Generated by a Computer System Using AI

The second proposal aims to provide requirements for parties who seek to put forward evidence generated by a computer system using artificial intelligence. Notably, this proposal would supplement the Ontario Evidence Act and the Canada Evidence Act as this is not addressed in either. We are generally in favour of this proposal. However, the subrule should make clear (assuming this is the intention) that this subrule is intended to deal with factual evidence, not expert evidence.

Assuming this proposal is intended for the lay witness, there is concern that it is overly broad. It is our opinion that as AI becomes more embedded into a myriad of decision making applications, regular people, should be entitled to testify as to their reliance on AI (and AI output as evidence), in their defence to an action or on the issue of damages even if

they do not know the answers to the questions in the subject proposal. Accordingly, it is recommended that proposal #2 be amended, as outlined below.

Moreover, subrule (c) seems to necessitate the tendering of expert evidence, likely in computer science and/or artificial intelligence. Respectfully, it is unlikely that a lay witness will be able to explain why software results are “valid and reliable.” This will of course increase the cost and complexity of leading such evidence. Query whether some system by which the court may whitelist artificial intelligence programs—much in the way it has implicitly done so with commercial case law databases and the results of document management platforms employing discriminative AI—under a given set of conditions, e.g. that the court is satisfied the correct inputs and prompts were employed in a given case.

This would spare litigants the necessity of tendering expert evidence for their AI-generated factual evidence in every single case in which such evidence was sought to be admitted.

Another option to simplify the process might be the use of certificates similar to those counsel provide in various circumstances under the Rules. Such certificates would be completed by computer science/artificial intelligence experts, but could be re-used by AI program vendors to address, the last element of the test at least (i.e. certifying only that the program produces valid and reliable results when properly used, or when used under a set of listed criteria). This would allow the court to presumptively accept AI-generated factual evidence without the need for a live witness, except in circumstances where the other party/parties wish to challenge the expert. Additional subrules would be required for this process as well.

Lastly, it is important to note that the test above does not deal with the issue of bias introduced into a given AI platform.

Proposal 3: Potentially Fabricated or Altered Computer-generated or Other Electronic Evidence

This proposal aims to prevent misleading AI-generated evidence, such as deep fakes, from benefiting from the presumption of authenticity and similarly supplementing the evidence acts. We agree with this proposal. However, it does not address the problem of inaccuracies introduced by the AI itself, e.g. hallucinations. As noted above, a procedure should be set out to deal with issues arising from accidental misuse (e.g. in input or prompts) of AI, or

Proposal 4: Admissibility of Expert AI Evidence

This proposal seeks to provide a process for the admissibility of outputs of a computer system using artificial intelligence in an expert report. We agree with this proposal in principle, so long as it does not contemplate the admission of an AI-generated report without the testimony of an expert.

Among the differences between human experts and generative AI programs is the fact that the human expert understands the difference between truth and lies or fabrications,

understands the solemnity of the court process and the expert’s duty in that process, and can be held accountable for failing to answer impartially and truthfully. We strongly disagree with any change to the Rules that would remove the accountability that having a human expert witness guarantees.

***

The OBA would be pleased to discuss this further and answer any questions that you may have.