Unmasking Deception: Canada Weighs in on the Dark Side of Deepfakes

  • 30 avril 2024
  • Jennifer R. Davidson, partner, Technology, Privacy and Cybersecurity Law, and Victoria Di Felice, articling student, of Deeth Williams Wall LLP

In today’s digital age, seeing is no longer believing. Deepfake technologies are shaking up the legal landscape as Canada grapples with how to catch up to these AI imposters. Deepfakes are hyper-realistic media manipulations created using artificial intelligence (AI), where images, audio, or videos, are either digitally altered or fully generated by AI to convincingly replace one person’s likeness with another.[i] These sophisticated synthetic multimedia manipulations are blurring the lines between reality and fiction.

DEEPFAKES IN THE WILD

Not all use of deepfake technology is nefarious or deceptive. Deepfake technology can, and is, being used for entertainment purposes to augment video game characters or develop satirical content.[ii] However, the use of deepfakes spark major concerns when the images, voices and even movements of real people are manipulated to create content that makes it look like the people portrayed are saying or doing things they’ve never done.

A 2023 report analyzing the state of deepfakes found that non-consensual altered pornographic clips constitute 98% of all deepfake videos found online.[iii] In late January 2024, sexually explicit deepfakes of singer Taylor Swift went viral on X (formerly known as Twitter). To prevent further distribution, X made Swift’s name unsearchable for 48 hours. Despite this effort, the images were still viewed millions of times before being taken down, with one of the images viewed more than 45 million times.[iv]

Unfortunately, Swift is just one of many women targeted in these pornographic deepfake videos, a crime for which 99% of the individuals targeted are women.[v] Most do not have the resources of Taylor Swift to enforce the removal of these images from the internet. The creation and dissemination of non-consensual deepfake pornography not only infringes on one’s right to control their own image and identity but can cause irreversible harm to an individual’s reputation and mental health.[vi]

The use of deepfake videos to engage in fraudulent activity is hitting new peaks. In February of this year, an employee of a multinational finance company was tricked into handing over 25 million dollars to criminals who utilized deepfake technology to invite the victim into a video conference call filled with the employee’s colleagues, including the chief financial officer. In the meeting, the employee was instructed to make the transaction. In dutifully following his superiors, he inadvertently handed over $25 million to the criminals.[vii]

It’s not just corporations being targeted. Last year, a family living in Newfoundland and Labrador received a call from a voice they recognized as their son, claiming to be in trouble and needing their urgent help. The couple handed over nearly $10,000 in cash to bail him out of trouble. Meanwhile, his son was at his home and knew nothing of the so-called emergency.[viii] These incidents demonstrate the growing sophistication of fraudulent scams via deepfakes, which can result in significant financial consequences to their victims.

In the last two years, deepfakes of politicians have become increasingly common. In March of this year, a deepfake advertisement depicting Justin Trudeau recommending a cryptocurrency exchange was posted on YouTube.[ix] In February of this year, days before Slovakia’s parliamentary elections, a fake audio recording of one of the candidates boasting about how he rigged the election surfaced online.[x]

In the era where videos go viral globally in mere seconds through platforms like Instagram, TikTok, and X, to audiences who assume that the image they see is a true artifact, deepfakes can easily lead to the spread of misinformation and shape public opinion.[xi] The potential use of deepfakes in upcoming elections creates serious concerns that require swift government action.