top of page

Support | Tip | Donate

Recent Posts

Featured Post

Deepfake Nudes and the Law: What Parents, Caregivers, and Adults Should Know About a New Ontario Court Ruling

  • Writer: The White Hatter
    The White Hatter
  • 4 minutes ago
  • 4 min read
ree

Caveat - We are not lawyers, and this is not legal advice, however Darren was involved in criminal justice system for over 30 years given his role in law enforcement, and we do follow case law from across Canada specific to the intersection of the misuse of technology and the law.


This morning, a Toronto Star headline caught our attention: “Distributing a fake nude of your spouse is ‘morally reprehensible’ — but not a crime under Canadian law, Ontario judge warns.” (1) The story refers to R. v. Kapoor, 2025 ONCJ 542, a case that exposes a serious gap in how Canadian law addresses deepfake technology and intimate-image abuse. (2)


In this case, a husband sent two types of images through Snapchat to an unknown person that his daughter found on his phone and brought to the attention of the mother:


1/ A photo of his wife wearing a bra in their bathroom, taken without her knowledge or consent. and,


2/ Digitally altered “deepfake” nudes of his wife showing exposed breasts that were not hers which were also sent without her knowledge or consent.


The husband was charged under Section 162.1 of the Criminal Code, which prohibits the non-consensual distribution of an intimate image. However, the judge ruled that neither image met the legal definition of an “intimate image.”


The court reasoned that:


  • The photo of the wife in her bra didn’t qualify because she wasn’t fully or partially nude as defined by law.


  • The deepfake image didn’t qualify because it wasn’t her real body. The judge stated, “If this type of photo were meant to be captured by this section, Parliament would have specifically done so.”


In other words, while the act was morally wrong, it was not criminal under current Canadian law.


Why the Ruling Matters


This decision highlights the legal challenge of keeping pace with technology. Canada’s intimate-image laws were written before the rise of AI-generated imagery. The term “made by any means” in Section 162.1 refers to recordings, not synthetic likenesses. Since deepfakes aren’t recordings but digital fabrications, they fall into a legal grey zone.


We agree that Canadian courts can’t legislate the future, but they can interpret laws in light of their original intent. This principle, established in R. v. Salituro (1991), allows judges to adapt common law incrementally when new situations arise. (3) As Justice Lacobucci wrote:


“Judges can and should adapt the common law to reflect the changing social, moral and economic fabric of the country. However, they should only do so when the change is incremental and necessary.”


Applied here, that means a judge could interpret the phrase “made by any means” broadly enough to include AI-created or manipulated likenesses if it aligns with the spirit of Section 162.1, protecting autonomy, privacy, and consent.


The harm caused by deepfake nudes is essentially the same as distributing real non-consensual images. The difference is technical, not moral.



Why This Legal Gap Needs to Close


The Crown in R. v. Kapoor did not appear to raise the Salituro principle as set out by the Supreme Court of Canada. Had it done so, the outcome might have been different. Until Parliament updates the law, courts could rely on this reasoning in their arguments to extend existing protections to deepfake victims.


Still, the case shows why legislative reform is urgently needed specific to the creation of deepfake nude pictures, video, and audio. Parliament must modernize Section 162.1 to reflect today’s reality, where AI-generated nudes are becoming more common than rare.



When the Victim Is a Minor: A Different Story


It’s important for parents and caregivers to know that Canadian courts already take a stronger stance when minors are involved. Cases like R. v. Larouche 2023 (4) and R. v. Legault 2024 (5) confirm that deepfake images depicting anyone under 18 fall under child pornography laws (Section 163.1), even when the image is completely synthetic. We even have case law that has found that semi-nude picture of youth, where genitalia is covered can still be found legally to be an intimate image depending on the context of the picture of video. 


In those cases, judges ruled that the intent and harm of the act are what matter, not whether the image was “real.” This is a textbook application of the Supreme Court’s  Salituro principle, using existing law to fill the gaps Parliament hasn’t yet closed.


So if deepfake laws can already protect minors, why shouldn’t the same principle protect adults too?



The Takeaway for Parents, Caregivers, and Adults


This provincial level court case is a sobering reminder that technology can outpace legislation, leaving real victims in legal limbo, and is not legally binding outside the province of Ontario. However, this article clearly shows that while deepfake abuse targeting minors is criminal, adult victims remain vulnerable unless lawmakers #1 act to update the law, and/or #2 make specific argument using the Supreme Court  Salituro principle as their foundation as to why section 162.1 should still apply until such time as the law is updated.


Until then, we believe the courts still have tools to respond, if they interpret the law based on its intent rather than its literal wording. As the R. v. Salituro precedent shows, the Supreme Court of Canada has stated that the justice system can evolve, even without new statutes.


Canadian courts may not see the future, but they can uphold the spirit of the law to protect dignity and consent in the digital age until existing laws are updated.


Just some thoughts from a team who are asymmetrically fighting the fight specific to this significant legal challenge. For those Canadians reading this article, contact your Member of Parliament and advocate strongly to have them update sec 162.1 of the Criminal Code of Canada.


POST SCRIPT:


For those who live in British Columbia, we have the "Intimate Images Protection Act", which is a civil process where a victim can seek civil restitution, both a takedown order and financial restitution for up to $75,000, for such images being created or shared without consent, which also includes deepfakes that are clearly outlined in the legislation. (6)


The White Hatter



References:







Support | Tip | Donate
Featured Post
Lastest Posts
The White Hatter Presentations & Workshops
bottom of page