Chamber
commons
Stage
1st Reading
Introduced
Jun 19, 2025
Progress
This bill creates new protections for minors online, strengthens child exploitation reporting rules, and adds Criminal Code offences for AI-generated intimate images and online harassment.
Key Changes
- Creates the Protection of Minors in the Digital Age Act, requiring online platforms to implement default safety settings, parental controls, and duty-of-care obligations for minors
- Allows minors and their parents to sue platform operators directly in court for serious harm caused by failure to meet duty-of-care requirements
- Expands mandatory child sexual abuse material reporting requirements to cover more types of internet services and directs all reports to a single designated law enforcement body
- Extends data preservation after a report from the current period to one year, and extends the prosecution limitation period to five years
- Creates a new Criminal Code offence for publishing AI-generated or digitally altered fake intimate images without consent, with penalties up to 14 years imprisonment
- Adds online harassment via social media or digital networks as a distinct form of criminal harassment, with anonymous or false-identity communication treated as an aggravating factor at sentencing
Gotchas
- The bill defines 'child' as under 16 and 'minor' as under 18, creating two tiers of protection with stricter default parental controls applying to the younger group
- Operators are prohibited from requiring users to provide a digital identity credential to access their platform, which could create tension with age verification requirements elsewhere in the bill
- The private right of action allows lawsuits only for 'serious harm,' defined as significant physical or psychological harm or substantial economic loss, which may limit the range of claims that can succeed
- Platforms that already meet equivalent standards under their own codes of practice may receive a ministerial notice reducing how much of the Act applies to them, creating a potential compliance exemption pathway
- The recognizance provisions for online harassment apply only when conduct was threatening or obscene and involved a pattern of repetitive or persistent aggressive behaviour, limiting their use to more serious cases
- Parts 2 and 3 come into force immediately upon royal assent but are contingent on a separate 2024 Act (on child sexual abuse material) already being in force, creating a dependency on another piece of legislation
Who's Affected
- Minors (under 18) who use online platforms, social media, and apps
- Parents and guardians of minors
- Owners and operators of online platforms, social media services, and apps accessible in Canada
- Internet service providers subject to mandatory child exploitation reporting
- Victims of non-consensual AI-generated intimate images (deepfakes)
- Victims of online criminal harassment
- Law enforcement bodies designated to receive child exploitation reports
Vibes
0 responses
Gotchas
- The bill defines 'child' as under 16 and 'minor' as under 18, creating two tiers of protection with stricter default parental controls applying to the younger group
- Operators are prohibited from requiring users to provide a digital identity credential to access their platform, which could create tension with age verification requirements elsewhere in the bill
- The private right of action allows lawsuits only for 'serious harm,' defined as significant physical or psychological harm or substantial economic loss, which may limit the range of claims that can succeed
- Platforms that already meet equivalent standards under their own codes of practice may receive a ministerial notice reducing how much of the Act applies to them, creating a potential compliance exemption pathway
- The recognizance provisions for online harassment apply only when conduct was threatening or obscene and involved a pattern of repetitive or persistent aggressive behaviour, limiting their use to more serious cases
- Parts 2 and 3 come into force immediately upon royal assent but are contingent on a separate 2024 Act (on child sexual abuse material) already being in force, creating a dependency on another piece of legislation
Summary
Bill C-216 is a three-part private member's bill introduced by MP Michelle Rempel Garner. Part 1 creates a new law called the Protection of Minors in the Digital Age Act, which requires app and website operators to act in the best interests of minors, provide safety settings and parental controls by default, restrict harmful advertising to minors, publish annual transparency reports, and face large fines if they fail to comply. Minors and their parents can also sue platforms directly if serious harm results from a failure to meet these duties. Part 2 updates the existing law on mandatory reporting of child sexual abuse material online. It expands which types of internet services must report such content, simplifies the reporting process by directing all reports to a single designated law enforcement body, requires that transmission data be included when content is clearly child sexual abuse material, and extends both the data preservation period and the time limit for prosecuting violations. Part 3 amends the Criminal Code to create a new offence for sharing AI-generated or digitally altered fake intimate images of a person without their consent, with penalties up to 14 years in prison in the most serious cases. It also adds online harassment through social media or digital networks as a specific form of criminal harassment, makes anonymous or false-identity harassment an aggravating factor at sentencing, and allows courts to order someone suspected of online harassment to enter into a recognizance (a formal promise to keep the peace) and to issue production orders to identify anonymous harassers.
Automatically generated from bill text using Claude
Vibes
0 responses