One of Canada’s foremost experts on child protection online said she is “very optimistic” that a panel advising cabinet ministers about combating online harm can map out a way to protect minors from sexual exploitation on the internet.
Lianna McDonald, executive director of the Canadian Centre for Child Protection, was among a dozen people appointed last week to the expert panel asked to help the government craft a new online safety bill.
Heritage Minister Pablo Rodriguez and Justice Minister David Lametti are preparing to reintroduce a bill tackling online harms, including racist and antisemitic abuse. The first version was introduced in the waning days of the last Parliament and when the election was called it died without ever being debated in the House of Commons.
The inclusion of McDonald on the new advisory panel signals that tackling online child abuse will be a key element of the forthcoming bill.
She said there is an urgent need for a way to force tech companies to swiftly remove indecent images of children as reports of victimization keep rising.
The Cybertip.ca tip line the centre runs has seen a 37 per cent increase in reports of online victimization of children over the past year. The average age of victims reporting online victimization and the non-consensual distribution of intimate images is 14, with many female victims as young as 12.
The line has also handled a 79 per cent increase in reports of “sextortion” — extorting money or sexual favours from minors with threats to reveal evidence of their sexual behaviour.
McDonald has been advising British officials on the U.K.’s new online harms bill, which she said is a “game changer.”
Canadian officials have been studying the U.K. bill, which imposes a “duty of care” on tech firms, forcing them to swiftly remove child abuse images from platforms or face substantial fines.
McDonald said a voluntary approach requesting firms remove indecent and exploitative images is not working and millions of explicit photos and videos are still circulating.
“This has been allowed to fester for decades,” she said.
The Canadian Centre for Child Protection has its own program called Project Arachnid, which scans the internet for indecent images of children. It has captured millions of exploitative images, prompting removal notices to be sent to tech firms, and has prompted the removal of six million images and videos since 2017.
Sean Litton, executive director of the Tech Coalition, which was set up to fight online child exploitation, said the Arachnid program plays a valuable role but the vast majority of exploitative content is detected by the industry itself and taken down.
“The tech industry has a very special responsibility to ensure that its platforms are safe for children,” he said. “The members of the coalition take this very seriously and proactively identify and take it down and report it.”
The coalition’s members include Facebook, Google, Twitter, Yahoo, Discord and TikTok.
YouTube says it commits significant time and resources to removing violative content as quickly as possible, and removed more than 1.8 million videos for violating its policies between April and June last year. This includes videos with sexual themes or obscenity meant to target young minors and families.
But McDonald said providers sometimes resist requests to remove exploitative images, including images of known sexual exploitation victims dating back decades.
She said the centre has had to argue about whether some graphic and suggestive images of young children fall within their guidelines on removing harmful material.
“Videos of children being beaten, they don’t fit under a clear internal code. We were having a back and forth with companies about bringing the videos down,” she said.
McDonald said that in addition to indecent images, tech platforms could be slow to remove online chats about child sexual exploitation.
She said that although many pedophiles communicate on the dark web, much of the material is on the “clear” web, and can be spotted and taken down swiftly if tech firms agree.
The House of Commons ethics committee published a report last year saying Canadians who have their image posted to Pornhub or other online streaming platforms without their consent should have the right to have it taken down immediately.
The report also recommended making online platforms liable for failing to ensure that material is deleted quickly, as well as measures to ensure those depicted in pornographic content are at least 18 years old and consented to it being published.
The committee conducted its study into sites such as Pornhub, owned by Montreal-based MindGeek, after a New York Times article alleged the site had hosted videos of child sexual assaults and exploitation.
Pornhub said it has “stringent” policies to “combat and eradicate unwanted content” which far surpass those of other major platforms. It said it is at the forefront of combating and eradicating illegal content, and bans content from unverified uploaders.
A spokesman said anyone who has material posted without their consent should have the right to have it taken down immediately, which is Pornhub’s policy.
The Liberals promised to reintroduce the online harms bill within 100 days of the fall election, a deadline that came and went almost two months ago.
A consultation launched last summer suggested its passage could be fraught with difficulty with some arguing the bill could stifle freedom of speech online and infringe privacy rights.
Some of the members of the new expert panel are among the critics of the first bill.
The panel will hold workshops and conduct additional consultations, including with online platforms over the next two months, and Rodriguez said the bill will be tabled as soon as possible after that.
—Marie Woolf, The Canadian Press
RELATED: Fong says legal reform needed to protect young women from growing online sexual violence
child abuseFederal PoliticsInternet and Telecomonline child abuse