Disinformation, including “deepfake” videos and bots spreading deception, should come within the scope of a future online harms bill, say a panel of experts appointed by Heritage Minister Pablo Rodriguez to help him shape a future law.
Members of the expert panel, including Bernie Farber of the Canada Anti-Hate Network and Lianna McDonald of the Canadian Centre for Child Protection, have advised that the act impose a duty on tech giants to tackle the spread of fake news and videos.
Some suggested Canada should mirror the European Union’s Digital Services Act which allows for stronger action to tackle disinformation in times of crisis — for example during elections, international conflicts and public-health emergencies.
They said the EU measure related to attempts by Russia to spread false claims to justify the invasion of Ukraine.
Public Safety Minister Marco Mendicino said in an interview that technology was now so sophisticated that some fake images and content were “virtually indistinguishable” from genuine content, making it very difficult for people to tell the difference.
He said a “whole-government approach” spanning several departments was needed to tackle the spread of disinformation in Canada.
“We are at a crucial juncture in our public discourse. We are seeing an increasing amount of misinformation and disinformation informed by extremist ideology,” he said.
An analysis by academics of over six million tweets and retweets — and their origins — found that Canada is being targeted by Russia to influence public opinion here.
The study by the University of Calgary’s School of Public Policy this month found that huge numbers of tweets and retweets about the war in Ukraine can be traced back to Russia and China, with even more tweets expressing pro-Russian sentiment traced to the United States.
READ ALSO: Canada target of Russian disinformation, with tweets linked to foreign powers
Ministers have announced their intention to bring in an online harms bill which would tackle online abuse — including racist slurs, antisemitism and offensive statements aimed at members of the LGBTQ community.
It follows the publication of a previous online hate bill just before the federal election last year. The bill did not become law.
The expert panel, which also includes law and policy professors from across the country, said not only should a bill tackle online abuse, including child abuse, it should consider fake and misleading information online. This could include co-ordinated disinformation campaigns “leveraged to create, spread, and amplify disinformation” including the use of bots, bot networks, inauthentic accounts, and “deepfakes.”
“Deepfakes” are fake videos or photos that use deep learning technology, which creates highly realistic-looking counterfeit images.
Some experts on the panel said the bill should also address false advertising, misleading political communications and content that contributes to “unrealistic body image.”
The panel said platforms would have a “duty to act” to address “harmful content online, which includes disinformation, by conducting risk assessments of content that can cause significant physical or psychological harm to individuals.”
Some experts on the panel warned that measures to address disinformation must be carefully worded so it cannot be abused by governments to justify censorship of journalism or criticism.
Their warning was echoed by Emmett Macfarlane, a constitutional expert at the University of Waterloo.
“There are always valid concerns about the potential for overreach and unintended consequences flowing from these sorts of laws. Our existing criminal hate speech and obscenity laws have resulted in material being unjustly restricted or blocked at the border, for example,” he said.
The 12-person panel of experts, which has just finished its work, said disinformation and fake posts could pose higher risks to children.
They have recommended that the bill impose strict requirements on social media companies and other platforms to remove content featuring or promoting child abuse and exploitation.
A few of members criticized platforms for failing to take such content down immediately, saying, “the current performance of online services in removing child sexual abuse material is unacceptably poor.”
The panel was critical of platforms in general for saying what percentage of harmful content they take down, but not how long it took to remove it.
Rodriguez thanked the panel for completing their discussions last week, saying “their advice is essential in crafting a legislative and regulatory framework to address this complex issue and help create a safe space online that protects all Canadians.”
“Freedom of expression is at the core of everything we do, and Canadians should be able to express themselves freely and openly without fear of harm online and our government is committed to taking the time to get this right,” he said.
The minister also thanked the Citizens Assembly, a group of 45 Canadians looking at the impact of digital technology on democracy, for its advice. At a conference last week, the assembly also stressed the importance of addressing the spread of disinformation online, saying it can manipulate public opinion.
Marie Woolf, The Canadian Press