Canadian voters should be aware that social media platforms are rife with misinformation, influenced by cheaply purchased bots and Facebook groups and the blocking of Canadian news.
Anyone can purchase thousands of bots and deploy them delivering misinformation and disinformation on social media but no one would know because these platforms no longer have transparency or provide any data to researchers, media ecosystem experts say.
“The need for greater transparency is so critical in this space,” said Aengus Bridgman (photo at right), director of the Media Ecosystem Observatory and assistant professor (research) at the Max Bell School of Public Policy at McGill University.
“The lack of visibility and transparency that platforms have into particularly manipulation campaigns is unacceptable in my view,” he said during a podcast by Law Bytes.
The podcast is hosted by Michael Geist, professor, Canada Research Chair in Internet and E-Commerce Law, and a faculty member at the Centre for Law, Technology and Society at the University of Ottawa.
Mistrust of information on social media platforms is rapidly increasing, Bridgman and other media ecosystem researchers said.
“There’s tonnes of generative AI-generated content on social media,” Bridgman said. “If you don’t know if something was created by a human or not, how do you trust anything that’s said online?”
“It might lead in the very near future to increased distrust of everything online, to the point where people start to disconnect from that [information] being any viable signal of populational attitudes,” he said.
“There’s enough evidence out there now that we know that online communities shape the way we see the world, and one of the things that can happen is the steady erosion of trust,” Bridgman said.
In the U.S., there has been a systematic and continued rejection of scientific evidence and the truth, he said. “We’re seeing it in Canada to a lesser extent as well.”
Trust in traditional media also is down significantly – about 10 points in five years, Bridgman said. “That is a massive shift.”
At the same time, social media use is enormously up, particularly by young people who use the platforms as their primary source of news and political information.
The American Sunlight Foundation studied the impact of social media misinformation and disinformation on the 2020 U.S. presidential election.
“We put out a statement saying, ‘Disinformation didn’t win the election, but the normalization of lies did,’” Nina Jankowicz (photo at right), co-founder and CEO of the foundation, said during the Machines Like Us podcast hosted by Taylor Owen, founding director of the Centre for Media, Technology and Democracy at McGill University.
“I think people just don’t care if their politicians are telling the truth or not or if they’re reading truthful information. They want information that makes them feel good,” Jankowicz said.
This is the kind of information that the U.S. is seeing so far in the second Trump administration and it is polarizing Americans, she said.
“I wouldn’t say that people don’t trust anything. They just don’t trust the ‘other’ [information provided by those with different views], which actually is potentially worse [in] enflaming of the tensions.”
Bridgman said the Media Ecosystem Observatory’s research on the 2021 Canadian federal election revealed “rampant misinformation” on social media platforms.
“Anyone who spends time online, you’re just a click and a hop away from accessing misinformation on whatever social media platform you’re on,” he said. “Go look at whatever heavy partisans are saying on any site it’s rife with misinformation.”
However, the Media Ecosystem Observatory’s research didn’t find any evidence that social media influence had a material impact on the outcome of the election or that it materially impacted about a dozen ridings, which was a claim after the election.
“Part of the joy of free speech on platforms is that you get a lot of false and misleading content. This has only increased since the 2021 [federal] election,” Bridgman said.
It’s cheap and easy to produce misinformation on social media platforms
In her recent report on foreign interference in Canada, Justice Marie-Josée Hogue wrote that “information manipulation poses the single biggest risk to our democracy.”
Senior Canadian intelligence officials have predicted that India, China, Pakistan and Russia all will attempt to influence the outcome of this federal election
Elections Canada has reached out to various social media platforms, including X and TikTok, to address concerns around misinformation.
But the huge challenge during this election is actually determining how much misinformation and disinformation is appearing on social media platforms and who is providing it, the media ecosystem researchers said.
Unlike during the 2021 election, Facebook no longer has an election integrity initiative, a well-resourced trust and safety team, fact-checking by CrowdTangle, or any data access for researchers, Bridgman said.
Also, Canadians are unable to access Canadian news on Facebook because Meta blocked this news after the federal government passed the Online News Act last June. Yet only about 25 percent of Canadians know that Canadian news is blocked on Facebook and Instagram, Bridgman said.
Meta’s ban has reshaped the media landscape in Canada, with Canadian news outlets losing 85 percent of their engagement on Facebook or Instagram as of August 2024 – an estimated reduction of 11 million views per day, according to research by the Media Ecosystem Observatory.
“In general, people still think they’re getting informed,” Bridgman said. “What they’re reading is other people sharing information.”
Google is continuing to work with the federal government on its concerns about the Online News Act and hasn’t blocked Canadian news on its platform.
The shift to no visibility or transparency started with Elon Musk and X which eliminated its community notes, reduced its trust and safety teams, and removed content moderation, Bridgman said.
“They have taken a pretty hard line on data transparency and the value of moderation versus freedom of speech, and that has sort of cascaded out to the other platforms,” he said.
Now with the current federal election there’s real concern about American influence, he added. “I think that interference possibility from the States is real this time around and something that we’ve really got to watch.”
Jankowicz said the giant U.S. social media companies have for years fought against regulation, not only in the U.S. but in Canada and the European Union.
Now with the second Trump administration “these multi-billionaires have chosen to acquiesce and they have rolled back their content moderation,” she said.
“So what I think [Mark] Zuckerberg and [Elon] Musk and others see in the Trump administration is not just something to fear, but they see an asset,” she added. “Now they have potentially the most powerful lobbyist they could possibly hope for in the U.S. government.”
Trump and U.S Vice-president J. D. Vance are now bullying Europe and other regulators to not regulate the U.S. social media companies, Jankowicz said. “That’s the landscape. It’s really scary.”
Bridgman agreed, saying: “The fundamental security issue is that we would have no idea if Musk had an army of bots amplifying certain messages on X.”
There are crypto marketplaces where anyone can buy 10,000 bots for about $2,000 and deploy them on social media platforms, he noted. “That’s chump change for somebody involved in politics.”
Ten thousand bots can have a material impact, even if a small set of these bots is active in response to a social media post and amplified early in the post, Bridgman said. “It can have a huge impact and it can be hard to detect and hard to understand.”
There are also crypto marketplaces where anyone can buy a defunct “aged” Facebook group of 50,000 or more members and make themselves the administrator and repurpose the group to promote whatever they want, he said.
One current private Facebook group, advocating for Canada to become a 51st state, was once a Hamilton-based Facebook group for buying and selling items. It was bought and is now administered by an American.
“This general challenge of [buying] and rebranding accounts and groups and channels is widespread,” Bridgman said. “There’s just a whole ecosystem where you can purchase this stuff.”
Jankowicz said her American Sunlight Foundation has been monitoring a content-aggregation group run by “some guy in Crimea” whose sole purpose is to regurgitate Russian propaganda, translated into many different languages and web domains, and spread it via the internet.
“It’s pumping out at least 3.6 million pieces of content per year,” she said
The sites where the Russian propaganda appears aren’t made for human users, she noted. They’re made to be scraped by AI developers that are training their AI models, where the information is incorporated and amplified.
More transparency, governance needed for social media platforms
Earlier this month, Canada’s intelligence agencies flagged a disinformation campaign on WeChat aimed at Mark Carney, with posts reaching up to three million views.
Originating from an account tied to the Chinese Communist Party, the posts cast Carney as a “rock-star economist” and lauded his toughness on Trump – praise that officials say masked a calculated attempt to sway Chinese-Canadian voters and stir political blowback.
Meanwhile, AI-generated videos of Carney endorsing cryptocurrency scams made the rounds on Facebook. There was also a doctored image that went viral, which depicted Carney on a beach alongside Tom Hanks and Ghislaine Maxwell, a convicted sex trafficker and Jeffrey Epstein’s former girlfriend, peddling the conspiracy theory that Carney is part of a shadowy global cabal.
Despite the rampant misinformation on social media platforms, Canadian politicians use them to communicate with their constituents who are getting their political news on such platforms, Bridgman said.
X is where Chrystia Freeland announced her resignation from Justin Trudeau’s government and where Trudeau said his final farewell to Canadians. Trudeau’s X handle had 6.6 million followers.
Based on the ad registry of Facebook, that’s where federal political parties are spending money during this campaign to reach voters, Bridgman said. “Politicians don’t have much choice in the platform they engage on. [They’ve] kind of got to do it.”
Among Liberal and NDP partisans in Canada, research shows usage on X is down about 15 percent, Bridgman said. Conservative partisans are using it at about the same rate.
Engagement with federal Conservative politicians on X has increased by 52 percent since Musk’s takeover, according to research by the Media Ecosystem Observatory. Posts from federal Conservative politicians received 61 percent more engagement than those of Liberal and NDP politicians combined in 2024.
“We know on that platform there’s been a dramatic shift to the right, that right-wing accounts are getting much more engagement and likes,” Bridgman said.
However, researchers no longer have access to data on X and getting such access would cost upwards of US$150,000 per month, he said.
In responding to misinformation on social media platforms, “policing speech doesn’t really work,” Bridgman said. “Any content moderation just drives people to other platforms where they form more insular communities and the rhetoric potentially gets more extreme.”
Social media companies have very successfully avoided having to provide rich data to researchers and there needs to be greater transparency in that space, he said. “The most successful strategy is additional transparency and clarity.”
At the Canadian government level, there’s no process to make public detailed information on interference by foreign or domestic entities so there can be a responsible conversation about it, Bridgman said. “Canadians need to know what goes on in the information space and they should be made aware of it.”
“In the disinformation space in particular, you can’t be super-secretive because being super-secretive means that false and misleading information is allowed to flourish and thrive. It loves a vacuum,” he said.
“Having that additional level of transparency when it doesn’t threaten operational security seems like a no-brainer.”
Bridgman urged governments not to abandon policy efforts to better govern the social media space.
He is part of the Coalition for Information Ecosystem Resilience of research groups across Canada that’s trying to provide information about bot- and GenAI-driven misinformation and disinformation campaigns to Canadians. The coalition has a digital threat tipline where Canadians can report suspected disinformation campaigns.
Canada also has the Critical Election Incident Public Protocol, a panel of the most senior federal public civil servants informed by the Security and Intelligence Threats to Election (SITE) Task Force, whose job is to sound the alarm if they think there’s been information manipulation at scale in Canada.
On April 21, the Privy Council Office reported that the SITE Task force has observed a "transnational repression" operation taking place on social media platforms where Chinese-speaking users in Canada are very action. This includes Facebook, WeChat, TikTok, RedNote, and Douyin, a sister app of TikTok for the Chinese market.
The content of this operation features a mock “wanted poster,” as well as disparaging headlines and comments, about Joe Tay, Conservative party candidate for Don Valley North.
Tay is known for his opposition to the People's Republic of China's laws and practices in the Hong Kong Special Administrative Region, and is one of six individuals targeted with monetary bounties by Hong Kong Police in December 2024.
The Site Task Force has raised concerns with the social media platforms and has briefed the security-cleared representative of the Conservative Party about the operation.
In contrast with the monitoring in Canada of foreign interference, Jankowicz said in the U.S. the Trump administration has just defunded the only entity in the State Department that was working on foreign interference because the issue has become so polarized.
Her advice for Canadians is “consume content deliberately and know that you might be under the spell of somebody for whom the truth is not the heart of the matter.”
Bridgman said Canadians need to make smart, intentional choices about how they’re getting their information. They should ask themselves whether the person providing the information is Canadian and actually cares about their community and Canadians, he said.
“When you’re consuming content, especially during an election and you care about your community, you care about your country, just be super-intentional about it.”
R$