Evidence of meeting #116 for Access to Information, Privacy and Ethics in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was going.

A video is available from Parliament.

On the agenda

MPs speaking

Also speaking

Ahmed Al-Rawi  Director, The Disinformation Project, Simon Fraser University, As an Individual
Richard Frank  Professor, School of Criminology, Simon Fraser University, As an Individual
Peter Loewen  Director, Munk School of Global Affairs & Public Policy, As an Individual
Clerk of the Committee  Ms. Nancy Vohl

11:05 a.m.

Conservative

The Chair Conservative John Brassard

I'm going to call the meeting to order.

Good morning, everyone. Welcome to meeting number 116 of the House of Commons Standing Committee on Access to Information, Privacy and Ethics.

Pursuant to Standing Order 108(3)(h), the committee is resuming its study of the impact of information and of misinformation on the work of parliamentarians.

I'm going to remind everybody about the audio issues. Make sure that when you're not using your earpiece, it's placed on the sticker that's on the desk. Please try to avoid hitting the microphones and try to avoid any feedback, because it does cause damage to our interpreters.

I'm going to call on our witnesses today.

I would like to welcome first, as an individual, Mr. Ahmed Al-Rawi, who is the director of the Disinformation Project at Simon Fraser University.

We also have Richard Frank, who is a professor in the School of Criminology at Simon Fraser University.

As well, we have Mr. Peter Loewen, who is the director of the Munk School of Global Affairs and Public Policy.

I want to welcome all three of you to our committee today on this important study.

We are going to start with Mr. Al-Rawi.

You have up to five minutes to address the committee, sir. Please go ahead.

11:05 a.m.

Dr. Ahmed Al-Rawi Director, The Disinformation Project, Simon Fraser University, As an Individual

Thank you very much.

Dear honourable MPs and committee members, thank you for the invitation to address the committee and talk about the impact of disinformation on the work of parliamentarians. I will rely here on my previous academic research on the topic.

I think there are different internal and external challenges.

First, in connection to internal challenges, there is no doubt that Canadian politicians need to be continuously and factually informed about many national and international events and issues. The belief and spread of disinformation could create a serious obstacle in understanding these events. The result could ultimately influence democracy in a negative way.

It's important here to develop adequate verification skills and methods that largely rely on scientific consensus and collective intelligence about various issues. This is a fluid issue, because such consensus could change with time, depending on the emerging empirical evidence. Parliamentarians have to feel more comfortable navigating uncertainty.

Of course, there need to be thorough efforts to find factual pieces of information by examining different and alternative credible sources, assessing a variety of angles and reading beyond the news reports themselves. This verification needs to be done even if the information comes from Canada's allies, such as the Five Eyes.

The 2003 U.S. invasion of Iraq is just one example of how disinformation about Iraq's alleged link to al Qaeda or possession of weapons of mass destruction led to disastrous outcomes, not only to this country's infrastructure but also to millions of people.

More importantly, disinformation today has become a highly politicized and weaponized issue. Media literacy is not the magic key to counter it. This is because some very media-literate political actors have themselves a vested interest in spreading disinformation to serve their own political agenda.

In addition, there are external challenges when it comes to disinformation targeting parliamentarians. In my research about foreign actors targeting Canada on social media, for example, I found ample evidence of many foreign states' disinformation campaigns that were especially directed at Canadian politicians.

For example, the Saudi actors were slightly more active and negative, followed by Russian, Iranian and Chinese actors, when it comes to targeting parliamentarians. As for the overall amount of disinformation targeting Canadians in general, the Russian actors were more interested in spreading disinformation, followed by Iran, China and Saudi Arabia.

Ideologically, Russian-affiliated actors continuously attacked Mr. Justin Trudeau and his Liberal Party, focusing specifically on MPs from Muslim backgrounds. These actors mostly aligned themselves with the far right in Canada in terms of attacking minorities, especially Muslims and, to a lesser degree, LGBT communities.

As for Iranian actors, they focused their attacks on the Conservative Party, as well as Canadian MPs from Iranian origins who are critical of the regime.

The Saudi and Chinese trolls also attacked Mr. Trudeau, mainly due to the presence and activities of some critical human rights activists in Canada.

Other actors that are involved in spreading disinformation and that often target Canadians in general include extremist groups and wealthy elites, some of whom employ front groups and organizations to cause confusion about how we perceive reality.

For example, the oil and gas industry and the vaping industry in Canada are active in doing so.

The polarized public can also be part of these information activities. In my research, I often saw that they target the intersectional identities of racialized Canadian politicians, especially from minority backgrounds.

To mitigate the problem with disinformation, I suggest creating a non-partisan fact-checking initiative at the House of Commons, consisting of a variety of experts. The initiative needs to exclusively focus on fact-checking the evidence provided that is making different claims, rather than assessing opinions.

Thank you very much.

11:10 a.m.

Conservative

The Chair Conservative John Brassard

Thank you for that, Mr. Al-Rawi.

I understand that you shared with the clerk some examples of what you deem misinformation and disinformation. I'm just letting you know that the information has been sent to translation. I expect that we could have it by the end of today's meeting. I'll certainly share that information.

Go ahead, sir. I see your hand is up.

11:10 a.m.

Director, The Disinformation Project, Simon Fraser University, As an Individual

Dr. Ahmed Al-Rawi

These are just a few examples for the committee to look at.

11:10 a.m.

Conservative

The Chair Conservative John Brassard

Yes. I appreciate that. They're in translation.

Mr. Frank, we will go to you next. You have up to five minutes to address the committee.

Go ahead, sir.

11:10 a.m.

Dr. Richard Frank Professor, School of Criminology, Simon Fraser University, As an Individual

Thank you very much for the chance to be involved in this. We've been doing lots of research on this. I find it to be a very, very serious threat.

Over the years, information has been used as a weapon to target whoever the opponent is, but we used to call it propaganda. Disinformation I perceive as digital propaganda that reaches us digitally through SMS, text, blog, Twitter and Facebook, etc., but very much unlike propaganda, where counter-propaganda has been deployed, right now I think we essentially don't have any defences or any equivalent counter-disinformation to defend us.

Up until very recently, disinformation was seen as “just posts on social media”, and as quite harmless. Any general user reading it would not see the coordinated effort behind disinformation or the specific intent behind it. This makes it really difficult and tricky to identify. At least with propaganda, we saw the leaflets being dropped from the sky or the messages being broadcast through megaphones. We could recognize it as propaganda, or we would have an idea of the source and the intent, whereas with online information, the source and the intent are quite often hidden and obfuscated.

It does have real-world consequences. The Trump election was shown to have Russian influence. Brexit also allegedly had foreign influence. These are humongously big, drastic changes.

I'll pick on Russia for a bit. Russia did this through troll farms, creating thousands of social media accounts that looked to be ordinary users. These accounts supported radical political groups with specific political reasons. They fabricated articles, invented stories and posted nonsense. Quite often they even posted the truth but with a twist, aiming at vulnerable groups who then got riled up. These fake users can have very many followers. They look established and real.

This was done in an organized fashion in a state-run campaign. Internet Research Agency, as an example, had hundreds of employees. They had 12-hour shifts, from 9 a.m. to 9 p.m. and from 9 p.m. to 9 a.m. These shifts overlapped with U.S. holidays and working hours, so they looked real. With a budget of about $600,000 Canadian a month, which might seem like a lot, they were able to achieve real impact abroad. Compared with a military intervention, $600,000 a month is negligible.

Disinformation or any such content is designed to spread. A study in 2016 showed that this content does spread six times faster than real news. There's an old proverb that says a lie goes halfway around the world before the truth gets its boots on. That's very much true here as well.

Another study in 2019 showed that over 70 countries had such disinformation campaigns. Facebook was the number one platform for this. Canada is not immune to any of this. We've had election interference. One MP in B.C. lost an election specifically because of disinformation. We are under attack. Disinformation is promoting the superiority of foreign countries and undermining confidence in our democracies, etc.

It's now been six years since we started working on disinformation detection, specifically looking at training computer models to detect this type of content. We've done about four or five projects specifically on this, funded by the Government of Canada. The end goal is to detect this disinformation campaign with artificial intelligence. Our models show that with about 90% accuracy, we can detect this content, so we know that this is doable.

Back in January of 2022, we were asked by our project funders—the Canadian Armed Forces, at that point—to study Russian online activity to see what their stance was with respect to Ukraine.

We submitted our findings on February 13, 11 days before the war started, essentially saying that Russia is painting itself as the victim and that it's taking steps to defend itself, and that NATO, the European Union, the U.S. and other western nations are aggressors against Russia. Eleven days later they attacked Ukraine.

This plan to attack—not this specific plan, but the intent to attack—was seen online beforehand.

11:15 a.m.

Conservative

The Chair Conservative John Brassard

Mr. Frank, I'm sorry, sir—

11:15 a.m.

Professor, School of Criminology, Simon Fraser University, As an Individual

Dr. Richard Frank

All of this content is hidden in a lot of innocuous information—soccer scores, TV shows—so it is hard to detect.

The solutions have to be community-specific. The exact same message can be safe in one community but a trigger in another.

11:15 a.m.

Conservative

The Chair Conservative John Brassard

Sir, I'm going to have to cut you off there. You're over five minutes.

I'm sure members will have lots of questions to ask.

I really hate this part of my job. I really was enjoying what you were saying, but we have to stay on time.

Mr. Loewen, you have up to five minutes to address the committee. Go ahead, sir.

11:15 a.m.

Peter Loewen Director, Munk School of Global Affairs & Public Policy, As an Individual

Thank you very much to the committee for this invitation to appear, and thank you, each and every one of you, for the irreplaceable work you do as members of Parliament.

It is a real pleasure, and it's an honour as well, to be with you today to talk about the role of misinformation and disinformation in your work as members of Parliament.

I know you will hear from a large number of witnesses, so I hope that I can make a few helpful observations in addition to what's been said and what will be said.

I come at this question, I'll just say, with two relevant sets of knowledge.

I'm first and foremost a professor. I have, for several Canadian elections, been conducting large-scale surveys to enable academic studies of how our democracy functions. Along with partners at McGill University, my lab at the U of T has been a leading collector of data on the media ecosystem or the information ecosystem in Canada, which is that combination of what's being said and what's being believed, and what exists in the media and in the minds of Canadians. We've also recently conducted a global study on attitudes towards artificial intelligence, which is relevant for the management of misinformation and for platform governance.

Second, in addition to my academic work, I have worked as an expert witness for the Government of Canada in its unsuccessful attempts to defend changes to the Canada Elections Act that would prohibit the spreading of falsehoods about candidates' biographies. You may recall that this case was heard in 2020.

I'd like to draw from these two sets of experiences to make five brief points about the relationship between misinformation and disinformation and your work as members of Parliament.

The first point is that misinformation and disinformation have always been a part of our elections. For as long as we've been having elections, individuals and groups have been spreading falsehoods about candidates, about parties, about what they believe and about what they'll do in office.

Second, we know very little about the actual effects of misinformation and disinformation, so it becomes very hard to make concrete, empirical claims about it, but even if misinformation and disinformation have little potential effect, they still matter normatively to the quality of our elections.

Third—and this is to that point—we need to separate the effects of disinformation on voters from its effect on the integrity of our elections. Elections are largely about giving voters reasons for their decisions. If voters are voting based on misinformation, it is damaging, even if it doesn't change the way they would have voted absent that disinformation.

For example, if a voter comes to the view that they are going to vote for the government for reasons that aren't true, that decision by the voter is arguably of less democratic quality than if they're voting for the government for reasons that are true, and likewise for a vote for any other party.

Similarly, if MPs believe that they've won on the backs of misinformation or if they believe that other MPs have won on the backs of misinformation and disinformation, especially that which may have come from foreign governments, then that can seriously erode not only trust in our democracy but also trust between MPs. I presume, though I've never been inside a caucus, that it can erode the functioning of caucuses.

Fourth, Canada is perhaps uniquely poorly positioned to address the online spreading of misinformation and disinformation. It's quite clear that our legal regime makes it very difficult to prohibit the spreading of falsehoods during elections, absent an explicit demonstration of intent and knowledge that the information is false. Also, we don't have a sufficiently high amount of public trust to address platform regulation. Canadians, when you compare them with other citizens globally, don't view technology companies as partners in addressing these problems, and they are at the same time skeptical of government's capacity to regulate them, as well.

Fifth—and I say this with some sensitivity—members of Parliament and candidates' offices can be sources of misinformation and disinformation. It's important, then, to make sure we have norms, practices and standards that make this unacceptable. Election candidates have incentives to spread misinformation and disinformation about their opponents and about the electoral process. We have to look inside to ask what we can do to stop that as well.

If there are two takeaways from all of this, it's that we first need to understand the extent and the effects of misinformation and disinformation much more carefully, and that it is on Canadians, and especially our political actors, to take seriously the maintenance of the integrity of our elections.

Thank you very much.

11:20 a.m.

Conservative

The Chair Conservative John Brassard

I want to thank all three of you for your opening statements.

We're going to go to our six-minute rounds, starting with Mr. Barrett.

Go ahead, sir.

11:20 a.m.

Conservative

Michael Barrett Conservative Leeds—Grenville—Thousand Islands and Rideau Lakes, ON

My questions are for you, Mr. Loewen.

In what ways does the Communist dictatorship in Beijing use misinformation to influence the Chinese diaspora community here in Canada?

11:20 a.m.

Director, Munk School of Global Affairs & Public Policy, As an Individual

Peter Loewen

This is not an area in which I have precise expertise, so I'm not going to take long to say it.

The most effective way that this can happen, as I've seen it, is by inserting into the ecosystem ideas about what political candidates would do or what parties would do. Then it allows individuals who are interested in politics and like to talk about it to spread those ideas. Think about it as an infection and a virus that spreads. That's an effective mental model for understanding how the CCP wants to influence voters' views during elections.

11:20 a.m.

Conservative

Michael Barrett Conservative Leeds—Grenville—Thousand Islands and Rideau Lakes, ON

Based on that prescription, do you believe that the dictatorship in Beijing has been successful in their misinformation campaigns in Canada?

11:20 a.m.

Director, Munk School of Global Affairs & Public Policy, As an Individual

Peter Loewen

This is a very difficult question to answer, in my view, Mr. Barrett,. I appreciate your asking it.

I'll just say it quickly on two levels.

Suppose that the Communist Party of China has spread misinformation about the positions of parties or voters in ways that are untrue and that are damaging. Perhaps they are true about positions, but they've spread those ideas and amplified them. That may have had the effect of changing voters' views and changing the views of Chinese-Canadian voters. It's very hard empirically to say so.

Even if it didn't, Mr. Barrett, the potentially equal effect is that we've spent all of this time wondering if the integrity of our elections has been disrupted. That is something that non-democratic regimes want us to do. They want us to wonder whether the integrity of our elections has been corrupted. The only thing to do then is take that possibility very seriously.

11:20 a.m.

Conservative

Michael Barrett Conservative Leeds—Grenville—Thousand Islands and Rideau Lakes, ON

Along with misinformation, we heard in testimony at this committee from the Chinese diaspora community that Beijing's campaign of influence and interference goes beyond disinformation. It extends to threats targeting the well-being of members of that community, especially of family members who might be in mainland China.

Is that tactic something that's typical of foreign state actors—pairing their online campaigns with real-world threats and targeting?

11:25 a.m.

Director, Munk School of Global Affairs & Public Policy, As an Individual

Peter Loewen

I don't have the expertise to comment on that.

11:25 a.m.

Conservative

Michael Barrett Conservative Leeds—Grenville—Thousand Islands and Rideau Lakes, ON

Are you familiar with Justice Hogue's report, the interim report that was published?

11:25 a.m.

Director, Munk School of Global Affairs & Public Policy, As an Individual

11:25 a.m.

Conservative

Michael Barrett Conservative Leeds—Grenville—Thousand Islands and Rideau Lakes, ON

I have a couple of quick questions with respect to that.

On April 11, the Prime Minister said that “It wasn't simply that overall the election was free and fair”, but that in “every single constituency election...the election integrity held, and it was free and fair.”

Justice Hogue's conclusions indicated otherwise. She concluded that there was well-grounded suspicion about the PRC interference in, for example, Don Valley North, and “It could...have impacted who was elected to Parliament. This is significant.”

Justice Hogue further concluded that in Steveston—Richmond East, “there are strong indicators of PRC involvement and there is a reasonable possibility that these narratives could have impacted the result in this riding.”

We have an independent justice who has issued an interim report and pronounced on this issue, and we have the head of government, the Prime Minister, saying something different. What are we to take from that when we're looking at the upside for one individual, in this case the Prime Minister, to take an interpretation that we would say is far too generous? It could be perceived as being misinformation.

11:25 a.m.

Director, Munk School of Global Affairs & Public Policy, As an Individual

Peter Loewen

Mr. Barrett, I read the Prime Minister's statement as definitive. I read the statement of Justice Hogue as being one with uncertainty, one in which she's saying that we don't know, but it's possible.

As a person who's spent the better part of 15 years as a practising academic studying elections very closely and trying to figure out why some ridings are won and some ridings are lost, I'll tell you that Justice Hogue has the correct position. We cannot be sure that each and every riding in Canada in the 2021 election was not influenced by China. For the Prime Minister to say that he's absolutely certain that Chinese influence had no effect is not a sustainable position.

11:25 a.m.

Conservative

Michael Barrett Conservative Leeds—Grenville—Thousand Islands and Rideau Lakes, ON

What does that say, then, to Canadians who are members of certain diaspora communities, but also to Canadians writ large, when the head of government is asserting something that there is no certainty about, as you inferred from Justice Hogue's interim report?

11:25 a.m.

Conservative

The Chair Conservative John Brassard

You have a 30-second response, Mr. Loewen.

11:25 a.m.

Director, Munk School of Global Affairs & Public Policy, As an Individual

Peter Loewen

I think that's largely a political question. I don't mean to dodge it, but what that says about the Prime Minister and his judgment is for Canadians to decide.

11:25 a.m.

Conservative

Michael Barrett Conservative Leeds—Grenville—Thousand Islands and Rideau Lakes, ON

I'd say, with the remaining 15 seconds, that it's a political decision and not one that's in the best interest of Canadians.

Thanks.