An Act to enact the Online Harms Act, to amend the Criminal Code, the Canadian Human Rights Act and An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service and to make consequential and related amendments to other Acts

Sponsor

Arif Virani  Liberal

Status

Second reading (House), as of Feb. 26, 2024

Subscribe to a feed (what's a feed?) of speeches and votes in the House related to Bill C-63.

Summary

This is from the published bill. The Library of Parliament often publishes better independent summaries.

Part 1 of this enactment enacts the Online Harms Act , whose purpose is to, among other things, promote the online safety of persons in Canada, reduce harms caused to persons in Canada as a result of harmful content online and ensure that the operators of social media services in respect of which that Act applies are transparent and accountable with respect to their duties under that Act.
That Act, among other things,
(a) establishes the Digital Safety Commission of Canada, whose mandate is to administer and enforce that Act, ensure that operators of social media services in respect of which that Act applies are transparent and accountable with respect to their duties under that Act and contribute to the development of standards with respect to online safety;
(b) creates the position of Digital Safety Ombudsperson of Canada, whose mandate is to provide support to users of social media services in respect of which that Act applies and advocate for the public interest in relation to online safety;
(c) establishes the Digital Safety Office of Canada, whose mandate is to support the Digital Safety Commission of Canada and the Digital Safety Ombudsperson of Canada in the fulfillment of their mandates;
(d) imposes on the operators of social media services in respect of which that Act applies
(i) a duty to act responsibly in respect of the services that they operate, including by implementing measures that are adequate to mitigate the risk that users will be exposed to harmful content on the services and submitting digital safety plans to the Digital Safety Commission of Canada,
(ii) a duty to protect children in respect of the services that they operate by integrating into the services design features that are provided for by regulations,
(iii) a duty to make content that sexually victimizes a child or revictimizes a survivor and intimate content communicated without consent inaccessible to persons in Canada in certain circumstances, and
(iv) a duty to keep all records that are necessary to determine whether they are complying with their duties under that Act;
(e) authorizes the Digital Safety Commission of Canada to accredit certain persons that conduct research or engage in education, advocacy or awareness activities that are related to that Act for the purposes of enabling those persons to have access to inventories of electronic data and to electronic data of the operators of social media services in respect of which that Act applies;
(f) provides that persons in Canada may make a complaint to the Digital Safety Commission of Canada that content on a social media service in respect of which that Act applies is content that sexually victimizes a child or revictimizes a survivor or intimate content communicated without consent and authorizes the Commission to make orders requiring the operators of those services to make that content inaccessible to persons in Canada;
(g) authorizes the Governor in Council to make regulations respecting the payment of charges by the operators of social media services in respect of which that Act applies, for the purpose of recovering costs incurred in relation to that Act.
Part 1 also makes consequential amendments to other Acts.
Part 2 amends the Criminal Code to, among other things,
(a) create a hate crime offence of committing an offence under that Act or any other Act of Parliament that is motivated by hatred based on certain factors;
(b) create a recognizance to keep the peace relating to hate propaganda and hate crime offences;
(c) define “hatred” for the purposes of the new offence and the hate propaganda offences; and
(d) increase the maximum sentences for the hate propaganda offences.
It also makes related amendments to other Acts.
Part 3 amends the Canadian Human Rights Act to provide that it is a discriminatory practice to communicate or cause to be communicated hate speech by means of the Internet or any other means of telecommunication in a context in which the hate speech is likely to foment detestation or vilification of an individual or group of individuals on the basis of a prohibited ground of discrimination. It authorizes the Canadian Human Rights Commission to deal with complaints alleging that discriminatory practice and authorizes the Canadian Human Rights Tribunal to inquire into such complaints and order remedies.
Part 4 amends An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service to, among other things,
(a) clarify the types of Internet services covered by that Act;
(b) simplify the mandatory notification process set out in section 3 by providing that all notifications be sent to a law enforcement body designated in the regulations;
(c) require that transmission data be provided with the mandatory notice in cases where the content is manifestly child pornography;
(d) extend the period of preservation of data related to an offence;
(e) extend the limitation period for the prosecution of an offence under that Act; and
(f) add certain regulation-making powers.
Part 5 contains a coordinating amendment.

Elsewhere

All sorts of information on this bill is available at LEGISinfo, an excellent resource from the Library of Parliament. You can also read the full text of the bill.

Stopping Internet Sexual Exploitation ActPrivate Members' Business

May 7th, 2024 / 6:45 p.m.
See context

Conservative

Arnold Viersen Conservative Peace River—Westlock, AB

Mr. Speaker, I am grateful for the opportunity to wrap up the debate on the SISE act at second reading.

I have appreciated listening to the members give their speeches. At the outset, I want to briefly urge members to use the term “child sexual abuse material”, or CSAM, rather than “child pornography”. As we heard from the member for Kamloops—Thompson—Cariboo, the latter term is being replaced with CSAM because pornography allows for the idea that this could be consensual. That is why the member for Kamloops—Thompson—Cariboo has put forward a bill that would change this in the Criminal Code as well.

During the first hour of debate, we heard from the member for Laurentides—Labelle, who gave a passionate speech outlining the many serious issues of the impact of the pornography industry on women and youth. I simply do not have the time to include all of that in my speech, but we both sat on the ethics committee during the Pornhub study and heard directly from the survivors who testified.

It was the speech, however, from the Parliamentary Secretary to the Leader of the Government in the House of Commons that left me scratching my head. I do not think he actually read Bill C-270 or even the Liberals' own bill, Bill C-63. The parliamentary secretary fixated on the 24-hour takedown requirement in Bill C-63 as the solution to this issue. However, I do not think anyone is opposed to a 24-hour takedown for this exploitative intimate content sharing without consent or the child sexual abuse material. In fact, a bill that was solely focused on the 24-hour takedown would pass very quickly through this House with the support of everyone, but that does not take into account what Bill C-270 is trying to do. It is completely missing the point.

The 24-hour takedown has effect only after harmful content has been put up, such as CSAM, deepfakes and intimate images that have been shared. Bill C-270 is a preventative upstream approach. While the takedown mechanism should be available to victims, the goal of Bill C-270 is to go upstream and stop this abusive content from ever ending up on the Internet in the first place.

As I shared at the beginning of the debate, many survivors do not know that their images are online for years. They do not know that this exploitative content has been uploaded. What good would a 24-hour takedown be if they do not even know the content is there? I will repeat the words of one survivor that I shared during the first hour of debate: “I was 17 when videos of me on Pornhub came to my knowledge, and I was only 15 in the videos they've been profiting from.” She did not know for two years that exploitative content of her was being circulated online and sold. That is why Bill C-270 requires age verification and consent of individuals in pornographic material before it is posted.

I would also point out that the primary focus of the government's bill is not to reduce harm to victims. The government's bill requires services “to mitigate the risk that users of the regulated service will be exposed to harmful content”. It talks about users of the platform, not the folks depicted in it. The focus of Bill C-270 is the other side of the screen. Bill C-270 seeks to protect survivors and vulnerable populations from being the harmful content. The two goals could not be more different, and I hope the government is supportive of preventing victims of exploitation from further exploitation online.

My colleague from Esquimalt—Saanich—Sooke also noted that the narrow focus of the SISE act is targeted at people and companies that profit from sexual exploitative content. This is, indeed, one of the primary aims of this bill. I hope, as with many things, that the spread of this exploitative content online will be diminished, as it is driven by profit. The Privacy Commissioner's investigation into Canada's MindGeek found that “MindGeek surely benefits commercially from these non-compliant privacy practices, which result in a larger content volume/stream and library of intimate content on its websites.”

For years, pornography companies have been just turning a blind eye, and it is time to end that. Bill C-270 is a fulfillment of a key recommendation made by the ethics committee three years ago and supported by all parties, including the government. I hope to have the support from all of my colleagues in this place for Bill C-270, and I hope to see it at committee, where we can hear from survivors and experts.

Stopping Internet Sexual Exploitation ActPrivate Members' Business

May 7th, 2024 / 6:40 p.m.
See context

Conservative

Garnett Genuis Conservative Sherwood Park—Fort Saskatchewan, AB

Mr. Speaker, I appreciate the opportunity to say a few words in support of Bill C-270, which is an excellent bill from my colleague from Peace River—Westlock, who has been working so hard over his nine years in Parliament to defend the interests of his constituents on important issues like firearms, forestry and fracking, but also to stand up for justice and the recognition of the universal human dignity of all people, including and especially the most vulnerable.

Bill C-270 seeks to create mechanisms for the effective enforcement of substantively already existing legal provisions that prohibit non-consensual distribution of intimate images and child pornography. Right now, as the law stands, it is a criminal offence to produce this type of horrific material, but there are not the appropriate legal mechanisms to prevent the distribution of this material by, for instance, large pornography websites.

It has come to light that Pornhub, which is headquartered in Canada, has completely failed to prevent the presence on its platform of non-consensual and child-depicting pornographic images. This has been a matter that has been studied in great detail at parliamentary committees. My colleague for Peace River—Westlock has played a central role, but other members from other parties have as well, in identifying the fact that Pornhub and other websites have not only failed but have shown no interest in meaningfully protecting potential victims of non-consensual and child pornographic images.

It is already illegal to produce these images. Why, therefore, should it not also be clearly illegal to distribute those images without having the necessary proof of consent? This bill would require that there be verification of age and consent associated with images that are distributed. It is a common-sense legal change that would require and affect greater compliance with existing criminal prohibitions on the creation of these images. It is based on the evidence heard at committee and based on the reality that major pornography websites, many of which are headquartered in Canada, are continuing to allow this material to exist. To clarify, the fact that those images are on those websites means that we desperately need stronger legal tools to protect children and stronger legal tools to protect people who are victims of the non-consensual sharing of their images.

Further, in response to the recognition of the potential harms on children associated with exposure to pornography or associated with having images taken of them and published online, there has been discussion in Parliament and a number of different bills put forward designed to protect children in vulnerable situations. These bills are, most notably, Bill C-270 and Bill S-210.

Bill S-210 would protect children by requiring meaningful age verification for those who are viewing pornography. It is recognized that exposing children to sexual images is a form of child abuse. If an adult were to show videos or pictures to a child of a sexual nature, that would be considered child abuse. However, when websites fail to have meaningful age verification and, therefore, very young children are accessing pornography, there are not currently the legal tools to hold them accountable for that. We need to recognize that exposing young children to sexual images is a form of child abuse, and therefore it is an urgent matter that we pass legislation requiring meaningful age verification. That is Bill S-210.

Then we have Bill C-270, which would protect children in a different context. It would protect children from having their images depicted as part of child pornography. Bill C-270 takes those existing prohibitions further by requiring that those distributing images also have proof of age and consent.

This is common sense; the use of criminal law is appropriate here because we are talking about instances of child sexual abuse. Both Bill S-210 and Bill C-270 deal with child sexual abuse. It should be clear that the criminal law, not some complicated nebulous regulatory regime, is the appropriate mechanism for dealing with child abuse.

In that context, we also have a government bill that has been put forward, Bill C-63, which it calls the online harms act. The proposed bill is kind of a bizarre combination of talking about issues of radically different natures; there are some issues around speech, changes to human rights law and, potentially, attempts to protect children, as we have talked about.

The freedom of speech issues raised by the bill have been well discussed. The government has been denounced from a broad range of quarters, including some of their traditional supporters, for the failures of Bill C-63 on speech.

However, Bill C-63 also profoundly fails to be effective when it comes to child protection and the removal of non-consensual images. It would create a new bureaucratic structure, and it is based on a 24-hour takedown model; it says that if something is identified, it should be taken down within 24 hours. Anybody involved in this area will tell us that 24-hour takedown is totally ineffective, because once something is on the Internet, it is likely to be downloaded and reshared over and over again. The traumatization, the revictimization that happens, continues to happen in the face of a 24-hour takedown model.

This is why we need strong Criminal Code measures to protect children. The Conservative bills, Bill S-210 and Bill C-270, would provide the strong criminal tools to protect children without all the additional problems associated with Bill C-63. I encourage the House to pass these proposed strong child protection Criminal Code-amending bills, Bill S-210 and Bill C-270. They would protect children from child abuse, and given the legal vacuums that exist in this area, there can be no greater, more important objective than protecting children from the kind of violence and sexualization they are currently exposed to.

Stopping Internet Sexual Exploitation ActPrivate Members' Business

May 7th, 2024 / 6:30 p.m.
See context

Bloc

Julie Vignola Bloc Beauport—Limoilou, QC

Mr. Speaker, the subject that we are dealing with this evening is a sensitive one. My colleagues have clearly demonstrated that in the last couple of minutes.

We all have access to the Internet and we basically use it for three reasons: for personal reasons, for professional reasons and for leisure, which can sometimes overlap with personal reasons. Pornography is one of those uses that is both for leisure and for personal reasons. To each their own.

The use of pornography is a personal choice that is not illegal. Some people might question that. We might agree or disagree, but it is a personal decision. However, the choice that one person makes for their own pleasure may be the cause of another person's or many other people's nightmare. Basically, that is what Bill C-270 seeks to prevent, what it seeks to sanction. The purpose of the bill is to ensure that people do not have to go through hell because of pornography. This bill seeks to criminalize the fact that, under the guise of legality, some of the images that are being viewed were taken or are being used illegally.

I want to talk briefly about the problem this bill addresses and the solutions that it proposes. Then, to wrap up, I will share some of my own thoughts about it.

For context for this bill and two others that are being studied, Bill S‑210 and C‑63, it was a newspaper article that sounded the alarm. After the article came out, a House of Commons committee that my esteemed colleague from Laurentides—Labelle sits on looked at the issue. At that time, the media informed the public that videos of women and children were available on websites even though these women and, naturally, these children never gave their consent to be filmed or for their video to be shared. We also learned that this included youths under 18. As I said, a committee looked at the issue. The images and testimonies received by the committee members were so shocking that several bills that I mentioned earlier were introduced to try to tackle the issue in whole or in part.

I want to be clear: watching pornography is not the problem—to each their own. If someone likes watching others have sex, that is none of my concern or anyone else's. However, the problem is the lack of consent of the people involved in the video and the use of children, as I have already said.

I am sure that the vast majority of consumers of pornography were horrified to find out that some of the videos they watched may have involved young people under the age of 18. These children sometimes wear makeup to look older. Women could be filmed without their knowledge by a partner or former partner, who then released the video. These are intimate interactions. People have forgotten what intimacy means. If a person agrees to be filmed in an intimate situation because it is kind of exciting or whatever, that is fine, but intimacy, as the word itself implies, does not mean public.

When a young person or an adult decides to show the video to friends to prove how cool it is that they got someone else to do something, that is degrading. It is beyond the pale. It gets to me because I saw that kind of thing in schools. Kids were so pleased with themselves. I am sorry, but it is rarely the girls who are so pleased with themselves. They are the ones who suffer the negative consequences. At the end of the day, they are the ones who get dragged through the mud. Porn sites were no better. They tried to absolve themselves by saying that they just broadcast the stuff and it is not up to them to find out if the person consented or was at least 18. Broadcasting is just as bad as producing without consent. It encourages these illegal, degrading, utterly dehumanizing acts.

I am going back to my notes now. The problem is that everyone is blaming everyone else. The producer says it is fine. The platform says it is fine. Ultimately, governments say the same thing. This is 2024. The Internet is not new. Man being man—and I am talking about humankind, humans in general—we were bound to find ourselves in degrading situations. The government waited far too long to legislate on this issue.

In fact, the committee that looked into the matter could only observe the failure of content moderation practices, as well as the failure to protect people's privacy. Even if the video was taken down, it would resurface because a consumer had downloaded it and thought it was a good idea to upload it again and watch it again. This is unspeakable. It seems to me that people need to use some brain cells. If a video can no longer be found, perhaps there is a reason for that, and the video should not be uploaded again. Thinking and using one's head is not something governments can control, but we have to do everything we can.

What is the purpose of this bill and the other two bills? We want to fight against all forms of sexual exploitation and violence online, end the streaming and marketing of all pornographic material involving minors, prevent and prohibit the streaming of non-consensual explicit content, force adult content companies and streaming services to control the streaming of this content and make them accountable and criminally responsible for the presence of this content on their online sites. Enough with shirking responsibility. Enough with saying: it is not my fault if she feels degraded, if her reputation is ruined and if, at the end of the day, she feels like throwing herself off a bridge. Yes, the person who distributes pornographic material and the person who makes it are equally responsible.

Bill C‑270 defines the word “consent” and the expression “pornographic material”, which is good. It adds two new penalties. Essentially, a person who makes or distributes the material must ensure that the person involved in the video is 18 and has given their express consent. If the distributor does not ask for it and does not require it, they are at fault.

We must also think about some of the terms, such as “privacy”, “education”, but also the definition of “distributor” because Bill C-270 focuses primarily on distributors for commercial purposes. However, there are other distributors who are not in this for commercial purposes. That is not nearly as pretty. I believe we need to think about that aspect. Perhaps legal consumers of pornography would like to see their rights protected.

I will end with just one sentence: A real statesperson protects the dignity of the weak. That is our role.

Stopping Internet Sexual Exploitation ActPrivate Members' Business

May 7th, 2024 / 6:20 p.m.
See context

Etobicoke—Lakeshore Ontario

Liberal

James Maloney LiberalParliamentary Secretary to the Minister of Justice and Attorney General of Canada

Mr. Speaker, I am very pleased to speak to Bill C-270, an act to amend the Criminal Code (pornographic material), at second reading.

I would like to begin my remarks by stressing the bill's important objective. It is to ensure that those who make, distribute or advertise pornographic material verify that those depicted in that material are at least 18 years of age and have consented to its production and distribution.

As the sponsor has explained, the bill's objective is to implement recommendation number two of the 2021 report of the House of Commons Standing Committee on Access to Information, Privacy and Ethics. Specifically, that report recommends that the government “mandate that content-hosting platforms operating in Canada require affirmation from all persons depicted in pornographic content, before it can be uploaded, that they are 18 years old or older and that they consent to its distribution”.

This recommendation responds to ongoing concerns that corporations like Pornhub have made available pornographic images of persons who did not consent or were underage. I want to recognize and acknowledge that this conduct has caused those depicted in that material extreme suffering. I agree that we must do everything we can to protect those who have been subjected to this trauma and to prevent it from occurring in the first place. I fully support the objective of the committee's recommendation.

I want to say at the outset that the government will be supporting this bill, Bill C-270, at second reading, but with some serious reservations. I have some concerns about the bill's ability to achieve the objective of the committee's recommendation. I look forward, at committee, to where we can hear from experts on whether this bill would be useful in combatting child pornography.

The bill proposes Criminal Code offences that would prohibit making, distributing or advertising pornographic material, without first verifying the age and consent of those depicted by examining legal documentation and securing formal written consent. These offences would not just apply to corporations. They would also apply to individuals who make or distribute pornographic material of themselves and others to generate income, a practice that is legal and that we know has increased in recent years due to financial hardship, including that caused by the pandemic.

Individuals who informally make or distribute pornographic material of themselves and of people they know are unlikely to verify age by examining legal documentation, especially if they already know the age of those participating in the creation of the material. They are also unlikely to secure formal written consent. It concerns me that such people would be criminalized by the bill's proposed offences, where they knew that everyone implicated was consenting and of age, merely because they did not comply with the bill's proposed regulatory regime governing how age and consent must be verified.

Who is most likely to engage in this conduct? The marginalized people who have been most impacted by the pandemic, in particular sex workers, who are disproportionately women and members of the 2SLGBTQI+ communities. Notably, the privacy and ethics committee clearly stated that its goal was “in no way to challenge the legality of pornography involving consenting adults or to negatively impact sex workers.” However, I fear that the bill's proposed reforms could very well have this effect.

I am also concerned that this approach is not consistent with the basic principles of criminal law. Such principles require criminal offences to have a fault or a mental element, for example, that the accused knew or was reckless as to whether those depicted in the pornographic material did not consent or were not of age. This concern is exacerbated by the fact that the bill would place the burden on the accused to establish that they took the necessary steps to verify age and consent to avoid criminal liability. However, basic principles of criminal law specify that persons accused of criminal offences need only raise a reasonable doubt as to whether they committed the offence to avoid criminal liability.

I would also note that the committee did not specifically contemplate a criminal law response to its concerns. In fact, a regulatory response that applies to corporations that make, distribute or advertise pornographic material may be better positioned to achieve the objectives of the bill. For example, our government's bill, Bill C-63, which would enact the online harms act, would achieve many of Bill C-270's objectives. In particular, the online harms act would target seven types of harmful content, including content that sexually victimizes a child or revictimizes a survivor, and intimate content communicated without consent.

Social media services would be subjected to three duties: to act responsibly, to protect children and to make content inaccessible that sexually victimizes a child or revictimizes a survivor, as well as intimate images posted without consent.

These duties would apply to social media services, including livestreaming and user-uploaded adult content services. They would require social media services to actively reduce the risk of exposure to harmful content on their services; provide clear and accessible ways to flag harmful content and block users; put in place special protections for children; take action to address child sexual exploitation and the non-consensual posting of intimate content, including deepfake sexual images; and publish transparency reports.

Bill C-63 would also create a new digital safety commission to administer this regulatory framework and to improve the investigation of child pornography cases through amendments to the Mandatory Reporting Act. That act requires Internet service providers to report to police when they have reasonable grounds to believe their service is being used to commit a child pornography offence. Failure to comply with this obligation can result in severe penalties.

As I know we are all aware, the Criminal Code also covers a range of offences that address aspects of the concerns animating the proposed bill. Of course, making and distributing child pornography are both already offences under the Criminal Code. As well, making pornography without the depicted person's knowledge can constitute voyeurism, and filming or distributing a recording of a sexual assault constitutes obscenity. Also, distributing intimate images without the consent of the person depicted in those images constitutes non-consensual distribution of intimate images, and the Criminal Code authorizes courts to order the takedown or removal of non-consensual intimate images and child pornography.

All these offences apply to both individuals and organizations, including corporations, as set out in section 2 of the Criminal Code. Should parliamentarians choose to pursue a criminal response to the concerns the proposed bill seeks to address, we may want to reflect upon whether the bill's objectives should be construed differently and its provisions amended accordingly.

I look forward to further studying such an important bill at committee.

Stopping Internet Sexual Exploitation ActPrivate Members' Business

May 7th, 2024 / 5:50 p.m.
See context

Bloc

Andréanne Larouche Bloc Shefford, QC

Mr. Speaker, as the member for Shefford and the Bloc Québécois critic for the status of women, I want to say that we support Bill C-270 in principle. We would like to examine this bill in committee. The Bloc Québécois fully supports the bill's stated objective, which is to combat child pornography and the distribution and commercialization of non-consensual pornography.

Since the first warning about the tragedy of women and girls whose sexual exploitation is the source of profits for major online porn companies, the Bloc Québécois has been involved at every stage and at all times in the public process to expose the extent of this public problem, which goes to our core values, including the right to dignity, safety and equality.

On this subject of online sexual exploitation, as on all facets and forms of the sexual exploitation of women, we want to stand as allies not only of the victims, but also of all the women who are taking action to combat violence and exploitation. I will begin by giving a little background on the topic, then I will explain the bill and, in closing, I will expand on some of the other problems that exist in Canada.

First, let us not forget that the public was alerted to the presence of non-consensual child pornography by an article that was published in the New York Times on December 4, 2020. The article reported the poignant story of 14-year old Serena K. Fleites. Explicit videos of her were posted on the website Pornhub without her consent.

This Parliament has already heard the devastating, distressing and appalling testimony of young Serena, which helped us understand the sensitive nature and gravity of the issue, but also the perverse mechanisms that porn streaming platforms use to get rich by exploiting the flaws of a technological system that, far from successfully controlling the content that is broadcast, is built and designed to promote and yet conceal the criminal practices of sexual exploitation.

Reports regarding the presence of child sexual abuse material and other non-consensual content on the adult platform Pornhub led the Standing Committee on Access to Information, Privacy and Ethics to undertake a study on the protection of privacy and reputation on online platforms such as Pornhub. My colleague from Laurentides—Labelle has followed this issue closely.

The committee noted that these platforms' content moderation practices had failed to protect privacy and reputation and had failed to prevent child sexual abuse material from being uploaded, despite statements by representatives of MindGeek and Pornhub who testified before the committee.

That same committee looked at regulating adult sites and online pornography, without challenging the legality. The committee heard testimony from survivors, critics of MindGeek's practices, child protection organizations, members of law enforcement, the federal government, academics, experts and support organizations, and it received many briefs.

The Standing Committee on Access to Information, Privacy and Ethics made 14 recommendations regarding the problems it had studied. The committee's 2021 report was clear and it recommended that the government introduce a bill to create a new regulator to ensure that online platforms remove harmful content, including depictions of child sexual exploitation and non-consensual images.

We know that sexually explicit content is being uploaded to Pornhub without the consent of the individuals involved, including minors, and that these individuals have tried and failed to get Pornhub to remove that content. We know that these survivors have been traumatized and harassed and that most of them have thought about suicide. That is the type of testimony that we heard at the Standing Committee on the Status of Women with regard to cases of sexual exploitation.

We know that even if content is finally removed, users just re-upload it shortly afterward. We know that the corporate structure of MindGeek, which was renamed Aylo last August, is the quintessential model for avoiding accountability, transparency and liability. We know that investigations are under way and that there has been a surge in online child sexual exploitation reports.

We must now legislate to respond to these crimes and deal with these problems. We also need to keep in mind the magnitude of the criminal allegations and the misconduct of which these companies are accused. Just recently, a new class action lawsuit was filed in the United States against MindGeek and many of the sites it owns, including Pornhub, over allegations of sex trafficking involving tens of thousands of children.

Let us not forget that these companies are headquartered right in Montreal. The fact that our country is home to mafia-style companies that profit from sexual exploitation is nothing to be proud of. The international community is well aware of this, and it reflects poorly on us. For these reasons, we have an additional obligation to take action, to find solutions that will put an end to sexual exploitation, and to implement those solutions through legislation.

With that in mind, we must use the following questions to guide our thinking. Are legislative proposals on this subject putting forward the right solutions? Will they be effective at controlling online sexual exploitation and, specifically, preventing the distribution of non-consensual content and pornographic content involving minors?

Second, let us talk a little more about Bill C‑270. This bill forces producers of pornographic material to obtain the consent of individuals and to ensure that they are of age. In addition, distributors will have to obtain written confirmation from producers that the individuals' consent has been obtained and that they are of age before the material is distributed. These new Criminal Code provisions will require large platforms and producers to have a process for verifying individuals' age and consent, without which they will be subject to fines or imprisonment.

The House will be considering two bills simultaneously. The first is Bill C-270, from the member for Peace River—Westlock, with whom I co-chair the All-Party Parliamentary Group to End Modern Slavery and Human Trafficking. The second is Bill C-63, introduced by the Minister of Justice, which also enacts new online harms legislation and aims to combat the sexual victimization of children and to make intimate content communicated without consent inaccessible.

We will need to achieve our goals, which are to combat all forms of online sexual exploitation and violence, stop the distribution and marketing of all pornographic material involving minors, prevent and prohibit the distribution of explicit non-consensual content, force adult content companies and platforms to control the distribution of such content, and make them accountable and criminally responsible for the presence of such content on their online platforms.

There is a debate about the law's ability to make platforms accountable for hosted content. It also raises questions about the relevance of self-regulation in the pornography industry.

Third, let us talk about what we can do here. Due to the high volume of complaints it receives, the RCMP often reacts to matters relating to child sexual abuse material, or CSAM, rather than acting proactively to prevent them. Canada's criminal legislation prohibits child pornography, but also other behaviours aimed at facilitating the commission of a sexual offence against a minor. It prohibits voyeurism and the non-consensual distribution of intimate images. Other offences of general application such as criminal harassment and human trafficking may also apply depending on the circumstances.

In closing, I will provide a few figures to illustrate the scope of this problem. Between 2014 and 2022, there were 15,630 incidents of police-reported online sexual offences against children and 45,816 incidents of online child pornography. The overall rate of police-reported online child sexual exploitation incidents has also risen since 2014. The rate of online child pornography increased 290% between 2014 and 2022. Girls were overrepresented as victims for all offence types over that nine-year period. The majority of victims of police-reported online sexual offences against children were girls, particularly girls between the ages of 12 and 17, who accounted for 71% of victims.

Incidents of non-consensual distribution of intimate images most often involved a youth victim and a youth accused. Nearly all child and youth victims, 97% to be exact, between 2015 to 2022 were aged 12 to 17 years, with a median age of 15 years for girls and 14 years for boys. Overall, nine in 10 accused persons, or 90%, were youth aged 12 to 17. For one-third of youth victims, or 33%, a casual acquaintance had shared the victim's intimate images with others.

Here is a quote from the Montreal Council of Women: “On behalf of the members of the Montreal Council of Women, I wish to confirm our profound concern for those whose lives have been turned upside down by the involuntary and/or non-consensual sharing of their images on websites and other platforms such as the Montreal-based Pornhub. The ‘stopping Internet sexual exploitation act’ will make much-needed amendments to the Criminal Code to protect children and those who have not given consent for their images and other content to be shared and commercialized.”

We must act. It is a question of safety for our women and girls. Young women and girls are depending on it.

April 30th, 2024 / 12:20 p.m.
See context

Associate Professor of Journalism, Media School, UQAM, As an Individual

Patrick White

Canada is already working hard with what it did with Bill C-18 and Bill C-11 for Canadian content, and with Bill C-63 it's going to fight misinformation and contenu préjudiciable as well. Are we doing enough? Probably not, but AI is an opportunity as well as a threat.

As far as deepfakes are concerned, I would strongly urge the government to legislate on that matter within the next 12 to 18 months, especially on deepfake videos and deepfake audio, as well, which you mentioned.

We have a lot to work on in the next 12 months on that issue, taking into context the upcoming federal election in Canada.

April 30th, 2024 / 12:05 p.m.
See context

Patrick White Associate Professor of Journalism, Media School, UQAM, As an Individual

Good afternoon, everyone.

I'd like to thank the committee members for the invitation.

I've been a journalist since 1990 and a professor of journalism at Université du Québec à Montréal for five years.

I believe that 2024 represents a crossroads for disinformation and misinformation. Content automation has proliferated with the launch of the ChatGPT 3.5 AI chatbot in 2022. Not only that, but a Massachusetts Institute of Technology study published in 2018 shows that false news has been circulating six times faster on Twitter than fact-checked news. That's cause for concern.

Things have gotten worse on X, formerly called Twitter, over the past 18 months, since it was taken over by businessman Elon Musk, as a result of several announcements, including the possibility of acquiring a blue checkmark, meaning verified status, simply by paying a few dollars a month, along with the reinstatement of accounts like the one held by former U.S. President Trump, who is himself a major vector of disinformation.

These social network algorithms clearly promote content that generates the most traffic, meaning comments, “likes” and sharing, which amplifies the spread of extreme ideas that we've been seeing in recent years.

One current concern is Meta's blocking of news on Facebook and Instagram in Canada since the summer of 2023, which further fuels the growth of disinformation and misinformation by suppressing news from Canadian media, except for sports and cultural news.

A recently published study that was quoted by Reuters says:

comments and shares of what it categorised as “unreliable” sources climbed to 6.9% in Canada in the 90 days after the ban, compared to 2.2% in the 90 days before.

On the political side of things, I believe efforts should be made to get the news back on Facebook and Instagram by the end of 2024, before Canada's federal elections. The repercussions of this disinformation are political. For example, on Instagram, you now have to click on a tab to see political publications. They've been purposely blocked or restricted by Meta for several months now. The experience is unpleasant for Canadians on Facebook, because more and more content of interest to them from major Canadian media outlets is being replaced by junk news. This reduces the scope of what people are seeing, is harmful to democracy, and also leads to less traffic on news sites. According to a recently published study from McGill University, to which our colleague who testified earlier contributed, news is being replaced by memes on Facebook. It reports the disappearance of five million to eight million views per day of informational content in Canada.

The Canadian government will also have to take rapid action on the issue of artificial intelligence by prohibiting the dissemination of AI-generated content, like deep fake images and audio. Bill C-63 is a partial response to prejudicial content, but it doesn't go far enough. More transparency is needed with respect to AI-generated content.

Oversight is also urgently needed for intellectual property. The Montreal newspaper Le Devoir ran an article about that this morning. What are the boundaries? I encourage you to quickly develop legislation to address this issue, rather than wait 30 years, as was the case for Bill C-11.

Canadian parliamentarians also need to declare war on content farms that produce false news on request about our country and other countries. Foreign governments like China's and Russia's often use that strategy. We mustn't forget that 140 million people were exposed to false news in the United States during the 2020 election. That's clearly very troubling in view of the coming U.S. election this fall. I am also amazed that Canada has been allowing the Chinese Communist Party to continue spreading propaganda press releases on the Canadian Cision newswire for years.

To conclude, I'll be happy to answer your questions. Canada needs to be on a war footing against disinformation, whether generated by artificial intelligence or manually. Stricter rules are required for generative artificial intelligence and for the protection of intellectual property owned by Canadian media and artists, who should be benefiting from these technological advances over the coming years.

Thank you.

April 29th, 2024 / 11:20 a.m.
See context

Senior Assistant Deputy Minister, Strategy and Innovation Policy Sector, Department of Industry

Mark Schaan

At this time, no federal legislation defines the age of minority or majority. The only age defined is the voting age, which is set at 18. However, that has nothing to do with the concept of majority.

Bill C‑63 on harmful content online is currently proposing that the age of majority be set at 18 in the digital world.

That said, right now only the provinces and territories, based on vital statistics, determine the age of majority and minority in Canada.

Public SafetyAdjournment Proceedings

April 18th, 2024 / 6:30 p.m.
See context

Whitby Ontario

Liberal

Ryan Turnbull LiberalParliamentary Secretary to the Minister of Innovation

Madam Speaker, the member certainly could consider supporting the government's online harms bill, which I think is a major piece of legislation that certainly will help to protect minors and children when they are interacting online.

I appreciate this opportunity to speak about the ongoing threat of extortion in Canada. The Government of Canada is deeply concerned about Canadians who are victimized by acts of extortion and related violence. The Government of Canada is aware of growing concerns related to extortion across the country and, indeed, the government has heard directly from the mayors of Surrey, British Columbia; Edmonton, Alberta; and Brampton, Ontario, about how this is impacting their communities.

The recent increase in the number and severity of extortion attempts, particularly targeting members of Canada's South Asian community are alarming. The Government of Canada and the RCMP encourage anyone experiencing or witnessing extortion to report it to their local police of jurisdiction and discourage anyone from complying with demands for money.

Rest assured, the Government of Canada is committed to protecting the safety of Canadians and Canadian interests against these threats. We are taking concrete action to protect all affected communities across Canada.

As Canada's national police force, the Royal Canadian Mounted Police is mandated to prevent, detect and investigate serious organized crime, in order to protect Canadians and Canadian interests. In doing so, the RCMP works closely with domestic and international law enforcement partners to share information and target shared threats. The RCMP and its law enforcement partners across the country have observed an increase in the number of extortion crimes taking place and are working collaboratively to investigate these incidents.

While the RCMP cannot comment on specific investigations, I can confirm that significant coordination is under way across the country to address similar types of extortion attempts directed at the South Asian communities in British Columbia, Alberta and Ontario. While many investigations remain ongoing, a number of arrests have been made, and information sharing across agencies, I would say, is imperative, as coordinated efforts are under way to identify cases that may be related to one another.

To this end, the RCMP is actively sharing information with local law enforcement to support their ongoing efforts.

Rest assured, law enforcement agencies across the country are utilizing the required tools and resources to combat these serious incidents in order to keep Canadians safe.

Financial Statement of Minister of FinanceThe BudgetGovernment Orders

April 18th, 2024 / 3:30 p.m.
See context

Parkdale—High Park Ontario

Liberal

Arif Virani LiberalMinister of Justice and Attorney General of Canada

Mr. Speaker, I rise today to address budget 2024. I propose to deliver my remarks in two contexts: first, to address how this budget resonates with the residents whom I am privileged to represent in Parkdale—High Park in Toronto; second, to look more largely at some of the very important components that relate to the administration of justice in this country and are touched on in this budget document.

I am proud to have represented, for almost nine years now, the constituents in Parkdale—High Park. What those constituents have talked to me repeatedly about is the need to address housing. In budget 2024, we find some very key provisions that relate to housing. I cannot list them all, but some deal with the pressing issue of building more housing, increasing housing supply. That is fundamental in terms of what we are trying to do as a government, and it is empowered and advanced by this important budget document. What I am speaking of here is, for example, $15 billion in additional contributions to Canada's apartment construction loan program, which will help to build more than 30,000 additional new homes.

What I also take a lot of pride in is the fact that we are addressing the acute needs of renters. I say that in two respects. This budget document outlines, for example, how renters can be empowered to get to the point of home ownership by virtue of having a proper rental payment history. This can contribute to building up one's credit worthiness with credit ratings agencies; when the time comes to actually apply for a mortgage, one will have built up that credit worthiness by demonstrating that one has made regular rent payments over a period of years. This is truly empowering for the renters in my community and communities right around the country. I have already heard that feedback from the renters whom I represent.

Lastly, I would simply point out what we are doing with respect to the tenants' bill of rights. This is a really important document that talks about ensuring that tenants have rights they can vindicate, including in front of tribunals and, potentially, courts of law. We are coupling that with a $15-million investment that would empower and unlock advocates who assist those renters. That is fundamental. In that respect, it actually relates to the two hats that I wear in this chamber, in both my roles as a representative of individual renters and as Minister of Justice.

Another component that my constituents have been speaking to me about regularly since 2015 is our commitment to advancing meaningful reconciliation with indigenous peoples. Again, this document has a number of components that relate to indigenous peoples in budget 2024. There are two that I would highlight for the purpose of these remarks. First, there is the idea about what we are doing to settle litigation against indigenous peoples and ensure that we are proceeding on a better and more conciliatory path forward. We talk about a $23-billion settlement with respect to indigenous groups who are litigating discriminatory underfunding of children and child family services and the fact that this historic settlement was ratified by the federal court. That is critical.

Second, in this document we also talk about funding a project that is near and dear to my heart. Why do I say that? It is because, in 2017, I had the privilege of serving as the parliamentary secretary to the Minister of Heritage. At that time, I helped to co-develop, along with Métis, first nations and Inuit leaders, the legislation that has now become the Indigenous Languages Act. That is coupled with an indigenous languages commission. In this very budget document, we talk about $225 million to ensure the continued success of that commission and the important work it is doing to promote, enhance and revitalize indigenous languages in this country.

Those are fundamental investments. I think it is really important to highlight them in the context of this discussion.

I would also highlight that my riding, I am proud to say, is full of a lot of people who care about women. They care about feminism; they care about social and economic policies that empower women. I would highlight just two. First of all, we talk about pharmacare in this budget. The first volley of pharmaceutical products that will be covered includes contraceptive devices that would assist, as I understand it, as many as nine million Canadians through access to contraception. This would allow women, particularly young women and older women, to ensure that they have control over their reproductive function. That is fundamental to me as a representative, and it is fundamental to our government and what our government prioritizes in this country. I would also say that, with $10-a-day child care, there are affordable and robust means of ensuring that people's children are looked after in this country; that empowers women to do such things as participate in the workforce.

What I am speaking about here is that we are hitting levels of women's participation in the workforce that have never been seen before, with women's labour force participation of 85.4%. That is an incredible social policy that is translating into a terrific economic policy.

We can also talk about the $6.1-billion Canada disability benefit. I am proud to say that the constituents of Parkdale—High Park care meaningfully about inclusive policies, policies that alleviate poverty and are addressed to those who are vulnerable and those who are in need. People have been asking me about the disability benefit, including when we will see it and when it will come to the fore. We are seeing it right now with this document. The very document that we will be voting on in this chamber includes a $6.1-billion funding model to empower Canadians who are disabled and to ensure that we are addressing their needs.

This budget also represents a bit of a catch-up, meaning that we are catching up to the rest of the G7. Until this budget was delivered, we remained the only G7 country in the world not to have a national school food program. It goes without saying that not a single one of the 338 members privileged to serve in this House would think it is good for a child to arrive at school hungry, in any of their communities or in this country as a whole. I do not think this is a partisan statement whatsoever. We would acutely address child hunger. Through a national school food program, we would ensure that children do not arrive at school hungry, which would impede their productivity and certainly limit their education. Through a $1-billion investment, we would cure school poverty and school hunger.

We are also introducing legislation to reduce cellphone and banking fees, which is fundamental.

With respect to the hat I wear as Minister of Justice, which I have done for about eight months, I firmly believe that one of my pivotal roles is ensuring access to justice. I would say that this document really rings true to the commitment that I have personally and that our government and the Prime Minister have to this. Here, I am speaking about the notion of our commitment to legal aid. Legal aid has multiple components, but it is fundamental to ensuring that people can have their rights vindicated with the assistance of counsel. This helps address things such as court backlogs and court delays; it is also fundamental for the individual litigants before the courts. There is a criminal legal aid package in this budget that includes $440 million over five years.

There is also immigration and refugee legal aid. Unfortunately, since the provinces have wholesale resiled from their involvement in this portfolio, since 2019, we have been stepping in with annual funding. We are making that funding no longer simply annual; we are projecting it over a five-year term, which gives certainty and predictability to the people who rely on immigration and refugee legal aid, to the tune of $273 million. That is fundamental.

Members heard in question period about efforts we are making to address workplace sexual harassment. I will pivot again here to the fact that this dovetails with both my ministerial role and my role of devoted constituency representative as the MP for Parkdale—High Park. I hear a great deal from my constituents about speaking to women's needs in terms of addressing harassment and sexual harassment. With this budget, we would provide $30 million over three years to address workplace sexual harassment. That is also fundamental.

Likewise, what we are doing on hatred is fundamental. Three full pages of the budget document are dedicated to addressing hatred. Some points dovetail with legislation that I have tabled in this House, including Bill C-63, regarding what we would do to curb online hatred and its propensity to spread. However, there are also concrete investments here that talk about Canada's action plan on combatting hate and empowering such bodies as the Canadian Race Relations Foundation, with the important work it is doing in terms of promoting better understanding and the knowledge base of hate crimes units. Also, fundamentally, there is money dedicated in this very budget to ensuring that both law enforcement agencies and Crown prosecutors are better trained and provided better information about how to identify hate and potentially prosecute it. With where we are as a country right now, this is a pressing need; I am very proud to see budget 2024 addressing it directly.

For the reasons I outlined earlier, in terms of how this addresses the particular needs of my constituents and for the very replete justice investments that are made to ensuring access to justice and tackling pernicious issues, such as sexual harassment and hatred, I believe this is a budget that all 338 of us should get behind and support.

Alleged Premature Disclosure of Bill C-63—Speaker's RulingPrivilegeOral Questions

April 11th, 2024 / 3:10 p.m.
See context

Liberal

The Speaker Liberal Greg Fergus

I am now ready to rule on the question of privilege raised on February 26, 2024, by the House leader of the official opposition, concerning the alleged premature disclosure of Bill C-63, an act to enact the online harms act, to amend the Criminal Code, the Canadian Human Rights Act and an act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service and to make consequential and related amendments to other acts.

The opposition House leader claimed that the bill's contents had been leaked to the media, as evidenced in two separate reports from CBC and CTV News. Pointing to the anonymous quotes in the news reports, he concluded his remarks by positing that the information was leaked intentionally, knowing that it was wrong. In doing so, it breached the rights of members of Parliament and the House.

For his part, the parliamentary secretary to the government House leader countered that the envisioned legislation's objectives were widely known and already in the public domain long before the bill was placed on notice and introduced, given the government's prior commitments and extensive public consultations. Furthermore, the parliamentary secretary emphatically rejected the allegations that the government had shared the bill before it was introduced.

The House leader of the official opposition is correct in asserting that there are abundant precedents that once a bill is placed on notice, its contents are not to be disclosed prior to introduction, thus ensuring that members have the first opportunity to take note of the bill. The premature disclosure of bills has usually been seen as a contempt of the House.

I will invite MPs to please take their conversations outside of the House, including the member for Scarborough—Guildwood.

In a ruling on October 4, 2010, which can be found at page 4711 of the Debates, Speaker Milliken stated, and I quote:

It is indisputable that it is a well-established practice and accepted convention that this House has the right of first access to the text of bills it will consider.

On the substantive matter raised in this question of privilege, as members know, the policy direction leading to a government bill is not typically developed in the strict isolation of a government department. Prior to the putting on notice and introduction of most modern legislation, extensive consultations and public debate frequently occur for months or even years. Past precedents from the Chair address this reality, and Bill C-63 seems to be another example of that pattern.

On June 8, 2017, Speaker Regan emphasized the need for balance between members' right to have the first opportunity to see the bill and the need for prior public consultation. He said, at page 12320 of the Debates:

The right of the House to first access to legislation is one of our oldest conventions. It does and must, however, coexist with the need of governments to consult widely, with the public and stakeholders alike, on issues and policies in the preparation of legislation.

In the same ruling, Speaker Regan indicated that the denial of a premature disclosure of the bill by the government, and the absence of evidence that members were impeded in the performance of their parliamentary duties, had led him to find that the matter was not a prima facie case of privilege.

Having reviewed the contents of the bill against what was reported in the media, and considering the assurance given by the parliamentary secretary that the government did not share the text of the bill between its placement on notice and its introduction, it cannot be determined that the information that appeared in the news media necessarily came from a premature disclosure of the bill by so-called senior government sources.

The title of the bill, combined with the various sources of information mentioned above, such as background information provided during the consultation process, could have easily informed as to the specific objectives of the bill. There is a plausible argument to be made that the scope, objectives and targets of the bill were known prior to its being placed on notice and introduced.

Not being able to say with certainty that the information in the media reports came from the bill itself, I cannot determine that any member was impeded in the carrying out of their parliamentary duties, or that the dignity of the House was transgressed. As such, the Chair cannot find that there is a prima facie question of privilege.

That being said, the Chair shares the members' concerns when detailed information on proposed legislation, whether accurate or not, appears in media stories prior to their introduction.

It casts doubt on the role and predominance of Parliament in the legislative process and may lead to—

Order. I am going to remind all members that one of the fundamental rules of being a member and being a Speaker in this House is that members are not to question or to insult the Speaker, unless they are doing it through a motion which would call into question the Speaker's role. I would like to remind all members about this fundamental rule. I know that I have had some conversations with members in the past about this.

I will continue.

It casts doubt on the role and predominance of Parliament in the legislative process and may lead to understandable frustration.

I thank all members for their attention.

National DefenceCommittees of the HouseRoutine Proceedings

April 10th, 2024 / 5:15 p.m.
See context

Liberal

Kevin Lamoureux Liberal Winnipeg North, MB

Madam Speaker, the member is so sensitive to us calling out what the Conservative Party is doing. I just finished saying that the most important reality of our Canadian Forces is the families, and he is standing up on a point of order. Does he not realize that the families of the Canadian Forces members are, in fact, what this report is all about?

As someone who was in the Canadian Forces and who was posted in Edmonton, I understand the issue of housing. I understand the pros and cons, the dips and so forth that take place, the waiting list for PMQs, for barracks and the whole process in which housing has evolved in the Canadian Forces, and I understand how important the issue is. I knew this not only today, and it did not necessarily take the report coming to the floor to be debated. This is not new. There has always been waiting lists to get into PMQs since the days when I was in the forces. I had to wait, and I actually lived in a PMQ. There have always been waiting lists.

Why did the Conservative Party wait until today to introduce this motion? If, in fact, Conservatives were genuine and really cared about the families and the Canadian Forces, they could have introduced some form of a motion on an opposition day. They should have done that if they genuinely cared about families and those in the forces representing our country and doing a phenomenal job, whether in Canada or abroad.

The Government of Canada has the backs of those members in the Canadian Forces and their families a lot more than Stephen Harper ever did. When I was first elected to the House of Commons in 2010, Stephen Harper literally closed down veterans offices, not two or three, but nine all over the country.

Members can imagine the veterans who already served in the forces in many different capacities and were going into private homes and facilities, some even in the non-profit area, when Stephen Harper shut down those access offices. In Manitoba, it was in Brandon. I was glad that when we took over the reins of power, we actually reopened those offices to continue to support our veterans.

There are two issues here that really need to be talked about. First and foremost is the motivating factor of the Conservative Party today and why the Conservatives are moving this motion. As the NDP House leader clearly attempted to get this motion passed, the Conservatives said no. It was not because of interest for members of the forces but rather to prevent legislation from being debated.

Just yesterday, I was in the House and had the opportunity to speak to a private member's bill, Bill C-270, which dealt with the issues of child porn and non-consensual porn. I stood in my place and provided commentary on how serious and important that issue is, not only to the government but also to every member inside this chamber. Throughout the debate, we found out that the Conservative Party was actually going to be voting against Bill C-63, which is the online harms act.

That was important to mention because the Conservatives were criticizing the government for not calling the legislation. They were heckling from their seats and were asking why we did not call the legislation if it was so important.

The Conservatives realize that when they bring in motions, as they have done today, they are preventing the government from bringing in legislation and from having debates on legislation. Then, they cry to anyone who will listen. They will tell lies and will do all sorts of things on social media. They spread misinformation to Canadians to try to give the impression that the House and Canada are broken.

There is no entity in the country that causes more dysfunction in the House of Commons, or even outside of the Ottawa bubble, than the Conservative Party of Canada under the leadership of the far right MAGA leader today. That is the core of the problem. They have a leader who genuinely believes and who wants to demonstrate that this chamber is dysfunctional. The only thing that is dysfunctional in this chamber is the Conservative Party. It does not understand what Canadians want to see.

If we look at some of the commitments we are making to the Canadian Armed Forces, we are talking about billions of dollars in the coming years. We have a target, and a lot depends on economic factors, but we are looking at 1.7% by 2030.

Let us contrast that to the Conservative government of Stephen Harper, who was the prime minister when the current Conservative leader was a parliamentary secretary and was a part of that government in a couple of roles. We saw a substantial decrease in funding. I made reference to the veterans and to shutting them down. What about the lack of general funding toward the Canadian Forces? We hit an all-time low under the Conservative Party and Stephen Harper. It was 1% of the GDP. That would be awfully embarrassing to go abroad and to start talking to people in the United States or to any of our ally countries in NATO. They were laughing at the Harper regime.

The Liberal government had to straighten out the problems of the Conservatives' inability to get a jet fighter. For years, they tried and failed. The Liberal government is now delivering on getting the jet fighters. The Liberal government continues to look at ways we can enhance our Canadian Forces, not only for today but also into the future. We will have new search and rescue aircraft that will be operating out of places like the city of Winnipeg.

Stopping Internet Sexual Exploitation ActPrivate Members' Business

April 9th, 2024 / 6:15 p.m.
See context

Conservative

Michelle Rempel Conservative Calgary Nose Hill, AB

Madam Speaker, I have a lot to say about the bill. I will just start with a brief personal anecdote. I want to be very clear when I say this: I do not do this as victim porn or looking for sympathy. It is an example of how if somebody like myself, in a position of privilege, has a hard time accessing the justice system, what about others?

When I was a minister of the Crown, over 10 years ago, I received very explicit sexualized online threats, very graphic descriptions of how somebody was going to rape me, with what instruments, and how they were going to kill me. I was alone in a hotel room. My schedule had been published the day before, and I was terrified. The response at that time from law enforcement, and the process I had to go through as a minister of the Crown, to attempt to get justice in a situation that did not involve intimate images, sticks with me to this day. If I had to go through that at that time, what hope is there for somebody who does not have my position of privilege?

What the bill would do is recognize that the forms of discrimination and harassment that, as my colleague from Esquimalt—Saanich—Sooke says, disproportionately impact women, sexual minorities and other persons, have outpaced Parliament's ability to change the law. Here we are today.

Briefly, I want to respond to some of the points of debate. First of all, my colleague from the Liberals suggested that we expedite Bill C-63. That bill has been so widely panned by such a variety of disparate stakeholders that the government has not even scheduled it for debate in the House yet.

Second, and this is particularly for my colleagues who are looking to support this, to send the bill through to second reading, Bill C-63 would not provide criminal provisions either for any of the activities that are in the bill or for some of the other instances that have been brought up in the House for debate tonight, particularly the non-consensual distribution of deepnudes and deepfake pornography.

I raised the issue in the House over seven months ago. The intimate image distribution laws that are currently in the Criminal Code were only put in place in 2014, about a decade after social media came into play, and after Rehtaeh Parsons and Amanda Todd tragically died due to an absence in the law. Seven months have passed, and the government could have dealt with updating the Criminal Code with a very narrow provision that the Canadian Bar Association and multiple victims' rights groups have asked for, yet it has chosen not to.

There are so many articles that have been written about what is wrong with what is in Bill C-63 that we now need to start paying attention to what is wrong with it because of what is not in there. There is no update to Canada's Criminal Code provisions on the distribution of intimate images produced by artificial intelligence that are known as deepnudes.

I want to be very clear about this. There are websites right now where anyone in this place can download an app to their phone, upload any image of any person, including any person in here, and imagine what that looks like during an election campaign, erase people's clothes, and make it look like legitimate pornography. Imagine, then, that being distributed on social media without consent. Our Criminal Code, the Canadian Bar Association, as well as law professors, and I could read case after case, say that our laws do not update that.

At the beginning of February, there was a Canadian Press article that said that the government would update the law in Bill C-63, but it did not. Instead, what it chose to do was put in place a three-headed bureaucracy, an entirely extrajudicial process that amounts to a victim of these crimes being told to go to a bureaucratic complaints department instead of being able to get restitution under the law. Do we know what that says to a perpetrator? It says, “Go ahead; do it. There is no justice for you.” It boggles my mind that the government has spent all of this time while countless women and vulnerable Canadians are being harassed right now.

I also want to highlight something my colleague from Esquimalt—Saanich—Sooke said, which is that there is a lack of resources for law enforcement across the country. While everybody had a nice couple of years talking about defunding the police, how many thousands of women across this country, tens of thousands or maybe even millions, experienced online harassment and were told, when they finally got the courage to go to the police, that it was in their head?

One of those women was killed in Calgary recently. Another of those women is Mercedes Stephenson, who talked about her story about trying to get justice for online harassment. If women like Mercedes Stephenson and I have a hard time getting justice, how is a teenager in Winnipeg in a high school supposed to get any sort of justice without clarity in the Criminal Code if there are deepnudes spread about her?

I will tell members how it goes, because it happened in a high school in Winnipeg after I raised this in the House of Commons. I said it was going to happen and it happened. Kids were posting artificial intelligence-generated deepnudes and deepfakes. They were harassing peers, harassing young women. Do members know what happened? No charges were laid. Why were no charges laid? According to the article, it was because of ambiguity in the Criminal Code around artificial intelligence-created deepnudes. Imagine that. Seven months have passed. It is not in Bill C-63.

At least the bill before us is looking at both sides of the coin on the Criminal Code provisions that we need to start looking at. I want to ensure that the government is immediately updating the Criminal Code to say that if it is illegal to distribute intimate images of a person that have been taken with a camera, it should be the exact same thing if it has been generated by a deepnude artificial intelligence. This should have been done a long time ago.

Before Bill C-63 came out, Peter Menzies, the former head of the CRTC, talked about the need to have non-partisan consensus and narrowly scoped bills so it could pass the House, but what the government has chosen to do with Bill C-63 is put in place a broad regulatory system with even more nebulousness on Criminal Code provisions. A lot of people have raised concerns about what the regulatory system would do and whether or not it would actually be able to address these things, and the government has not even allowed the House to debate that yet.

What we have in front of us, from my perspective, is a clear call to action to update the Criminal Code where we can, in narrow provisions, so law enforcement has the tools it needs to ensure that victims of these types of crimes can receive justice. What is happening is that technology is rapidly outpacing our ability to keep up with the law, and women are dying.

I am very pleased to hear the multipartisan nature of debate on these types of issues, and that there is at least a willingness to bring forward these types of initiatives to committee to have the discussions, but it does concern me that the government has eschewed any sort of update of the Criminal Code on a life-versus-life basis for regulators. Essentially what I am worried about is that it is telling victims to go to the complaints department, an extrajudicial process, as opposed to giving law enforcement the tools it needs.

I am sure there will be much more debate on this, but at the end of the day, seven months have passed since I asked the government to update the Criminal Code to ensure that deepnudes and deepfakes are in the Criminal Code under the non-consensual intimate image distribution laws. Certainly what we are talking about here is ensuring that law enforcement has every tool it needs to ensure that women and, as some of my colleagues have raised here, other sexual minorities are not victimized online through these types of technologies.

Stopping Internet Sexual Exploitation ActPrivate Members' Business

April 9th, 2024 / 6:10 p.m.
See context

NDP

Randall Garrison NDP Esquimalt—Saanich—Sooke, BC

Madam Speaker, New Democrats support, as all parties do, tackling the important issues that the bill before us seeks to tackle. We also know that there has been an explosion of sexual exploitation of individuals online without their consent and an explosion of child pornography. What we have to do is find those measures that would be effective in bringing an end to these heinous practices.

Like the member for Peace River—Westlock, I would like to support and salute the survivors who have told their tales, at much personal sacrifice and much personal anguish, publicly acknowledging what has happened to them and the impact it has had on their lives. We would not be making progress on these issues without that work by those survivors, so I think we all want to salute them for their bravery in taking up this problem.

However, the challenge with these issues is to find what will actually work to end sexual exploitation. We know that a lack of resources for enforcement is almost always at least equally important to any gaps in legislation. What we need to see is dedicated funding to specific and skilled police units to tackle these questions because it can become highly complex and highly convoluted in trying to bring these cases to prosecution, and we know that is one of the problems with the existing legislation. It is difficult to prosecute for these offences under the Criminal Code as it now stands.

We look forward, as New Democrats, to hearing from expert witnesses in committee on what measures will actually be the most effective in bringing an end to these practices, and whether and how the measures proposed in Bill C-270 would contribute to bringing an end to online sexual exploitation. The bill, in some senses, is very simple. It would require checking ID and keeping records of consent. Some would argue that the existing law already implicitly requires that, so is this a step that would make it easier to prosecute? I do not know the answer to that, but I am looking forward to hearing expert testimony on it.

While this legislation is not specific to women, it is important to acknowledge the disproportionate representation of women as victims of both child pornography and of sexual exploitation online without consent. However, I would also note that we have had a recent rash of cases of sexploitation or sextortion of young men who thought they had been speaking to other partners their own age online. They later find out that they were being threatened with the images they had shared being posted online and being asked for money or sexual favours to avoid that. Yes, it is primarily women, but we have seen this other phenomenon occurring where men pose as young women to get young boys to share those images.

Obviously, we need more education for young people on the dangers of sharing intimate images, although I am under no illusion that we can change the way young people relate to each other online and through their phones. Education would be important, but some measures to deal with these things when they happen are also important.

If we look at the Criminal Code, paragraph 162.1(1) already makes it illegal to distribute an intimate image without consent. Of course, child pornography, under a succeeding subsection, is also already illegal. This was first brought forward and added to the Criminal Code 11 years ago. I was a member of Parliament at that time, and the member for Peace River—Westlock joined us shortly after. It came in an omnibus bill brought forward by the Conservatives. In that bill, there were a number of things, to be honest, that New Democrats objected to, but when the bill, which was Bill C-13 at the time, was brought forward, our spokesperson Françoise Boivin offered to the government to split the bill, take out the section on online exploitation without consent and pass it through all stages in a single day. The Conservatives refused, at that point, to do that, and it took another year and a half to get that passed into law.

New Democrats have been supportive in taking these actions and have recognized its urgency for more than a decade. We are on board with getting the bill before us to committee and making sure that we find what is most effective in tackling these problems.

What are the problems? I see that there are principally two.

One, as I have mentioned before, is the difficulty of prosecution and the difficulty of making those who profit from this pay a price. All the prosecutors I have talked to have said that it is difficult to make these cases. It is difficult to investigate, and it is difficult to get convictions. Are there things we can do that would help make prosecution easier, and are the things suggested in the bill going to do that? I look forward to finding that out in committee.

The second problem is the problem of takedown, and we all know that once the images are uploaded, they are there forever. They are hard to get rid of. As members of the government's side have pointed out, there are measures in government Bill C-63 that would help with warrants of seizure, forfeiture, restitution and peace bonds in trying to get more effective action to take down the images once they have been posted. I am not an optimist about the ability to do that, but we seem to lack the tools we need now to make a stab at taking the images off-line. It is also important to remember that whatever we do here has to make our law more effective at getting those who are profiting from the images. That is really what the bill is aimed at, and I salute the member for Peace River—Westlock for that singular focus because I think that is really key.

We also have to be aware of unintended consequences. When subsection 162.1(1) became law, in court we ran into a problem fairly early on of minors who share private images between each other, because technically, under the law as it is written, that is illegal; it is child pornography, and it certainly was not the intention to capture 15-year-olds who share intimate images with each other.

Whenever we make these kinds of changes, we have to make sure they do not have unintended consequences. Whether we like the practices that young people engage in online or not is not the question. We just have to make sure we do not capture innocent people when we are trying to capture those who profit from exploitation. The second part, in terms of unintended consequences, is I think we have to keep in mind there are those who are engaged in lawful forms of sex work online, and we have to make sure they are not captured under the broad strokes of the bill.

Again, I am looking forward to hearing the testimony about what will work to tackle these problems. We know the images are already illegal, but we know we lack effective tools in the legal system both to prosecute and to get the images taken down. New Democrats are broadly supportive of the principles in the bill. We are looking forward to the expert testimony I am certain we will hear at committee about what will actually work in tackling the problem. I look forward to the early passage of the bill through to committee.

Stopping Internet Sexual Exploitation ActPrivate Members' Business

April 9th, 2024 / 5:50 p.m.
See context

Winnipeg North Manitoba

Liberal

Kevin Lamoureux LiberalParliamentary Secretary to the Leader of the Government in the House of Commons

Madam Speaker, to be very clear, with regard to the issue of non-consensual pornography and child pornography, I like to believe that every member in the House would be deeply offended by any activity that would ultimately lead to, encourage or promote, in any fashion whatsoever, those two issues. It angers a great number of us, to the degree that it causes all forms of emotions. We all want to do what we can to play an important role in making our online world experience a safer place.

I must say that I was a little surprised when the member for Peace River—Westlock responded to the issue of Bill C-63. I did have some concerns.

When one thinks of non-consensual pornography and child pornography, they are already illegal today in Canada. We know that. I appreciate what is being suggested in the private member's legislation, but he was asked a question in regard to Bill C-63, the government legislation dealing with the online harms act. It is something that is very specific and will actually have a very tangible impact. I do not know 100%, because this is the first time that I heard that members of the Conservative Party might be voting against that legislation. That would go against everything, I would suggest, in principle, that the member opposite talked about in his speech.

The greatest threat today is once that information gets uploaded. How can we possibly contain it? That is, in part, what we should be attempting to deal with as quickly as possible. There was a great deal of consultation and work with stakeholders in all forms to try to deal with that. That is why we have the online harms act before us today.

I wanted to ask the member a question. The question I was going to ask the member is this: Given the very nature of his comments, would he not agree that the House should look at a way in which we could expedite the passage of Bill C-63?

By doing that, we are going to be directly helping some of the individuals the member addressed in his opening comments. The essence of what Bill C-63 does is that it provides an obligation, a legal obligation, for online platforms to take off of their platforms child pornography and non-consensual pornography. For example, the victims of these horrific actions can make contact and see justice because these platforms would have 24 hours to take it off. It brings some justice to the victims.

I do not understand, based on his sincerity and how genuine the member was when he made the presentation of his bill. I have a basic understanding of what the member is trying to accomplish in the legislation, and I think that there are some questions in regard to getting some clarification.

As I indicated, in terms of the idea of child pornography not being illegal, it is illegal today. We need to make that statement very clear. Non-consensual pornography is as well. Both are illegal. There is a consequence to perpetrators today if they are found out. What is missing is how we get those platforms to get rid of those images once those perpetrators start uploading the information and platforms start using the material. That is what the government legislation would provide.

Hopefully before we end the two hours of debate the member can, in his concluding remarks, because he will be afforded that opportunity, provide some thoughts in regard to making sure people understand that this is illegal today and the importance of getting at those platforms. If we do not get at those platforms, the problem is not going to go away.

There was a question posed by I believe a New Democratic member asking about countries around the world. People would be surprised at the motivation used to get child pornography on the net and livestreamed. I have seen some eye-opening presentations that show that in some countries in the world the person who is putting the child on the Internet is a parent or a guardian. They do it as a way to source revenue. They do it for income for the family. How sad is that?

How angering is it to see the criminal element in North America that exploits these individuals, and children in particular. This is not to mention of course the importance of non-consensual pornography, but think of the trauma created as a direct result of a child going through things a child should never, ever have to experience. This will have a lifetime effect on that child. We know that. We see generational issues as a direct result of it.

That is the reason I like to think that every member of the House of Commons would look at the issue at hand and the principles of what we are talking about and want to take some initiative to minimize it. Members need to talk to the stakeholders. I have had the opportunity in different ways over the last number of years to do so. It is one the reasons I was very glad to see the government legislation come forward.

I was hoping to get clarification from the member on Bill C-270. He may be thrown off a little because of Bill C-63, which I believe will be of greater benefit than Bill C-270. After listening to the member speak though, I found out that the Conservative Party is apparently looking at voting against Bill C-63.

We come up with things collectively as a House to recognize important issues and put forward legislation that would have a positive impact, and I would suggest that Bill C-63 is one of those things. I would hope the member who introduced this private member's bill will not only be an advocate for his bill but be the strongest voice and advocate within his own caucus for the online harms act, Bill C-63, so we can get the support for that bill. It would literally save lives and take ungodly things off the Internet. It would save the lives of children.