Stopping Internet Sexual Exploitation Act

An Act to amend the Criminal Code (pornographic material)

Sponsor

Arnold Viersen  Conservative

Introduced as a private member’s bill. (These don’t often become law.)

Status

In committee (House), as of May 8, 2024

Subscribe to a feed (what's a feed?) of speeches and votes in the House related to Bill C-270.

Summary

This is from the published bill. The Library of Parliament often publishes better independent summaries.

This enactment amends the Criminal Code to prohibit a person from making, distributing or advertising pornographic material for commercial purposes without having first ascertained that, at the time the material was made, each person whose image is depicted in the material was 18 years of age or older and gave their express consent to their image being depicted.

Elsewhere

All sorts of information on this bill is available at LEGISinfo, an excellent resource from the Library of Parliament. You can also read the full text of the bill.

Votes

May 8, 2024 Passed 2nd reading of Bill C-270, An Act to amend the Criminal Code (pornographic material)

Stopping Internet Sexual Exploitation ActPrivate Members' Business

April 9th, 2024 / 5:30 p.m.
See context

Conservative

Arnold Viersen Conservative Peace River—Westlock, AB

moved that Bill C-270, An Act to amend the Criminal Code (pornographic material), be read the second time and referred to a committee.

Madam Speaker, imagine being the parent of a teenage daughter who has been missing for months and somebody discovers 50 explicit videos of that daughter being sexually abused on Pornhub, the most popular porn site in the world. Imagine how one would feel if intimate images of one's sibling was uploaded and Pornhub refused one's request to remove that content. Now, imagine if those videos of their exploited loved ones were being monetized and published for profit by Pornhub and were made available to Pornhub's over 130 daily visitors.

How would someone feel if Pornhub’s only response was an auto-reply email? Understandably, one would be outraged. One would be furious, yet this happens over and over. Survivors, including a 12-year-old from Ontario, have had to seek justice through their own lawsuits because in Canada, the onus is on survivors and on law enforcement to prove, after the material has been uploaded, that the individuals depicted in those videos are either under age or have not consented to their distribution. This is a serious problem that Bill C-270, the stopping internet sexual exploitation act, seeks to fix.

it’s important to note that for years, survivors, child protection agencies and the police have spoken out about this exploitation. They have not been silent. Survivors have shared how pornographic companies like Pornhub have been profiting from content depicting minors, sex trafficking victims, sexual assault, intimate images and gender-based violence for years. As early as 2019, companies like PayPal cut ties with MindGeek due to the availability of exploitive and abusive content.

In March 2020, a few parliamentarians and I wrote a public letter to the Prime Minister to alert him about the exploitation that was happening on MindGeek. We followed up in November 2020 with a letter to the then Minister of Justice, urging him to ensure that our laws were adequate to prevent women and girls from being exploited by Pornhub.

It was The New York Times exposé on December 4, 2020, in a piece written by Nicholas Kristof, that finally got the public's and the government’s attention. It was entitled “The Children of Pornhub: Why does Canada allow this company to profit off videos of exploitation and assault?” That article finally kicked off a firestorm of international attention on Pornhub, which is one of many pornographic websites owned by MindGeek, a Canadian company based in Montreal. About a year ago, it was bought and rebranded as Aylo by a company called Ethical Capital Partners, based in Ottawa.

A few days after that article, the House of Commons ethics committee initiated an investigation into Pornhub. I joined the ethics committee for its study on Pornhub and listened to the harrowing stories of young women who had videos of sexual assaults or intimate content shared without their consent. Many of these women were minors when the videos were created and uploaded to pornography sites like Pornhub. I want to take a moment to share some of their testimony.

Serena Fleites, whose story was covered by The New York Times exposé, had videos of her at age 13 uploaded by her ex-boyfriend. After that, her whole life came crumbling down. She experienced depression and drug use. She was harassed by people at her school who found her video and sent it to family members. She was blackmailed. She had to pretend to be her mother to have the videos taken down from Pornhub. This was all while she was 13 years old. In the end, she stopped going to school. She told us:

I thought that once I stopped being in the public so much, once I stopped going to school, people would stop re-uploading it. But that didn't happen, because it had already been basically downloaded by [all the] people...[in] the world. It would always be uploaded, over and over and over again. No matter how many times I got it taken down, it would be right back up again.

It basically became a full-time job for her to just chase down those images and to get them removed from Pornhub.

Some witnesses appeared anonymously to protect their identities. One witness stated, “I was 17 when videos of me on Pornhub came to my knowledge, and I was only 15 in the videos they [were] profiting from.” She went on to say, “Every time they took it down, they also allowed more and more videos of me to be reuploaded.” That witness also said, “Videos of me being on Pornhub has affected my life so much to the point that I don't leave my house anymore. I stopped being able to work because I [am]...scared to be out in public around other people.”

Another survivor who spoke to us at committee is Victoria Galy. As a result of discovering non-consensual images and videos of herself on Pornhub, she completely lost her sense of self-worth, and at times, she was suicidal. She told us at committee, “There were over eight million views just on Pornhub alone. To think of the amount of money that Pornhub has made off my trauma, date rape and sexual exploitation makes me sick to my stomach.” She added, “I have been forced to stand up alone and fight Pornhub”.

It is a serious failure of our justice system when survivors have to launch their own lawsuits to get justice for the harms caused by companies like MindGeek. This Canadian company has not faced a single charge or consequence in Canada for publishing its videos of exploitation and for profiting from them. This is truly shameful.

Last year, a survivor named Uldouz Wallace reached out to me. Uldouz is a survivor of the 2014 iCloud hack. She is also an award-winning actress, executive producer, activist and director of Foundation RA. Uldouz had photos and videos taken in the 2014 iCloud hack and uploaded onto porn sites like Pornhub, and she fought for years to get them taken down. As a result of this, she told us, “I lost followers, I lost everything that you could think of. It was just such hard time for me. I ended up spending over a million dollars over a three-year span just to get the content taken down on me with no success.... They're making so much money off of the non-consensual uploading of images and videos. The re-uploading is also a billion dollar industry.” She added, “There's still no federal laws. There's barely any laws at all to hold anyone online accountable. There's currently foreign revenge laws but for people like me there's nothing.”

Rachel, a survivor from Alberta, said that it was devastating and that it is going to haunt her for the rest of her life. She said that she will always be someone's porn.

I want to point out the incredible courage of Victoria, Serena, Uldouz, Rachel and many other survivors who have spoken out. In the midst of one of the most difficult moments of their lives, they are fighting back against a billion-dollar industry that seeks to profit from their pain and exploitation. I thank Victoria, Serena, Uldouz, and Rachel for refusing to back down. I thank them for their courage. I thank them for their relentless pursuit of justice. I would encourage members to listen to their full testimonies, and they can do so at www.siseact.ca.

Throughout the ethics committee hearings and from the interactions I have had with survivors since, it is clear that this is a common problem. Pornographic companies are publishing and monetizing content without verifying the age and the consent of the people depicted in them. This is particularly a problem for Canada as many of those websites are hosted here.

Bill C-270, the stopping Internet sexual exploitation act, would stop this. I am going to quote right from the summary of my bill. It states that the SISE act would:

...prohibit a person [including companies] from making, distributing or advertising pornographic material for commercial purposes without having first ascertained that, at the time the material was made, each person whose image is depicted in the material was 18 years of age or older and gave their express consent to their image being depicted.

The SISE act would also allow individuals to revoke their consent. This is an important part to express the ongoing consent. Finally, the SISE act would provide for aggravating factors when the material created or published actually depicts minors or non-consensual activity.

I am also pleased to share that I consulted on the bill with a variety of child protection agencies, law enforcement groups and the Canadian Centre for Child Protection to ensure that there are no gaps and that police have the tools to ensure they can seek justice.

The heart of the bill is consent. No one should be publishing sexually explicit material without the express consent of everyone depicted in that material. Children cannot consent to exploitation. Victims of sex trafficking and sexual assault cannot consent. Those filmed without their knowledge cannot consent, yet pornography companies freely publish this content and profit from it because there is no onus on them to verify the age or the consent of those depicted.

That is why the second recommendation of the 2021 ethics committee report is:

That the Government of Canada mandate that content-hosting platforms operating in Canada require affirmation from all persons depicted in pornographic content, before it can be uploaded, that they are 18 years old or older and that they consent to its distribution, and that it consult with the Privacy Commissioner of Canada with respect to the implementation of such obligation.

We have heard from survivors who testified that their images of abuse would not be online if companies like Pornhub had bothered to check for age and consent. Bill C-270 would fulfill this important recommendation from the ethics committee report and, importantly, I should add that this report was unanimously supported by all parties at the ethics committee.

The recommendation also suggests consulting with the Privacy Commissioner. I happy to share with my colleagues that on February 29, 2024, the Privacy Commissioner released his investigation into Pornhub's operator Aylo, formerly MindGeek. The report was initially scheduled to be released on May 23, but it was delayed for over nine months when MindGeek, or Aylo, and its owners, Ethical Capital Partners took the Privacy Commissioner to court to block the release of that report.

The Privacy Commissioner’s investigation into Aylo, MindGeek, was in response to a woman whose ex-boyfriend had uploaded intimate images of her to MindGeek's website without her consent. The young woman had to use a professional service to get it taken down and to remove her images from approximately 80 websites, where they had been re-posted more than 700 times.

The report shared how the publishing of the woman’s intimate images led to a permanent loss of control of the images, which had a devastating effect on her. It caused her to withdraw from her social life and to live in a state of fear and anxiety. The Commissioner stated:

This untenable situation could have been avoided in many cases had MindGeek obtained direct consent from each individual depicted in content prior to or at the time of upload.

Pornhub’s own Monthly Non-Consensual Content reports suggest that non-consensual content is still regularly uploaded and viewed by thousands of users before it is removed.

We find that by continuing to rely solely on the uploader to verify consent, MindGeek fails to ensure that it has obtained valid and meaningful consent from all individuals depicted in content uploaded to its websites.

Ultimately, the Privacy Commissioner recommended that Pornhub and its owners adopt measures that would verify age and consent before any content is uploaded. I would urge all members to read the Privacy Commissioner's report on Pornhub.

While Pornhub and its owners are the biggest pornography company in the world, this bill would ensure that age verification and consent applies to all pornography companies because whether it is videos of child exploitation, sex trafficking, AI deepfakes, sexual assault or an intimate encounter filmed by a partner, once a video or image has been uploaded, it is virtually impossible to eliminate. Each video can be viewed and downloaded millions of times within a 24-hour period, starting an endless nightmare for victims who must fight to get those videos removed, only for them to be uploaded again within minutes or hours.

Canada must do more to prevent this exploitive content from ever reaching the Internet in the first place. I hope I have the support of my colleagues in ending this nightmare for so many and in preventing it for so many more. To the survivors, some of whom are watching today, we thank them. Their voices are being heard.

I want to thank the organizations that have supported me along the way in getting this bill to this point: National Centre on Sexual Exploitation, National Council of Women of Canada, Ottawa Coalition to End Human Trafficking, London Abused Women's Centre, Defend Dignity, Vancouver Collective Against Sexual Exploitation, The Salvation Army, Survivor Safety Matters, Foundation RA, Montreal Council of Women, CEASE UK, Parents Aware, Joy Smith Foundation, Hope Resource Centre Association, Evangelical Fellowship of Canada, Colchester Sexual Assault Centre, Sexual Assault and Violence Intervention Services of Halton, and Ally Global Foundation.

Stopping Internet Sexual Exploitation ActPrivate Members' Business

April 9th, 2024 / 5:45 p.m.
See context

St. Catharines Ontario

Liberal

Chris Bittle LiberalParliamentary Secretary to the Minister of Housing

Madam Speaker, the topic that the member is dealing with is particularly important. One of the arguments that he is making is with respect to taking down this heinous material online. I agree with him. However, the bill does not make any provisions for it.

Bill C-63, which is government legislation, does make provisions for taking down these types of heinous materials. The member's leader has said that he would vote against it. I wonder if the hon. member will be supporting Bill C-63 or if he is going to stick with what is here that would not accomplish the objectives that he is seeking, which I hope we would all be in favour of.

Stopping Internet Sexual Exploitation ActPrivate Members' Business

April 9th, 2024 / 5:45 p.m.
See context

Conservative

Arnold Viersen Conservative Peace River—Westlock, AB

Madam Speaker, Bill C-63 has no criminal offences around the uploading of this kind of content. In this bill, it would be a criminal offence to upload. We want to make sure this content never hits the Internet. A 24-hour takedown period is not good enough. We want to ensure that companies are doing their due diligence to ensure that their content is of people who are of age and that people consent to it.

An important piece of this bill is also that, if somebody has made a written request saying they revoke their consent, immediately that content must come down.

Stopping Internet Sexual Exploitation ActPrivate Members' Business

April 9th, 2024 / 5:45 p.m.
See context

Bloc

Andréanne Larouche Bloc Shefford, QC

Madam Speaker, my colleague and I were at a meeting this morning, and one of the things we talked about was the online exploitation of children in the Philippines. Digging into the issue, we can see how far behind Canada is. This issue has received international attention, and other models are out there.

I would like my colleague to comment on other such models currently in use around the world that may have inspired his bill.

Stopping Internet Sexual Exploitation ActPrivate Members' Business

April 9th, 2024 / 5:45 p.m.
See context

Conservative

Arnold Viersen Conservative Peace River—Westlock, AB

Madam Speaker, there are a number of initiatives around the world that seek to tackle this online content and child safety online. I would point to the work of Baroness Beeban Kidron in the U.K. The U.K. Parliament, in general, has been working to try to tackle some of these things. I know that France, Germany and Spain have all passed legislation trying to tackle the safety of kids online.

I think about six American states have passed legislation around keeping kids safe online, and I know that the American Congress has before it right now a bipartisan bill called the kids online safety act, which is proceeding through their legislature. This is something that is being tackled around the world. This morning, the Filipino embassy pleaded with Canada to help prevent sexual predators in Canada from accessing livestreaming content from the Philippines of CSAM, child sexual abuse material.

This bill would only be a start to preventing some of the heinous crimes that are being committed on the Internet.

Stopping Internet Sexual Exploitation ActPrivate Members' Business

April 9th, 2024 / 5:50 p.m.
See context

NDP

Randall Garrison NDP Esquimalt—Saanich—Sooke, BC

Madam Speaker, I thank the member for bringing forward this private member's bill, which directs our attention to some really important problems.

Is the member familiar with the report from the Department of Justice on cyber-bullying and non-consensual distribution of images from just a year ago, which takes quite a different approach from his bill and says we need to rewrite the existing offence so it is easier to prosecute and include measures, which are now in Bill C-63, to allow forfeiture, seizure, restitution and peace bonds in connection with these kinds of things?

Stopping Internet Sexual Exploitation ActPrivate Members' Business

April 9th, 2024 / 5:50 p.m.
See context

Conservative

Arnold Viersen Conservative Peace River—Westlock, AB

Madam Speaker, I am happy to support that initiative. I would say we can do both of these things. This bill is to try to prevent, in the first place, any of this content from being uploaded, rather than trying to deal with the mess after the fact.

What the member is suggesting is more about dealing with something after it has been uploaded. That is an important aspect. Bringing the people who upload this content to justice is an important piece, but this would be put in place to prevent the uploading of this content in the first place.

Stopping Internet Sexual Exploitation ActPrivate Members' Business

April 9th, 2024 / 5:50 p.m.
See context

Winnipeg North Manitoba

Liberal

Kevin Lamoureux LiberalParliamentary Secretary to the Leader of the Government in the House of Commons

Madam Speaker, to be very clear, with regard to the issue of non-consensual pornography and child pornography, I like to believe that every member in the House would be deeply offended by any activity that would ultimately lead to, encourage or promote, in any fashion whatsoever, those two issues. It angers a great number of us, to the degree that it causes all forms of emotions. We all want to do what we can to play an important role in making our online world experience a safer place.

I must say that I was a little surprised when the member for Peace River—Westlock responded to the issue of Bill C-63. I did have some concerns.

When one thinks of non-consensual pornography and child pornography, they are already illegal today in Canada. We know that. I appreciate what is being suggested in the private member's legislation, but he was asked a question in regard to Bill C-63, the government legislation dealing with the online harms act. It is something that is very specific and will actually have a very tangible impact. I do not know 100%, because this is the first time that I heard that members of the Conservative Party might be voting against that legislation. That would go against everything, I would suggest, in principle, that the member opposite talked about in his speech.

The greatest threat today is once that information gets uploaded. How can we possibly contain it? That is, in part, what we should be attempting to deal with as quickly as possible. There was a great deal of consultation and work with stakeholders in all forms to try to deal with that. That is why we have the online harms act before us today.

I wanted to ask the member a question. The question I was going to ask the member is this: Given the very nature of his comments, would he not agree that the House should look at a way in which we could expedite the passage of Bill C-63?

By doing that, we are going to be directly helping some of the individuals the member addressed in his opening comments. The essence of what Bill C-63 does is that it provides an obligation, a legal obligation, for online platforms to take off of their platforms child pornography and non-consensual pornography. For example, the victims of these horrific actions can make contact and see justice because these platforms would have 24 hours to take it off. It brings some justice to the victims.

I do not understand, based on his sincerity and how genuine the member was when he made the presentation of his bill. I have a basic understanding of what the member is trying to accomplish in the legislation, and I think that there are some questions in regard to getting some clarification.

As I indicated, in terms of the idea of child pornography not being illegal, it is illegal today. We need to make that statement very clear. Non-consensual pornography is as well. Both are illegal. There is a consequence to perpetrators today if they are found out. What is missing is how we get those platforms to get rid of those images once those perpetrators start uploading the information and platforms start using the material. That is what the government legislation would provide.

Hopefully before we end the two hours of debate the member can, in his concluding remarks, because he will be afforded that opportunity, provide some thoughts in regard to making sure people understand that this is illegal today and the importance of getting at those platforms. If we do not get at those platforms, the problem is not going to go away.

There was a question posed by I believe a New Democratic member asking about countries around the world. People would be surprised at the motivation used to get child pornography on the net and livestreamed. I have seen some eye-opening presentations that show that in some countries in the world the person who is putting the child on the Internet is a parent or a guardian. They do it as a way to source revenue. They do it for income for the family. How sad is that?

How angering is it to see the criminal element in North America that exploits these individuals, and children in particular. This is not to mention of course the importance of non-consensual pornography, but think of the trauma created as a direct result of a child going through things a child should never, ever have to experience. This will have a lifetime effect on that child. We know that. We see generational issues as a direct result of it.

That is the reason I like to think that every member of the House of Commons would look at the issue at hand and the principles of what we are talking about and want to take some initiative to minimize it. Members need to talk to the stakeholders. I have had the opportunity in different ways over the last number of years to do so. It is one the reasons I was very glad to see the government legislation come forward.

I was hoping to get clarification from the member on Bill C-270. He may be thrown off a little because of Bill C-63, which I believe will be of greater benefit than Bill C-270. After listening to the member speak though, I found out that the Conservative Party is apparently looking at voting against Bill C-63.

We come up with things collectively as a House to recognize important issues and put forward legislation that would have a positive impact, and I would suggest that Bill C-63 is one of those things. I would hope the member who introduced this private member's bill will not only be an advocate for his bill but be the strongest voice and advocate within his own caucus for the online harms act, Bill C-63, so we can get the support for that bill. It would literally save lives and take ungodly things off the Internet. It would save the lives of children.

Stopping Internet Sexual Exploitation ActPrivate Members' Business

April 9th, 2024 / 6 p.m.
See context

Bloc

Marie-Hélène Gaudreau Bloc Laurentides—Labelle, QC

Madam Speaker, on November 21, 2022, a team from the TV show Envoyé spécial travelled from Paris to my office on the Hill. The France 2 team was there for a major investigation into Pornhub and the tragic experience of women and girls being sexually exploited by massive online pornography companies for profit.

The French public television team wanted to see me because, during the 43rd Parliament, I was a member of the Standing Committee on Access to Information, Privacy and Ethics, where we studied this unbelievable industry. I would actually describe this industry as disgusting, and people can understand why. It exploits and abuses women to create and distribute pornographic content with neither their knowledge nor their consent. I was absolutely shocked by what I heard.

The committee heard from Serena Fleites, whose story was reported in a New York Times article. According to the article, the 14-year-old found herself in sexually explicit videos uploaded to Pornhub. It is abominable.

I was also shocked by when the administrators of MindGeek, the parent company behind Pornhub, whose office was in Montreal at the time, came before the committee. I was stunned by the administrators' pathological lack of consideration for victims. As far as they were concerned, it was not really their fault if those videos ended up on their platform, and it would not harm their lives. It was appalling.

The work we did in committee on Pornhub's practices enabled every member from every party present to understand the dubious mechanisms by which platforms distributing pornographic material get rich by exploiting the flaws in a technological system that is far from being able to control the content being distributed. In fact, it is built and designed to encourage criminal sexual exploitation practices by covering them up.

The committee I was on heard about the failure of moderation. We were told that the content was moderated, that people's privacy and reputations were protected. We heard about the failure to prevent the presence of child sexual exploitation material, despite the claims of the MindGeek representatives who testified in committee.

The committee made a number of recommendations, including the following: We must now, as a matter of urgency, pass legislation to respond to these crimes and deal with these troubling issues. I would remind the House that this study took place during the 43rd Parliament, which ran from 2019 to 2021. Now it is 2024. We had to wait for a bill from a Conservative member before we could finally talk about it in Parliament. What is the government waiting for? We are talking about human dignity. Young girls are having their reputations tarnished. Young people are committing suicide because they have been manipulated and cheated, because people have abused their trust. This has to stop.

I am going to give some figures on what happened from 2014 to 2022. This is important because we will understand the seriousness of the situation. Police reported 15,630 incidents of online sexual offences against children and 48,816 incidents of online child pornography.

The rate of police-reported online child sexual exploitation has risen since 2014, reaching 160 cases per 100,000 Canadian children and youth in 2022. Between 2014 and 2022, making and distributing child pornography accounted for three-quarters of child pornography cases, while possessing or accessing child pornography accounted for the remainder. The rate of online child pornography increased 290% in that short period of time.

Girls were overrepresented as victims for all offence types over the nine-year period. The majority of victims of police-reported online sexual offences against children were girls, particularly girls between the ages of 12 and 17.

Incidents of non-consensual distribution of intimate images most often involved a youth victim and a youth accused. Nearly all—97%—child and youth victims between 2015 and 2022 were aged 12 to 17 years, with a median age of 15 years for girls and 14 years for boys. Nine in ten accused persons were minors. For one-third of youth victims, a casual acquaintance shared the victim's intimate images with others. The goal is that once the image has been uploaded, it can be uploaded again, even months after it has been viewed.

I am calling on my colleagues to refer this bill to committee so that it may be improved and become an example to the world. We must no longer allow sexual exploitation.

Bill C‑270 “amends the Criminal Code to prohibit a person from making, distributing or advertising pornographic material for commercial purposes without having first ascertained that, at the time the material was made, each person whose image is depicted in the material was 18 years of age or older and gave their express consent to their image being depicted.” To me, that is essential.

The voluntary agreement, in writing, of the person whose image is depicted in the pornographic material will be required before the content can be uploaded to the platforms.

Makers and distributors who do not comply with the requirements of the legislation will be subject to fines of up to half a million dollars and a prison sentence of up to two years. Alternatively, the offence may be punishable on summary conviction and liable to a fine of not more than $100,000. What is more, convicted distributors or makers will be subject to an order.

I could go on at length, but I have only 30 seconds left. I just want to close by saying that this is a serious subject that raises a lot of questions, that this bill must be referred to committee and that the Bloc Québécois will vote in favour of it.

Stopping Internet Sexual Exploitation ActPrivate Members' Business

April 9th, 2024 / 6:10 p.m.
See context

NDP

Randall Garrison NDP Esquimalt—Saanich—Sooke, BC

Madam Speaker, New Democrats support, as all parties do, tackling the important issues that the bill before us seeks to tackle. We also know that there has been an explosion of sexual exploitation of individuals online without their consent and an explosion of child pornography. What we have to do is find those measures that would be effective in bringing an end to these heinous practices.

Like the member for Peace River—Westlock, I would like to support and salute the survivors who have told their tales, at much personal sacrifice and much personal anguish, publicly acknowledging what has happened to them and the impact it has had on their lives. We would not be making progress on these issues without that work by those survivors, so I think we all want to salute them for their bravery in taking up this problem.

However, the challenge with these issues is to find what will actually work to end sexual exploitation. We know that a lack of resources for enforcement is almost always at least equally important to any gaps in legislation. What we need to see is dedicated funding to specific and skilled police units to tackle these questions because it can become highly complex and highly convoluted in trying to bring these cases to prosecution, and we know that is one of the problems with the existing legislation. It is difficult to prosecute for these offences under the Criminal Code as it now stands.

We look forward, as New Democrats, to hearing from expert witnesses in committee on what measures will actually be the most effective in bringing an end to these practices, and whether and how the measures proposed in Bill C-270 would contribute to bringing an end to online sexual exploitation. The bill, in some senses, is very simple. It would require checking ID and keeping records of consent. Some would argue that the existing law already implicitly requires that, so is this a step that would make it easier to prosecute? I do not know the answer to that, but I am looking forward to hearing expert testimony on it.

While this legislation is not specific to women, it is important to acknowledge the disproportionate representation of women as victims of both child pornography and of sexual exploitation online without consent. However, I would also note that we have had a recent rash of cases of sexploitation or sextortion of young men who thought they had been speaking to other partners their own age online. They later find out that they were being threatened with the images they had shared being posted online and being asked for money or sexual favours to avoid that. Yes, it is primarily women, but we have seen this other phenomenon occurring where men pose as young women to get young boys to share those images.

Obviously, we need more education for young people on the dangers of sharing intimate images, although I am under no illusion that we can change the way young people relate to each other online and through their phones. Education would be important, but some measures to deal with these things when they happen are also important.

If we look at the Criminal Code, paragraph 162.1(1) already makes it illegal to distribute an intimate image without consent. Of course, child pornography, under a succeeding subsection, is also already illegal. This was first brought forward and added to the Criminal Code 11 years ago. I was a member of Parliament at that time, and the member for Peace River—Westlock joined us shortly after. It came in an omnibus bill brought forward by the Conservatives. In that bill, there were a number of things, to be honest, that New Democrats objected to, but when the bill, which was Bill C-13 at the time, was brought forward, our spokesperson Françoise Boivin offered to the government to split the bill, take out the section on online exploitation without consent and pass it through all stages in a single day. The Conservatives refused, at that point, to do that, and it took another year and a half to get that passed into law.

New Democrats have been supportive in taking these actions and have recognized its urgency for more than a decade. We are on board with getting the bill before us to committee and making sure that we find what is most effective in tackling these problems.

What are the problems? I see that there are principally two.

One, as I have mentioned before, is the difficulty of prosecution and the difficulty of making those who profit from this pay a price. All the prosecutors I have talked to have said that it is difficult to make these cases. It is difficult to investigate, and it is difficult to get convictions. Are there things we can do that would help make prosecution easier, and are the things suggested in the bill going to do that? I look forward to finding that out in committee.

The second problem is the problem of takedown, and we all know that once the images are uploaded, they are there forever. They are hard to get rid of. As members of the government's side have pointed out, there are measures in government Bill C-63 that would help with warrants of seizure, forfeiture, restitution and peace bonds in trying to get more effective action to take down the images once they have been posted. I am not an optimist about the ability to do that, but we seem to lack the tools we need now to make a stab at taking the images off-line. It is also important to remember that whatever we do here has to make our law more effective at getting those who are profiting from the images. That is really what the bill is aimed at, and I salute the member for Peace River—Westlock for that singular focus because I think that is really key.

We also have to be aware of unintended consequences. When subsection 162.1(1) became law, in court we ran into a problem fairly early on of minors who share private images between each other, because technically, under the law as it is written, that is illegal; it is child pornography, and it certainly was not the intention to capture 15-year-olds who share intimate images with each other.

Whenever we make these kinds of changes, we have to make sure they do not have unintended consequences. Whether we like the practices that young people engage in online or not is not the question. We just have to make sure we do not capture innocent people when we are trying to capture those who profit from exploitation. The second part, in terms of unintended consequences, is I think we have to keep in mind there are those who are engaged in lawful forms of sex work online, and we have to make sure they are not captured under the broad strokes of the bill.

Again, I am looking forward to hearing the testimony about what will work to tackle these problems. We know the images are already illegal, but we know we lack effective tools in the legal system both to prosecute and to get the images taken down. New Democrats are broadly supportive of the principles in the bill. We are looking forward to the expert testimony I am certain we will hear at committee about what will actually work in tackling the problem. I look forward to the early passage of the bill through to committee.

Stopping Internet Sexual Exploitation ActPrivate Members' Business

April 9th, 2024 / 6:15 p.m.
See context

Conservative

Michelle Rempel Conservative Calgary Nose Hill, AB

Madam Speaker, I have a lot to say about the bill. I will just start with a brief personal anecdote. I want to be very clear when I say this: I do not do this as victim porn or looking for sympathy. It is an example of how if somebody like myself, in a position of privilege, has a hard time accessing the justice system, what about others?

When I was a minister of the Crown, over 10 years ago, I received very explicit sexualized online threats, very graphic descriptions of how somebody was going to rape me, with what instruments, and how they were going to kill me. I was alone in a hotel room. My schedule had been published the day before, and I was terrified. The response at that time from law enforcement, and the process I had to go through as a minister of the Crown, to attempt to get justice in a situation that did not involve intimate images, sticks with me to this day. If I had to go through that at that time, what hope is there for somebody who does not have my position of privilege?

What the bill would do is recognize that the forms of discrimination and harassment that, as my colleague from Esquimalt—Saanich—Sooke says, disproportionately impact women, sexual minorities and other persons, have outpaced Parliament's ability to change the law. Here we are today.

Briefly, I want to respond to some of the points of debate. First of all, my colleague from the Liberals suggested that we expedite Bill C-63. That bill has been so widely panned by such a variety of disparate stakeholders that the government has not even scheduled it for debate in the House yet.

Second, and this is particularly for my colleagues who are looking to support this, to send the bill through to second reading, Bill C-63 would not provide criminal provisions either for any of the activities that are in the bill or for some of the other instances that have been brought up in the House for debate tonight, particularly the non-consensual distribution of deepnudes and deepfake pornography.

I raised the issue in the House over seven months ago. The intimate image distribution laws that are currently in the Criminal Code were only put in place in 2014, about a decade after social media came into play, and after Rehtaeh Parsons and Amanda Todd tragically died due to an absence in the law. Seven months have passed, and the government could have dealt with updating the Criminal Code with a very narrow provision that the Canadian Bar Association and multiple victims' rights groups have asked for, yet it has chosen not to.

There are so many articles that have been written about what is wrong with what is in Bill C-63 that we now need to start paying attention to what is wrong with it because of what is not in there. There is no update to Canada's Criminal Code provisions on the distribution of intimate images produced by artificial intelligence that are known as deepnudes.

I want to be very clear about this. There are websites right now where anyone in this place can download an app to their phone, upload any image of any person, including any person in here, and imagine what that looks like during an election campaign, erase people's clothes, and make it look like legitimate pornography. Imagine, then, that being distributed on social media without consent. Our Criminal Code, the Canadian Bar Association, as well as law professors, and I could read case after case, say that our laws do not update that.

At the beginning of February, there was a Canadian Press article that said that the government would update the law in Bill C-63, but it did not. Instead, what it chose to do was put in place a three-headed bureaucracy, an entirely extrajudicial process that amounts to a victim of these crimes being told to go to a bureaucratic complaints department instead of being able to get restitution under the law. Do we know what that says to a perpetrator? It says, “Go ahead; do it. There is no justice for you.” It boggles my mind that the government has spent all of this time while countless women and vulnerable Canadians are being harassed right now.

I also want to highlight something my colleague from Esquimalt—Saanich—Sooke said, which is that there is a lack of resources for law enforcement across the country. While everybody had a nice couple of years talking about defunding the police, how many thousands of women across this country, tens of thousands or maybe even millions, experienced online harassment and were told, when they finally got the courage to go to the police, that it was in their head?

One of those women was killed in Calgary recently. Another of those women is Mercedes Stephenson, who talked about her story about trying to get justice for online harassment. If women like Mercedes Stephenson and I have a hard time getting justice, how is a teenager in Winnipeg in a high school supposed to get any sort of justice without clarity in the Criminal Code if there are deepnudes spread about her?

I will tell members how it goes, because it happened in a high school in Winnipeg after I raised this in the House of Commons. I said it was going to happen and it happened. Kids were posting artificial intelligence-generated deepnudes and deepfakes. They were harassing peers, harassing young women. Do members know what happened? No charges were laid. Why were no charges laid? According to the article, it was because of ambiguity in the Criminal Code around artificial intelligence-created deepnudes. Imagine that. Seven months have passed. It is not in Bill C-63.

At least the bill before us is looking at both sides of the coin on the Criminal Code provisions that we need to start looking at. I want to ensure that the government is immediately updating the Criminal Code to say that if it is illegal to distribute intimate images of a person that have been taken with a camera, it should be the exact same thing if it has been generated by a deepnude artificial intelligence. This should have been done a long time ago.

Before Bill C-63 came out, Peter Menzies, the former head of the CRTC, talked about the need to have non-partisan consensus and narrowly scoped bills so it could pass the House, but what the government has chosen to do with Bill C-63 is put in place a broad regulatory system with even more nebulousness on Criminal Code provisions. A lot of people have raised concerns about what the regulatory system would do and whether or not it would actually be able to address these things, and the government has not even allowed the House to debate that yet.

What we have in front of us, from my perspective, is a clear call to action to update the Criminal Code where we can, in narrow provisions, so law enforcement has the tools it needs to ensure that victims of these types of crimes can receive justice. What is happening is that technology is rapidly outpacing our ability to keep up with the law, and women are dying.

I am very pleased to hear the multipartisan nature of debate on these types of issues, and that there is at least a willingness to bring forward these types of initiatives to committee to have the discussions, but it does concern me that the government has eschewed any sort of update of the Criminal Code on a life-versus-life basis for regulators. Essentially what I am worried about is that it is telling victims to go to the complaints department, an extrajudicial process, as opposed to giving law enforcement the tools it needs.

I am sure there will be much more debate on this, but at the end of the day, seven months have passed since I asked the government to update the Criminal Code to ensure that deepnudes and deepfakes are in the Criminal Code under the non-consensual intimate image distribution laws. Certainly what we are talking about here is ensuring that law enforcement has every tool it needs to ensure that women and, as some of my colleagues have raised here, other sexual minorities are not victimized online through these types of technologies.

Stopping Internet Sexual Exploitation ActPrivate Members' Business

April 9th, 2024 / 6:30 p.m.
See context

Liberal

Ben Carr Liberal Winnipeg South Centre, MB

Madam Speaker, I am pleased to join the second reading debate with respect to Bill C-270, an act to amend the Criminal Code on pornographic material, which was introduced on April 28, 2022, by the member for Peace River—Westlock.

I want to take an opportunity off the top to thank an organization that has played a critical role in advocacy in terms of dealing with so many of the challenges that the members opposite have raised. This organization, the Canadian Centre for Child Protection, is located in the heart of my riding in Winnipeg South Centre. I want to thank Lianna McDonald, Signy Arnason, Noni Classen and the entire team at the Canadian Centre for Child Protection for the work that they have done and continue to do in helping to protect children across this country.

I know we all agree that non-consensual distribution of intimate images, child sexual abuse, sexual assault and human trafficking, as well as any images of such conduct, are among the most heinous crimes. I know that we are all deeply concerned that depictions of these crimes have been uploaded and shared on online platforms.

Ensuring that our policies and legislation effectively address this serious issue is a priority for our government. Victims must be protected, and digital platforms have a critical role to play in protecting them.

I know we all agree that non-consensual distribution of intimate images, child sexual abuse, sexual assault and human trafficking, as well as any images of such conduct, are among the most heinous crimes. I know that we are deeply concerned—

Stopping Internet Sexual Exploitation ActPrivate Members' Business

April 9th, 2024 / 6:30 p.m.
See context

NDP

The Assistant Deputy Speaker NDP Carol Hughes

The hon. member will have eight minutes the next time this matter is before the House.

The time provided for the consideration of Private Members' Business has now expired, and the order is dropped to the bottom of the order of precedence on the Order Paper.

The House resumed from April 9 consideration of the motion that Bill C-270, An Act to amend the Criminal Code (pornographic material), be read the second time and referred to a committee.

Stopping Internet Sexual Exploitation ActPrivate Members' Business

May 7th, 2024 / 5:50 p.m.
See context

Bloc

Andréanne Larouche Bloc Shefford, QC

Mr. Speaker, as the member for Shefford and the Bloc Québécois critic for the status of women, I want to say that we support Bill C-270 in principle. We would like to examine this bill in committee. The Bloc Québécois fully supports the bill's stated objective, which is to combat child pornography and the distribution and commercialization of non-consensual pornography.

Since the first warning about the tragedy of women and girls whose sexual exploitation is the source of profits for major online porn companies, the Bloc Québécois has been involved at every stage and at all times in the public process to expose the extent of this public problem, which goes to our core values, including the right to dignity, safety and equality.

On this subject of online sexual exploitation, as on all facets and forms of the sexual exploitation of women, we want to stand as allies not only of the victims, but also of all the women who are taking action to combat violence and exploitation. I will begin by giving a little background on the topic, then I will explain the bill and, in closing, I will expand on some of the other problems that exist in Canada.

First, let us not forget that the public was alerted to the presence of non-consensual child pornography by an article that was published in the New York Times on December 4, 2020. The article reported the poignant story of 14-year old Serena K. Fleites. Explicit videos of her were posted on the website Pornhub without her consent.

This Parliament has already heard the devastating, distressing and appalling testimony of young Serena, which helped us understand the sensitive nature and gravity of the issue, but also the perverse mechanisms that porn streaming platforms use to get rich by exploiting the flaws of a technological system that, far from successfully controlling the content that is broadcast, is built and designed to promote and yet conceal the criminal practices of sexual exploitation.

Reports regarding the presence of child sexual abuse material and other non-consensual content on the adult platform Pornhub led the Standing Committee on Access to Information, Privacy and Ethics to undertake a study on the protection of privacy and reputation on online platforms such as Pornhub. My colleague from Laurentides—Labelle has followed this issue closely.

The committee noted that these platforms' content moderation practices had failed to protect privacy and reputation and had failed to prevent child sexual abuse material from being uploaded, despite statements by representatives of MindGeek and Pornhub who testified before the committee.

That same committee looked at regulating adult sites and online pornography, without challenging the legality. The committee heard testimony from survivors, critics of MindGeek's practices, child protection organizations, members of law enforcement, the federal government, academics, experts and support organizations, and it received many briefs.

The Standing Committee on Access to Information, Privacy and Ethics made 14 recommendations regarding the problems it had studied. The committee's 2021 report was clear and it recommended that the government introduce a bill to create a new regulator to ensure that online platforms remove harmful content, including depictions of child sexual exploitation and non-consensual images.

We know that sexually explicit content is being uploaded to Pornhub without the consent of the individuals involved, including minors, and that these individuals have tried and failed to get Pornhub to remove that content. We know that these survivors have been traumatized and harassed and that most of them have thought about suicide. That is the type of testimony that we heard at the Standing Committee on the Status of Women with regard to cases of sexual exploitation.

We know that even if content is finally removed, users just re-upload it shortly afterward. We know that the corporate structure of MindGeek, which was renamed Aylo last August, is the quintessential model for avoiding accountability, transparency and liability. We know that investigations are under way and that there has been a surge in online child sexual exploitation reports.

We must now legislate to respond to these crimes and deal with these problems. We also need to keep in mind the magnitude of the criminal allegations and the misconduct of which these companies are accused. Just recently, a new class action lawsuit was filed in the United States against MindGeek and many of the sites it owns, including Pornhub, over allegations of sex trafficking involving tens of thousands of children.

Let us not forget that these companies are headquartered right in Montreal. The fact that our country is home to mafia-style companies that profit from sexual exploitation is nothing to be proud of. The international community is well aware of this, and it reflects poorly on us. For these reasons, we have an additional obligation to take action, to find solutions that will put an end to sexual exploitation, and to implement those solutions through legislation.

With that in mind, we must use the following questions to guide our thinking. Are legislative proposals on this subject putting forward the right solutions? Will they be effective at controlling online sexual exploitation and, specifically, preventing the distribution of non-consensual content and pornographic content involving minors?

Second, let us talk a little more about Bill C‑270. This bill forces producers of pornographic material to obtain the consent of individuals and to ensure that they are of age. In addition, distributors will have to obtain written confirmation from producers that the individuals' consent has been obtained and that they are of age before the material is distributed. These new Criminal Code provisions will require large platforms and producers to have a process for verifying individuals' age and consent, without which they will be subject to fines or imprisonment.

The House will be considering two bills simultaneously. The first is Bill C-270, from the member for Peace River—Westlock, with whom I co-chair the All-Party Parliamentary Group to End Modern Slavery and Human Trafficking. The second is Bill C-63, introduced by the Minister of Justice, which also enacts new online harms legislation and aims to combat the sexual victimization of children and to make intimate content communicated without consent inaccessible.

We will need to achieve our goals, which are to combat all forms of online sexual exploitation and violence, stop the distribution and marketing of all pornographic material involving minors, prevent and prohibit the distribution of explicit non-consensual content, force adult content companies and platforms to control the distribution of such content, and make them accountable and criminally responsible for the presence of such content on their online platforms.

There is a debate about the law's ability to make platforms accountable for hosted content. It also raises questions about the relevance of self-regulation in the pornography industry.

Third, let us talk about what we can do here. Due to the high volume of complaints it receives, the RCMP often reacts to matters relating to child sexual abuse material, or CSAM, rather than acting proactively to prevent them. Canada's criminal legislation prohibits child pornography, but also other behaviours aimed at facilitating the commission of a sexual offence against a minor. It prohibits voyeurism and the non-consensual distribution of intimate images. Other offences of general application such as criminal harassment and human trafficking may also apply depending on the circumstances.

In closing, I will provide a few figures to illustrate the scope of this problem. Between 2014 and 2022, there were 15,630 incidents of police-reported online sexual offences against children and 45,816 incidents of online child pornography. The overall rate of police-reported online child sexual exploitation incidents has also risen since 2014. The rate of online child pornography increased 290% between 2014 and 2022. Girls were overrepresented as victims for all offence types over that nine-year period. The majority of victims of police-reported online sexual offences against children were girls, particularly girls between the ages of 12 and 17, who accounted for 71% of victims.

Incidents of non-consensual distribution of intimate images most often involved a youth victim and a youth accused. Nearly all child and youth victims, 97% to be exact, between 2015 to 2022 were aged 12 to 17 years, with a median age of 15 years for girls and 14 years for boys. Overall, nine in 10 accused persons, or 90%, were youth aged 12 to 17. For one-third of youth victims, or 33%, a casual acquaintance had shared the victim's intimate images with others.

Here is a quote from the Montreal Council of Women: “On behalf of the members of the Montreal Council of Women, I wish to confirm our profound concern for those whose lives have been turned upside down by the involuntary and/or non-consensual sharing of their images on websites and other platforms such as the Montreal-based Pornhub. The ‘stopping Internet sexual exploitation act’ will make much-needed amendments to the Criminal Code to protect children and those who have not given consent for their images and other content to be shared and commercialized.”

We must act. It is a question of safety for our women and girls. Young women and girls are depending on it.

Stopping Internet Sexual Exploitation ActPrivate Members' Business

May 7th, 2024 / 6 p.m.
See context

NDP

Rachel Blaney NDP North Island—Powell River, BC

Mr. Speaker, I am here today to speak to Bill C-270, which is an act to amend the Criminal Code around pornographic material.

We all, in the House, agree that we do not want to see children treated in a way that is sexualized. Children deserve to be children as long as possible. We know that, far too often, without consent, young people are exposed to predators who take advantage of their vulnerability.

I think it is important. I look forward to seeing the bill go to committee, where we can do some of the work. However, we also have to acknowledge some factors that move us into this place of having images online without young people's consent. We want to make sure that people participating in this realm are 18 or older. We need to find ways to address this.

We know that the resources are not there, as well, for enforcement to go after some of these very serious predators. We need to see the resources there, and they need to be supported so that we can move forward and protect young people.

I look forward to hearing those witnesses. We know that, as we move forward with this type of legislation, we have to look at ways that express consent can be given safely. This is something that we should be talking about a lot here, not only the outcome of this behaviour. We see young people being exploited; we see predators using technology to groom young people and mislead them into thinking they are someone else, then young people are sharing content about themselves that they should not be sharing.

When we think about this behaviour, we have to understand that these are predators. The bad people are not easy to see, and whenever it is revealed, we are often shocked by the members of our community who are part of this. I hope this discussion also looks at how we address that.

When we think of preventative measures, a significant part of prevention is looking at how we see sex education through our education fields and in the places where young people can come together and learn factual information. There is a lot of factual information that supports this.

I was looking at the report by Action Canada for Sexual Health and Rights, which talks about the state of sex ed in Canada. I love the hashtag. It says #SexEdSavesLives, and I believe that is absolutely true. If young people are exposed, it is getting harder and harder for people who love them to try to find ways to keep them safe. That is the world we live in, with technology right now, as young people have access to information.

Part of keeping young people safe is allowing them to have the appropriate education for their needs. The report says some things. It says:

In sum, the sex-ed most young people in Canada receive is:

1. Not meeting international standards and best practices nor is it meeting our own 2019 Canadian Guidelines for Sexuality Education;

2. Outdated;

3. Not comprehensive;

4. Not monitored or evaluated to ensure high-quality delivery; and

5. Offered by educators who receive low to no support from provinces and educational systems and whose comfort levels are often low.

This leads to a lack of safety for young people if they do not understand the information that is around them. If a person has issues around sexuality as a young person, or really at any age, and they are fearful and do not know whom to ask, often they go where there are secrets. They do this because they are keeping a secret about their own understanding.

We have to think about that. We have to think about how young people are prepared or not prepared for these things.

As they are exploring, if they do not have a safe adult to go to and learn more from, if they do not have a place of education that teaches them factual information about their bodies and what is happening, then they are left vulnerable. It is really important that we do not leave young people vulnerable.

I had the great pleasure of raising two beautiful sons, and we spent a lot of time talking about things so that they would have an understanding. What I found is that my openness led to their friends coming to ask me questions, and sometimes they were very interesting questions. However, it allowed for that safe adult who was going to talk to them openly about it, who was not going to create a secret or hiding place but be open and up front, and it seemed to help.

I will read again from the report, and the amazing people in the House should not worry. I will make sure to send the link so that they know where the content is coming from. It reads:

...the federal government, as signatory to international human rights treaties, is failing to hold provinces and territories accountable to delivering comprehensive sexuality education in line with human rights obligations. This runs contrary to positions taken by Canada at the UN that support the full implementation of comprehensive sexuality education around the world.

The threat is everywhere. I know it is scary, but a defence mechanism is making sure that people are properly educated, especially young people. I think that, regardless of our opinions on a lot of things in the House, everybody here understands, hopefully, that children are a precious gift and that we want to protect them as long as we possibly can. However, ignoring reality is not protecting them. Not talking about things that are happening for them and their friends is not helping them. It is keeping them less safe. Let us make sure that we educate people who will educate our children, that we are engaged in that process and that we make sure it is one of the beautiful lines of defence that we have created in our children, knowing that they can talk about these things openly.

The other thing that has come up as a concern around the bill, which I hope we address meaningfully in the study that we are doing, is around the safety of sex workers, and this is something that I am very passionate about. We know that there are a lot of people who are of the age of consent, and they are doing this work. It has happened forever. I cannot tell members when it was not happening. We know that sex work continues to be something that is just part of us as a people across the planet. One of the things that worries me is that we have to look at how we are building the defences so that we can protect our children. Part of building defences is making sure that sex work is safe, that people have the ability to talk about what is happening to them and that they are not put into positions where they are made increasingly vulnerable.

I was reading a report that Pivot Legal Society in B.C. sent as a submission to the special rapporteur on violence against women and girls for its report to the UN Human Rights Council on prostitution and violence against women and girls, and it was done in January of this year. One of the things that it talked about was this:

Qualitative research and data from Canada consistently shows that criminalization and policing of clients, under demand legislative models, shape sex workers’ health and safety, and that police-based enforcement heighten the risk of violence, by reducing sex workers’ ability to employ client screening mechanisms and negotiate safer terms of sexual transactions, including condom use for prevention of HIV/STI....

When we think about this, when we look at the legislation that we are making in this place, across this country and in every province and territory, part of what we have to be addressing is how we keep people safe.

When we have sexuality and that part of our human nature secret, repressed and pushed down, it comes out in ways that are dysfunctional, sick and violent, and that worries me. We need to make it safer for people to do what they do, because it takes it out of the shadows and makes it something that we can actually deal with that is out there. The more we repress it, push it aside and pretend it is not happening, the less safe children are and the less safe people are, and it is not okay.

I think of times in my community when I was approached about particular segments of the community that refused to use condoms when they were having sex with sex workers, and desperate people were getting into desperate situations. However, the spreading of STIs and HIV was only increasing, and the health outcomes were terrible.

When we look at this issue, we should make sure that we are keeping children precious, make sure that we are keeping sex workers safe and make sure that education is at the core of it.

Stopping Internet Sexual Exploitation ActPrivate Members' Business

May 7th, 2024 / 6:10 p.m.
See context

Conservative

Frank Caputo Conservative Kamloops—Thompson—Cariboo, BC

Mr. Speaker, it is always a pleasure to rise on behalf of the people from Kamloops—Thompson—Cariboo. It is especially a pleasure to rise when we are speaking to a bill that is on a subject I am very passionate about.

I have spoken before in this House about things I said when I was on the doorstep, in my time, dealing with Internet exploitation of children. That was something I devoted a number of years to in my professional career. It is something I am very proud of, and it is something that taught me a lot about life, about healing, about trauma and, sadly, about how prolific this type of exploitation is.

I believe it was my colleague from the Bloc who spoke about increases in numbers. If memory serves, when we talk about sexual exploitation of children, the spike in numbers happens, and this is not something that gets mentioned when opposing parties speak about the Harper government and its tough-on-crime agenda. One thing that does not get mentioned is that a number of providers, be it media providers or ISP providers, were getting a free ride. They knew or ought to have known that their platforms were being used to facilitate either the potential or actual sexual exploitation of children, which typically begins with the offence of Internet luring.

What happened, I believe in 2012, is that the Harper government passed legislation that placed a positive obligation on service providers to report suspected abuse of children. No longer could a platform simply look the other way. No longer could a platform simply say that it did not know what was going on. A lot of platforms probably knew it was going on or chose not to know that it was going on, because it was easier and cheaper to do business as usual. From 2012, if we look at the graphs, we can actually see this spiking. That spike really has not receded to this day.

I was speaking at the B.C. ICE conference with a number of brave officers, pediatricians and workers who put their lives into addressing sexual offences against children. There were probably about 100 people in a room, generally from British Columbia. It was one of the most profound honours I have had as a member of Parliament. I attended this conference as an attendee, just somebody who was trying to learn more. This year I was invited to be one of the keynote speakers. What a profound honour to go from attendee to keynote speaker.

We still see this spike. Technology and the law are really not working hand in hand, especially when we think about technology and how far we have to go: not only how far we have to go when it comes to technology, but how far we have to go when it comes to sentencing.

I will pause here to note that in 2011, in a case called Woodward, a former Supreme Court of Canada justice, Justice Moldaver, when he was on the Ontario Court of Appeal, actually said that when it came to Internet luring, we should be looking at sentences of three to five years. This is a judge who later went on to the Supreme Court of Canada. I still remember the language he used. He talked about “this insidious crime”, the one that targets children in such a hidden way. Here we are dealing with it.

When I was on the doorsteps of Kamloops—Thompson—Cariboo, when I was running for office, one of the things I committed to was changing the name of “child pornography” to “child sexual abuse and exploitation material”. I am very proud that my colleague from North Okanagan—Shuswap and I will be giving evidence as witnesses at third reading in the Senate on Thursday on Bill C-291. I researched the bill. I authored the bill, and I put forward the bill. My colleague sponsored the bill. It was unanimously passed at second reading and third reading, and now it is at third reading in the Senate and is about to be considered at committee. Again, it is a profound honour to be able to do this.

It is my hope that when we talk about things that are in Bill C-270, for instance, that we would eliminate the term “child pornography”. Pornography implies consent. Pornography implies adults who are voluntarily doing things. Children can never consent, so it is time we eliminate the term from our legal lexicon. Bill C-270 tells us why we need to be aware of this, so it is my hope that we will receive royal assent very quickly on Bill C-291.

I am just going to go through a few of the aspects of Bill C-270 and provide some input as to why I do support it, particularly as it relates to child sexual abuse and exploitation material that is being put on the Internet. Obviously I support the punishment at subsection 2 and the designation of the offence.

The reality is that I cannot adequately say how many times the police will come to ask questions when someone deals with this type of matter in a prosecutorial context. It is an area of law that someone needs to sink their teeth into in order to understand it. Unless someone spends a lot of time with it, I find, it has a really steep learning curve. It took me a long time. I still felt like a bit of an amateur even when I was elected here, with respect to the nature of the law on these types of things.

One of the struggles that the police would communicate to me when it was an attempt to prove Internet luring or possession of child sexual abuse and exploitation material was the age of the person being dealt with. That puts forward, again, a positive obligation. For those, like my mother, who are at home watching this on CPAC and who may wonder what I mean by a positive obligation, it is a requirement for somebody to take action.

One thing I really like about the bill is that it is not stating that somebody would need to refrain from doing something, which would be a negative obligation. There would be a positive obligation to ascertain the age. A failure to do that, to take that step, is the nature of the offence that I am speaking of right now, the failure to ascertain that a person is actually 18 years of age.

In my view, child sexual abuse and exploitation material is a blight on our society. If anybody thinks that it is just something that happens over there or happens elsewhere, in my experience it is something happening far more than we want to admit, yet what have we seen when it comes to sentences? I referenced Justice Moldaver earlier on Internet luring.

We have seen the Supreme Court of Canada come out with a case called R. v. Friesen that said mid-single-digit penitentiary terms should not be odd; they should be the norm. I cannot recall whether the maximum sentence for possession of child sexual abuse and exploitation material is 10 or 14 years, but for Internet luring it is 14 years, and for production, I believe, it is 14 years.

The court said that a maximum sentence should not be all that uncommon. I still look, to this day, at B.C. Court of Appeal decisions every day, just because I find them interesting. I cannot remember one time seeing anything close to the maximum sentence. In fact, what I am seeing more of is what used to be considered outlier cases, where community-based sentences are now being provided.

In 2011, a respected jurist said that we should be looking at three to five years for Internet luring. Then there was the Supreme Court of Canada case R. v. Friesen that said sentences should range from the upper-single digits to double digits on sexual offences against children, and the maximum should not be there. What are we seeing? We are just not seeing it come to fruition.

I know I have not touched on this as much as I could. I could obviously speak a lot more. I wholeheartedly endorse the bill. It is time that we address sexual offences in this country and that we do it with full vigour. I, my colleagues and, I believe, my colleagues across the aisle, should be focused on this. It is something that cannot wait another day.

Stopping Internet Sexual Exploitation ActPrivate Members' Business

May 7th, 2024 / 6:20 p.m.
See context

Etobicoke—Lakeshore Ontario

Liberal

James Maloney LiberalParliamentary Secretary to the Minister of Justice and Attorney General of Canada

Mr. Speaker, I am very pleased to speak to Bill C-270, an act to amend the Criminal Code (pornographic material), at second reading.

I would like to begin my remarks by stressing the bill's important objective. It is to ensure that those who make, distribute or advertise pornographic material verify that those depicted in that material are at least 18 years of age and have consented to its production and distribution.

As the sponsor has explained, the bill's objective is to implement recommendation number two of the 2021 report of the House of Commons Standing Committee on Access to Information, Privacy and Ethics. Specifically, that report recommends that the government “mandate that content-hosting platforms operating in Canada require affirmation from all persons depicted in pornographic content, before it can be uploaded, that they are 18 years old or older and that they consent to its distribution”.

This recommendation responds to ongoing concerns that corporations like Pornhub have made available pornographic images of persons who did not consent or were underage. I want to recognize and acknowledge that this conduct has caused those depicted in that material extreme suffering. I agree that we must do everything we can to protect those who have been subjected to this trauma and to prevent it from occurring in the first place. I fully support the objective of the committee's recommendation.

I want to say at the outset that the government will be supporting this bill, Bill C-270, at second reading, but with some serious reservations. I have some concerns about the bill's ability to achieve the objective of the committee's recommendation. I look forward, at committee, to where we can hear from experts on whether this bill would be useful in combatting child pornography.

The bill proposes Criminal Code offences that would prohibit making, distributing or advertising pornographic material, without first verifying the age and consent of those depicted by examining legal documentation and securing formal written consent. These offences would not just apply to corporations. They would also apply to individuals who make or distribute pornographic material of themselves and others to generate income, a practice that is legal and that we know has increased in recent years due to financial hardship, including that caused by the pandemic.

Individuals who informally make or distribute pornographic material of themselves and of people they know are unlikely to verify age by examining legal documentation, especially if they already know the age of those participating in the creation of the material. They are also unlikely to secure formal written consent. It concerns me that such people would be criminalized by the bill's proposed offences, where they knew that everyone implicated was consenting and of age, merely because they did not comply with the bill's proposed regulatory regime governing how age and consent must be verified.

Who is most likely to engage in this conduct? The marginalized people who have been most impacted by the pandemic, in particular sex workers, who are disproportionately women and members of the 2SLGBTQI+ communities. Notably, the privacy and ethics committee clearly stated that its goal was “in no way to challenge the legality of pornography involving consenting adults or to negatively impact sex workers.” However, I fear that the bill's proposed reforms could very well have this effect.

I am also concerned that this approach is not consistent with the basic principles of criminal law. Such principles require criminal offences to have a fault or a mental element, for example, that the accused knew or was reckless as to whether those depicted in the pornographic material did not consent or were not of age. This concern is exacerbated by the fact that the bill would place the burden on the accused to establish that they took the necessary steps to verify age and consent to avoid criminal liability. However, basic principles of criminal law specify that persons accused of criminal offences need only raise a reasonable doubt as to whether they committed the offence to avoid criminal liability.

I would also note that the committee did not specifically contemplate a criminal law response to its concerns. In fact, a regulatory response that applies to corporations that make, distribute or advertise pornographic material may be better positioned to achieve the objectives of the bill. For example, our government's bill, Bill C-63, which would enact the online harms act, would achieve many of Bill C-270's objectives. In particular, the online harms act would target seven types of harmful content, including content that sexually victimizes a child or revictimizes a survivor, and intimate content communicated without consent.

Social media services would be subjected to three duties: to act responsibly, to protect children and to make content inaccessible that sexually victimizes a child or revictimizes a survivor, as well as intimate images posted without consent.

These duties would apply to social media services, including livestreaming and user-uploaded adult content services. They would require social media services to actively reduce the risk of exposure to harmful content on their services; provide clear and accessible ways to flag harmful content and block users; put in place special protections for children; take action to address child sexual exploitation and the non-consensual posting of intimate content, including deepfake sexual images; and publish transparency reports.

Bill C-63 would also create a new digital safety commission to administer this regulatory framework and to improve the investigation of child pornography cases through amendments to the Mandatory Reporting Act. That act requires Internet service providers to report to police when they have reasonable grounds to believe their service is being used to commit a child pornography offence. Failure to comply with this obligation can result in severe penalties.

As I know we are all aware, the Criminal Code also covers a range of offences that address aspects of the concerns animating the proposed bill. Of course, making and distributing child pornography are both already offences under the Criminal Code. As well, making pornography without the depicted person's knowledge can constitute voyeurism, and filming or distributing a recording of a sexual assault constitutes obscenity. Also, distributing intimate images without the consent of the person depicted in those images constitutes non-consensual distribution of intimate images, and the Criminal Code authorizes courts to order the takedown or removal of non-consensual intimate images and child pornography.

All these offences apply to both individuals and organizations, including corporations, as set out in section 2 of the Criminal Code. Should parliamentarians choose to pursue a criminal response to the concerns the proposed bill seeks to address, we may want to reflect upon whether the bill's objectives should be construed differently and its provisions amended accordingly.

I look forward to further studying such an important bill at committee.

Stopping Internet Sexual Exploitation ActPrivate Members' Business

May 7th, 2024 / 6:30 p.m.
See context

Bloc

Julie Vignola Bloc Beauport—Limoilou, QC

Mr. Speaker, the subject that we are dealing with this evening is a sensitive one. My colleagues have clearly demonstrated that in the last couple of minutes.

We all have access to the Internet and we basically use it for three reasons: for personal reasons, for professional reasons and for leisure, which can sometimes overlap with personal reasons. Pornography is one of those uses that is both for leisure and for personal reasons. To each their own.

The use of pornography is a personal choice that is not illegal. Some people might question that. We might agree or disagree, but it is a personal decision. However, the choice that one person makes for their own pleasure may be the cause of another person's or many other people's nightmare. Basically, that is what Bill C-270 seeks to prevent, what it seeks to sanction. The purpose of the bill is to ensure that people do not have to go through hell because of pornography. This bill seeks to criminalize the fact that, under the guise of legality, some of the images that are being viewed were taken or are being used illegally.

I want to talk briefly about the problem this bill addresses and the solutions that it proposes. Then, to wrap up, I will share some of my own thoughts about it.

For context for this bill and two others that are being studied, Bill S‑210 and C‑63, it was a newspaper article that sounded the alarm. After the article came out, a House of Commons committee that my esteemed colleague from Laurentides—Labelle sits on looked at the issue. At that time, the media informed the public that videos of women and children were available on websites even though these women and, naturally, these children never gave their consent to be filmed or for their video to be shared. We also learned that this included youths under 18. As I said, a committee looked at the issue. The images and testimonies received by the committee members were so shocking that several bills that I mentioned earlier were introduced to try to tackle the issue in whole or in part.

I want to be clear: watching pornography is not the problem—to each their own. If someone likes watching others have sex, that is none of my concern or anyone else's. However, the problem is the lack of consent of the people involved in the video and the use of children, as I have already said.

I am sure that the vast majority of consumers of pornography were horrified to find out that some of the videos they watched may have involved young people under the age of 18. These children sometimes wear makeup to look older. Women could be filmed without their knowledge by a partner or former partner, who then released the video. These are intimate interactions. People have forgotten what intimacy means. If a person agrees to be filmed in an intimate situation because it is kind of exciting or whatever, that is fine, but intimacy, as the word itself implies, does not mean public.

When a young person or an adult decides to show the video to friends to prove how cool it is that they got someone else to do something, that is degrading. It is beyond the pale. It gets to me because I saw that kind of thing in schools. Kids were so pleased with themselves. I am sorry, but it is rarely the girls who are so pleased with themselves. They are the ones who suffer the negative consequences. At the end of the day, they are the ones who get dragged through the mud. Porn sites were no better. They tried to absolve themselves by saying that they just broadcast the stuff and it is not up to them to find out if the person consented or was at least 18. Broadcasting is just as bad as producing without consent. It encourages these illegal, degrading, utterly dehumanizing acts.

I am going back to my notes now. The problem is that everyone is blaming everyone else. The producer says it is fine. The platform says it is fine. Ultimately, governments say the same thing. This is 2024. The Internet is not new. Man being man—and I am talking about humankind, humans in general—we were bound to find ourselves in degrading situations. The government waited far too long to legislate on this issue.

In fact, the committee that looked into the matter could only observe the failure of content moderation practices, as well as the failure to protect people's privacy. Even if the video was taken down, it would resurface because a consumer had downloaded it and thought it was a good idea to upload it again and watch it again. This is unspeakable. It seems to me that people need to use some brain cells. If a video can no longer be found, perhaps there is a reason for that, and the video should not be uploaded again. Thinking and using one's head is not something governments can control, but we have to do everything we can.

What is the purpose of this bill and the other two bills? We want to fight against all forms of sexual exploitation and violence online, end the streaming and marketing of all pornographic material involving minors, prevent and prohibit the streaming of non-consensual explicit content, force adult content companies and streaming services to control the streaming of this content and make them accountable and criminally responsible for the presence of this content on their online sites. Enough with shirking responsibility. Enough with saying: it is not my fault if she feels degraded, if her reputation is ruined and if, at the end of the day, she feels like throwing herself off a bridge. Yes, the person who distributes pornographic material and the person who makes it are equally responsible.

Bill C‑270 defines the word “consent” and the expression “pornographic material”, which is good. It adds two new penalties. Essentially, a person who makes or distributes the material must ensure that the person involved in the video is 18 and has given their express consent. If the distributor does not ask for it and does not require it, they are at fault.

We must also think about some of the terms, such as “privacy”, “education”, but also the definition of “distributor” because Bill C-270 focuses primarily on distributors for commercial purposes. However, there are other distributors who are not in this for commercial purposes. That is not nearly as pretty. I believe we need to think about that aspect. Perhaps legal consumers of pornography would like to see their rights protected.

I will end with just one sentence: A real statesperson protects the dignity of the weak. That is our role.

Stopping Internet Sexual Exploitation ActPrivate Members' Business

May 7th, 2024 / 6:40 p.m.
See context

Conservative

Garnett Genuis Conservative Sherwood Park—Fort Saskatchewan, AB

Mr. Speaker, I appreciate the opportunity to say a few words in support of Bill C-270, which is an excellent bill from my colleague from Peace River—Westlock, who has been working so hard over his nine years in Parliament to defend the interests of his constituents on important issues like firearms, forestry and fracking, but also to stand up for justice and the recognition of the universal human dignity of all people, including and especially the most vulnerable.

Bill C-270 seeks to create mechanisms for the effective enforcement of substantively already existing legal provisions that prohibit non-consensual distribution of intimate images and child pornography. Right now, as the law stands, it is a criminal offence to produce this type of horrific material, but there are not the appropriate legal mechanisms to prevent the distribution of this material by, for instance, large pornography websites.

It has come to light that Pornhub, which is headquartered in Canada, has completely failed to prevent the presence on its platform of non-consensual and child-depicting pornographic images. This has been a matter that has been studied in great detail at parliamentary committees. My colleague for Peace River—Westlock has played a central role, but other members from other parties have as well, in identifying the fact that Pornhub and other websites have not only failed but have shown no interest in meaningfully protecting potential victims of non-consensual and child pornographic images.

It is already illegal to produce these images. Why, therefore, should it not also be clearly illegal to distribute those images without having the necessary proof of consent? This bill would require that there be verification of age and consent associated with images that are distributed. It is a common-sense legal change that would require and affect greater compliance with existing criminal prohibitions on the creation of these images. It is based on the evidence heard at committee and based on the reality that major pornography websites, many of which are headquartered in Canada, are continuing to allow this material to exist. To clarify, the fact that those images are on those websites means that we desperately need stronger legal tools to protect children and stronger legal tools to protect people who are victims of the non-consensual sharing of their images.

Further, in response to the recognition of the potential harms on children associated with exposure to pornography or associated with having images taken of them and published online, there has been discussion in Parliament and a number of different bills put forward designed to protect children in vulnerable situations. These bills are, most notably, Bill C-270 and Bill S-210.

Bill S-210 would protect children by requiring meaningful age verification for those who are viewing pornography. It is recognized that exposing children to sexual images is a form of child abuse. If an adult were to show videos or pictures to a child of a sexual nature, that would be considered child abuse. However, when websites fail to have meaningful age verification and, therefore, very young children are accessing pornography, there are not currently the legal tools to hold them accountable for that. We need to recognize that exposing young children to sexual images is a form of child abuse, and therefore it is an urgent matter that we pass legislation requiring meaningful age verification. That is Bill S-210.

Then we have Bill C-270, which would protect children in a different context. It would protect children from having their images depicted as part of child pornography. Bill C-270 takes those existing prohibitions further by requiring that those distributing images also have proof of age and consent.

This is common sense; the use of criminal law is appropriate here because we are talking about instances of child sexual abuse. Both Bill S-210 and Bill C-270 deal with child sexual abuse. It should be clear that the criminal law, not some complicated nebulous regulatory regime, is the appropriate mechanism for dealing with child abuse.

In that context, we also have a government bill that has been put forward, Bill C-63, which it calls the online harms act. The proposed bill is kind of a bizarre combination of talking about issues of radically different natures; there are some issues around speech, changes to human rights law and, potentially, attempts to protect children, as we have talked about.

The freedom of speech issues raised by the bill have been well discussed. The government has been denounced from a broad range of quarters, including some of their traditional supporters, for the failures of Bill C-63 on speech.

However, Bill C-63 also profoundly fails to be effective when it comes to child protection and the removal of non-consensual images. It would create a new bureaucratic structure, and it is based on a 24-hour takedown model; it says that if something is identified, it should be taken down within 24 hours. Anybody involved in this area will tell us that 24-hour takedown is totally ineffective, because once something is on the Internet, it is likely to be downloaded and reshared over and over again. The traumatization, the revictimization that happens, continues to happen in the face of a 24-hour takedown model.

This is why we need strong Criminal Code measures to protect children. The Conservative bills, Bill S-210 and Bill C-270, would provide the strong criminal tools to protect children without all the additional problems associated with Bill C-63. I encourage the House to pass these proposed strong child protection Criminal Code-amending bills, Bill S-210 and Bill C-270. They would protect children from child abuse, and given the legal vacuums that exist in this area, there can be no greater, more important objective than protecting children from the kind of violence and sexualization they are currently exposed to.

Stopping Internet Sexual Exploitation ActPrivate Members' Business

May 7th, 2024 / 6:45 p.m.
See context

Conservative

The Deputy Speaker Conservative Chris d'Entremont

Continuing debate.

I recognize the hon. member for Peace River—Westlock for his right of reply.

Stopping Internet Sexual Exploitation ActPrivate Members' Business

May 7th, 2024 / 6:45 p.m.
See context

Conservative

Arnold Viersen Conservative Peace River—Westlock, AB

Mr. Speaker, I am grateful for the opportunity to wrap up the debate on the SISE act at second reading.

I have appreciated listening to the members give their speeches. At the outset, I want to briefly urge members to use the term “child sexual abuse material”, or CSAM, rather than “child pornography”. As we heard from the member for Kamloops—Thompson—Cariboo, the latter term is being replaced with CSAM because pornography allows for the idea that this could be consensual. That is why the member for Kamloops—Thompson—Cariboo has put forward a bill that would change this in the Criminal Code as well.

During the first hour of debate, we heard from the member for Laurentides—Labelle, who gave a passionate speech outlining the many serious issues of the impact of the pornography industry on women and youth. I simply do not have the time to include all of that in my speech, but we both sat on the ethics committee during the Pornhub study and heard directly from the survivors who testified.

It was the speech, however, from the Parliamentary Secretary to the Leader of the Government in the House of Commons that left me scratching my head. I do not think he actually read Bill C-270 or even the Liberals' own bill, Bill C-63. The parliamentary secretary fixated on the 24-hour takedown requirement in Bill C-63 as the solution to this issue. However, I do not think anyone is opposed to a 24-hour takedown for this exploitative intimate content sharing without consent or the child sexual abuse material. In fact, a bill that was solely focused on the 24-hour takedown would pass very quickly through this House with the support of everyone, but that does not take into account what Bill C-270 is trying to do. It is completely missing the point.

The 24-hour takedown has effect only after harmful content has been put up, such as CSAM, deepfakes and intimate images that have been shared. Bill C-270 is a preventative upstream approach. While the takedown mechanism should be available to victims, the goal of Bill C-270 is to go upstream and stop this abusive content from ever ending up on the Internet in the first place.

As I shared at the beginning of the debate, many survivors do not know that their images are online for years. They do not know that this exploitative content has been uploaded. What good would a 24-hour takedown be if they do not even know the content is there? I will repeat the words of one survivor that I shared during the first hour of debate: “I was 17 when videos of me on Pornhub came to my knowledge, and I was only 15 in the videos they've been profiting from.” She did not know for two years that exploitative content of her was being circulated online and sold. That is why Bill C-270 requires age verification and consent of individuals in pornographic material before it is posted.

I would also point out that the primary focus of the government's bill is not to reduce harm to victims. The government's bill requires services “to mitigate the risk that users of the regulated service will be exposed to harmful content”. It talks about users of the platform, not the folks depicted in it. The focus of Bill C-270 is the other side of the screen. Bill C-270 seeks to protect survivors and vulnerable populations from being the harmful content. The two goals could not be more different, and I hope the government is supportive of preventing victims of exploitation from further exploitation online.

My colleague from Esquimalt—Saanich—Sooke also noted that the narrow focus of the SISE act is targeted at people and companies that profit from sexual exploitative content. This is, indeed, one of the primary aims of this bill. I hope, as with many things, that the spread of this exploitative content online will be diminished, as it is driven by profit. The Privacy Commissioner's investigation into Canada's MindGeek found that “MindGeek surely benefits commercially from these non-compliant privacy practices, which result in a larger content volume/stream and library of intimate content on its websites.”

For years, pornography companies have been just turning a blind eye, and it is time to end that. Bill C-270 is a fulfillment of a key recommendation made by the ethics committee three years ago and supported by all parties, including the government. I hope to have the support from all of my colleagues in this place for Bill C-270, and I hope to see it at committee, where we can hear from survivors and experts.

Stopping Internet Sexual Exploitation ActPrivate Members' Business

May 7th, 2024 / 6:50 p.m.
See context

Conservative

The Deputy Speaker Conservative Chris d'Entremont

The question is on the motion.

If a member participating in person wishes that the motion be carried or carried on division, or if a member of a recognized party participating in person wishes to request a recorded division, I would invite them to rise and indicate it to the Chair.

Stopping Internet Sexual Exploitation ActPrivate Members' Business

May 7th, 2024 / 6:50 p.m.
See context

Conservative

Arnold Viersen Conservative Peace River—Westlock, AB

Mr. Speaker, I request a recorded division.

Stopping Internet Sexual Exploitation ActPrivate Members' Business

May 7th, 2024 / 6:50 p.m.
See context

Conservative

The Deputy Speaker Conservative Chris d'Entremont

Pursuant to Standing Order 93, the division stands deferred until Wednesday, May 8, at the expiry of the time provided for Oral Questions.