An Act to enact the Online Harms Act, to amend the Criminal Code, the Canadian Human Rights Act and An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service and to make consequential and related amendments to other Acts

Sponsor

Arif Virani  Liberal

Status

Second reading (House), as of Feb. 26, 2024

Subscribe to a feed (what's a feed?) of speeches and votes in the House related to Bill C-63.

Summary

This is from the published bill. The Library of Parliament often publishes better independent summaries.

Part 1 of this enactment enacts the Online Harms Act , whose purpose is to, among other things, promote the online safety of persons in Canada, reduce harms caused to persons in Canada as a result of harmful content online and ensure that the operators of social media services in respect of which that Act applies are transparent and accountable with respect to their duties under that Act.
That Act, among other things,
(a) establishes the Digital Safety Commission of Canada, whose mandate is to administer and enforce that Act, ensure that operators of social media services in respect of which that Act applies are transparent and accountable with respect to their duties under that Act and contribute to the development of standards with respect to online safety;
(b) creates the position of Digital Safety Ombudsperson of Canada, whose mandate is to provide support to users of social media services in respect of which that Act applies and advocate for the public interest in relation to online safety;
(c) establishes the Digital Safety Office of Canada, whose mandate is to support the Digital Safety Commission of Canada and the Digital Safety Ombudsperson of Canada in the fulfillment of their mandates;
(d) imposes on the operators of social media services in respect of which that Act applies
(i) a duty to act responsibly in respect of the services that they operate, including by implementing measures that are adequate to mitigate the risk that users will be exposed to harmful content on the services and submitting digital safety plans to the Digital Safety Commission of Canada,
(ii) a duty to protect children in respect of the services that they operate by integrating into the services design features that are provided for by regulations,
(iii) a duty to make content that sexually victimizes a child or revictimizes a survivor and intimate content communicated without consent inaccessible to persons in Canada in certain circumstances, and
(iv) a duty to keep all records that are necessary to determine whether they are complying with their duties under that Act;
(e) authorizes the Digital Safety Commission of Canada to accredit certain persons that conduct research or engage in education, advocacy or awareness activities that are related to that Act for the purposes of enabling those persons to have access to inventories of electronic data and to electronic data of the operators of social media services in respect of which that Act applies;
(f) provides that persons in Canada may make a complaint to the Digital Safety Commission of Canada that content on a social media service in respect of which that Act applies is content that sexually victimizes a child or revictimizes a survivor or intimate content communicated without consent and authorizes the Commission to make orders requiring the operators of those services to make that content inaccessible to persons in Canada;
(g) authorizes the Governor in Council to make regulations respecting the payment of charges by the operators of social media services in respect of which that Act applies, for the purpose of recovering costs incurred in relation to that Act.
Part 1 also makes consequential amendments to other Acts.
Part 2 amends the Criminal Code to, among other things,
(a) create a hate crime offence of committing an offence under that Act or any other Act of Parliament that is motivated by hatred based on certain factors;
(b) create a recognizance to keep the peace relating to hate propaganda and hate crime offences;
(c) define “hatred” for the purposes of the new offence and the hate propaganda offences; and
(d) increase the maximum sentences for the hate propaganda offences.
It also makes related amendments to other Acts.
Part 3 amends the Canadian Human Rights Act to provide that it is a discriminatory practice to communicate or cause to be communicated hate speech by means of the Internet or any other means of telecommunication in a context in which the hate speech is likely to foment detestation or vilification of an individual or group of individuals on the basis of a prohibited ground of discrimination. It authorizes the Canadian Human Rights Commission to deal with complaints alleging that discriminatory practice and authorizes the Canadian Human Rights Tribunal to inquire into such complaints and order remedies.
Part 4 amends An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service to, among other things,
(a) clarify the types of Internet services covered by that Act;
(b) simplify the mandatory notification process set out in section 3 by providing that all notifications be sent to a law enforcement body designated in the regulations;
(c) require that transmission data be provided with the mandatory notice in cases where the content is manifestly child pornography;
(d) extend the period of preservation of data related to an offence;
(e) extend the limitation period for the prosecution of an offence under that Act; and
(f) add certain regulation-making powers.
Part 5 contains a coordinating amendment.

Elsewhere

All sorts of information on this bill is available at LEGISinfo, an excellent resource from the Library of Parliament. You can also read the full text of the bill.

May 27th, 2024 / 6:55 p.m.
See context

Conservative

Garnett Genuis Conservative Sherwood Park—Fort Saskatchewan, AB

Thank you, Chair.

Thank you to our witnesses.

We're obviously familiar with the Liberal government's position on this bill. With respect to the officials, of course you're in a position to support that position. Your role as an official is not to come here and state your disagreement with government policy, even if you might privately disagree with government policy.

I will just say that I think that many of the arguments you put forward were clearly refuted by the senator already. I also want to say that I think Bill C-63 is a real disaster. It raises actual censorship issues. It has nothing on age verification. It's far, far broader than Bill S-210 at every level. It's enforced by vaguely empowered bureaucratic agencies and it includes dealing with speech.

Most Canadians who have seen what your government did.... To be fair, I understand your role as a non-partisan public servant, tasked with providing fearless advice and faithful implementation. However, what the Liberal government has put forward in Bill C-63 is not being well received across the board.

On the issues with section 171, I'm looking at the Criminal Code and trying to understand the argument here.

We have one definition of sexually explicit material in the Criminal Code. Implicitly, it's being suggested that maybe we could have multiple different definitions of sexually explicit material operating at the same time. However, it seems eminently logical that you would have one definition that relies on the existing jurisprudence.

As Mr. Bittle has suggested that if this definition covers the Game of Thrones, then it's already a problem because it already violates the Criminal Code if, in the commission of another offence, you were to show a child that material. Therefore, you already could run afoul of the Criminal Code if you put on Game of Thrones in your home for your 16-year-old. That's not happening. No one's getting arrested and going to jail because they let their 16-year-old watch Game of Thrones. If that's not happening already off-line, then maybe that suggests that this extensive reinterpretation of what the existing law already says is a little bit exaggerated.

In this context, we also know that Pornhub has been represented by a well-connected Liberal lobbyist who has met with Liberals in the lead-up to the vote.

I want to ask the Privacy Commissioner about what he said in terms of potential amendments.

How would this apply on social media? I'm going to just pose the question. I have young children. I obviously don't want them accessing the major, well-known pornography websites. I also don't want them seeing pornographic material on any other website that they might go to for a legitimate purpose. Therefore, if my children are on social media—they're not—or if they were on another website, if they were watching a YouTube video on that, whatever it was, I would want to ensure that 6-, 7-, 8-, 9-, 10-, 11- and 12-year-olds were not accessing pornography, regardless of the platform and regardless of the percentage of that company's overall business model.

I don't really understand philosophically why it would make sense or protect anyone's privacy to have an exemption for sites where it's just a small part of what they do, because if the point is to protect children, then the point is to protect children wherever they are.

I'd be curious for your response to that.

May 27th, 2024 / 6:45 p.m.
See context

Associate Assistant Deputy Minister, Cultural Affairs, Department of Canadian Heritage

Owen Ripley

Thank you for those questions.

In the sense that the purpose of Bill C‑63 is to promote online safety and reduce harm, the duty to protect children, which is referred to in section 64 of the proposed act, is quite flexible. According to the proposed section, “an operator has a duty, in respect of a regulated service that it operates, to protect children by complying with section 65.” Section 66 of the proposed act gives the commission the power to establish a series of duties or measures that must be incorporated into the service.

According to the government, the proposed act provides the flexibility needed to better protect children on social media. During the consultations, it is certainly legitimate to wonder whether the appropriate response is to require some services to adopt age verification. Once again, there will be a specialized regulator with the necessary expertise. In addition, there are mechanisms to consult civil society and experts to ensure that these decisions are well-thought-out.

May 27th, 2024 / 6:45 p.m.
See context

Bloc

Kristina Michaud Bloc Avignon—La Mitis—Matane—Matapédia, QC

You also have concerns about the protection of privacy and personal information.

Comparisons are often made with Bill C‑63, but in my opinion, the two are quite different. Bill C‑63 aims to protect children from harmful online content, which is commendable. Bill S‑210 seeks to limit access to pornography.

The regulator you want to create through Bill C‑63 seems as though it could be very effective in playing that kind of role. The digital safety commission could play the same role as commissions in other countries. The same goes for the age verification processes.

Can you tell us what concerns you have regarding privacy, as well as any other concerns?

May 27th, 2024 / 6:40 p.m.
See context

Liberal

Chris Bittle Liberal St. Catharines, ON

Thank you so much.

I don't have much time, but perhaps I could turn to Mr. Ripley for him to expand on Bill C-63, the online harms act, with respect to what the government is intending to do to protect individuals from harms that are on the Internet.

May 27th, 2024 / 6:25 p.m.
See context

Owen Ripley Associate Assistant Deputy Minister, Cultural Affairs, Department of Canadian Heritage

Mr. Chair, thank you for inviting me to discuss Bill S‑210. As the associate assistant deputy minister for cultural affairs at the Department of Canadian Heritage, I will be responsible for the Online Harms Act that is being proposed as part of Bill C‑63.

While Bill C‑63 was being drafted, the department heard directly from experts, survivors from civil society and members of the public on what should be done to combat the proliferation of harmful content online.

A common theme emerged from these consultations: the vulnerability of children online and the need to take proactive measures to protect them. With this in mind, the future online harms act proposes a duty to protect children, which will require platforms to incorporate age-appropriate design features for children. Bill C‑63 also proposes a specialized regulatory authority that will have the skills and expertise to develop regulations, guidance and codes of practice, in consultation with experts and civil society.

Bill S-210 seeks to achieve a similarly admirable goal of protecting children online. However, the bill is highly problematic for a number of reasons, including a scope that is much too broad in terms of regulated services, as well as regulated content; possible risk to Canadians' privacy, especially considering the current state of age-verification frameworks internationally; structural incoherence that seems to mix criminal elements with regulatory elements; a troubling dependence on website blocking as the primary enforcement mechanism; and a lack of clarity around implementation and an unrealistic implementation timeline.

I'll briefly unpack a few of these concerns in greater detail.

As drafted, Bill S-210 would capture a broad range of websites and services that make sexually explicit material available on the Internet for commercial purposes, including search engines, social media platforms, streaming and video-on-demand applications, and Internet service providers. Moreover, the bill's definition of sexually explicit material is not limited to pornography but instead extends to a broader range of mainstream entertainment content with nudity or sex scenes, including content that would be found on services like Netflix, Disney+, or CBC Gem. Mandating age-verification requirements for this scope of services and content would have far-reaching implications for how Canadians access and use the Internet.

While efforts are under way globally in other jurisdictions to develop and prescribe age-verification technologies, there is still a lack of consensus that they are sufficiently accurate and sufficiently privacy-respecting. For example, France and Australia remain concerned that the technology is not yet sufficiently mature, and the testing of various approaches is ongoing. Over the next couple of years, the U.K. will ultimately require age assurance for certain types of services under its Online Safety Act. Ofcom is currently consulting on the principles that should guide the rollout of these technologies. However, the requirement is not yet in force, and services do not yet have to deploy age assurance at scale. In jurisdictions that have already moved ahead, such as certain U.S. states or Germany, there continue to be questions about privacy, effectiveness and overall compliance.

In short, these international examples show that mandates regarding age verification or age assurance are still a work in progress. There is also no other jurisdiction proposing a framework comparable in scope to Bill S-210. Website blocking remains a highly contentious enforcement instrument that poses a range of challenges and could impact Canadians' freedom of speech and Canada's commitment to an open and free Internet and to net neutrality.

I want to state once again that the government remains committed to better protecting children online. However, the government feels that the answer is not to prescribe a specific technology that puts privacy at risk and violates our commitment to an open Internet. It is critical that any measures developed to achieve this goal create a framework for protecting children online that is both flexible and well-informed.

Thank you for your attention. I look forward to any questions you may have.

May 27th, 2024 / 5:30 p.m.
See context

Bloc

Kristina Michaud Bloc Avignon—La Mitis—Matane—Matapédia, QC

I only have a few seconds left, but I want to hear your thoughts on the fact that, according to the government, we don't need Bill S-210, since there's Bill C-63. To my knowledge, they're not the same at all. Bill C‑63 is extremely important, to be sure, but it's not identical to Bill S‑210. Do you share that opinion?

May 27th, 2024 / 5:15 p.m.
See context

Senator, Quebec, ISG

Julie Miville-Dechêne

I have to say, first of all, that Bill C-63 doesn't talk about age verification. There's nothing in this bill about age verification—the words are not even used—and there's very little on pornography. Bill C-63 talks about the very vague concept of “age appropriate design”. It says there should be age-appropriate design; I'm sorry, but age-appropriate design is not age verification. It could be at some point if a committee so decides, but it's not in the bill. That's the first thing I wanted to say.

Regarding privacy, we have laws in Canada. Why would—

May 27th, 2024 / 5:15 p.m.
See context

Liberal

Taleeb Noormohamed Liberal Vancouver Granville, BC

Thank you so much.

It's nice to finally get to a place where we can have a good conversation about this bill.

Senator, I think we all agree that what you are trying to accomplish is very important, and I think there are many means by which to do that. I would submit that Bill C-63 takes into consideration many of the issues you seek to resolve.

One of the concerns I did want to hear from you on is the whole issue of online privacy. Could you briefly explain what impact this bill might have on online privacy for Canadians? Would there be any concerns with respect to the privacy of online users?

May 27th, 2024 / 12:50 p.m.
See context

Bloc

Rhéal Fortin Bloc Rivière-du-Nord, QC

Thank you, Madam Chair.

I want to thank the four witnesses for joining us today.

What's happening on campuses right now is very concerning. I think you are experiencing somewhat similar situations across Canada, including in Quebec.

You have been talking about this since the beginning of your remarks, but I would like to hear you talk more about the challenge that arises when it comes to respecting freedom of expression while avoiding hate speech or outbursts of that nature.

In my view, a university has always been a hotbed for exchanges, even heated exchanges, among students and professors on various subjects, including the thorniest ones. I'm always a little troubled when we talk about limiting freedom of expression, especially at a university.

That said, we believe that hate speech is unacceptable. However, it is difficult to define what is hate and what is not. As we said earlier, Bill C‑63 proposes provisions in this regard.

Another thing I find problematic is what is called the religious exception in the Criminal Code, which allows hate speech or antisemitic speech based on a religious text.

All these things are problematic. I will try to summarize by asking the witnesses my questions in the order in which their names appear on the notice of meeting.

Mr. Carr, at Concordia University, how do you plan to combat the problem of hate speech while respecting freedom of expression? Do encampments actually play an important role in terms of hate speech and freedom of expression?

May 27th, 2024 / 12:10 p.m.
See context

D/Chief Robert Johnson

I've recently reviewed Bill C-63, the online harms act, and I do support it.

May 27th, 2024 / 12:10 p.m.
See context

Liberal

The Chair Liberal Lena Metlege Diab

Thank you very much to our witnesses.

As the chair, I have one quick question for the police.

You talked in your recommendation about hate crime. The online harms act that's been introduced, Bill C-63, attempts to enshrine the definition of hatred in the Criminal Code. I'd like to know if you support that or if you have any recommendations on it.

Before you answer that, I will say to all our witnesses, please submit anything in writing that you feel that you did not get a chance to get out here this morning.

We have 30 seconds for to the police specifically on the hate crime definition.

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 11 p.m.
See context

Liberal

Arif Virani Liberal Parkdale—High Park, ON

Mr. Speaker, I think the track record of the previous Harper government, in which the Leader of the Opposition played a part in its cabinet, is demonstrably curious with respect to that barbaric cultural practices hotline suggestion, with respect to interdictions on the citizenship ceremonies and what people could wear, and with respect to approaches towards settlement of Syrian refugees and who would be selected for settlement in Canada and who would not. The track record is not an enviable one.

On this side of the House, we stand completely opposed to such policies and have implemented policies that are vastly different. That includes challenging Islamophobia. That includes funding for the security infrastructure program to protect places of worship. That includes Bill C-63, which would tackle Islamophobia head-on and help keep all Canadians safe.

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 10:55 p.m.
See context

Liberal

Arif Virani Liberal Parkdale—High Park, ON

Mr. Speaker, I think that is actually appalling, given where we are with the alarming rise in anti-Semitism post October 7. We need to be doing everything we can to shore up the Jewish community and its need for safety and security at this time.

Apropos of that, I find it very troubling that the opposition articulated by the Leader of the Opposition to a bill that I am shepherding through this chamber, Bill C-63, was so vociferous that he did not even wait to read the document. He came out against it before it was even tabled. This is the very same document that groups like CIJA have gone on record about, saying that if we tackle online hatred, we will help them stop anti-Semitism online from turning into real-world consequences in the physical world.

Bill C-63 is critical for the safety of the Jewish community, as it is critical for many vulnerable groups, including Muslims and Arabs in the LGBTQ community, the Black community and the indigenous community. That is what we need to stand for as Canadians. That is what the opposition leader is standing against.

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 10:20 p.m.
See context

Liberal

Arif Virani Liberal Parkdale—High Park, ON

Mr. Speaker, I would be very open to looking at what is transpiring in California. Centring victims at the heart of our criminal justice strategy is important, and we have been attempting to do that with respect to victims of hatred, through the online hate bill; victims of child sex predation, through Bill C-63; victims of intimate partner violence, through our changes to the bail regime, not once but twice, through Bill C-48 and Bill C-75; and fundamentally, victims of gun violence in this country, through bills like Bill C-21, which would put a freeze on handgun sales and ensure tougher penalties with respect to things like gun trafficking. These are important provisions, but I am definitely willing to entertain suggestions about what California is doing and look at whether the model could be brought over.

Department of Justice—Main Estimates, 2024-25Business of SupplyGovernment Orders

May 23rd, 2024 / 10:20 p.m.
See context

Liberal

Arif Virani Liberal Parkdale—High Park, ON

Mr. Speaker, I have a few responses. First of all, Bill C-63 contemplates a responsibility to file a digital safety plan with the new commissioner to indicate how one is going to moderate risk for one's users, and lastly, to be vetted against that moderation and to be subject to penalties or orders by the digital safety commissioner.

It also contemplates the idea that the digital safety commissioner could green-light researchers at universities around the country to get access to some of the inner workings of the platforms. This has been hailed by people like Frances Haugen, the famous Facebook whistle-blower, as internationally leading legislation on promoting some of the transparency the member opposite is seeking, which I seek as well.