Trouver l'équilibre entre sécurité en ligne et liberté d'expression : comprendre la loi sur les préjudices en ligne
Articles suggérés
Sur des sujets similaires
Le projet de loi sur les préjudices en ligne (projet de loi C-63) du gouvernement canadien propose une approche audacieuse pour contrer les contenus nocifs sur le web. Des questions se posent quant à l'équilibre entre la protection et la liberté dans l'ère numérique. Découvrez les enjeux de cette législation et son impact potentiel sur l'expression en ligne en lisant l'intégralité de notre article (en anglais seulement).
With the release of the long-awaited (and debated) Online Harms Bill (Bill C-63), the Canadian government is taking a bold leap towards addressing the myriad of challenges posed by harmful content and activities on the web. The intention behind the bill is undoubtedly noble—to create a safer online environment for everyone, especially children and youth—yet it has caused some to question what the balance will be between online safety and freedom of expression in the digital age.
Undoubtedly, there is an urgent need to tackle online harms—such as hate speech, cyberbullying, and misinformation. By holding online platforms accountable for addressing these harms, Bill C-63 seeks to protect vulnerable users while promoting a more respectful and inclusive online culture.
The federal government proposes to police seven unique categories of harmful content online, including content used to bully a child, content that invites violence or terrorism, and hate speech (to name a few). Private messages would not be targeted. Legislation would also see to the development of a Digital Safety Commission of Canada and a Digital Safety Ombudsperson of Canada. The commission would “oversee and enforce new regulatory frameworks” whereas the ombudsperson would be a resource for users and governments alike.
It doesn’t stop there.
Bill C-63 also proposes changes to the Criminal Code and the Canadian Human Rights Act to develop new hate crime offences—some that would allow penalties up to life imprisonment depending on the act.
The implementation of the Online Harms Bill has caused many to raise concerns about its potential impact on freedom of expression and innovation online. With the term “hate speech” defined by the Supreme Court of Canada on a case-by-case basis, some have argued that the bill’s broad definition of harmful content and its expansive regulatory powers could stifle legitimate speech and creativity, leading to over-censorship.
Margarat Atwood—Canadian poet, novelist, and activist—has been a fierce critic of the bill, outlining on X (previously Twitter) that “the possibilities for revenge false accusations + thoughtcrime stuff are sooo inviting.”
Elon Musk, the CEO of Tesla and owner of X used his two cents (or 280-character limit) to write “insane” on the platform in reference to the bill. With hate speech top of mind, the X platform has increasingly been scrutinized as an online source allowing for hate speech and harmful rhetoric to run rampant.
Arif Virani, Minister of Justice, is spearheading, and tirelessly defending, the new bill against critics—outlining that protecting freedom of speech and expression is essential to him as Minister.
“There’s a lack of understanding about Bill C-63,” claimed Arif Virani at a Toronto event.
Politically, Conservatives agree with the overall sentiment that more should be done to address bullying online, inducing a child to harm, and online hate speech, yet Conservative Leader Pierre Poilievre’s stance is that these acts should be tried in court and investigated by local law enforcement—not “pushed off to new bureaucracy that provides no justice to victims.”
With the recent pharmacare deal made between the NDP and Liberals, it’s no surprise that the NDP has shown overall favour to the bill. The only thing they ask? Collaboration across the political floor to tighten aspects of the bill—which includes algorithm transparency outlined in the bill.
There are practical challenges with enforcing the bill’s contents—especially given the global nature of the internet and the sheer volume of content uploaded every day.
Questions remain on whether smaller online platforms will be able to keep up or struggle to comply with the bill’s requirements. Will this lead to a concentration of power in the hands of a few dominant players who have the resources to invest in compliance measures? Only time will tell.
Impact on clients
Bill C-63 introduces new compliance requirements and potential liabilities. Under the duty of care framework, online platforms are required to take proactive measures to prevent harm to users. This includes:
- robust content moderation and safety measures; and
- reporting mechanisms for users to flag harmful content.
Failure to comply could result in significant penalties and fines.
Businesses that rely on online platforms to reach and engage with the public may face additional challenges when navigating evolving regulatory frameworks.
In light of these considerations, businesses and their members should stay informed of Bill C-63 and its potential impacts. This may include:
- conducting and updating risk assessments;
- updating internal and external policies and procedures;
- on-going monitoring regarding any new shifts to the legislation by the government with opposition parties; and
- engaging with policymakers and industry stakeholders to shape the implementation of the bill in a way that balances safety, innovation, and free expression in the digital space.
NATIONAL Public Relations’ expert Public Affairs teams are here to support your organization's strategic objectives. We look forward to hearing from you.
——— Emily Rowan était directrice, Affaires publiques au Cabinet de relations publiques NATIONAL