Connect with us

Canada News

The Online Harms Act doesn’t go far enough to protect democracy in Canada

Published

on

The Online Harms Act aims to protect Canadians from harmful content. (Pexels photo)

The Liberal government’s recent proposal for regulating social media platforms, the Online Harms Act (Bill C-63), comes as the final act in a promised trilogy of bills aimed at bringing some order to the digital world.

After contentious attempts to address the fallout from the Online News Act and the threat from online streaming platforms to Canadian content, this final bill attempts to identify and regulate harmful content. The Online Harms Act follows Europe, the United Kingdom and Australia in setting up a new regulator in an attempt to address the spread of what is considered harmful content.

The idea that such efforts are necessary is not controversial — content that sexually exploits children, for instance, has already been a target for law enforcement, and hate speech has been illegal for decades in most industrialized democracies.

CBC News looks at the Online Harms Act.

Platform responsibility

Online harms laws are based on the idea of “intermediary liability”: making the platforms legally responsible when users use them to distribute content that breaks laws.

Under the Online Harms Act, platforms will be required to promptly remove two forms of content — that which “sexually victimizes a child or revictimizes a survivor” and “intimate images posted without consent” — or face large fines.

But it also includes less strict measures to deal with other forms of harmful content, including promotion of terrorism or genocide, incitement to violence or hate speech. Platforms will be required to develop, and make public, plans to “mitigate the risk that users will be exposed to harmful content on the services and submitting digital safety plans to the Digital Safety Commission of Canada.”

Crime and punishment

There are also new criminal offences and penalties for users who upload these forms of content. These provisions have been the subject of much of the debate over the bill.

Many civil libertarians argue that they go too far, while advocates for marginalized groups believe that they are long overdue.

But much of the debate over these specific details misses a deeper failing of the bill, which derives from the way the idea of “online harm” is understood.

‘Lawful but awful’

For much of the last decade, digital media scholars have also been directing attention to different ways in which platform communication ought to be considered harmful. The definition of harmful content in Bill C-63 focuses on harms that are experienced by users when they encounter particular forms of content posted by others.

But platforms aren’t merely empty spaces for users to send messages to other users — they play an active role in shaping the communication that takes place, determining how messages are combined and sorted, and how their distribution is prioritized and limited.

For this reason, algorithms that amplify or suppress particular kinds of messages should also be seen as a source of harm.

This is often understood as the reason why fake news or hyper-partisan political commentary is so problematic on platforms. Even perfectly legal communication — what is called “lawful but awful” content — can contribute to a pattern of serious harm.

One person denying the scientific consensus on vaccines, promoting entirely baseless conspiracy theories about political figures or discouraging people from voting, might not be “harmful” in the sense that Bill C-63 defines the concept.

But when social media algorithms ensure that many users don’t see counter-evidence from outside their “filter bubble,” the dangers are real. This is also true of any number of other kinds of platformed deception, such as AI-generated deep fake videos of political candidates.

Democracy at risk

Democracy relies on open and rational deliberation. The conditions for that kind of communication can be degraded by the way that algorithms operate. That algorithms are operated by private, for-profit corporations that seek to maximize “engagement” makes the problem even worse; this creates an incentive for content that provokes outrage and further polarizes political opinion.

Exactly how algorithms should be regulated is not a simple question. Some of the provisions in Bill C-63 might be a step in the right direction: requirements for risk mitigation plans, an ombudsperson who can help the public submit complaints about platforms to a regulator and obligations to provide information about content. And importantly, all of this can be done without unnecessarily violating users’ freedom of expression.

But a more specific legal obligation on platforms to deprioritize content that is clearly false — such as public health messaging or information related to elections — would be necessary to stop increasing online polarization and promoting anti-democratic populism.

While the Online Harms Act might protect individuals from being exposed to specific kinds of content, protecting the democratic nature of our society will require a more robust set of regulations than what has been proposed.The Conversation

Derek Hrynyshyn, Contract Faculty, Communication & Media Studies, York University, Canada

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Maria in Vancouver

Lifestyle1 day ago

How I Got My Groove Back

Life is not life if it’s just plain sailing! Real life is all about the ups and downs and most...

Lifestyle2 weeks ago

Upgrade Your Life in 2025

It’s a brand new year and a wonderful opportunity to become a brand new you! The word upgrade can mean...

Maria in Vancouver4 weeks ago

Fantabulous Christmas Party Ideas

It’s that special and merry time of the year when you get to have a wonderful excuse to celebrate amongst...

Lifestyle1 month ago

How To Do Christmas & Hanukkah This Year

Christmas 2024 is literally just around the corner! Here in Vancouver, we just finished celebrating Taylor Swift’s last leg of...

Lifestyle2 months ago

Nobody Wants This…IRL (In Real Life)

Just like everyone else who’s binged on Netflix series, “Nobody Wants This” — a romcom about a newly single rabbi...

Lifestyle2 months ago

Family Estrangement: Why It’s Okay

Family estrangement is the absence of a previously long-standing relationship between family members via emotional or physical distancing to the...

Lifestyle4 months ago

Becoming Your Best Version

By Matter Laurel-Zalko As a woman, I’m constantly evolving. I’m constantly changing towards my better version each year. Actually, I’m...

Lifestyle4 months ago

The True Power of Manifestation

I truly believe in the power of our imagination and that what we believe in our lives is an actual...

Maria in Vancouver5 months ago

DECORATE YOUR HOME 101

By Matte Laurel-Zalko Our home interiors are an insight into our brains and our hearts. It is our own collaboration...

Maria in Vancouver5 months ago

Guide to Planning a Wedding in 2 Months

By Matte Laurel-Zalko Are you recently engaged and find yourself in a bit of a pickle because you and your...