Connect with us

Instagram

TikTok suicide video: it’s time platforms collaborated to limit disturbing content

Published

on

TikTok users have warned others to swipe away quickly if they see a video pop up showing a man with long hair and a beard. (File Photo: @konkarampelas/Unsplash)

A disturbing video purporting to show a man committing suicide is reportedly doing the rounds on the popular short video app TikTok, reigniting debate about what social media platforms are doing to limit circulation of troubling material.

According to media reports, the video first showed up on Facebook in late August but has been re-uploaded and shared across Instagram and TikTok — reportedly sometimes cut with seemingly harmless content such as cat videos.

TikTok users have warned others to swipe away quickly if they see a video pop up showing a man with long hair and a beard.

A statement by TikTok quoted by News.com.au said:

Our systems have been automatically detecting and flagging these clips for violating our policies against content that displays, praises, glorifies, or promotes suicide.

We are banning accounts that repeatedly try to upload clips, and we appreciate our community members who’ve reported content and warned others against watching, engaging, or sharing such videos on any platform out of respect for the person and their family.

Schools and child safety advocates have warned parents to be alert for the possibility their child may see — or may have already seen — the video if they are a TikTok or Instagram user.

The sad reality is users will continue to post disturbing content and it is impossible for platforms to moderate before posting. And once a video is live, it doesn’t take long for the content to migrate across to other platforms.

Pointing the finger at individual platforms such as TikTok won’t solve the problem. What’s needed is a coordinated approach where the big social media giants work together.




Read more:
Don’t just blame YouTube’s algorithms for ‘radicalisation’. Humans also play a part


Evading moderation

Post-moderation means even the worst content can be published. Either the platforms identify it with machine learning systems, or users report it to be processed by human moderators. But it can be live for five minutes, an hour or longer.

Once a video is up, it can be downloaded by bad actors, modulated to reduce the chance of detection by content moderation machine learning systems, and shared across multiple platforms — Reddit, Instagram, Facebook or more.

These bad actors can cut the video slightly differently, edit it within harmless material, put filters on it or distort the audio to make it difficult for the content moderation programs to automatically identify disturbing videos. Machine learning with visual content is advancing but it’s not perfect.

This is broadly what happened with video of the Christchurch massacre, where content taken from the gunman’s Facebook livestream of his attack was downloaded and then shared across various platforms.

By the time Facebook took down the original video, people already had copies of it and were uploading to Facebook, Reddit, YouTube and more. It very quickly became a cross-platform problem. These bad actors can also add hashtags (some very innocent-sounding) to target a particular community.

One of the key draws of TikTok as a social media platform is its “spreadability”; how easily it facilitates creating and sharing new videos based on the one a user was just watching.

With just a few taps users can create a “duet” video showing themselves reacting to the disturbing content. Bad actors, too, can easily re-upload videos that have been removed. Now this purported suicide video is out in the wild, it will be difficult for TikTok to control its spread.

What about copyright takedowns?

Some have noted social media platforms appear very adept at quickly removing copyrighted material from their services (and thereby avoiding huge fines), but can seem more tardy when it comes to disturbing content.

However, copyright videos are, in many ways, easier for machine learning moderation systems to detect. Existing systems used to limit the spread of copyrighted material have been built specifically for copyright enforcement.

For example, TikTok uses a system for detecting coprighted material (specifically music licensed by major record labels) to automatically identify a song’s fingerprint.

Even so, TikTok has faced a range of issues relating to copyright enforcement. Detecting hate speech or graphic videos on the platform is much more difficult.

Room for improvement

Certainly, there’s room for improvement. It’s a platform-wide, society-wide problem — we can’t just say TikTok is doing a bad job, it’s something all the platforms need to tackle together.

But asking market competitors to come up with a coordinated approach is not easy; platforms normally don’t share resources and work together globally to handle content moderation. But maybe they should.

TikTok employs massive teams of human moderators in addition to their algorithmically driven automated content moderation. These human content moderators work in many regions and languages to monitor content that may violate terms of use.

Recent events show TikTok is aware of growing demand for improved content moderation practices. In March 2020, responding to national security concerns, TikTok’s parent company ByteDance committed to stop using moderation teams based in China to moderate international content. It also established a “transparency centre” in March 2020 to allow outside observers and experts to scrutinise the platform’s moderation practices.

These platforms have enormous power, and with that comes responsibility. We know content moderation is hard and nobody is saying it needs to be fixed overnight. More and more users know how to game the system, and there’s no single solution that will make the problem go away. It’s an evolving problem and the solution will need to constantly evolve too.

Improving digital citizenship skills

There’s a role for citizens, too. Every time these disturbing videos do the rounds, many more people go online to find the video – they talk about it with their friends and contribute to its circulation.

Complicating matters is the fact reporting videos on TikTok is not as straightforward as it is on other platforms, such as Facebook or Instagram. A recent study I (Bondy Kaye) was involved in compared features on TikTok with its Chinese counterpart, Douyin. We found the report function was located in the “share” menu accessed from the main viewing screen on both platforms — not a place many would think to look.

So if you’re a TikTok user and you encounter this video, don’t share it around – even in an effort to condemn it. You can report the video by clicking the share icon and selecting the appropriate reporting option.




Read more:
Becoming more like WhatsApp won’t solve Facebook’s woes – here’s why


Anyone seeking support and information about suicide can contact Lifeline on 131 114 or Beyond Blue on 1300 224 636.The Conversation

Ariadna Matamoros-Fernández, Lecturer in Digital Media at the School of Communication, Queensland University of Technology and D. Bondy Valdovinos Kaye, PhD Candidate / Editorial Assistant, Queensland University of Technology

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Latest

Health20 hours ago

Lessons from COVID-19: Preparing for future pandemics means looking beyond the health data

The World Health Organization declared an end to the COVID-19 public health emergency on May 5, 2023. In the year...

News20 hours ago

What a second Trump presidency might mean for the rest of the world

Just over six months ahead of the US election, the world is starting to consider what a return to a...

supermarket line supermarket line
Business and Economy21 hours ago

Some experts say the US economy is on the up, but here’s why voters don’t think so

Many Americans are gloomy about the economy, despite some data saying it is improving. The Economist even took this discussion...

News21 hours ago

Boris Johnson: if even the prime minister who introduced voter ID can forget his, do we need a rethink?

Former prime minister Boris Johnson was reportedly turned away on election day after arriving at his polling station to vote...

News21 hours ago

These local council results suggest Tory decimation at the general election ahead

The local elections which took place on May 2 have provided an unusually rich set of results to pore over....

Canada News21 hours ago

Whitehorse shelter operator needs review, Yukon MLAs decide in unanimous vote

Motion in legislature follows last month’s coroner’s inquest into 4 deaths at emergency shelter Yukon MLAs are questioning whether the Connective...

Business and Economy21 hours ago

Is the Loblaw boycott privileged? Here’s why some people aren’t shopping around

The boycott is fuelled by people fed up with high prices. But some say avoiding Loblaw stores is pricey, too...

Prime Video Prime Video
Business and Economy21 hours ago

Amazon Prime’s NHL deal breaches cable TV’s last line of defence: live sports

Sports have been a lifeline for cable giants dealing with cord cutters, but experts say that’s about to change For...

ALDI ALDI
Business and Economy21 hours ago

Canada’s shopping for a foreign grocer. Can an international retailer succeed here?

An international supermarket could spur competition, analysts say, if one is willing to come here at all With some Canadians...

taekwondo taekwondo
Lifestyle21 hours ago

As humans, we all want self-respect – and keeping that in mind might be the missing ingredient when you try to change someone’s mind

Why is persuasion so hard, even when you have facts on your side? As a philosopher, I’m especially interested in...

WordPress Ads