"Coma Challenge" Reminds TIKTOK It's Time To Revise Your Push Algorithm

take 7 minutes to read
Home News Main article

TikTok, the short video platform that has taken the world by storm, is once again at the forefront of public opinion. At the heart of the matter remains the thorn in TikTok's side - the issue of content censorship regarding teenagers.

Over a year-long span, seven children lost their lives to the Blackout Challenge on TikTok. The parents of two of these children have filed a lawsuit against TikTok, which does not condemn the other Blackout Challenge creators or the fact that TikTok allowed their children to post the video, but instead points the finger at the heart of TikTok's --The recommendation mechanism.

 Source: TIKTOK official website Source: TIKTOK official website "

The "Coma Challenge" is a series of videos posted by TikTok users in which the filmmaker passes out from asphyxiation in various ways to see how long the user can last before passing out. These extremely dangerous videos have received a lot of attention on the platform and have even caused a number of teens and children to participate in copycats, leading to the accidental deaths of some children as a result of the coma challenge.

Parental neglect of children may have contributed to the tragedy, but for TikTok, it was the platform's fault for pushing dangerous and highly imitative content to an inappropriate audience.

The lawsuit alleges that there was an accidental death of a child as a result of participating in the "coma challenge" back in January 2021. The suit claims that TikTok should have acted to stop the activity when the first accident occurred.TikTok has told People magazine that this "disturbing challenge" existed before their platform and never became a "TikTok trend. "TikTok expressed its "deepest sympathy" to the families affected in the case and vowed to "will remain vigilant about security issues" and "will remove relevant content immediately if we find it."

Now that coma challenge related content can no longer be found on TikTok, it looks like TikTok did fulfill their promise. But the key point is that the removal of the video occurred not at the time of the first accident, but after causing six other related accidents in quick succession.

 Source: web Source: web The "Coma Challenge"

The problem has caused TikTok's reputation among parents to fall again, making it controversial again for the already questionable rubbish video issue. Some parents have said in the lawsuit that the children involved in the accident never actively searched for content related to the "coma challenge" and that the videos were pushed to them by TikTok's algorithm through the app's For You page.

TikTok has also made some efforts. In 2020, TikTok went live with a "Family Safety Mode," in which parent accounts can be linked to teen accounts, with the goal of providing reasonable supervision of what teens are viewing. But the problems mean that TikTok still has to take more responsibility with parents to keep teens safe. Video content that can cause harm, such as the "coma challenge," is clearly inappropriate for young, impressionable users. Short video platforms should add an invisible label to prohibit such content from appearing in front of teenage users when they post such videos.

In fact, not only for teenage users, but even for adult users, short video platforms have an obligation to add a reminder in videos with dangerous content to remind people not to imitate it. Making a reminder about dangerous content is not only at a disclaimer for the platform, but also really makes people aware that the content spread in the video may be dangerous and not to try to imitate it easily.

Web3 Vs. Web2: The Fundamental Ideological Divide
« Prev 07-12
The First Generation Of MacBook Pro With Touch Bar Has Been Ruthlessly Phased Out By Apple
Next » 07-13