Monthly plan to access Budget

Journalism of Courage
Advertisement

Tiktok, the ‘Blackout Challenge’ and Section 230 in US law

As TikTok continues to face criticism over the dangerous Blackout Challenge, a look at what has happened so far and what the future looks like for dangerous social media trends such as these.

As deaths and accidents have continued, a number of lawsuits have emerged against TikTok, the social media platform on which the Challenge went viral. (File)

The “Blackout Challenge” is one of the many controversial viral “challenges” that have taken social media by storm. In this, people are encouraged to choke themselves until they become unconscious due to the lack of oxygen. While this Challenge started making rounds on TikTok in 2021, it seems to have been around since 2008, when the US Centers for Disease Control and Prevention (CDC) issued a warning to individuals trying to strangulate themselves “in order to obtain a brief euphoric state or high,” linking it to more than 80 deaths (in 2008).

In November this year, a Bloomberg Businessweek report linked at least 15 deaths in children aged 12 and under to the Challenge in the past 18 months, and another five deaths in children aged 13 and 14.

As deaths and accidents have continued, a number of lawsuits have emerged against TikTok, the social media platform on which the Challenge went viral.

Lawsuits against TikTok

Lawsuits, mostly by parents of children who died due to the “Blackout Challenge”, have alleged that among other things, the app’s algorithm promotes harmful content, allows underage users, and fails to warn users or their guardians of TikTok’s addictive nature.

Subscriber Only Stories

The families of 8-year-old Lalani of Temple, Texas, and 9-year-old Arriani of Milwaukee, Wisconsin, in partnership with Social Media Victims Law Center (SMVLC), which works to hold social media companies legally accountable for the harm they inflict on vulnerable users, filed a lawsuit on June 30 in a California court. In May, Tawainna Anderson, mother of Nylah who died after taking the Challenge, filed a lawsuit in a Pennsylvania court. Her lawsuit specifically challenges TikTok’s “For You” page which provides curated content based on a user’s interest and past history.

The lawsuit read, “ the algorithm determined that the deadly Blackout Challenge was well-tailored and likely to be of interest to 10-year-old Nylah Anderson, and she died as a result.” However, an October ruling on the Anderson case has thrown up major question marks on the future of the other lawsuits, all of which make similar claims.

The judgement shielding TikTok

On October 25, US District Judge Paul Diamond in Philadelphia said a federal law shielded the video-sharing platform from liability in the death of Nylah Anderson, even if the company’s app recommended the video to her.

Advertisement

In an eight-page ruling, the judge concluded that “the defendants did not create the Challenge; rather, they made it readily available on their site. The defendant’s algorithm was a way to bring the Challenge to the attention of those likely to be most interested in it. In thus promoting the work of others, Defendants published that work — exactly the activity Section 230 shields from liability.”

The court said that if at all Tiktok was to be held liable for the deaths associated with the “Blackout Challenge”, it was the US Congress that would have to pass suitable laws and regulations. The judge noted that “the wisdom of conferring such immunity is something properly taken up with Congress, not the courts.”

Section 230 of the Communications Decency Act (CDA)

As the US witnessed an internet boom in the early 1990s, there was a need for new regulations to address the challenges of the new internet landscape. The Congress enacted the Communications Decency Act (CDA) as Title V of the Telecommunications Act of 1996 in an attempt to prevent minors from gaining access to sexually explicit materials on the Internet.

Advertisement

While on the whole, the CDA was deemed as “anti-free speech” by many activists, with the US Supreme Court striking down many of its more vague provisions, Section 230 has withstood the test of time as one of the most valuable tools for protecting freedom of expression and innovation on the Internet.

Section 230 says that “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”

The Electronic Frontier Foundation (EFF) explains that “online intermediaries that host or republish speech are protected against a range of laws that might otherwise be used to hold them legally responsible for what others say and do.”

What this means is that a company like Tiktok, which is a platform that hosts various users who post a wide variety of content, is not liable for what its users post or publish. In this case, while the “Blackout Challenge” was amplified on Tiktok, under Section 230, Tiktok is shielded from liability for content it did not directly create.

According to the EFF, “though there are important exceptions for certain criminal and intellectual property-based claims (such as publishing of child pornography or promotion of terrorism), CDA 230 creates a broad protection that has allowed innovation and free speech online to flourish.”

Advertisement

Challenges for parents, lawmakers, and TikTok

While TikTok has so far managed to evade any civil or criminal liability with regards to the dangerous nature of some of its content, as more people, especially minors, put themselves in dangerous positions on account of TikTok’s content, there is growing concern within and without the company. TikTok Inc. itself has a host of content moderation policies but in context of the sheer volume of the content posted, it has struggled to prevent harmful trends and posts online.

Further, it faces a major challenge with regards to underage users. While the minimum age to sign into TikTok is 13 years old, leaked internal data indicated that as many as a third of its users were underage, Bloomberg reported.

Advertisement

Like many other products on the internet, TikTok has struggled to keep away underage users who would just lie about their age to sign up. While facial age determination softwares have been floated as a possible solution, they come with their own dangers, not least the storage of sensitive biometric data with a company that has faced allegations of spying for China.

However, critics and families that have faced tragedy due to dangerous TikTok challenges and games have argued that the company does not do enough and actively prioritises its own commercial interests over the welfare of children.

Advertisement

Michael Rich, a paediatrician and director of the Digital Wellness Lab at Boston Children’s Hospital, told Bloomberg that “Platforms don’t seriously enforce age restrictions because it’s not in their best interests to do so.”

During the pandemic, at-home children saw a rise in the time they spent on various social media platforms, becoming a demographic of frequent users. TikTok also amplifies hundreds of relatively benign trends involving dancing and singing, and the participation of underage children adds to the product that it sells. As Dr Rich mentions, “these companies don’t see their users as customers to be served, they see them as a product they are selling.”

Even lawmakers have stepped in, reacting to increasing pressure from parents regarding the safety of children online. In September, California Governor Gavin Newsom signed an online privacy law, modelled after the UK’s age-appropriate design code, that forces tech companies to prioritise children’s best interests over commercial ones. The California code, set to take effect in 2024, requires companies to estimate the age of a child user “with a reasonable level of certainty.”

First published on: 05-12-2022 at 16:06 IST
Next Story

Dominique Lapierre and his deep bond with India

Home
ePaper
Next Story
close
X