TikTok found not responsible for strangled girl’s death

New You can now listen to the Insurance Journal articles!

Federal communications law protects video-sharing site TikTok Inc. from liability for the death of a 10-year-old girl who died while attempting a challenge on TikTok that encourages viewers to choke on household items.

A Pennsylvania federal judge has ruled that Section 230 of the federal Communications Decency Act requires him to dismiss claims by Nylah Anderson’s mother against TikTok because the law grants broad immunity to third-party content sites .

Hiding in a bedroom closet, Nylah attempted the so-called Blackout Challenge. Her mother found Nylah unconscious, hanging from the strap of a handbag. She tried unsuccessfully to revive her daughter with CPR. According to court information, three deep ligature marks on Nylah’s neck confirmed that she suffered as she struggled to free herself. After several days in intensive care, Nylah passed away.

Section 230 provides that “no provider or user of an interactive computer service shall be considered the publisher of information provided by another information content provider”. It further prohibits the bringing of any cause of action and the imposition of any liability under any state or local law that is inconsistent with this immunity.

Nylah’s mother has filed product liability, negligence, wrongful death and survival claims against TikTok. Anderson urged the court to hold TikTok liable as the designer, manufacturer and seller of a defective product, not for conduct as a publisher. She also presented evidence that TikTok knew its algorithm was encouraging the challenge to children and alleged that four other children had died attempting the challenge.

But the court determined that the various “creative” claims could not overcome the reality that the challenge posted on TikTok’s site was created by others and that TikTok could not be held liable as a publisher. Judge Paul S. Diamond wrote that Section 230 excludes Anderson’s product liability and negligence claims, upon which his claims for wrongful death and survival depend.

“What matters is not the name of the cause of action – defamation versus negligence versus intentional infliction of emotional distress – what matters is whether the cause of action inherently requires the court to treat the defendant as the “publisher or speaker” of content provided by another.”

By excluding interactive service providers from being treated as publishers of their third-party content, Congress immunized “decisions relating to the monitoring, filtering, and removal of content from [their] network[s]— actions essentially related to the role of a publisher”, according to the court.

Other challenges

TikTok, which is owned by Chinese company ByteDance Ltd., faces other challenges and criticism.

A group of states are investigating whether the social media platform is being inappropriately marketed to children. Police fear another of the videos on its site is teaching people how to hook up and steal cars.

Another lawsuit blames the site for the death of a 14-year-old African-American girl. This complaint claims that TikTok’s algorithm directs more violent videos towards minority viewers than white users.

Texas is investigating TikTok for potential human trafficking and child privacy violations.

In recent years, there have been other legal challenges to the broad immunity enjoyed by social media platforms, including those that tried to hold websites responsible for a terrorist attack and a mass shooting. , but the courts upheld the broad immunity.

There have also been talks in Washington between politicians on both sides of the aisle about revising the immunity law.

Earlier this month, the United States Supreme Court agreed to hear a case about whether social media companies can be sued over targeted content recommendations. The complaint alleges that Google bears part of the responsibility for an ISIS terrorist attack in 2015.

Judge Diamond seemed to acknowledge that some people had questioned section 230. In his summary conclusion, he wrote:

“Nylah Anderson’s death was caused by her attempt to complete the ‘Blackout Challenge.’ Defendants did not create the challenge; instead, they made it readily available on their site. The algorithm of defendants was a means of bringing the Challenge to the attention of those likely to be most interested. In so promoting the work of others, defendants published that work – exactly the activity that Section 230 protects from liability. The wisdom of conferring such immunity is something properly supported by Congress, not by the courts.”

The most important insurance news, delivered to your inbox every working day.

Receive the trusted insurance industry newsletter

Comments are closed.