Lawsuit Filed by Mom Who Lost Child to Blackout Challenge Revived by Court

A lawsuit that a Pennsylvania mother filed against TikTok has been revived after a U.S. appeals court ruled earlier this week that it could move forward.

The mother’s 10-year-old daughter died while she was attempting the viral “blackout challenge” that she saw posted on TikTok. The challenge dared viewers to choke themselves until they eventually lost consciousness.

The attorneys representing Tawainna Anderson argued the challenge, which went viral back in 2021, appeared on the “For You” feed of her daughter Nylah’s TikTok account. They said the video appeared there even though other children died after attempting it.

Usually, online publishers receive liability protection for content that others post on their platform, according to federal law. However, the appeals court ruled this week that TikTok might be held liable for that content since it promoted it or used an algorithm that ultimately steered the content toward users on the platform who are minors.

In the ruling, Judge Patty Shwartz of the 3rd U.S. Circuit Court in Philadelphia wrote:

“TikTok makes choices about the content recommended and promoted to specific users, and by doing so, is engaged in its own first-party speech.”

After Nylah attempted the challenge, her mother found her in a closet of their home in Chester, just outside of Philadelphia, unconscious. She tried to resuscitate her when she found her and then called 911. Nylah ultimately died five days after her mother found her.

At a news conference that was held in 2022, after the lawsuit was filed, Tawainna Anderson said:

“I cannot stop replaying that day in my head. It is time that these dangerous challenges come to an end so that other families don’t experience the heartbreak we live every day.”

The lawsuit was initially dismissed by a district judge who ruled that TikTok enjoyed protections under Section 230 of the 1996 Communications Decency Act. That provision is what’s often cited to protect internet companies from facing liability for the content that’s posted on their platforms.

That ruling was partially reversed on Tuesday by a three-judge panel of the appeals court, which sent it back down to the lower court for the trial to be held.

In a partial concurrence to the majority opinion, Judge Paul Matey wrote:

“Nylah, still in the first year of her adolescence, likely had no idea what she was doing or that following along with the images on her screen would kill her. But, TikTok knew that Nylah would watch because the company’s customized algorithm placed the videos on her ‘For You Page.’”

One of the family’s lawyers, Jeffrey Goodman, added that it is “inevitable” that Section 230 gets more scrutiny from courts, as technology continues to reach into nearly every facet of daily lives. He said the family knows that the lawsuit won’t bring Nylah back, but hopes a ruling will protect others from experiencing the same loss they did.

He explained:

“Today’s opinion is the clearest statement to date that Section 230 does not provide this catchall protection that the social media companies have been claiming it does.”