The greatest desire of Lalani Erika Walton was to become “TikTok famous.” Instead, the bright eyes 8 year-old wound up dead. Lanai is just one two such tragedies that prompted a pair of wrongful death lawsuits filed recently in Los Angeles County Superior Court against TikTok. The TikTok app fed both Lalani Watson, 8, and Arriani Arroyo, 9, videos promoting a viral trend called the blackout challenge. In this challenge participants attempt to choke themselves into unconsciousness, the cases allege; both of the young girls died after attempting the stunt
These events are an indication TikTok is a defective product, said the Social Media Victims Law Center, the law firm behind the suits and a self-described “legal resource for parents of children harmed by social media.” TikTok pushed Lalani and Arriani videos of the dangerous trend, is engineered to be addictive and didn’t offer the girls or their parents adequate safety features, the Law Center said, all in the name of maximizing ad revenue. TikTok did not immediately respond to a request for comment.
The deaths of Lanalni and Arriani bear striking similarities. Lalani, who was from Texas, was an avid TikToker, posting videos of herself dancing and singing on the social network in hopes of going viral, according to the Law Center’s complaint. A review of her algorithm in July 2021 showed videos of the self-strangulation blackout challenge, the suit said. Midway through that month, Lalani told her family that bruises that had appeared on her neck were the result of a fall. Soon after, she spent some of a 20-hour car ride with her stepmother watching what her mother would later learn had been blackout challenge videos. After arriving home Lalani’s stepmother told her the two could go swimming later, and then took a brief nap. But upon waking up, the suit said, her stepmother went to Lalani’s bedroom and found the girl “hanging from her bed with a rope around her neck.”
The police, who took Lalani’s phone and tablet, later told her stepmother that the girl had been watching blackout challenge videos “on repeat,” the suit said. Lalani was “under the belief that if she posted a video of herself doing the Blackout Challenge, then she would become famous,” it said, yet the young girl “did not appreciate or understand the dangerous nature of what TikTok was encouraging her to do.”
Arriani also loved posting song and dance videos on TikTok, the suit said. She “gradually became obsessive” about the app. On Feb. 26, 2021, Arriani’s father was working in the basement when her younger brother Edwardo came downstairs and said that Arriani wasn’t moving. The two siblings had been playing together in Arriani’s bedroom, the suit said, but when their father rushed upstairs to check on her, he found his daughter “hanging from the family dog’s leash.”
The allegation is “TikTok’s product and its algorithm directed exceedingly and unacceptably dangerous challenges and videos” to Arriani’s feed, the suit said, encouraging her “to engage and participate in the TikTok Blackout Challenge.” Other children ages 10 to 14, have reportedly died under similar circumstances while attempting the blackout challenge. “TikTok unquestionably knew that the deadly Blackout Challenge was spreading through their app and that their algorithm was specifically feeding the Blackout Challenge to children,” the Social Media Victims Law Center’s complaint said, adding that the company “knew or should have known that failing to take immediate and significant action to extinguish the spread of the deadly Blackout Challenge would result in more injuries and deaths, especially among children.”
TikTok denies the blackout challenge is a TikTok trend, pointing to pre-TikTok instances of children dying from “the choking game.” TikTok told the Washington Post that the company has blocked #BlackoutChallenge from its search engine. But some of their so-called challenges have proven even more risky. Injuries have been reported from stunts known as the fire challenge, Tide Pod challenge, cinnamon challenge, milk crate challenge, Benadryl challenge, skull breaker challenge and dry scoop challenge, among others.
Social media platforms have long been accused of hosting socially harmful content — including hate speech, slander and misinformation — a federal law called Section 230 makes it difficult to sue the platforms. Section 230 gives apps and websites a wide latitude to host user-generated content and moderate it how they see fit, without having to worry about being sued over it. Or, as the complaint said: The plaintiffs “are not alleging that TikTok is liable for what third parties said or did, but for what TikTok did or did not do.” In large part the suits do this by criticizing TikTok’s algorithm as addictive, with a slot machine-like interface that feeds users an endless, tailor-made stream of videos in hopes of keeping them online for longer and longer periods.
The foundation of my law firm is integrity, honesty and compassion. I have over 20 years of experience helping Tucson families who are injured because of the careless and reckless acts of others. I help with serious injuries that require serious representation. I am a vigorous advocate for my clients, protect their rights and give them a voice in the court room. Call Tammy Carter Law 24/7/365 at 520-333-7737.