In November 2020, a four-hour journey home left the Schott couple in distress. They had just learned that their 18-year-old daughter, Annalee, had died by suicide on their family farm in Colorado.
That morning, they had video-called Annalee, telling her that they would be home around 11 p.m. after visiting family in Texas, and everything seemed normal. Now, they wanted to be with her immediately, but there were still four hours left.
“It was the longest, most terrifying journey we’ve ever experienced, like a horrible dream,” Lori Schott told The Epoch Times.
“Sitting in the car for hours, trying to make sense of it all, I remember telling my son: ‘This isn’t real, she’s in the barn with the animals, this can’t be real, she couldn’t have done that.'”
For the following months, Lori Schott pondered why her daughter chose to end her life and what she, as a mother, could have done differently, perhaps leading to a different outcome.
A year and a half later, she finally summoned the courage to browse through Annalee Schott’s diary. On a page titled “TikTok,” her daughter wrote, “Technically, if I kill myself, the problem goes away.” Lori Schott believed the page contained quotes from TikTok.
Months later, Lori Schott had a company crack her daughter’s phone.
The mother found answers: “I could open her TikTok page, and I would feel the same thing.” Even two years after Annalee Schott’s death, her TikTok feed was still filled with content related to depression and hints of suicide.
TikTok has specific policies regarding content that may lead to self-harm or suicide.
However, in 2022, a friend of Annalee, visiting Mrs. Schott from college, told her that they had watched a live suicide on TikTok together, then decided to delete the app, only to reinstall it about a week later.
“TikTok hijacked her brain, making her believe she was something she truly wasn’t,” Lori Schott said. “To me, TikTok discovered who she was and her vulnerabilities and exploited them.”
To this day, TikTok continues to send messages to Annalee, promoting anxiety, depression, and “you have no future.”
“This is wrong.”
Annalee Schott also had accounts on other social media platforms: Facebook, Instagram, and Snapchat, but concerning content, “TikTok is definitely the worst,” her mother said.
Over the years, concerns have been raised about this popular video app originating from China. Apart from its impact on young people’s mental health, legislators, security officials, and experts have raised alarms, stating that the app’s data could be accessed by Beijing, and the communist regime could use the app for influence operations and spreading misinformation.
In mid-March, the House of Representatives overwhelmingly passed a bill requiring TikTok to divest from its parent company, ByteDance, within 270 days, or face a ban in the United States. This measure was included in the $95 billion foreign aid bill passed by the House on April 20, signed into law by President Biden on April 24.
Beijing-based tech company ByteDance launched the short video platform Douyin in 2016 and TikTok, its international version, a year later.
The number of TikTok users has grown exponentially, with over 170 million Americans currently using the app, nearly half of the country’s population, with approximately half of the users being under 30 years old. TikTok, based in Los Angeles and Singapore, is a wholly-owned subsidiary of ByteDance.
Reporter and tech expert Geoffrey Cain said that the mental health of American Gen Z is being affected, with a significant part due to an addiction to TikTok.
His team posed as 13-year-old children, comparing the content they encountered on different social media platforms, the parental control settings, and found that these users were more likely to come across harmful content on TikTok.
However, testing conducted by his team revealed that harmful content that 13-year-old American TikTok users could see was not visible to users of the same age on Douyin, the Chinese equivalent. Instead, Chinese users attempting to view such videos would be met with a “not appropriate” warning issued by the public security department.
“It’s clear that the Chinese government knows this content is extremely harmful,” Cain told The Epoch Times. “Given that they effectively control ByteDance, why do they allow TikTok to show this content in the U.S. while banning the same content in China? They know it’s bad for the kids, but they don’t mind showing it on the platform.”
Lori Schott believed that TikTok was merely about cute animals and dances, unaware of what Annalee Schott actually saw on the app.
“It’s like sending her into a terrifying store without any warning, thinking it’s a candy shop,” she said. “People walk in, but it’s just darkness and despair, I think these kids feel dejected, and then they start connecting with each other, or those algorithms keep pushing.”
Tristan Harris, a tech ethicist and co-founder of the Center for Humane Technology, a nonprofit organization dedicated to exposing harmful technologies, referred to this phenomenon as “amplifaganda,” a synthesis of “amplify” and “propaganda.”
Harris stated that this is an ability, like a magician, to focus people’s attention on what they want them to focus on, selectively amplifying and influencing people’s attitudes.
…
The rest of the article contains detailed information on the dynamics between TikTok, U.S. legislators, potential national security threats, and the Chinese government’s intentions, as well as ongoing debates, lobbying efforts, and legal proceedings surrounding TikTok’s presence and operations in the United States.
