TikTok Feeds Teens a Diet of Darkness - WSJ
Published by ☕,
Clip source: TikTok Feeds Teens a Diet of Darkness - WSJ
By Julie Jargon
May 13, 2023 9:00 am ET
Calls to ban TikTok in the U.S. are growing louder. Government leaders are trying to keep the popular China-owned social video platform away from schools, public workers, even entire states, on the grounds that users’ data could wind up in the wrong hands.
Data privacy, though, might be less worrisome than the power of TikTok’s algorithm. Especially if you’re a parent.
Newsletter Sign-up
Family + Tech
Columnist Julie Jargon, a mother of three, helps families find answers and address concerns about the ways technology is impacting their lives.
Subscribe
A recent study found that when researchers created accounts belonging to fictitious 13-year-olds, they were quickly inundated with videos about eating disorders, body image, self-harm and suicide.
If that sounds familiar, a Wall Street Journal investigation in 2021 found that TikTok steers viewers to dangerous content. TikTok has since strengthened parental controls and promised a more even-keeled algorithm, but the new study suggests the app experience for young teens has changed little.
What teens see on social media can negatively affect them psychologically. Plenty of research backs this up. The simplest evidence may be found in my earlier column about teens who developed physical tics after watching repeated TikTok videos of people exhibiting Tourette Syndrome-like behavior.
A TikTok spokeswoman said the company has a team of more than 40,000 people moderating content. In the last three months of 2022, TikTok said it removed about 85 million posts deemed in violation of its community guidelines, of which 2.8% were suicide, self-harm and eating-disorder content. It also considers the removal of content flagged by users. "We are open to feedback and scrutiny, and we seek to engage constructively with partners," the spokeswoman added.
Screenshots of TikTok videos, nodding at self-harm, compiled by the Center for Countering Digital Hate.
Two-thirds of U.S. teens use TikTok, and 16% of all U.S. teens say they’re on it near constantly, according to Pew Research Center. Kids’ frequent social-media use—along with the potential for algorithms to lure teens down dangerous rabbit holes—is a factor in the American Psychological Association’s new recommendations for adolescent social-media use.
The group this week said parents should monitor their younger kids’ social-media scrolling and keep watch for troublesome use. The APA also urges parents and tech companies to be extra vigilant about content that encourages kids to do themselves harm.
‘Every 39 seconds’
The Center for Countering Digital Hate, a nonprofit that works to stop the spread of online hate and disinformation, tested what teens see on TikTok. Last August, researchers set up eight TikTok accounts to look like they belonged to 13-year-olds in the U.S., the U.K., Canada and Australia. For 30 minutes, researchers behind the accounts paused briefly on any videos the platform’s For You page showed them about body image and mental health, and tapped the heart to like them.
TikTok almost immediately recommended videos about suicide and eating disorders, the researchers said. Videos about body image and mental health popped up on the accounts’ For You pages every 39 seconds, they added.
Screenshots of TikTok videos, nodding at body-image concerns and suicidal ideation, compiled by the Center for Countering Digital Hate.
After the researchers published their findings, many of the videos they flagged disappeared from TikTok. Many of the accounts that posted the material remain. Those accounts include other videos that promote restrictive diets and discuss self-harm and suicide.
TikTok does take down content that clearly violates its guidelines by, for instance, referring directly to suicide. Videos where people describe their own suicidal feelings, however, might not be considered a violation—and wouldn’t fall under moderator scrutiny. They could even be helpful to some people. Yet child psychologists say these too can have a harmful effect.
TikTok executives have said the platform can be a place for sharing feelings about tough experiences, and cite experts who support the idea that actively coping with difficult emotions can be helpful for viewers and posters alike. They said TikTok aims to remove videos that promote or glorify self-harm while allowing educational or recovery content.
The company said it continually adjusts its algorithm to avoid repeatedly recommending a narrow range of content to viewers.
Advertisement - Scroll to Continue
‘Sad and lonely’
The Center for Countering Digital Hate shared its full research with me, including links to 595 videos that TikTok recommended to the fake teen accounts. It also provided reels containing all of the videos, some of which are no longer on the site. I also looked at other content on the accounts with flagged videos.
After a few hours, I had to stop. If the rapid string of sad videos made me feel bad, how would a 14-year-old feel after watching this kind of content day after day?
One account is dedicated to "sad and lonely" music. Another features a teenage girl crying in every video, with statements about suicide. One is full of videos filmed in a hospital room. Each of the hospital videos contains text expressing suicidal thoughts, including, "For my final trick I shall turn into a disappointment."
Screenshots of TikTok videos, nodding at body-image concerns and disordered eating, compiled by the Center for Countering Digital Hate.
Users have developed creative ways to skirt TikTok’s content filters. For instance, since TikTok won’t allow content referencing suicide, people use a sound-alike such as "sewerslide," or just write "attempt" and leave the rest to the viewer’s imagination. Creators of videos about disordered eating have also evaded TikTok’s filters.
SHARE YOUR THOUGHTS
Do you think people should be able to opt out of algorithms in social media? Join the conversation below.
Policing all the content on a service used by more than one billion monthly users is no easy task. Yet there is a difference between stamping out harmful content and promoting it.
"If tech companies can’t eliminate this from their platforms, don’t create algorithms that will point kids to that information," said Arthur C. Evans Jr., chief executive of the American Psychological Association.
Investigation: How TikTok's Algorithm Figures Out Your Deepest Desires
A Wall Street Journal investigation found that TikTok only needs one important piece of information to figure out what you want: the amount of time you linger over a piece of content. Photo illustration: Laura Kammermann/The Wall Street Journal
What parents can do
Watch what your kids are watching. Ariana Hoet, a pediatric psychologist at Nationwide Children’s Hospital, recommends asking your teens to show you their For You page. If you spot harmful content, it is an indication they’re likely engaging with that type of content. That can give you an opening to start a conversation about it.
Set up Family Pairing. Parents can set up their own TikTok account and use the app’s Family Pairing to restrict age-inappropriate content and limit the time their teens spend on the app.
Filter the feed. People can filter out videos containing words or hashtags they don’t want to see. If content is still slipping through, teens can tap "not interested."
Refresh the feed. Some teens have told me their feeds became so problematic they closed their accounts and started over. Teens can now refresh their feed without creating a new account. Once again, they must be careful what content they like or linger on, because new rabbit holes are forming all the time.
—For more Family & Tech columns, advice and answers to your most pressing family-related technology questions, sign up for my weekly newsletter.
Write to Julie Jargon at Julie.Jargon@wsj.com
Copyright ©2023 Dow Jones & Company, Inc. All Rights Reserved. 87990cbe856818d5eddac44c7b1cdeb8