Banking

TikTok’s search algorithm has been auto-suggesting potentially harmful eating disorder content when users type the first few letters of banned keywords

  • TikTok’s search function has been surfacing potentially harmful eating disorder content.
  • Users only needed to type in the first few letters of banned terms before TikTok recommended keywords and hashtags that evaded its moderation.
  • One search recommended the topic “how to successfully starve without anyone noticing.”

TikTok has been recommending potentially harmful content in its search results page to users who type in the first few letters of apparently banned eating disorder-related terms, Insider has found.

TikTok has previously taken action on keywords such as “anorexia” and “bulimia” to instead direct users towards support resources.

However, Insider’s investigation, carried out between June 30 and July 6, found that users only needed to type in truncated versions of those terms before they were automatically recommended hashtags and search terms that circumvented the keywords ban, such as deliberately misspelled words. This in turn led users to distressing videos that appeared to contravene TikTok’s community guidelines.

The results of one search performed by Insider, which didn’t include a misspelling but was clearly eating disorder-related, recommended topics including “how to successfully starve without anyone noticing” and “how to starve with water.”

 

 

A screenshot from TikTok where the app shows an "Others searched for" function which suggests "how to starve tips" and "how to starve with water."

TikTok suggests users click on topics “others searched for.” In this search, performed by Insider, those suggestions included “how to starve tips” and “how to successfully starve.”

Screenshots/TikTok


Insider’s findings come after a separate investigation, published last month, found that TikTok users had skirted moderation by using misspelled hashtags and keywords in their videos. The number of videos carrying certain eating-disorder hashtags was up 253% from last year, an analysis found — though not all the content was necessarily harmful, with many videos also promoting eating disorder recovery.

At the time, TikTok said it was reviewing the content Insider had brought to its attention and was “taking appropriate action,” including removing videos and accounts that violated its community guidelines.

Insider’s latest investigation found that TikTok suggested 29 potentially harmful eating disorder terms in searches conducted between June 30 and July 5. The majority of those keywords appeared to have been created by users as deliberate attempts to get around TikTok’s moderation of more common eating disorder-related terms. Insider also collected a sample of 44 videos that appeared in search results prompted by TikTok’s own suggestions.

All evidence was sent to TikTok. By July 6, typing in the first few letters of the word “anorexia” fetched no results in TikTok’s search engine, and users were instead directed to support resources. Some 35 of the 44 videos Insider presented to TikTok have since been removed. The now-deleted videos ranged in view-counts from 20 to 2.7 million.

“The safety and wellbeing of our community is a top priority,” a TikTok spokesperson said. “Our community guidelines make clear that we do not allow content depicting, promoting, normalizing, or glorifying eating disorders.”

“When a user searches for terms related to eating disorders, we don’t return results,” the spokesperson continued, adding that the app instead directs users to support helplines and guidance. TikTok has also introduced permanent public service announcements like #whatIeatinaday “to increase awareness and provide support for our community,” the TikTok spokesperson said.

TikTok said it’s constantly reviewing and extending its list of hundreds of search keywords upon which it has placed interventions. These interventions include directing users to its in-app resources or to eating-disorder support offered by its charity partners. TikTok has also introduced permanent public service announcements (PSAs) on certain hashtags.

“We’re always open to feedback from our community and partners,” the TikTok spokesperson said, adding that users also can “long-press” to report any problematic search suggestions to its moderators.

Tom Quinn, the director of external affairs at Beat, an eating-disorder charity in the UK, said it was “very concerning” that TikTok’s search algorithm had continued to suggest further harmful search terms to its users.

“We would urge them to address this without delay to avoid causing any further harm or distress to users affected by or vulnerable to eating disorders,” Quinn told Insider.

Earlier this month, TikTok said that it had removed more than 61 million videos for various violations of its community guidelines in the first three months of 2021.

If you or someone you know is struggling with an eating disorder in the US, you can call NEDA’s Helpline (1-800-931-2237) or email [email protected] on weekdays for support, resources, and information about treatment options. In crisis situations, NEDA offers 24/7 support — just text “NEDA” to 741-741.

If you are in the UK and you or someone you know is struggling with an eating disorder, you can call Beat’s helpline (0-808-801-0677) or email [email protected] for adult services, or contact the young people’s helpline (0-808-801-0711) or email [email protected]

Most Related Links :
Business News Governmental News Finance News

Need Your Help Today. Your $1 can change life.

[charitable_donation_form campaign_id=57167]

Source link

Back to top button