The Popular Video Platform Reportedly Leads Children's Profiles to Explicit Material In Just a Few Taps

According to a recent investigation, TikTok has been observed to steer profiles of minors to explicit material after only a few taps.

Research Methodology

A campaign organization created fake accounts using a birthdate of a 13-year-old and activated the "restricted mode" setting, which is intended to limit exposure to adult-oriented content.

Researchers found that TikTok suggested inappropriate and adult-themed search terms to seven test accounts that were set up on new devices with no search history.

Troubling Search Prompts

Keywords suggested under the "recommended for you" feature contained "extremely revealing clothing" and "inappropriate female imagery" – and then advanced to phrases such as "explicit adult videos".

Regarding three of the accounts, the inappropriate search terms were recommended right away.

Quick Path to Pornography

After a "small number of clicks", the researchers found adult videos including women flashing to explicit intercourse.

The organization claimed that the content tried to bypass filters, usually by displaying the video within an innocuous picture or video.

Regarding one profile, the process took two taps after signing in: one interaction on the search bar and then one on the recommended term.

Regulatory Context

The research entity, whose mandate includes examining big tech's impact on human rights, stated it carried out several experimental rounds.

The first group occurred prior to the enforcement of safeguarding regulations under the British online safety legislation on the 25th of July, and additional tests following the measures took effect.

Serious Findings

Investigators noted that multiple clips showed someone who seemed to be a minor and had been submitted to the child protection organization, which oversees harmful material involving minors.

The research organization asserted that the video platform was in breach of the UK safety legislation, which requires tech companies to stop children from viewing harmful content such as explicit content.

Regulatory Response

An official representative for Britain's media watchdog, which is responsible for monitoring the legislation, said: "We appreciate the work behind this research and will review its conclusions."

Ofcom's codes for complying with the legislation state that digital platforms that present a significant danger of displaying dangerous material must "adjust their systems to remove inappropriate videos from young users' timelines.

The app's policies ban adult videos.

Company Reaction

The video platform said that after being contacted from Global Witness, it had taken down the problematic material and introduced modifications to its search recommendations.

"Immediately after notification" of these assertions, we took immediate action to examine the issue, delete material that violated our policies, and launch improvements to our search suggestion feature," commented a company representative.

Ashley Barron
Ashley Barron

Tech enthusiast and startup advisor with a passion for emerging technologies and digital transformation.

Popular Post