TikTok error allows users to upload porn and violent videos

Have we reached tok bottom yet?

Online bottom feeders have come up with a devious way to bypass TikTok’s graphic content ban – by posting violent and sexual videos on their profile pictures. The horrific clips, tagged with the hashtag “don’t search that,” garnered 50 million views before they were banned.

« I’ve watched gore and hardcore porn and am very concerned about it because so many kids are using TikTok, » said an alleged teenager named Tom, who first brought the BBC’s attention to the disturbing trend.

YOUTUBE’S TIKTOK rival launches US beta

In fact, TikTok sleazeballs reportedly used the hack to upload everything from graphic sex videos to a gruesome clip from 2015 in which Islamic State militants burn Jordanian pilot Muadh al-Kasasbeh in a cage.

Unfortunately, the TikTok gap allows users to set videos as their profile picture that are infinitely more difficult to monitor than clips posted to their feed. Often times, the offending accounts have no actual TikTok content other than the clips in their avatar slot.

Their profile names also tend to have a ransom-like jumble of letters and words in them, making them harder to track down.

GET FOX BUSINESS ON THE GO BY CLICKING HERE

Despite the anonymity of these accounts, they regularly garner tens of thousands of followers eager to see their shocking content. The hideous videos were even recommended on TikTok’s « For You » page, which is based on the app’s eyeball search algorithm. Last summer, their click-craving formula notoriously allowed a series of Nazi-themed videos to garner over 6.5 million views before they were drawn.

And while several TikTokkers have reported these digital beverage traders, « TikTok takes forever to act, » lamented Tom.

Thankfully, the video-sharing app has since deleted many of the offensive accounts and turned off the hashtags that are promoting the trend, according to the BBC.

CLICK HERE TO READ MORE ABOUT FOX BUSINESS

« Protecting our community from potential harm is our most important task, » the company said in a statement to The Sun. “Our Community Guidelines apply to all content on our platform, and we are working vigilantly to identify and remove content that violates our guidelines, including reporting it to the National Center for Missing & Exploited Children and other appropriate authorities as appropriate .

« We have accounts that have tried to circumvent our rules via their profile photo, permanently banned and hashtags like #dontsearchup disabled, » the statement said. « Our security team is continuing its analysis and we will continue to take all necessary steps to ensure the security of our community. »

To read more from the New York Post, click here.

Laisser un commentaire