TikTok said it had removed videos and deleted accounts on the platform associated with the Islamic State group
TikTok said it had removed videos and deleted accounts on the platform associated with the Islamic State group AFP / JOEL SAGET

Is TikTok becoming the new home to pedophiles and ISI propagandists? With the social media application now hitting a milestone download count and millions of registered users, questions concerning security and the safety of users in the app, specifically those of the younger age, are being raised.

TikTok is becoming more and more popular each day. In the social media app, one can watch different variations of short video clips ranging from dubbed music videos, to dubbed meme video clippings and more. In fact, in a recent report by SensorTower analytics, the TikTok app has been downloaded for more than 1.5 billion times already. This combines the numbers both for the Google Play Store and for the Apple App Store.

This makes TikTok the most downloaded application for this year in the non-gaming category, even surpassing other social media and messaging giants such as WhatsApp, Facebook and Instagram. This can be considered as a major feat, as the app was only launched in 2016.

Since a bigger number of popular users in the platform are those in the younger age, the safety of these users against older people who may see them as prey is slowly becoming a concern for a lot of people.

“Without the right security settings, children broadcasting live video of themselves in their bedrooms over the internet could be targeted by abusers,” Javed Khan, chief executive of Barnardo’s told The Sun in a report.

Additionally, occasions of ISIS propagandist streaming jihadist videos have been streamed on the app. Though they were all eventually removed but not until after thousands, if not millions of young users, have seen it.

In addition to this, the platform has been found to be a ground for “grooming” victims of pedophiles, since the lack of security features and the carelessness of the users both made it possible.

“TikTok is very popular with children yet its owners have previously shown a careless approach to protecting its young users,” Andy Burrows, NSPCC Head of Child Safety Online Policy told The Sun in the same report.

“Abusers can exploit TikTok’s chat and live-streaming features to groom and harm children, and sexualized content of young people was allowed because moderators were told to assume users were adults if their age was unclear.”

Following the UK, the US government is also conducting its own national security investigation on TikTok.