Instagram's IGTV Feature Has A Child Exploitation Content Problem
Just three months after its launch, Instagram’s IGTV long-form video service has a problem with inappropriate, or even illegal, content. An extensive report from Business Insider found that multiple Instagram accounts were being algorithmically recommended videos featuring child exploitation and genital mutilation on IGTV, which Instagram took several days to remove.
IGTV allows Instagram creators to upload videos of up an hour in length, unlike the app’s regular video feature. It can be accessed from the app’s main screen by tapping an icon in the upper right corner. Once there, users will see several different tabs, including one for popular videos and one for videos the app thinks said users will like, called “For You.”
Business Insider used the personal Instagram accounts of multiple journalists, as well as a fake account registered as a 13-year-old, the youngest age the site allows. While checking the popular videos tab as well as the For You tab for three weeks, the journalists noted several videos that clearly should not have been allowed on the service under its guidelines.
One video, entitled “Hot Girl Follow Me,” featured what was believed to be an underage girl preparing to take her clothes off. It was recommended by the app’s algorithms in the For You tabs for both the journalists’ accounts and the dummy account, which had no history for the algorithms to draw from. It also showed up in the popular videos tab.
Another similar video was uploaded by the same user. After being reported through the app’s built-in moderation features, it took Instagram five days for the videos to be removed, per Business Insider. Both videos accumulated more than a million views before their removal.
Instagram did not, however, delete the offending account that uploaded the videos. The publication also described a highly graphic video involving genital mutilation that was recommended to the fake, 13-year-old account.
Instagram is betting on IGTV to help the app bring even more monetized video content to users. It set itself apart when it launched in June by presenting videos in a vertical aspect ratio, specifically for mobile viewing.
However, the feature is still in its relative infancy, meaning some users have found ways to game the system and upload content that violates its policies. Even worse, Instagram’s algorithms are recommending sexual or graphic content to users who are marked as young teens, according to Business Insider’s findings.
“We take measures to proactively monitor potential violations of our Community Guidelines and just like on the rest of Instagram, we encourage our community to report content that concerns them,” an Instagram representative told Business Insider. “We have a trained team of reviewers who work 24/7 to remove anything which violates our terms.”
© Copyright IBTimes 2024. All rights reserved.