Skip to content

February 6, 2018

Apple briefly pulled Telegram over child pornography distribution

by John_A

When Apple temporarily pulled Telegram from the App Store over “inappropriate content,” it left many wondering just what that content was. We now know: 9to5Mac has learned that the company removed the app after discovering that people had been distributing child pornography through the app. Apple both contacted Telegram’s team and authorities (including the National Center for Missing and Exploited Children) to both address the specific violation and to ensure that there were “more controls” in place to prevent a repeat.

As a rule, internet services use a range of safeguards to prevent the spread of child porn, such as shared hash lists that prevent a file on one site from being posted elsewhere. It’s not certain what solutions Telegram implemented, but the relatively short turnaround (its software was back within hours) suggests it didn’t require a fundamental change.

The nature of the discovery might provide a clue as to how it was distributed. Telegram’s bread and butter is end-to-end encrypted messaging, which should rule out a non-participant directly intercepting the messages (including Apple itself). The 9to5 team suggests that the material may have been made public through a third-party plugin. Your privacy should remain intact as a result — Apple may have just been fortunate enough to spot the vile content and take action.

Via: The Verge

Source: 9to5Mac

Read more from News

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

Note: HTML is allowed. Your email address will never be published.

Subscribe to comments

%d bloggers like this: