Home News Telegram iOS App Was Pulled Due to Child Pornography

Telegram iOS App Was Pulled Due to Child Pornography

Feb 6, 2018
SHARE

We finally have the scoop on why the Telegram iOS app was pulled last week, and it had to do with child pornography. We previously knew that it was inappropriate content that led to both Telegram apps being pulled, but within 24-hours the apps were back online. Now, we have more details about what led to the quick removal of Telegram.

Child Pornography Findings Led to Telegram Removal on iOS

Even though Telegram has since returned to Apple’s App Store, we wanted to fill you in on what caused the app to be removed. Telegram was taken off the App Store for over a day. We now know that Telegram was pulled from the iOS App Store due to child pornography. Someone alerted Apple to the fact that there was child pornography found within the app.

Apple verified that there was indeed child pornography. Once Apple saw the content itself, the decision was made to remove Telegram from the App Store. Apple then went on to notify Telegram about what was found. The company also notified the National Center for Missing and Exploited Children.

The good news was that the users who were posting this inappropriate content were also banned. We also know that law enforcement was notified of the content. However, we do not know if the people responsible were found. The developer and Apple worked together on removing the child pornography from the Telegram iOS app. That is really a good thing all the way around. Apple verified that the developer was taking action and banning those responsible. Quick action is why Telegram was put back up on Apple’s App Store within a day. Apple did say that more controls had to be put into place to prevent this situation from happening again. Those controls had to happen before the app was allowed back onto Apple’s App Store.

Quick Action Removed Child Pornography from Telegram App

Another piece of information we know about the encrypted messaging app is that it only took a few hours to get the child pornography removed. New safety measures were put into place to prevent this from happening again, and both Telegram apps quickly reappeared on Apple’s App Store. Apple also released some information about this issue.

Apple said that anytime inappropriate content is found within an app, very swift action will be taken. This includes things such as removing the offending app from the App Store, as well as notifying the developer of the app about the problem. In this case, several agencies and law enforcement were notified due to it being child pornography.

The developers of Telegram believe the new safety measures put into place will prevent illegal things like child pornography from getting onto the app. Due to the fact Telegram is end-to-end encryption, the company does not believe it originated from users. It could have gotten into the app by other means, although Telegram nor Apple are talking about how the child pornography got into the app.

Child Pornography Not Only Issue Telegram Has Dealt With

It does make you wonder how such illegal content managed to sneak into the Telegram app undetected. Some people believe it was a plugin from Telegram, although the original source for that would be unknown as well. Telegram has not had problems with child pornography before, but the app has been known to be a platform for some of the extremists out there.

At one point, there were even chat channels from ISIS on the app, which were removed by Telegram. An Iranian channel was also banned because it was encouraging people to join up in various violent protests. So, the child pornography angle is a first for Telegram, and hopefully it does not happen again. With apps that are end-to-end encryption, such as Telegram, these types of illegal or extremists think they will not get caught. Luckily, this was one instance illegal activities were caught and the proper authorities were notified.