fbpx

Apple Scans Photos Uploaded to iCloud to Check Possible Cases of Child Abuse

Apple Scans Photos Uploaded to iCloud to Check Possible Cases of Child Abuse

Photos on iPhone and iPadPhotos on iPhone and iPad

Apple’s chief privacy officer, Jane Horvath, has revealed at the CES 2020 event that the Cupertino-based tech giant is scanning images backed up from its devices like the iPhone and iPad to iCloud to check for child sexual abuse pictures.

Without going too deep into the image scanning methods, Jane said: “We are utilizing some technologies to help screen for child sexual abuse material.” Most other tech companies like Google, Facebook, and Twitter use a system called PhotoDNA, which can compare an image with a set of pictures that have been identified to contain child sexual abuse material to screen the illegal photos. So, Apple may be using the same PhotoDNA system to scan users’ images to recognize child sexual abuse pictures.

According to Jane, removing encryption isn’t how Apple is scanning the images, reassuring the brand’s customers that the privacy of their data on Apple’s servers remains intact. She added “End to end encryption is critically important to the services we come to rely on…. health data, payment data. Phones are relatively small they get lost and stolen. We need to make sure that if you misplace that device you’re not (exposing that data).” Although Apple didn’t reveal from how long has it been scanning user’s images, it sure isn’t doing it in secrecy, as a disclaimer on Apple’s website states:

“Apple is dedicated to protecting children throughout our ecosystem wherever our products are used, and we continue to support innovation in this space. As part of this commitment, Apple uses image matching technology to help find and report child exploitation. Much like spam filters in email, our systems use electronic signatures to find suspected child exploitation. Accounts with child exploitation content violate our terms and conditions of service, and any accounts we find with this material will be disabled.”

Our Take:

As each day passes by, government agencies are pressurizing tech companies more and more into allowing them to scan user’s data for potential threats. While brands like Apple have been successful so far at refusing to reveal user’s data, it is hard to say for how long can the tech companies keep doing so. If you are really concerned about keeping your data private, the only way to ensure that is to keep it stored in offline storage media like pen drive or a Blu-ray disc.

Do you approve of Apple’s decision to scan user’s images for child sexual abuse material? Do let us know in the comments below.

via iPhone Hacks http://www.iphonehacks.com/2020/01/apple-scans-photos-uploaded-to-icloud-to-check-possible-cases-of-child-sexual-abuse.html
link : http://www.iphonehacks.com/2020/01/apple-scans-photos-uploaded-to-icloud-to-check-possible-cases-of-child-sexual-abuse.html
January 8, 2020 at 08:35PM

  •  
  •  
  •  
  •  
  •  
  •  
  •  
  •  
  •