apple patient
  • Home
  • News
  • Rumors
  • Tips & Tricks
  • Tests & Experience Reports
  • Generally
No Result
View All Result
  • Home
  • News
  • Rumors
  • Tips & Tricks
  • Tests & Experience Reports
  • Generally
No Result
View All Result
apple patient
No Result
View All Result

Apple introduces new child protection features: Security researchers concerned

by Milan
August 5, 2021
in News
Apple parental controls

Prague Czech Republic - April 2, 2021 - Businessman in suit holding in hands and opening a box with a brand new black apple iPhone 12.

Apple announced late Thursday evening that with the launch of iOS 15 and Co., it will begin scanning iCloud Photos in the US to look for known Child Sexual Abuse Material (CSAM) and plans to report the results to the National Center for Missing and Exploited Children (NCMEC).

Even before Apple has announced its own plans in detail presented news of the CSAM initiative leaked out throughAs a result, security researchers have begun to raise concerns about how Apple’s new image scanning protocol could be used in the future. reported now the Financial Times. Apple uses a "NeuralHash" system to compare known CSAM images with photos on a user's iPhone before they are uploaded to iCloud.

Matthew Green on Apple's plan: "A really bad idea"

If there's a match, the photo is uploaded with a cryptographic security voucher and at a certain threshold, a check is triggered to ensure the person actually has CSAM on their devices. Currently, Apple uses this technology to scan and match images to look for child abuse. But security researchers fear it could be adapted in the future to look for other types of images that are more concerning, such as anti-government signs at protests. In a series of tweets, John Hopkins cryptography researcher Matthew Green said CSAM scanning is a "really bad idea" because in the future it could be expanded to scan end-to-end encrypted photos, rather than just content uploaded to iCloud. Side note: For children, Apple is implementing a separate scanning feature that looks for sexually explicit content directly in messages that are end-to-end encrypted.

Scanning technology should be very accurate

Green also raised concerns about the hashes Apple plans to use, as there could potentially be "collisions" where someone sends a harmless file that shares a hash with CSAM and could result in a false positive. As a reminder, Apple says its scanning technology has an "extremely high level of accuracy" to ensure accounts are not falsely flagged. Reports are manually reviewed before a person's iCloud account is disabled and a report is sent to the NCMEC. Green believes Apple's implementation will push other tech companies to adopt similar techniques.

That will break the dam. Governments will demand it from everyone.

Security researcher Alec Muffett, who formerly worked at Facebook, said Apple's decision to implement this type of image scanning is a "huge and regressive step for individual privacy."

Apple has been using screening technology for some time

As many noted on Twitter, several tech companies are already doing image scanning for CSAM. Google, Twitter, Microsoft, Facebook, and others use image hashing methods to search for and report known child abuse images. It's also worth noting that Apple was already scanning some content for child abuse images before the new CSAM initiative rolled out. In 2020 confirmed Apple's Chief Privacy Officer Jane Horvath said Apple uses screening technology to look for illegal images and then disable accounts if evidence of CSAM is discovered. That being said, Apple already updated its privacy policy in 2019 to indicate that it scans uploaded content for "potentially illegal content, including child sexual exploitation material." So yesterday's announcement is actually not entirely new. (Photo by weyo / Bigstockphoto)

  • iMessage, Siri & iCloud Photos: Apple expands parental control features
Have you already visited our Amazon Storefront? There you'll find a hand-picked selection of various products for your iPhone and other devices – enjoy browsing !
This post contains affiliate links .
Add Apfelpatient to your Google News Feed. 
Was this article helpful?
YesNo
Tags: Apple servicesiOSiOS 15iPadOSiPadOS 15macOSmacOS 12 MontereywatchOSwatchOS 8
Previous Post

iMessage, Siri & iCloud Photos: Apple expands parental control features

Next Post

iCloud Photos: CSAM check can be disabled

Next Post
iCloud Photos CSAM scan

iCloud Photos: CSAM check can be disabled

Apple Notes App iOS 26

iOS 26: These new features await you in Apple Notes

June 13, 2025
iPadOS 26 iPad Fold

iPadOS 26 is ready for the upcoming 18.8-inch iPad Fold

June 13, 2025
Apple Music iOS 26

Apple Music gets a DJ feeling with AutoMix feature

June 13, 2025

About APFELPATIENT

Welcome to your ultimate source for everything Apple - from the latest hardware like iPhone, iPad, Apple Watch, Mac, AirTags, HomePods, AirPods to the groundbreaking Apple Vision Pro and high-quality accessories. Dive deep into the world of Apple software with the latest updates and features for iOS, iPadOS, tvOS, watchOS, macOS and visionOS. In addition to comprehensive tips and tricks, we offer you the hottest rumors, the latest news and much more to keep you up to date. Selected gaming topics also find their place with us, always with a focus on how they enrich the Apple experience. Your interest in Apple and related technology is served here with plenty of expert knowledge and passion.

Legal

  • Imprint – About APFEPATIENT
  • Cookie Settings
  • Privacy Policy
  • Terms of Use

service

  • Partner Program
  • Netiquette – About APPLEPATIENT

RSS Feed

Follow Apfelpatient:
Facebook Instagram YouTube threads
Apfelpatient Logo

© 2025 Apfelpatient. All rights reserved. | Sitemap

No Result
View All Result
  • Home
  • News
  • Rumors
  • Tips & Tricks
  • Tests & Experience Reports
  • Generally

© 2025 Apfelpatient. All rights reserved. | Page Directory