apple patient
  • Home
  • News
  • Rumors
  • Tips & Tricks
  • Tests & Experience Reports
  • Generally
No Result
View All Result
  • Home
  • News
  • Rumors
  • Tips & Tricks
  • Tests & Experience Reports
  • Generally
No Result
View All Result
apple patient
No Result
View All Result

Apple introduces new child protection features: Security researchers concerned

by Milan
August 5, 2021
Apple parental controls

Prague Czech Republic - April 2, 2021 - Businessman in suit holding in hands and opening a box with a brand new black apple iPhone 12.

Apple announced late Thursday evening that with the launch of iOS 15 and Co., it will begin scanning iCloud Photos in the US to look for known Child Sexual Abuse Material (CSAM) and plans to report the results to the National Center for Missing and Exploited Children (NCMEC).

Even before Apple has announced its own plans in detail presented news of the CSAM initiative leaked out throughAs a result, security researchers have begun to raise concerns about how Apple’s new image scanning protocol could be used in the future. reported Now the Financial Times reports that Apple uses a "NeuralHash" system to compare known CSAM images with photos on a user's iPhone before uploading them to iCloud.

Matthew Green on Apple's plan: “A really bad idea”

If there's a match, the photo is uploaded with a cryptographic security voucher, and at a certain threshold, a check is triggered to ensure the person actually has CSAM on their devices. Currently, Apple uses this technology to scan and match images to check for child abuse. But security researchers fear it could be adapted in the future to look for other types of images that are more concerning, such as anti-government symbols at protests. In a series of tweets, John Hopkins cryptography researcher Matthew Green said that CSAM scanning is a "really bad idea" because in the future it could be expanded to scan end-to-end encrypted photos, rather than just content uploaded to iCloud. Side note: For children, Apple is implementing a separate scanning feature that looks for sexually explicit content directly in end-to-end encrypted messages.

Scanning technology should be very accurate

Green also raised concerns about the hashes Apple plans to use, citing the potential for "collisions" where someone sends a harmless file that shares a hash with CSAM, potentially leading to a false positive. As a reminder, Apple says its scanning technology has an "extremely high level of accuracy" to ensure accounts aren't falsely flagged. Reports are manually reviewed before a person's iCloud account is deactivated and a report is sent to the NCMEC. Green believes Apple's implementation will encourage other tech companies to adopt similar techniques.

That will break the dam. Governments will demand it from everyone.

Security researcher Alec Muffett, who formerly worked at Facebook, said Apple's decision to implement this type of image scanning is a "huge and regressive step for individual privacy."

Apple has been using screening technology for some time

As many noted on Twitter, several tech companies are already doing image scanning for CSAM. Google, Twitter, Microsoft, Facebook, and others use image hashing methods to search for and report known child abuse images. It's also worth noting that Apple was already scanning some content for child abuse images before the new CSAM initiative rolled out. In 2020 confirmed Apple's Chief Privacy Officer, Jane Horvath, explained that Apple uses screening technology to search for illegal images and then deactivates accounts if evidence of CSAM is discovered. That said, Apple already updated its privacy policy in 2019 to indicate that it scans uploaded content for "potentially illegal content, including child sexual exploitation material." Therefore, yesterday's announcement isn't entirely new. (Photo by weyo / Bigstockphoto)

  • iMessage, Siri & iCloud Photos: Apple expands parental control features
Have you already visited our Amazon Storefront? There you'll find a hand-picked selection of various products for your iPhone and other devices – enjoy browsing !
This post contains affiliate links .
Add Apfelpatient to your Google News Feed. 
Was this article helpful?
YesNo
Tags: Apple servicesiOSiOS 15iPadOSiPadOS 15macOSmacOS 12 MontereywatchOSwatchOS 8
Previous Post

iMessage, Siri & iCloud Photos: Apple expands parental control features

Next Post

iCloud Photos: CSAM check can be disabled

Next Post
iCloud Photos CSAM scan

iCloud Photos: CSAM check can be disabled

iPhone Fold Apple

iPhone Fold coming in 2026 – Apple begins sourcing materials

October 31, 2025
EU Chat Control

No chat monitoring: EU foregoes chat surveillance

October 31, 2025
Apple Home

Apple launches major smart home offensive with new devices

October 31, 2025

About APFELPATIENT

Welcome to your ultimate source for everything Apple - from the latest hardware like iPhone, iPad, Apple Watch, Mac, AirTags, HomePods, AirPods to the groundbreaking Apple Vision Pro and high-quality accessories. Dive deep into the world of Apple software with the latest updates and features for iOS, iPadOS, tvOS, watchOS, macOS and visionOS. In addition to comprehensive tips and tricks, we offer you the hottest rumors, the latest news and much more to keep you up to date. Selected gaming topics also find their place with us, always with a focus on how they enrich the Apple experience. Your interest in Apple and related technology is served here with plenty of expert knowledge and passion.

Legal

  • Imprint – About APFEPATIENT
  • Cookie Settings
  • Privacy Policy
  • Terms of Use

service

  • Partner Program
  • Netiquette – About APPLEPATIENT

RSS Feed

Follow Apfelpatient:
Facebook Instagram YouTube threads threads
Apfelpatient Logo

© 2025 Apfelpatient. All rights reserved. | Sitemap

No Result
View All Result
  • Home
  • News
  • Rumors
  • Tips & Tricks
  • Tests & Experience Reports
  • Generally

© 2025 Apfelpatient. All rights reserved. | Page Directory

Deutsch