Apple will reportedly announce new photo identification features that use hashing algorithms to match the content of photos in users' photo libraries with known child abuse materials, such as child pornography.
Apple has removed individual applications from the App Store in the past due to concerns about child pornography. But now Cupertino is reportedly close to introducing automatic detection system-wide. Using photo hashing, iPhones could identify "Child Sexual Abuse Material" (CSAM) on the device that reported Matthew Green, a cryptographer and professor at the Johns Hopkins Information Security Institute. According to Green, the plan will initially be client-side - that is, all detection will take place on the user's iPhone. However, he thinks it's possible that this is the beginning of a process that will lead to monitoring traffic sent and received by the phone.
iCloud Photos: Apple has already confirmed hashing techniques
Ultimately, it could be a key component for adding surveillance to encrypted messaging systems. The ability to add scanning systems like this to E2E [end-to-end encrypted] messaging systems has been a big request from law enforcement agencies around the world. This kind of tool can be a boon when it comes to finding child pornography on people's phones. But imagine what it could do in the hands of an authoritarian government?
As a reminder, Apple has already confirmedthat it uses hashing techniques when photos are uploaded to iCloud. But this new system would be carried out on the client side, i.e. on the user's device. This would make it independent of iCloud. When Apple will officially announce the whole thing remains to be seen, of course. The details will be interesting. (Photo by Unsplash / TheRegisti)
 
			



