Apple has announced (https://www.apple.com/child-safety/) its intention to begin scanning photographs uploaded to its iCloud online backup and file-sharing service beginning with iOS 15 and macOS 12. Before being uploaded to Apple's iCloud service, photos will be checked on-device for any images that match those in a database of child sexual abuse material (CSAM) maintained by the National Center for Exploited and Missing Children.
Apple's plan to conduct on-device scanning raises serious privacy concerns. As NSA whistleblower Edward Snowden observed on Twitter (https://twitter.com/snowden/status/1423469854347169798), "No matter how well-intentioned, @Apple is rolling out mass surveillance to the entire world with this. Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow. They turned a trillion dollars of devices into iNarcs—*without asking.*"
The Electronic Frontier Foundation addresses privacy concerns associated with Apple's decision in an article titled, "Apple's Plan to "Think Different" About Encryption Opens a Backdoor to Your Private Life." (https://www.eff.org/deeplinks/2021/08/apples-plan-think-different-about-encryption-opens-backdoor-your-private-life)
Apple's decision is of great concern to me, because I use Apple devices both personally and for running AntiPolygraph.org. This Friday, 13 August from 2-4 PM Eastern (11 AM-1 PM Pacific) I'll be hosting a special meetup to discuss the privacy implications of Apple's decision to begin device scanning as well as strategies for risk mitigation. All are welcome:
https://jitsi.cyberian-nomad.site/BadApple
I'm glad this startled you as well George. I saw a real red flag pop up. Also, iPhones are assembled in China. Ask anyone in Hong Kong if they can be trusted. I think congressional hearings should result from this. It's very scary.