You can enhance your privacy when browsing and posting to this forum by using the free and open source Tor Browser and posting as a guest (using a fake e-mail address such as nobody@nowhere.com) or registering with a free, anonymous ProtonMail e-mail account. Registered users can exchange private messages with other registered users and receive notifications.
In addition, check out our new (April 2022) and greatly improved chat server.
I'm glad this startled you as well George. I saw a real red flag pop up. Also, iPhones are assembled in China. Ask anyone in Hong Kong if they can be trusted. I think congressional hearings should result from this. It's very scary.
Posted by: George W. Maschke Posted on: Aug 11th, 2021 at 9:56am
Apple has announced its intention to begin scanning photographs uploaded to its iCloud online backup and file-sharing service beginning with iOS 15 and macOS 12. Before being uploaded to Apple's iCloud service, photos will be checked on-device for any images that match those in a database of child sexual abuse material (CSAM) maintained by the National Center for Exploited and Missing Children.
Apple's plan to conduct on-device scanning raises serious privacy concerns. As NSA whistleblower Edward Snowden observed on Twitter, "No matter how well-intentioned, @Apple is rolling out mass surveillance to the entire world with this. Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow. They turned a trillion dollars of devices into iNarcs—*without asking.*"
Apple's decision is of great concern to me, because I use Apple devices both personally and for running AntiPolygraph.org. This Friday, 13 August from 2-4 PM Eastern (11 AM-1 PM Pacific) I'll be hosting a special meetup to discuss the privacy implications of Apple's decision to begin device scanning as well as strategies for risk mitigation. All are welcome: