Apple unveiled plans to scan U.S. iPhones for images of child sexual abuse, drawing applause from child protection groups but raising concern among some security researchers that the system could be misused, including by governments looking to surveil their citizens.
The tool designed to detected known images of child sexual abuse, called "neuralMatch," will scan images before they are uploaded to iCloud. If it finds a match, the image will be reviewed by a human. If child pornography is confirmed, the user's account will be disabled and the National Center for Missing and Exploited Children notified.
Separately, Apple plans to scan users' encrypted messages for sexually explicit content as a child safety measure, which also alarmed privacy advocates.
The detection system will only flag images that are already in the center's database of known child pornography. Parents snapping innocent photos of a child in the bath presumably need not worry. But researchers say the matching tool — which doesn't "see" such images, just mathematical "fingerprints" that represent them — could be put to more nefarious purposes.
Matthew Green, a top cryptography researcher at Johns Hopkins University, warned that the system could be used to frame innocent people by sending them seemingly innocuous images designed to trigger matches for child pornography. That could fool Apple's algorithm and alert law enforcement. "Researchers have been able to do this pretty easily," he said of the ability to trick such systems.
There was a story a while back about the humans that review fasebook content that is reported for content of abuse, child pron, violent content, and so forth. The humans burn out very quickly and usually seek counseling after a period of looking at the worst their fellow humans can dish up.
Considering Apple’s reliance on China to manufacture the vast majority of things they sell, I consider it an inevitability that Apple will spy and censor more and more for the Chinese Communist Party. The only hope is that may be, just may be, the spying and censorship can be limited to just devices and servers located in China.
Originally posted by pianojuggler: There was a story a while back about the humans that review fasebook content that is reported for content of abuse, child pron, violent content, and so forth. The humans burn out very quickly and usually seek counseling after a period of looking at the worst their fellow humans can dish up.
The Wall Street Journal does an “interview” with Apple software chief Craig Federighi to “clarify” those new features about Apple attempting to detect child pornography on devices (embedded video): https://www.wsj.com/video/seri...87-ACE1-E99CECEFA82C
I put “interview” and “clarify” in quotes because I don’t think the WSJ asks tough enough questions to really sketch out the risk of what Apple is doing that can harm consumer privacy. I would much rather have Electronic Freedom Frontier lawyer and engineer hash it out with Apple’s system architects and engineers.