

The company has been under pressure from governments and law enforcement to allow for surveillance of encrypted data. Apple has also been scanning user files stored in its iCloud service, which is not as securely encrypted as its messages, for such images. Tech companies including Microsoft, Google, Facebook and others have for years been sharing “hash lists” of known images of child sexual abuse. “Researchers have been able to do this pretty easily.” “This is a thing that you can do,” said Green. Matthew Green of Johns Hopkins, a top cryptography researcher, was concerned that it could be used to frame innocent people by sending them harmless but malicious images designed designed to appear as matches for child porn, fooling Apple’s algorithm and alerting law enforcement - essentially framing people.
/article-new/2021/01/apple-privacy.jpg)
If it finds a match, the image will be reviewed by a human who can notify law enforcement if necessary.īut researchers say the tool could be put to other purposes such as government surveillance of dissidents or protesters. The tool Apple calls “neuralMatch” will detect known images of child sexual abuse without decrypting people’s messages. iPhones for images of child abuse, drawing applause from child protection groups but raising concern among some security researchers that the system could be misused by governments looking to surveil their citizens.Īpple said its messaging app will use on-device machine learning to warn about sensitive content without making private communications readable by the company.
