EFF Joins Global Coalition Asking Apple CEO Tim Cook to Stop Phone-Scanning
Apple’s Plan to Scan Photos in Messages Turns Young People Into Privacy Pawns
25,000 EFF Supporters Have Told Apple Not To Scan Their Phones
Delays Aren’t Good Enough—Apple Must Abandon Its Surveillance Plans?
The latest news from Apple—that the company will open up a backdoor in its efforts to combat child sexual abuse imagery (CSAM)—has us rightly concerned about the privacy impacts of such a decision.
But even in the U.S., no company is going to satisfy everyone when it comes to defining, via an algorithm, what photos are sexually explicit. Are 247 blackjack breast cancer awareness images sexually explicit? Facebook has said so in the past. Are shirtless photos of trans men who’ve had top surgery sexually explicit? Instagram isn’t sure. Is a photo documenting sexual or physical violence or abuse sexually explicit? In some cases like these, the answers aren’t clear, and Apple wading into the debate, and tattling on children who may share or receive the images, will likely only produce more 247 blackjack frustration, and more confusion.
SIGN THE PETITION