Child safety group Heat Initiative plans to launch a campaign pressing Apple on child sexual abuse material scanning and user reporting. The company issued a rare, detailed response on Thursday.
I understand why people want privacy and it’s legitimate. As an honest citizen I’d want that too.
But, as a policeman in a country without a dictator, I feel also really frustrated to know that a pedophile or any criminal is able to escape justice because the encryption is good enough.
It’s always difficult to find the right balance. Especially because some governments are corrupts and trying to eliminate political threats.
What about people who keep paper magazines of printed porn? I guess it's really frustrating how too good a lock and key of an apartment is? We should really outlaw locks and keys that the government can't open at any moment, with no notice. Search warrants should be unnecessary.
Disregarding whataboutism about much larger problems, specifically here, about something like the church.....
I feel also really frustrated to know that a pedophile or any criminal is able to escape justice because the encryption is good enough
How do you know something that's provably false? Pedophiles who evade justice are not doing careless things that need end to end encryption, like backing up their porn to iCloud. The problem doesn't start or end with encryption. The policy is entirely about privacy and has nothing to do with protecting children.
Ah, but as a citizen who doesn't trust the police or the government even without a dictator. Cops lie all the time, on camera, here. Thankfully everyone sees why this is a gross invasion of privacy.
Would you be in favor of mandatory explosive implants in the brain that can only be activated by police? Consider all the crimes that could be stopped dead on their tracks.
I never supported since it was on device and given this is the US hashes to spot "extremism could be added" given apple doesn't know what the hashes are.
They are not cryptographic hashes. They are "perceptual" hashes or "fuzzy" hashes. They're basically just a low resolution copy of the original image. It's trivial for an attacker to maliciously send innocent seeming images that are a hash collision. This is, by the way, a feature not a bug. Perceptual hashes are not designed to perform a perfect match.
There are plenty of free white-papers on how perceptual hashes work, and Facebook's implementation is even open source.
Apple said they tested 100 million perfectly legal images and three had collisions with a CSAM perceptual hash. When you consider how many photos Apple was proposing to scan (hundreds of trillions of photos) that means thousands of false positives would have occurred even if nobody maliciously abused the system.
And because of all that - Apple was planning to do human reviews of every photo. They would, therefore, have seen every match (and every false positive). It couldn't have been hidden from Apple.
This outrage is going to be had by several people who want protection of children who had monsters do a terrible thing to them and who exacerbated the situation by uploading it to the cloud, which makes sharing it easier. However, these people aren’t seeing the bigger implications of this. I don’t really think many of the people that are against CSAM scanning are against protection for children or prevention of the very thing this is designed to prevent, myself included. However, what people are against is the scanning of material on your phone (which is what Apple proposed). People don’t want pictures scanned on their phones, even if it’s only as those photos will be uploaded to the cloud. Several companies were doing the scanning after the content was placed on the cloud, which many people against the previously mentioned scanning were in favor of. Apple, who is not in favor of scanning of your cloud data, was against this, which I think is admirable.
The fact of the matter is that scanning data for any purpose is at odds with the protection of your privacy. I, for one, am in favor of privacy protection. And although at times it may seem like people are against things like the protection for children, the fact is we’re actually in favor of protection for everyone.
Scanning everyone's photos is a clear invasion of user privacy.
Not scanning everyone's photos means people retain privacy, and bad actors may then have content that we, as a society, agree they should not have.
These two things are at odds, so any solution is a compromise (or at least, a choice of one thing over another), and either will always be controversial. It's not just photo scanning that falls into this, but also things like VPN usage -- really, virtually anything that lets users retain privacy could also be used for nefarious purposes.
Personally, I don't want to live in a world where everyone's photos are scanned, because I am vehemently opposed to that level of surveillance and believe it would lead to profit motives (e.g. better ad targeting). I do hope there is another way to curb CSAM content, but ultimately I don't see mass surveillance as viable.