This case reminds me a bit of the big stink with the Daraprim price hikes and Martin Shkreli being dumb enough to be the public face of it. That worked out pretty bad for him but also bad for everyone else, because now we had a guy to blame for it, instead of asking the harder questions like "why aren't drug prices regulated?" Utilities must go to the public utility commission to raise rates so why is the same not done for drugs? The federal government stockpiles oil in case of shortages, why is it not responsible for ensuring that drug production is continued?
The maddening thing about the Apple CSAM scanning controversy is that I still have no idea which legislation Apple is complying with, and it's difficult to find out any reporting on that aspect. US? UK? EU? Clearly there is/are some government organization(s) with teeth long enough that Apple is not willing to go to court again as was the case with the San Bernandino shooter. Point is, whether or not Apple has been dishonest about this whole thing, they are still just a distraction.
There's a grand bargain where tech companies are granted immunity from user generated content in exchange for moderating that content. Apple's move is to try and maintain that bargain while moving to E2E encryption.
> Apple's move is to try and maintain that bargain while moving to E2E encryption.
It could be. It makes sense as a precursor to it. However, unless I've missed something, at no point in the entire debacle did Apple say "Yes, we're going to be moving iCloud to E2EE, and this allows us to do it. If you opt into this, the gains are that you can enable full encryption for everything else, and we literally can't see it on the servers."
Apple, to the best of my knowledge, never said that, nor even implied it. It was, "We're adding this to what we can do now, take it."
> Apple, to the best of my knowledge, never said that, nor even implied it. It was, "We're adding this to what we can do now, take it."
Yup. There was a ton of speculation about Apple's motives - whether legislative or related to iCloud E2E encryption - but AFAIK Apple never confirmed anything outside of "we're doing this to protect children". I think if there was some other motive, Apple should have communicated it better.
It's interesting how Apple's normal ability to set the tone and framing of new features was undermined in this case by a leak. I wonder if Apple would have been more successful at marketing this if they were able to frame the debate themselves.
The "slightly paranoid and cynical" hypothesis I have is that the reason the whole set of docs looked like something pushed out at 2AM, and that the "Clarification" documents looked like they were written by people operating on a week of no sleep, is because they were. Apple wasn't planning to really bother documenting the "feature," and someone found something sufficiently objectionable to leak it. Maybe someone wanted to turn it on early or something, or the hash database had some "non-CP data" in it, or... I don't know, and probably will never know.
But they decided that releasing what they had was better than letting the leaks run wild - just, it turns out, the actual system was really that bad. The "clarifications" were a wild run of "Well, OK, how can we answer this objection? And that one? Oh, crap, this one too!" - which is about how the later documents read. "We just came up with this idea that might address your objection!"
I'd love to hear the inside story of it at some point, but accept I likely won't.
However, it's been enough to drive me off Apple products. I can only hope enough other people have done so as well to hurt their profits, though it's doubtful. :(
But why? The way Microsoft, Google, Facebook do it is run the CSAM scans on their server side on everything you upload. Apple could have done that. They could have done it without announcing anything. They could also have put it on-device as a background service which scanned every photo you took or saved, or every website you visited.
Instead they put a non-trivial amount of effort into designing a much more complex system that doesn't require them to be able to see everything you upload, one that has input from multiple countries to guard against a single organization adding meme images to it, one that has an adjustable threshhold for some consideration against false positives, one that doesn't scan everything on your device even though it could do, etc. The actual system was not "really that bad" as you say, considering what it was trying to do, it was a reasonably good attempt to do that. (Not flawless).
Apple already send notification of every program you run on macOS back to Apple HQ and sync your browser bookmarks and passwords through them, Google know everything you search for and which results you click and can track you through sites using Google Ads, Microsoft try to get you to send all your browsing traffic to Microsoft and report on executables picked up by Windows Defender. Compared to all these things, Apple's scanner design was tackling a more difficult topic in a more privacy protecting way. I wouldn't be much surprised if Apple push their scanner to macOS as well as iOS, still only for images uploaded to iCloud and not more than that. I would be quite surprised if Microsoft pushed a similar thing Windows 11 and only restricted it to scanning OneDrive uploads; I doubt that the money saved in running the scans on-devices instead of in-datacenter would alone be worth the engineering effort of moving such a system from datacenter to device as Microsoft would have to do, compared to Apple apparently starting from scratch. That is, if it appeared in Windows 11 I'd expect the reason to be being a whole device scanner. I wouldn't be dreadfully surprised if such a system appeared in the Chrome source code and thence to Chromebooks, Chromium and Edge. The next 10 years is probably going to include various pushes from governments and companies into more invasive tech, getting in with a design that is somewhat privacy protecting and setting that as an expectation is not a terrible idea - especially if you assume they don't want to do nothing, or are not allowed to continue doing nothing.
"We support E2E encryption and can't unlock your device"... except we put software on your device that scans for anything our algorithm deems illegal and then we route that data to the appropriate authorities.
There is no grand bargain - the bargain is that they can moderate and be immune from user generated content. The default is no moderation to be immune from user generated content or moderate and be liable.
Given that the CSAM system as documented catches way less cases than server-side scanning, in all likelihood Apple can't implement E2E - no government would let Apple use such an inefficient system by itself. The bargain probably requires invasive client scanning and server side scanning, or at least way more invasive client scanning to catch all the cases the current system can't.
If this were true, the right way to go about it would be to spin out iCloud into a separate company. Being one legal entity is what creates the larger situation where Apple is hosting content that Apple is also able to moderate. Splitting them up, Apple Software would be free to fully represent their users' interests with proper encryption, and iCloud would have zero ability to access to the plaintext.
That seems like the kind of "hah! Gotcha!" loophole the authorities wouldn't fall for. For one, if they were allowed to do that full encryption in your situation and they wanted to, they could do that now; store data on Amazon S3, say. If they aren't allowed to now, they wouldn't be allowed to then either.
For two, encrypting on device means you can't login to iCloud web and see your photos and share them with people. That's not fully representing all user's interests.
For three, some of "their users" are the criminals whom the system is trying to catch; implying all users are the same and want the same thing and Apple wants the same thing as all users is overly simple.
1) That isn't a loophole, but rather the usual state of affairs. Right now I can use rclone or similar piece of Free software with any cloud storage service, and keep the contents of my files private. The cloud service has no access to the plaintext content, and thus can't be construed as having a duty to scan the plaintext. (If they were given notice that a specific blob was CSAM along with an encryption key that proved such, that's a different story)
2) If photos are accessible through iCloud web, then talking about encryption is irrelevant. That's just a standard cloud service which has the ability to scan on their servers. The Apple CSAM issue was remarkable precisely because Apple aimed to do the scanning on users devices before encryption that would make files unavailable through iCloud web.
3) Criminals still have rights. What we're talking about here is the right to computational representation.
The problems with drugs are entirely government created. Through the granting of monopolies on drug formulations through patents and then through the licensing by the FDA that prevents other manufacturers to produce already approved drugs if they're not exactly the same.
As with COVID we are seeing that the FDA fails to serve the public and is overly cautious. If we want an FDA make it voluntary, separate testing danger from effectiveness and at the end of the day let each individual do whatever they want to their own body.
We know why drugs aren't regulated. Pharmaceutical Companies contribute millions every year to Congress critter PACs and election coffers. Only democrats have tried lukewarm measures to reverse some of the damage. Essentially pharmaceutical companies think they are unassailable.
>That worked out pretty bad for him but also bad for everyone else, because now we had a guy to blame for it, instead of asking the harder questions like "why aren't drug prices regulated?"
I would say it went very well for the pharma industry.
The average American commits three felonies a day. I don’t think he was targeted by the feds for unrelated reasons even if the actual charges were unrelated.
The maddening thing about the Apple CSAM scanning controversy is that I still have no idea which legislation Apple is complying with, and it's difficult to find out any reporting on that aspect. US? UK? EU? Clearly there is/are some government organization(s) with teeth long enough that Apple is not willing to go to court again as was the case with the San Bernandino shooter. Point is, whether or not Apple has been dishonest about this whole thing, they are still just a distraction.