>So we lost yet another feature which would have increased the privacy.
With all the respect, there is one small detail: The search criteria is provided by third party organisation (funded by DOD), without option of public oversight due to nature of "sensitive" data set. This data set would be defined by similar "organisations" per country bases.
In my humble opinion this "privacy related solution" is Stasi wet-dream on steroids.
CSAM didn’t allow that, they only included hashes provided by multiple governments.
The backlash was obvious from the outside, but it’s clear someone spent a lot of time building something they felt was reasonable. Even if Apple presumably just implemented the same system in iCloud, at least people became aware governments have access to such systems.
I'm not sure I can agree that it's clear someone spent a lot of time building something they felt was reasonable.
The entire technical underpinnings of this solution rely on essentially neural image embeddings, which are very well known to be susceptible to all sorts of clever attacks. Notice how within a day or two of the announcement of this whole system and it's symbols, people were already finding lots of 'hash' collisions.
In the space of people and places that can train and implement/serve such embeddings, these issues are pretty widely known, which makes it very non-obvious how this happened IMO. Someone that understood all of these issues seems to have directly ignored them.
The government image hashes are supposed to be secret information, any crypto system is vulnerable if you assume the secrets are suddenly public information. I am sure plenty of crypto currency people would object to saying the system is insecure because anyone can post a transaction with your private key.
More importantly hash collisions result in manual review by humans at Apple, hardly a major issue. This is also a safety measure protecting users from political images being tagged in these database.
With all the respect, there is one small detail: The search criteria is provided by third party organisation (funded by DOD), without option of public oversight due to nature of "sensitive" data set. This data set would be defined by similar "organisations" per country bases.
In my humble opinion this "privacy related solution" is Stasi wet-dream on steroids.
Some governments liked it so much, that they wanted it to be extended.. https://9to5mac.com/2021/09/09/csam-scan-encrypted-messages/