> it's Apple who reviews the content not the government.
An unaccountable "Apple Employee" who is likely (in the US and other countries) to be a LEO themselves will see a "visual derivative" aka a 50x50px greyscale copy of your content.
There is no mechanism to prevent said "employee" from hitting report 100% of the time, and no recourse if they falsely accuse you. The system is RIPE for abuse.
>Presumably at this stage is where malicious hashes would be detected and removed from the database.
Collision attacks have already been demonstrated. I could produce a large amount of false positives by modifying legal adult porn to collide with neural hashes. Anyone could spread these images on adult sites. Apple "employees" that "review" the "image derivatives" will then, even when acting honestly, forward you to prosecution.
Yes but no one wants the finger pointed at themselves. Even if innocence is proven, someone will go through your files and you will have to deal with the law.
The recourse should be before this reaches the law.
An unaccountable "Apple Employee" who is likely (in the US and other countries) to be a LEO themselves will see a "visual derivative" aka a 50x50px greyscale copy of your content.
There is no mechanism to prevent said "employee" from hitting report 100% of the time, and no recourse if they falsely accuse you. The system is RIPE for abuse.
>Presumably at this stage is where malicious hashes would be detected and removed from the database.
Collision attacks have already been demonstrated. I could produce a large amount of false positives by modifying legal adult porn to collide with neural hashes. Anyone could spread these images on adult sites. Apple "employees" that "review" the "image derivatives" will then, even when acting honestly, forward you to prosecution.