Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You should at least be somewhat charitable to the other side of this issue which is that CSAM is the result of real life sexual abuse and the market for it causes more. Unless you want a cop stationed every 10 feet you can't stop the abuse, you can only take away the incentive to do it in the first place which is the theory behind most laws. The reason most people aren't constantly worried about being murdered isn't because we have super advanced defense systems but because the fear of punishment keeps people from doing it.

I'm aware that the majority of HN want the solution to be "do nothing" and some quote about liberty and security but you would probably change your tune if you or your kids were trafficked, starved, raped, and the had the pictures and videos distributed to fetishists online. CSAM detection is the government's best option to kill the market for it and make it radioactive.

At least listen to the experience of human trafficking survivors before you say it's all a big conspiracy. Those scary signs on buses, trains, in women's bathrooms, and doctor's offices aren't a joke.



> The reason most people aren't constantly worried about being murdered isn't because we have super advanced defense systems but because the fear of punishment keeps people from doing it.

No, the real reason most people aren't constantly worried about being murdered is that most of the people we encounter aren't murderers.


> most of the people we encounter aren't murderers

Gods I wish this fallacy had a name, I guess it's a corollary of the Spiders Georg principal. The number of perpetrators is not proportional to the number of victims.

It's not true that 40% of men will sexually assault someone in their lifetime but it's nonetheless true that 40% of women will be sexually assaulted in their lifetime.


Fine, turn my statement around if you prefer. The reason people aren't afraid of being murdered is because being a victim of murder is rare.

But the statistic you're citing here conflicts with your earlier statement (which is the one I take issue with):

> isn't because we have super advanced defense systems but because the fear of punishment keeps people from doing it.

The "fear of punishment" deterrent is clearly doing absolutely nothing for the 40% of women who have been / will be sexually assaulted.

Fear of punishment does not work when "correct" execution of justice is so rare and arbitrary. That's my point. Police are corrupt. Prosecutors are corrupt. Therefore, there is no justice. Therefore, "fear of justice" doesn't work as a deterrent.

Even if the police were better, if all it takes is a few murdering, raping, psychopaths to do all of the damage, how could "fear of punishment" possibly work?

There is, however, fear that the bureaucracy will decide to come after you in a Kafkaesque horror show...


Jesus in what country? Even taking some very liberal definition of rape 25% is insanely high. Do you have some source on that?

EDIT: In the US we have a 2000 study quoting 5% and a 2011 study saying "nearly 20%" but their data include attempted rape which is a pretty important distinction. This is coming from Wikipedia though.


Changed it to sexual assault because you're right, the study I was going off of included attempted rape which made it higher, but also only counted penetration which is also kinda dumb. Switching to sexual assault eliminates some ambiguity.


Nobody is saying “do nothing” about all the depraved stuff some people do with respect to human trafficking, etc.

What people are asking is to not be treated as if they are human traffickers by Apple. If they deploy CSAM, they are in effect saying, “sorry world, we suspect all of you are trying to traffic humans so we need to scan your images”.

That’s not how law enforcement should work in a free and open society. You should be suspected of illegal activity first, then law enforcement can move in and do their job.

Once CSAM is deployed, then it’s much easier to expand the scope of it to include other forms of content, and Apple will have a much harder time saying “No”.

I was really hoping Apple would take the backlash of CSAM to exert pressure against whoever is asking them to do this. “Sorry <large gov entity>, we can’t do that without destroying the decade of goodwill we built up with our customers. You’ll have to legislate this for us to do it”. That would have the advantage of at least a public debate.


Nice job making everyone who opposes totalitarian surveillance systems look like they don't care about abuse. This is exactly why children are the perfect political weapon: it makes people accept anything, and the few of us who resist are lumped in with the abusers.


So do detective work, track down the pedo’s, kick down their doors, shoot/imprison them, and free the children.

Don’t take away the privacy and liberty of hundreds of millions of innocent citizens.

What happened to all the people who visited Epstein’s pedophile island? Any consequences? Nothing so far! But the rest of the population better watch out, Big Brother is going through their photo libraries. It’s all a massive government overreach.


The whole reason these systems are proposed is because "detective work" is ineffective and the harm is ongoing. You gotta a least meet people where they are. Don't you think the level of policing required to actually find people who possess CSAM wouldn't also be a massive overreach? Or are you hoping that "just do more" will come to nothing in practice?

Prosecuting members of the ruling class in this country is a whole separate issue, and one that I'm sure we are in total agreement on sans the "well if rich people are above the law why can't everyone be too" take.


> Don't you think the level of policing required to actually find people who possess CSAM wouldn't also be a massive overreach?

Obviously. Criminalizing possession of data is dumb. It's essentially a declaration that some numbers are illegal. It's the exact same problem faced by copyright. Any attempt to actually enforce these laws at scale will be a massive overreach to the point of tyranny. They are incompatible with the computing freedom we enjoy today.


I don't buy this argument because you can make any law sound silly by reducing it to something absurd. Saying that that the benefit isn't worth the trade in freedom is totally valid but the quip about illegal numbers isn't super persuasive.

"Criminalizing possession of nuclear weapons is dumb, it's essentially a declaration that some molecules are illegal."

"Criminalizing hate speech is dumb, it's essentially a declaration that certain combinations of sound frequencies are illegal."

"Criminalization of murder is dumb, it's essentially a declaration that I certain locations where I store my bullets are illegal."


It sounds absurd because it is absurd. Think about the absolute control that would be necessary in order to stop our computers from working with arbitrary data. Obviously this logic cannot be applied to the physical world.


>detective work" is ineffective

Source?

Detective work actually is effective. The people investigating already know where huge amounts of trafficking and abuse happen. It happens in the CPS system. The problem is they have made it such a quagmire of rules regulations and gotchas that only the truly angelic and those looking to abuse children to make a quick buck are willing to get involved. I got news for you one of those numbers is greater.


Somewhat disturbing content following!

> The whole reason these systems are proposed is because "detective work" is ineffective and the harm is ongoing.

In Germany, there recently was a news story about two journalists infiltrating and effectively shutting down an insanely large darknet forum for child abuse material.

That story exploded, because evidently, law enforcement could have done a lot better, as all the journalists did was crawling for webhoster-links and reporting the content there - the hosters immediately followed through - terabytes of abuse were deleted. The admins of the forum even spoke with the journalists and told them, literally no one cared before and the whole userbase was caught in surprise.

Not-so-much-fun facts: IIRC that site had millions of accounts registered; apparently at least 1% of men are attracted to children; it's been mostly non-commercial, user-made abuse material - strongly motivated by that community's demand.

The journalists concluded: It's not enough to persecute individuals and gather evidence. It is absolutely imperative to disrupt those communities, too, as those sick people not just consume abuse material, they very much actively promote the abuse itself in their interactions. Now, it's not single hit, lasting solution, but requires constant effort. However terabytes of abuse material are not moved in an instant, and they are surely not delivered via Tor to literally millions of people. Apparently it takes about two people, who know Python, to take down a platform like that. (The argument of destroying evidence is moot, as you can easily do both, within law enforcement capabilities.)

German podcast, interview with one of the journalists; articles in shownotes: https://lagedernation.org/podcast/ldn269-bka-laesst-bilder-v...

TL;DR: Before everyone's privacy and democratic distribution of power are getting compromised, ... maybe lets make sure law enforcement is actually doing their job first.


You are making it sound all philosophical when HN crowd is already philosophically behind finding an appropriate solution to the CP problem. “Problem” is not a problem here, current proposed solution is. See it critically and not emotionally.


See this is always how online discussions go.

* "CSAM is -- nothing more -- than a way for governments to establish a totalitarian rule!"

* "Oh come on, that's really uncharitable to the people implementing CSAM detection, at least present the argument for the other side in the way they view the issue."

[loud screeching of goalposts...]

* "Of course we acknowledge the problem of sexual abuse imagery, it's just that we don't like the CSAM detection as a solution... [aaand have no ideas for an effective privacy preserving alternative, and want nothing done in the interim while we search for the unicorn]."

There becomes a point where if you don't want to implement the best idea we've got to address the problem, and don't have any other ideas, then it gets harder to believe you really care about the problem.


>There becomes a point where if you don't want to implement the best idea we've got to address the problem

The assumption here is "something must be done". The fact is that liberty and safety are intrinsically at odds. If we're going to make progress, the question we have to face is how many rescued children is worth how many units of liberty. It's distasteful to present the issue so bluntly, but it's the implicit calculation we do in other cases, e.g. preventing child deaths in car accidents. We all implicitly agree that some amount of death and disfigurement of children is worth the economic and social benefits of cars. Similarly, how much liberty should we give up here?


> "The fact is that liberty and safety are intrinsically at odds."

If you own an Apple iPhone, and you have photos on it, and you enter into a contract with Apple Computer to use their cloud service, and you upload photos to the cloud service for them to host on their servers, do you have the "liberty" for them to not be allowed to stop you abusing the service in egregious ways? This isn't a matter of freedom and liberty while you can still you can refuse to use iCloud and refuse to use iPhones. Where, by comparison to your example you have much less option to opt-out of travelling by road, using products driven to you on roads and working with people who travelled by road to get to you.

> "it's the implicit calculation we do in other cases, e.g. preventing child deaths in car accidents. We all implicitly agree that some amount of death and disfigurement of children is worth the economic and social benefits of cars"

We don't all agree in the same way; the amount of agreement is continually changing. Where roads are dangerous, authorities lower speed limits and add traffic calming measures and citizens agitate for cycle lanes and pedestrian zones. Where vehicle exhaust makes for poor air quality, people demand cars must have catalytic converters and be subject to emissions control regulations. When pedestrians die in crashes, car manufacturers add external crumple zones. Around schools, people are employed to stop traffic and help children cross roads safely. We pay educators to teach children about road safety and pay for campaigns aimed at educating drivers about pedestrian awareness, drunk driving, distracted driving, the risks of speeding, using fairly graphic imagery of children being hit and killed to drive home the point. We mandate that older cars have regular checks to make sure they're still road worthy and still have functioning brakes and tyre tread and working lights for safety reasons.

I'm often arguing against cars and the harm they do to humans, but it's not the "either you ban cars or you hate children" boolean you're suggesting it is where everyone who benefits from cars is happy to sacrifice children for those benefits and thinks the current amount of sacrifice is just fine thanks and wouldn't change it if they could.


You can turn off a iCloud. CSAM cannot be turned off or even viewed. Only evil is done in the dark.


Of course but at least call a spade a spade, the units of liberty you're willing to trade to solve the problem is basically the metric of how much you care about it.

It's a fine position to take that the harm due to that much invasion of privacy and government involvement in our private lives isn't worth but means precisely that you care about the problem less than someone who is willing to make that trade.


I don't think that follows. One can maximally prioritize children while also believing that all children are better served by a society that protects liberty over immediate safety. How you weigh these issues turns on how you weigh the N-th order effects. It's probably not too controversial to say that eliminating all cars would harm children more and thus children as a whole benefit more from cars than their elimination. But it would be disingenuous to characterize this calculation as caring more about economic benefit than children.


You're conflating viewing CSAM with the physical act of exploitation, through some dubious reference to "incentives" and "the market". I don't think people who abuse children are doing so because it's economically lucrative, but rather because they themselves are dangerously sick in the head. But maybe you know more about this scene than I do. (see how that works?)

Once abuse has occurred and has been "documented", completely preventing the spread of that CSAM requires totalitarian systems of policing communications, which are what is being argued against here. Invoking the original abuse as if the topic at hand would have had some bearing on it is an emotional appeal, not a logical argument.


And the defense of totalitarianism begins. Sure strangers 1000 miles away are definitely going to look out for you better than yourself.

>Those scary signs on buses... arn't a joke

They actually are. If you paid attention to Epstine the people doing this have access to massive resources and you don't actually see this happening in public for the most part. Only very small independent traffickers and no one reports them because they themselves are poor and don't have the time to deal with it on their hour long bus trip to a minimum wage job. In fact if you read about most of the people doing it are related to the victims and if you report them there is little to be done about it because of that.


> The reason most people aren't constantly worried about being murdered isn't because we have super advanced defense systems but because the fear of punishment keeps people from doing it.

In this case, CSAM is the "advanced" defense system, not the fear of punishment.


Kinda? Detecting CSAM doesn't stop the victim from getting abused in the first place which is the goal. It's to increase the risk of getting caught and punished, and to make the photos worthless.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: