> Close to 2/3 Americans also believe in magic so I'm not sure what these studies are supposed to tell us.
I think you're missing the point, as are many other comments on this post saying effectively, "These people don't even understand how AI works, so they can't make good predictions!"
It's true that most people can't make accurate predictions about AI, but this study is interesting because it represents people's current opinion, not future fact.
Right now, people are already distrustful of AI. This means that if you want people to adapt it, you need to persuade them otherwise. So far, most people's interactions with AI are limited to cheesy fake internet videos, deceptive memes, and the risk of shrinking labor demand.
In its short tenure in the public sphere, large language models have contributed nothing positive, except for (a) senior coders who can offload part of their job to Claude, and (b) managers, who can cut their workforce.
Yet this is a primary goal of AI. Problem is that the way how the dominant economic system is structured, reduction of said demand increasingly leads to a societal crash.
I think you're missing the point, as are many other comments on this post saying effectively, "These people don't even understand how AI works, so they can't make good predictions!"
It's true that most people can't make accurate predictions about AI, but this study is interesting because it represents people's current opinion, not future fact.
Right now, people are already distrustful of AI. This means that if you want people to adapt it, you need to persuade them otherwise. So far, most people's interactions with AI are limited to cheesy fake internet videos, deceptive memes, and the risk of shrinking labor demand.
In its short tenure in the public sphere, large language models have contributed nothing positive, except for (a) senior coders who can offload part of their job to Claude, and (b) managers, who can cut their workforce.
Why would people hold AI in high esteem?