Before you ask: ChatGPT did not write this article. I will admit that I did ask the AI chatbot to write an 800-word article in FiveThirtyEight’s style about current public opinion on AI chatbots like ChatGPT, but our collaboration disintegrated amid editorial differences. (If you’re curious what ChatGPT had to say, though, you can read its take at the bottom of the page.)
AI-powered online search, AI-generated recipes, and AI-powered roadside assistance topped the list, although 43 percent of respondents were “very” or “somewhat” interested in AI tools for police or criminal justice — a use for AI with considerably higher stakes than search accuracy. Monmouth, meanwhile, asked about six potential applications of artificial intelligence and found that respondents were only broadly OK letting AI take on risky jobs like coal
You should only abandon FiveThirtyEight for chatbot journalism if you don’t mind some made-up data. I was unable to find the 2021 Pew survey that ChatGPT was referencing, and when I reached out, Pew’s media team was similarly stumped about where the question came from. (ChatGPT’s tendency to make up facts is a “core challenge” for the technology, according to its chief technology officer.)1 In fact, a 2021 Pew survey on the topic that I did unearth suggested the opposite. 
When asked about the increased use of artificial intelligence in daily life, only 18 percent of respondents said they were more excited than concerned, while 37 percent said they were more concerned than excited and 45 percent said they were equally concerned and excited. This is consistent with what recent polls actually show: Americans generally don’t trust AI — particularly in high-stakes situations, like medical care — and they’re fearful that AI-powered search engines could be inaccurate and biased.

[Read More…]