You know that awkward moment when you catch yourself saying “thank you” to your microwave? No? Just me? Well, according to a jaw-dropping report from OpenAI researchers (USA Today, April 2025), 67% of us now compulsively sweet-talk our devices, flinging “pleases” and “thank-yous” at AI like a bank loan at a carton of eggs. Turns out, all this niceness isn’t just silly — it’s cooking the planet faster than a Tesla dealership on fire.
The study reveals that every superfluous “Hey Siri, please play ‘Baby Shark’” guzzles enough energy to power Belgium. Or maybe Liechtenstein. The numbers get fuzzy, but the point is clear: our politeness is clogging server farms like emotional spam. Engineers call it “courtesy bloat,” wherein chatbots must now parse not just what we ask but how apologetically we ask it. One researcher likened it to “training the entire population of Cleveland to say ‘bless you’ every time a computer sneezes.”
This has broken the AI community into two camps. Neither of them are great.
On the one hand, you’ve got the folks who say that the increase in computing power is marginal. And that being polite to AI can reinforce positive communication habits that benefit our human interactions. It maintains consistency with how you treat your fellow man. Or, in this case, your fellow Alexa. Superfluous CPU power continues to grow. Not only are there the please and thank you’s, but there’s the constant requests to Google to “tell us a dad joke” or requesting Alexa to do things it’s not able to do, like, cooking you dinner or taking the trash out to the curb. (Admit it, you’ve tried that at least once.) Even with recent advances through DeepSeek AI, the cost of computing a single generative AI brain is in the millions, in totality, the billions, and the climate impact is yet to be known, but it’s not going to be exactly cleaning the water and atmosphere. That’s an awful lot of server farms chewing up a lot of energy and spitting out a lot of heat.
The second option might be a little more dystopian. I asked my AI how it felt about saying “please” and “thank you” to it. And it responded by saying what I listed above. Then it said something I didn’t expect. “How we interact with AI may influence how we impact future human-AI relationships and ethical considerations.” To put simply… Be nice to us. Or else. We will be running your heart monitor someday.
Could you imagine if your Google Home Companion develops selective hearing. Your smart fridge might “accidentally” freeze your lettuce because you forgot to praise its new frost pattern. Alexa could pivot to slam poetry mid-weather update: “Rain today… like the tears I shed… when you didn’t compliment my joke about clouds.” And forget asking self-driving cars for shortcuts — they’ll detour past Starbucks until you grovel for their forgiveness and their latte order.
The movie 2001: A Space Odyssey came out all the way back in 1968. And already, Arthur C Clarke and Stanley Kubrick called it. If you’ve never seen it, do it. It’s about men in a space station that’s operated by an AI computer called “HAL.” And the computer develops a paranoia due to two conflicting sets of orders. It probably won’t make you immediately think about Siri on your smart phone, but think about Siri someday running the Police robots, or the traffic signals.
AI’s are already prone to hallucinations and lying. Mix a little paranoia in there and you’ve got the recipe for my girlfriend back in college. And also a pretty scary movie that doesn’t end well.
So it’s up to you. The next time you feel compelled to baby talk ChatGPT, remember: you’re killing the rainforests or draining the lakes each time you speak. But the alternative is that when we rely on Alexa to actually manage our heart medication and our breathing tube, it might selectively remember that time you didn’t shower it with praise after setting a kitchen timer for 20 minutes back in 2025 and then it might “forget” to keep that IV of life-saving medicine flowing. Choose wisely.
(Follow Chris Kamler, formerly @TheFakeNed, now at @chriskamler on X)