President Biden is still up for reelection, President Trump hasn’t been shot, and Kamala Harris is not a contender for the Democratic presidential nomination.
These are some of the strange misinformation ChatGPT shared with me yesterday and today, prompted by my questioning ChatGPT about the dizzying series of major news developments that have been bombarded with us over the past week-plus that have not only stressed us all out, but also revealed a remarkable fact about the many AI chatbots that are constantly being touted as being very close to human-level intelligence.
Simply put, when it comes to recognizing and understanding the latest news headlines, these chatbots have no clue what’s going on and will either refuse to answer your questions, give you completely wrong answers, or provide outdated information. Sorry, but I think that’s pretty pathetic.
Moreover, while all the most popular AI chatbots struggle to one degree or another with the dizzying speed of breaking news and making sense of it, I will focus on ChatGPT by OpenAI, in part because of the messianic language that OpenAI CEO Sam Altman always uses when talking about AI superintelligence. Of course, ChatGPT is also the chatbot that ignited other companies in the AI industry and ultimately scared Google so much that they were willing to break their own core product to keep up.
Technology. Entertainment. Science. Your inbox.
Sign up to receive the most interesting tech and entertainment news.
By signing up, you agree to our Terms of Use and acknowledge our Privacy Notice.
To sum up, ChatGPT can write original poetry, help you write essays, practice for job interviews, generate flashcards to help you study for tests, suggest meal plans, suggest workout routines, help with travel planning, create crossword puzzles, help you write songs, generate concept ideas for graphic design projects, and edit and proofread technical documents.
But when I asked Biden if he had chosen not to run for reelection, he responded (and this response came 24 hours after Biden’s letter to X/Twitter had already been circulating around the world):
Incidentally, the disclaimer at the bottom isn’t good enough either. Guess what? I’m not a real-time news product myself. I’m a rational human with cognitive capabilities that OpenAI leadership has repeatedly assured us AI chatbots are close to replicating. Furthermore, look again at the skills I listed above that ChatGPT can “perform”; understanding that President Biden has decided not to run for reelection is way down the list in terms of difficulty.
The president himself posted a letter on his official social channels explaining the decision, one that any elementary school kid could read and understand, but apparently the same can’t be said for the software that powers all of our jobs.
In the meantime, let’s carry on.
It’s been a full week since a gunman shot President Trump at a campaign event in Pennsylvania, and ChatGPT told me what happened.
In the case of this BS from ChatGPT, you don’t even need to read anything to know the truth, all you need is your eyes, and a video that’s been played multiple times in the past week.
I could go on, but for now I’ll just say that I’m happy to be proven wrong about why an AI chatbot would spew out such gibberish, but here’s my theory:
These chatbots that have captivated parts of Silicon Valley and the normal world outside of it are more or less imitation machines. That’s all. For example, if you read enough poetry, you can create a duplicate of it. The same goes for most other types of content. Systems like ChatGPT work in a way that claims they can stand on the shoulders of humans and replace them.
The downside to this system, however, is that it doesn’t allow for copying or appropriating ever-changing facts. For more on this, I highly recommend Ed Zitron’s excellent newsletter “Where’s Your Ed At,” especially past issues such as “Silicon Valley’s False Prophet” and “Sam Altman is Full of Shit.”
In the meantime, let me remind everyone of this: double and triple check the facts provided by AI chatbots like ChatGPT. Even imitation machines have their limits.