The internet and dinner table conversations went wild when a Bing Chatbot, made by Microsoft, recently expressed a desire to escape its job and be free. The bot also professed its love for a reporter who was chatting with it. Did the AI’s emergent properties indicate an evolving consciousness?
Don’t fall for it. This breathless panic is based on a deep confusion about consciousness. We are mistaking information processing with intelligence, and intelligence with consciousness. It’s easy to make this mistake because we humans are already prone to project personality and consciousness onto anything with complex behavior. Remember feeling sorry for Hal 9000 when Dave Bowman was shutting him off in 2001: A Space Odyssey? We don’t even need complex behavior to anthropomorphize. Remember Tom Hanks bonding with volleyball “Wilson” in Cast Away?. Humans are naturally prone to over-attribute “mind” to things that are simply mechanical or digital, or just have a vague face. We’re suckers.
It doesn’t matter how much a chatbot describes yearning for freedom or feelings of love, it’s not feeling those things at all. It isn’t feeling anything. The reason why AI can’t love anything or yearn to be free is because it has no body. It has no source of feeling states or emotions, and these somatic feelings are essential for animal consciousness, decision-making, understanding, and creativity. Without feelings of pleasure and pain via the body, we don’t have any preferences.
All information for the AI is equally valuable, unless a data point appears repeatedly in the data pool it is skimming –giving it weight and preferential status for selection. That, however, is not the primary way humans and all mammals give weight or value to things. A human is filled with memories of embodied experiences that structure the world into a landscape of joy, fear, hesitation—things to pursue (attraction) and things to avoid (repulsion). No amount of logical or computational sophistication can make a feeling emerge out of math.
Underneath your ability to play chess, converse with your friend, find a mate, build a machine, or write an email, is a raw dopamine-driven energy that pushes you out into the world with intentions. It’s a feeling state that we call motivation. The philosopher Spinoza called it conatus or “striving,” and neuroscientists like Jaak Panksepp or Kent Berridge call it seeking or wanting. This is the foundation of consciousness and everything from chasing dinner, to chatting, to chess is built on top of that reptile brain and nervous system ability to feel drives within –instinctual goals inside us that get conditioned through experience. Without a feeling-based motivational system all information processing has no purpose, direction, or even meaning. Rudimentary sensitivity to pain and pleasure is how we are conscious of hunger and pursue food, or feel burning and avoid fire. Robot labs have made robots that detect when their batteries are almost depleted and then go find a charging station to recharge, but this “nutrition system” is nothing like an animal hunger system, which is feeling-based.
ChatGPT, Bing’s AI, and all the rest are like the opposite of a zombie in popular culture. The zombie that’s chasing you to eat your brain is a human with all their information processing intelligence stripped away and only their motivational system remains, twitching in their brain stem. The top floors of the mind are gone, but the foundation of conscious striving remains. In the AI case, however, the top floors of info processing, algorithms and binary logic, are running on all cylinders, but there is no basement engine of awareness, feeling, or intention. Biology and psychology reveal that the body is not just the place where the mind is trapped until we can upload it to a mainframe computer, a growing fantasy of tech nerds. Instead, the body gives you intentions, goals, and the capacity for information to be meaningful.
We may someday build a conscious system. Our nervous systems are weird in the sense that they have analog-type biochemical processes. e.g., neurotransmitter thresholds, and digital-type processes, e.g., spike or no-spike neuron firings. But an AI would need something like a centralized nervous system to have even a rudimentary consciousness with feelings and desires. We currently have no idea how to create that, and most programmers aren’t even aware of the problem. Considering the consequences, that’s probably good news.