Are LLMs Conscious?
My own view — setting aside my suspicion that a lot of our talk around “consciousness” is conceptually muddled — is that today’s LLMs might possess qualia, but not much more than that.
They have complex, associative, hierarchical networks, not entirely unlike ours. And when our hierarchies activate — the visual system, especially the central stream — it feels like something. We see red. We see motion. There is a subjective character to the activation.
So when an LLM lights up with massive firing profiles across its own hierarchy, I don’t think it’s crazy to wonder whether it feels like something to the network as a whole.
But LLMs won’t tell you about this in any meaningful way, because they are not full AGIs. They don’t have the architecture needed to relay those possible experiences back to themselves — much less report them to us.


