I’ve been reading philosopher John Searle lately. He’s probably most famous for his “Chinese Room” thought experiment which he intends to prove that consciousness is not software running on the hardware of the mind.
I’m not a big fan of the thought experiment… I think there are a half-dozen effective objections to it. However, I’ve been so impressed with his writing in my current book (“The Mystery of Consciousness”) that I’m trying to take a closer look. I don’t think the Chinese Room can be totally salvaged as he intends it, but there is one conclusion that he takes from the experiment that I think might be defensible. Paraphrased in my own words, it is:
Embodiment is a necessary condition for consciousness.
Embodiment is the difference between an algorithm in the abstract, and the algorithm in action (for instance, in a computer). So, the algorithm for finding the lowest common denominator is not embodied, but each time a student works it out with a paper and pencil, that’s an embodied instance.
It seems to me that one of Searle’s big objections to consciousness as a software is that software is almost by definition un-embodied (or ‘disembodied’). Software could be fully defined by a static, logical flowchart. Perhaps the workings of a particular brain could as well, but both of these charts would never be conscious.
The “hardware” of the brain is a necessary component for consciousness as we know it, Searle says. In his theory, consciousness “arises”, or “is an emergent property from” the physical processes in the brain. Perhaps, along these lines, Searle might allow that consciousness could emerge from a working, particular computer/software combination. But he would insist that the substrate matters - without it the consciousness is not there.
This is counter to my usual (functionalist) leanings. Or maybe we could say that the embodiment/substrate is necessary, but that any old substrate will do. Maybe my brain simulated on a computer will be just as conscious as me, but the logical flowchart of it’s workings will not be... maybe.
This all seems to get at the idea that consciousness feels like it needs to happen over time. I can’t freeze it. A photo of my face might smile, but it doesn’t feel happy, nor would an exact recording of my neuronal state at one instant have any thoughts. But the moment the representation is allowed to move or develop, for me the question of it’s being conscious opens right back up again.