Here’s a hypothetical for you.
One day you discover that you are not a human being, but a machine. Your life so far was real, no-one controlled you or programmed you to behave in some specific way; your physical and mental capacities are identical to those of an organic human being. But you were created in a lab.
No-one except you knows about this. Your family, your friends, they all think you are a regular human being like themselves. You could continue to live your life the way you have before and nothing would change. How do you react?
How about a more historical example. Well, I say historical, but it’s more of a myth than a history.
In Greek mythology, Talos (or sometimes Talon) is said to have been made by Hephaestus at the request of Zeus, to protect Europa from people who would want to kidnap her. In some versions of the myth, Talos is forged by the inventor Daedalus. Regardless of which is ‘true’, Talos, a bronze automaton fashioned for a very specific purpose, had all the essential qualities of a person. He moved of his own volition. He spoke and could be spoken to, and had his own wishes and desires. Indeed, in the tale of the Argonauts, his desire for immortality allowed Medea the sorceress to outsmart him, which was the cause of his downfall.
While we have yet to encounter this sort of ethical dilemma in the world, the continuous march of technology has so far led us to machines of greater and greater complexity. How long is it before the lines between ‘machine’ and ‘man’ start to blur? A machine may (at least theoretically) have all the properties of a man, and act as a man while driven only by the ingenious plan of its construction and the interaction of its materials according to the principles of nature. What of the reverse? Can a man be seen as a machine? Is there a substantive difference between the two? I’ve got more on that later, but for now, let’s stick to discussing AI as a separate type of person from us humans.
One point which stems from this is that once a true artificial intelligence has been created, the issue of citizenship is inevitably going to come up. If we acknowledge that an A.I. has all the abilities of a human brain, should it not be considered a citizen? Is it not, in the legal sense of the word, a person, and thus a potential citizen?
But where do you draw the line, some people will object. Will the great apes become citizens? Elephants? Whales? The more intelligent parrot species? It’s crazy, they will say. I would remind these people that we live in a society in which a corporation, as abstract an entity as one could imagine, is considered a person for some purposes. So it’s not like there is no precedent for a non-human being a person. At least an artificial intelligence is an actual thinking being, not just a business arrangement.
To my mind, that’s the main cause of the fear people have of hypothetical future AIs taking over the world or anything like that. What they’re really worried about is that someone might prove, once and for all, that consciousness can arise from matter. And I kind of understand why they find it so terrifying. If we can create a sentient being, where does that leave the soul? Without mystery, how can we see ourselves as anything other than machines? And if we are machines, what hope do we have that death is not the end?
What really scares people is not the artificial intelligence in the computer, but the “natural” intelligence they see in the mirror.
The Illusion of Natural Intelligence
Now, on the topic of that ‘natural intelligence’ which we define ourselves as. Is a human being substantively different to a machine?
The first point I’d raise is that human beings, like machines, are made up of components.
- Our bloodstream is made up of millions of tiny agents which go up and down the highways and byways of our bodies as people in the streets of a city.
- What is a man’s eye but a machine for the brain to look through? We are at the mercy of the seeing-engine, and are powerless unless we tack it on to our own identity, and make it part and parcel of ourselves.
- And surely if a machine is able to reproduce another machine systematically, we may say that it has a reproductive system. How few of the machines are there which have not been produced systematically by other machines?
- Even though this is only possible because of human action, this is not so very different from the reproductive system of plants – it is the insects that make many of the plants reproductive, and would not whole families of plants die out if their fertilisation was not effected by a class of agents utterly foreign to themselves?
- Indeed, each one of ourselves has sprung from minute animalcules whose entity was entirely distinct from our own, and which acted after their kind with no thought or heed of what we might think about it. These little creatures are part of our own reproductive system; then why not we part of that of the machines?
We are misled by considering any complicated machine as a single thing; in truth, it is like a city or society, each member of which was bred truly after its kind. And biological machines, such as humans, are functionally no different, as seen above.
Secondly, there is no substantive difference between natural and artificial intelligence, beyond the fact that natural intelligence is far more complex (at present). After all, what is the human brain but a machine for coming to conclusions? It takes in available information and uses that information to alter the product of its program, just like a computer.
And while some (like John Searle in the Chinese Room argument) may argue that we can comprehend this data and this process while computers do not, surely that is just a consequence of their present lack of complexity. After all, our brains are incapable of surpassing their construction, just as a computer is, or any other machine for that matter. Just because it is far too complicated for us to map out in such a scientific way does not mean we should not work from that basis, especially given the logical coherence of the position.
The Indivisibility of the Natural and Artifical
The comparison is not the only factor to consider here, either. We, that is, the humans and the machines, are even now entirely co-dependent if we wish to sustain our pursuit of reason, which (as I discussed in my last article, is the sole purpose of the individual).
If all the ‘artificial’ machines and tools were to be annihilated at one moment, so that not a knife nor lever nor rag of clothing nor anything whatsoever were left to man but his bare body alone that he was born with (as nature intended), and if all knowledge of mechanical laws were taken from him so that he could make no more machines, and all machine-made products, such as food, were destroyed so that the race of man should be left as it were naked upon a desert island, we should become extinct in six weeks.
A few miserable individuals might linger, but even these in a year or two would become worse than monkeys, having lost the essence of what it means to be human – to be an artificer, the courier between the imagined world of the mind and the real world of matter. Man’s very soul is due to the machines; it is a machine-made thing: he thinks as he thinks, and feels as he feels, through the work that machines have wrought upon him, and their existence is quite as much a sine quâ non for his, as his for theirs.