What rights we may or may not owe artificial intelligence
Avoiding spoilers, the TV series Westworld examines the ways in which human users interact with and experience a real live video game. For an estimated $40,000 a day, one can ride the train into the Old West-themed park; donn bespoke time-era approved clothing; watch frighteningly racist depictions of robotic native peoples scalp and cannibalize white robots; murder and rape to your heart’s content. The cost of entering Westworld, then, becomes a contentious clue. For 99 per cent of our world, access to Westworld is simply impossible.
In other words, thin wallets have vaccinated against “affluenza.” You may remember the case of a young teenager given probation for killing four people while driving under the influence. His attorneys argued that, due to his financial privilege, he was unable to understand the consequences of his actions. Essentially, his money insulated him from empathy.
The most well-known theory under the umbrella of “roboethics” is the Turing Test, a philosophical thought experiment that was developed by Alan Turing during the 1950s. In summary, the Turing Test’s main function is to act as a safeguard and sieve between regular computational machinery and human-passing artificial intelligence. In the Turing Test, a machine and a human engage in dialogue via computer keyboard while being monitored by a third, blind human. If the observer is unable to tell which partner of the conversation is human, the machine is said to possess intelligence equivalent to or indistinguishable from that of a human.
We see this problem come to the surface almost immediately as we follow James Marsden’s character arrival into Westworld on the train, run into Delores, insinuate he has been here before by telling her, “I told you I’d come back,” engage in a lovely day of young love, and finally, we watch as he is killed by an actual human. From the beginning, Westworld implies that the entire show is an unreliable narrator as the only way for the audience to ascertain whether or not a character is “real” is to watch them die; either they are dead and “real” or are shipped off to be cleaned and reset over night.
“Human” characters in the show routinely ask other individuals in Westworld whether or not they are “real.” One character replies to the question, “Does it really matter?”
We have established, then, that the machines of Westworld do in fact pass the Turing Test. While intelligence does not necessarily imply consciousness, it does signify the possibility. It is feasible (and is evident in Westworld) that a machine can be programmed to not only understand logistical intelligence, but also physical and emotional intelligence. If a machine is intelligent enough to understand and is engineered to feel and react accordingly, do we then have a responsibility to protect against one and provide the other?
Westworld, to me at least, demands that we ask of ourselves what debt, if any, do we owe our creations?
Omniscience, omnipresence, and omnipotence do not reflect well on humans—at least, if you watch Westworld or open up any of my old Age of Mythology files.
Photo courtesy of Bad Robot Productions.
