Transcendence was a frustrating movie for me to watch. I went in hoping for a genuinely novel look at transhumanism and strong AI, but with an unclear thesis and a slew of standard tropes, I was left watching the same failures of AI-that-will-be that I've seen before.
The movie stars Johnny Depp as Will Caster, Rebecca Hall as Evelyn Caster, and Paul Bettany as Max Waters — all geniuses who develop 1) strong AI that is possibly self-aware, and 2) a system that can supposedly house an uploaded human consciousness. They struggle with the ethics of the creature they've built and its (potentially drastic) influence on society.
Unfortunately, they lapse into the following clichés:
Humans want to make god(s)
A questionable assumption. Making something that's better than you at some things is not the same as seeking to make a god. That's a conclusion borne of fear of domination.
It also presupposes that making something stronger or better than you will lead to domination. Things we create need not have violent intent.
Access to the internet is dangerous
They made a big deal about Caster getting access to the internet. Unless they're afraid of him learning misogyny, racism, and how to troll people (as if Caster hadn't known such things existed before), this fear isn't backed in the movie with anything concrete. It's just an event that they're concerned about.
Keeping an AI away from sources of knowledge on the world seems like a great way to end up with an unbalanced entity — much as what happens with a too-long isolated person.
The formerly-organic entity should behave as they did before
The characters got pretty worried when Caster behaved differently after their migration (if migration it was). "He's not the same," Waters said. "He would never have done this before. He's changed." (paraphrased) Evelyn Caster argued that he was the same man she fell in love with.
But he shouldn't be behaving the same. If someone's form factor changes, it makes perfect sense that their perceptions of the world — and hence how they act in it — would change. His senses changed, his capabilities changed; of course his actions would change. Why live with constraints — or pretend to have abilities — that you no longer have?
Artificial bodies are squicky
A significant turning point in the perceptions of those around Caster was when he began to make artificial bodies. That, suddenly, was a special kind of horrific, which doesn't make any kind of sense. He'd already been augmenting others' bodies, and people weren't nearly as squicked by that.
I wouldn't even call this a trope. It was just arbitrary and nonsensical.
Oh, noes, it knows my biology
There's a scene where Caster analyzes Evelyn's emotional state by reading her biology: her heart rate, hormone levels, temperature, etc. She is not amused, and this sets off her discomfiture with Caster's development.
Why? Emotion has biological components. This is related to "the formerly-organic entity should behave as they did before" — if the entity has different abilities, why is it less ethically acceptable to use them to deduce a mood that a standard-issue-organic human might deduce by looking at face and posture?
Are you self-aware?
This question was asked twice in the movie, the first time seriously, the second time as a quasi-joking callback.
In neither case did the AI being asked answer it seriously. Merriam-Webster defines self-awareness as:
knowledge and awareness of your own personality or character
Wikipedia defines it as:
the capacity for introspection and the ability to recognize oneself as an individual separate from the environment and other individuals
This can be a valid question to ask a new type of entity that you're curious about. "Do you perceive yourself as an individual? Can you perceive and describe your own personality?" It's a potentially interesting thing to discuss for a while, especially if philosophy floats your boat.
The off-hand treatment in the movie, however, makes the question a short-hand for "Are you a strong/real AI?" It lacks nuance, especially when they "mysteriously" leave the question completely unanswered in the case of Caster.
They do ask (or imply... or lead me to ask) some good questions, though, including:
- Would the loss of a few memories in the consciousness upload process completely change 'who' is uploaded?
- If it does become unethical to use certain sensory abilities (as with the biology-emotions link), what are those new boundaries and how do we negotiate them?
- How much malleability do we accept in a person's physicality and personality before we begin to consider them a different person?
Questions that Transcendence won't be answering for us, but I'd love to read or watch some works with better treatment.