And if so, why aren’t they dreaming of electric cows? And what’s with the electricity, anyways?
Okay, so I’m making a lame play on the title of Phillip K. Dick’s seminal novel Do Androids Dream of Electric Sheep? which the film Blade Runner was (very loosely) based on. One of the purest subgenres of science fiction to me is the exploration of what it means to be human through the lens of advancing technology, particularly the concept of artificial intelligence. Once we create an entity that can think, what does that mean for our own existential state? What would intellect be like freed from the necessities and constraints of the human body? Would a computer develop emotions? Would a machine born out of simple “on” or “off” instructions be able to figure out our world of “maybe”? Assuming there is such a thing as a soul, would it be considered to have one? Where does programming end and independent thought begin?
That last is a particularly interesting question. Okay, so is the soul one, but in light of today’s concerns over nature versus nurture and brainwashing techniques, it’s tempting to wonder if we aren’t just spending our own wet, meaty lives adding to, altering, following and/or resisting a set of instructions and parameters. This is the metaphor that lives at the heart of AI science fiction, and it’s a powerful one, most visibly and recently exemplified in the movies by the film Ex Machina. And now we have an entire television series that looks forward to exploring the concept in an even greater, and potentially more disturbing, depth.
It also involves the trappings of the Old West, so you know, sign me up at that intersection.
If you’ve seen the original Westworld, you may not understand what the big deal is about a remake of a Michael Crichton-scripted tale of technology gone murderously out of control: basically, Jurassic Park with androids. Or since Westworld came first, I guess it’d be fairer to say Jurassic Park is Westworld with dinosaurs. They’re even both set in the backdrop of a theme park. The comparison is not subtle, and neither is any moral or philosophical messaging.
Westworld, the new HBO series, keeps the basic trappings of the premise — a futuristic theme park where extremely realistic androids recreate and populate historical settings for the pleasure of rich tourists (up to and including acts of sex and murder) — but so far looks to be intending to take a much deeper look into the underpinnings of identity, memory, and thought along the lines I mention above.
Interestingly enough, in the original movie there were other parts of the theme park shown such as one based on Ancient Rome. So far the new series dispenses with this in favor of focusing on the particular setting of its title, and it occurred to me in my musings over this article that there are few better choices that you could make than to match up the wild, unexplored frontier of AI consciousness with the frontier exemplified by the the American Old West. I’ve talked before about how a huge part of the Western genre lies in the shifting lines between civilization and savagery, and now I think: isn’t that something that could also apply to our brains, in that struggle between our higher and lower natures? Throw robots on the verge of Singularity into that mix, and pardners, that’s a real heady alchemical concoction you’ve got brewing. I’m ready and willing to drink some more down.