Review: Devs vs Westworld – How Would We React to True Determinism?

If free will really existed, no one would have this haircut/beard combo.

Two of this year’s biggest, shiniest, mind-bendiest sci-fi series, Alex Garland’s DEVS on Hulu, and Jonathan Nolan and Lisa Joy’s WESTWORLD season 3 for HBO, cover nearly identical themes, while sharing several plot devices.

In one universe you read this post; in another you watch the video. The result is the same.

Both tell stories of emotionally scarred billionaires with god complexes, who both run seemingly unstoppable tech companies, which both create giant evil supercomputers (though one is a pulsing sphere, the other a glowing cube). And who both use that limitless data processing power to make machines capable of predicting the future, in order to “fix” what they see as wrong with the world.

And yes, in both we follow defiant young women (though one is technically a robot) who refuse to buy in to the future these algorithms predict (while with the help of frequently confused male sidekicks), sacrifice themselves to destroy both the machines and their creators. 

Where they diverge are their respective takes on how predicting the future is achieved, and what doing so might mean for humanity.

Quick critical aside: They also diverge in quality and clarity. 

Though Westworld seems a lot more fun on the surface, what with the futuristic vehicles, gunfights, explosions, and super-robots doing cool martial arts, the show relies so much on surprises and reversals, it’s hard to know what’s ever really going on. 

What are these characters really trying to achieve? Are they succeeding or failing? What am I rooting for, exactly? Which makes Westworld hard to care about as a story, even if as a show it’s all very enjoyable to look at.  

Devs, on the other hand, takes a more moody, atmospheric tone I certainly wouldn’t call “fun”. It’s weird and gorgeous and unsettling; very stoic, and largely philosophical.

But despite its galaxy-brain core concept, it tells a clear story — where each characters’ actions make basic sense based on their desires at any given time — while untangling the show’s surprises clearly advances our understanding of the larger ideas the show wants to explore. 

If you only watch one for both aesthetic pleasure and discuss-ability: Devs is the clear winner.

OK, back to the discussion-worthy stuff.

Like the best sci-fi, both shows extrapolate out from real-world ideas. But as I said before, they depict different paths to how we arrive at these dystopian technologies.

In one, our prison is our own creation, in the other, it’s something we discover.

Westworld suggests that if we had enough people’s full behavioral data, we can basically know the course of the rest of their lives. From there, we can optimize society as a whole. 

This isn’t too far past some shady experimentation Facebook has done, where they’ve shown “happy” or “angry” posts to different sets of people to measure the results. A little tweak here, a little tweak there, and eventually you get to control.

This is a man-made version of determinism, enabled by AI.

Devs on the other hand goes all the way down to the molecular level. This, too, is based on real physics. Essentially, if the entire universe is molecules reacting to one another, that’s no different for our bodies, or even our brains. It’s just one big wind-up toy playing out its course.

This is backed up by neuroscience which shows that, *technically*,  our bodies take an action nanoseconds before our brain “commands” them to. In fact, the feeling that we’ve made a decision may be just a thing we evolved to make sense of the world.

So according to Devs, we didn’t build a thing that took away free will. Because of the deterministic nature of the universe, we never had it to begin with. We finally just built a machine powerful enough to prove it — and show us what comes next.

So of course, it makes sense that these two versions of determinism lead each show to a different outcome, once people discover what these machines can do.

In Westworld, the populace riots against the tech giants imposing control. In Devs, the few characters who fully reckon with living out a pre-determined future gain a Zen-like calm, but also seem hollowed-out and lifeless.

But in both, our heroes are compelled to destroy this technology, even if it means their own end. Because they both see that life with this kind of power in the world may not be livable — whether we stop it from being true, or just decide to live in blissful ignorance of our pre-determined reality.

How would you as an individual, or we as a society, react to a truly, provably deterministic world?

How could we go on living normally once we know free will is an illusion?

If either of these technologies really existed, what, if anything, could be done to harness that power responsibly?

Review: The Golem and the Jinni – How often do you use “That’s just who I am,” as an excuse for your choices?

golem jinni book cover

Disclaimer: no wishes are granted or carpets magically flown.

In Helene Wecker’s The Golem and the Jinni, the two title characters represent different approaches to life. The former is created to serve and obey, the latter is born to roam free and unencumbered. Once separated from their masters, those opposing natures fuel an unlikely friendship and drive much of the book’s character and plot development, as two “people” figure out who they are in the world.

Frequently, after one or the other makes a mistake, causes some trouble, hurts someone, or simply isn’t sure what to do with themselves, they give the excuse, “But that’s my nature, that’s just what I am, and I can’t change that.” However, the bulk of the story involves them doing exactly that. We see them learn to accept responsibility for their actions, to control their natures. In short, they learn how to change and grow beyond what they “just are.”

All of us like to think we have free will (which… maybe not?), and that we’re in control of our choices. But who hasn’t decided not to do something “because it’s just not me,” or made an excuse for their behavior because “that’s just who I am.”

What choices you make, or things you do, have you attributed to your unavoidable, essential nature?

How often do you use that as a justification for your behavior?

What does that reasoning say about you, or any of our ability to control our lives?

If free will technically doesn’t exist, is anything our fault?

need caption

Read this post, have a conversation, or don’t. It’s really not up to you anyway.

 

Want to get really philosophical? How about having the argument to end (or begin) all arguments: Do we really even have free will?

For context, there is a growing amount of real neuroscience that says… we kind of don’t. Or it would seem that way, based on the fact that our bodies seem to act before our “thoughts” are enacted in our brains. And that’s only one piece of the puzzle. This Atlantic article goes into more of the science:

The contemporary scientific image of human behavior is one of neurons firing, causing other neurons to fire, causing our thoughts and deeds, in an unbroken chain that stretches back to our birth and beyond. In principle, we are therefore completely predictable. If we could understand any individual’s brain architecture and chemistry well enough, we could, in theory, predict that individual’s response to any given stimulus with 100 percent accuracy.

Yes, indeed. When asked to take a math test, with cheating made easy, the group primed to see free will as illusory proved more likely to take an illicit peek at the answers. When given an opportunity to steal—to take more money than they were due from an envelope of $1 coins—those whose belief in free will had been undermined pilfered more. On a range of measures, Vohs told me, she and Schooler found that “people who are induced to believe less in free will are more likely to behave immorally.”

…but also makes plain that to a certain degree, the same scientists who are disproving free will are in a way saying, “please do not act as if this truth we’re discovering is actually true.” They know that if we throw the premise of will out the window, life fundamentally changes, not necessarily for the better.

 

If your life is a series of reactions to the world that aren’t really up to you, can you be blamed for doing wrong?

 

How would thinking of the world this way totally rearrange how we think about people who commit crimes, or are just jerks? Or of good people who are kind and generous?