Frankenstein, Tech Ethics and The First Science Fiction Novel



In 1818, a young woman called Mary Shelley published the first work of science fiction: Frankenstein. I would argue it is also the earliest examination of technology ethics.

Perhaps every scientist and engineer needs to read it again.

A Reanimated Discussion

Frankenstein is a deceptively simple story of the dangers of creating something without understanding the consequences. The eponymous main character, Victor Frankenstein, struggles to discover the secret of restoring life to dead flesh. He finally does so, only to abandon the man he has brought to life.

The novel describes the chase across Europe of the doctor by the reanimated corpse - a surprisingly sympathetic character, who is never named. The monster's hunt is partly in revenge for his abandonment, but primarily to force Frankenstein to be answerable for his ill thought-through act.

Shelley reflects on the ethics and responsibilities of creation, and her work is an excellent example of the power of a novel to explore a complex theme. As a science fiction writer with an interest in tech ethics, I found Frankenstein fascinating and drew on it heavily for my new book Denizen 43.

Unburying the Truth

At the beginning of the tale, our protagonist is young, naive, and obsessed with scientific discovery. He fails to consider the implications of success in his quest and when faced with the result, he runs away. From then on, the twin themes of Shelley's narrative are the dangers of blind pursuit of a goal, and failing to take moral and practical responsibility for your actions.

In the second part of the book, the young Doctor attempts to ignore his mistake and keep it a secret.  He neither asks for help nor tells his acquaintances of the danger they are in.

As the tale progresses and the bodies pile up, our hero is given an opportunity to atone. He is offered the chance to give up his current life and become a companion to the monster, ending its murderous activities. He rejects the deal and again flees.

It is only once everyone he knows and loves is dead that Frankenstein fully faces up to his part in the tragedy.

So what is Shelley saying? 

I believe Mary Shelley was asking scientists two questions: what technology should you bring into the world and what are your responsibilities for it afterwards?

She was demanding that we think about the implications of what we're creating in advance. We would now call that consequence scanning.  If we miss an implication, and we will, she is suggesting we tell folk rather than hush it up, and take responsibility rather than running away. All of this is remarkably good advice. In fact, it reminds me of much modern thinking on safety culture.

Better Safe than Sorry?

In 1818, I suspect Shelley not only wrote the first science fiction novel and the first Tech Ethics manual, she may also have written the first call for a safety culture, as finally adopted over 250 years later by industries like oil, automotive and aviation (but not tech as yet).

Admittedly, if it wasn't for the concept of scientists inventing crazily dangerous things without thinking them through, we'd have no SciFi thrillers and I'd be out of a job. So, it's not all bad. However, I'd prefer my Gothic horror to remain fictional.

If we don't want it to stray into reality, perhaps in the tech industry we should take another look at Mary Shelley's advice on science and creation and think implications through in advance, take personal responsibility for what we make, and speak up when something goes wrong. If we don't, we too might come to regret it.

Comments

Popular posts from this blog

Book Review - "Project Hail Mary" by Andy Weir

Book Review - "God's Debris" by Scott Adams