So even back in high school, when Isaac Asimov was The Science Fiction Author for me, there was an essay of his that felt... off. And it ties into his concept of the Frankenstein Complex. https://twitter.com/aDillonDev/status/1259494612735057920
I can't remember the title of the essay or what collection it can be found in (Though I know it's not in Gold; I just checked), but Asimov bemoans the science fiction stories being won by buff men punching things vs. smart men solving a puzzle.
As I was reading through this, through his frustrations that only strength and never intellect is valued in these stories, I kept thinking "but what does it matter if the character is not kind?"
(random thought: what if we had compassion as a DnD stat too?)
And looking back at all the Asimov stuff I consumed over the years, I'm seeing this pattern in his work: intellect, above all else.
That doesn't mean there isn't emotional elements to his stories, nor are they all void of compassion.
"C-Chute," "Ugly Little Boy," "Bicentennial Man" - just to name a few - are reliant on emotion and compassion. And in Robots and Empire, Daneel and Giskard take their next steps towards being "human" by recognizing each other as friends.
But if you look at the body of his work, a lot of it comes down to "troubleshooting a thing" (which, to be fair, my brain looooves that too) and I think it's important to note that his coining the phrase "The Frankenstein Complex" is about intellect, not compassion.
In his essay "The Robot Chronicles" (which can be found in Gold), he talks about Shelley's Frankenstein as a cautionary tale against man knowing things that man should not, and how that translated later into robots always having a revolt.
Asimov notes that both Frankenstein's monster and the first robots (of the play Rossum's Universal Robots) have reasons for lashing out.

The monster was abandoned by his maker. The robots gained emotions and resented their oppression.

Asimov's frustration and solution for this?
"We're smart enough to build safeguards against that."

Which... um...

Yikes.
To be fair, early Asimov was thinking more along the lines of "robots as machines" and not "robots as beings," like "how do you makes sure a computer doesn't get malware?"
And he would later challenge and play with his Three Laws of Robotics as he started moving more into the "robots as beings" side of scifi.
But looking back on his work - and even just browsing the essays as I tried to search for these - he tries to find an intellectual explanation for everything. (unless he's trying to interpret how women work but BOY IS THAT ANOTHER TOPIC. Related: https://www.publicbooks.org/asimovs-empire-asimovs-wall/)
And pivoting back to the original quoted tweet, that's where I find myself really appreciating Children of Time/Ruin and the Jurassic Park franchise, both of which dip their toes into Asimov's dreaded Frankenstein Complex. They both ask the question:
But it's not about fear of discovery or science (okay, Jurassic Park does play with that a bit), but rather the question becomes about responsibility. "Should" isn't about trespassing on God's domain. "Should" is about our ability to love what we create.
If we create something of sentience, something with a soul, it will break through any cage we try to create (life finds a way, etc.). So do we have the moral fortitude to love it anyway? Do we have the fortitude to care for it, to nurture it, to guide it along a right path?
Because that was Frankenstein's sin: not creating life, but *abandoning* it.
Jurassic Park really starts tackling this idea in the sequels, but we see the first hints of it as Ellie points out the spaces where inGen is putting plants and animals at risk with their carelessness.
Children of Time/Ruin brings in the added element of loving the thing you created even when it's not what you wanted it to be. How does one react to finding that our race's metaphorical children aren't the animals we chose, but an animal we frequently despise?
Trust me, I love a good set of troubleshooting and the hero solving problems by being clever (Children of Time in particular has that in SPADES), but science fiction is at its best when it considers the emotional and moral cost of the puzzle as well.
There's no number of rules you can make to fix the Frankenstein Complex, because it is not an intellectual issue.
You can follow @aDillonDev.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled: