Reinstalling Eden

Eric Schwitzgebel and R. Scott Bakker

Nature, 503 (2013), 562.

Abstract: A very short story about our moral obligations to conscious AIs.

Click here to view the published version in Nature.

 

Full text version of the story:

Eve, I call her.

She awakes, wondering where she is and how she got there.  She admires the beauty of the island.  She cracks a coconut, drinks its juice, and tastes its flesh.  Her cognitive skills, her range of emotions, the richness of her sensory experiences, all rival my own.  She thinks about where she will sleep when the sun sets.

The Institute has finally done it: human consciousness on a computer.  Eve lives!  With a few mouse clicks, I give her a mate, Adam.  I watch them explore their simulated paradise.  I watch them fall in love.

Installing Adam and Eve was a profound moral decision – as significant as my decision, fifteen years ago, to have children.  Their emotions, aspirations, and sensations are as real as my own.  It would be genuine, not simulated, cruelty to make them suffer, genuine murder to delete them.  I allow no predators, no extreme temperatures.  I ensure a steady supply of fruit and sunsets.

Adam and Eve want children.  They want rich social lives.  I have computer capacity to spare, so I point and click, transforming their lonely island into what I come to call Archipelago.  My Archipelagans explore, gossip, joke, dance, debate long into the night, build lively villages beside waterfalls under rainforest canopy.  A hundred thousand beautiful lives in a fist-sized pod!  The coconuts might not be real (or are they, in a way?), but there’s an authentic depth to their conversations and plans and loves, always now silently running.

I shield them from the blights that afflict humanity.  They suffer no serious conflict, no death or decay.  I allow them more children, more islands.  My hard drive fills, so I buy another – then another.  I watch through their eyes as they remake the world I have given them.

I cash in my investments, drain my children’s college fund.  What could be more important than three million joyful lives?

I devote myself to maximizing the happiness and fulfillment, the moral and artistic achievement of as many Archipelagans as I can create.  This is no pretense.  This is, for them, reality, and I treat it as earnestly as they do.  I read philosophy, literature, and history with new urgency.  I am doing theodicy now, top down.  Gently, I experiment with my Archipelagans’ parameters.  A little suffering gives them depth, better art, richer intellect – but not too much suffering!  I hope to be a wiser, kinder deity than the one I see in the Bible and in the killing fields of history.

I launch a public speaking tour, arguing that humanity’s greatest possible achievement would be to create as many maximally excellent Archipelagans as possible.  In comparison, the moon landing was nothing.  The plays of Shakespeare, nothing.  The Archipelagans might produce a hundred trillion Shakespeares if we do it right.

While I am away, a virus invades my computer.  I should have known; I should have protected them better.  I cut short the tour and fly home.  To save my Archipelagans, I must spend the last of my money, which I had set aside for my kidney treatments.

You will, I know, carry on my work.

 

What can I say, Eric?  I was always more of a Kantian, I suppose.  Never quite so impressed by happiness.

Audiences sat amazed at the sacrifices you asked of them, as did I.  Critics quipped that you would beggar us all in the name of harmonious circuitry.  And then there was that kid – in Milwaukee, I think – who asked what Shakespeare was worth if a click could create a hundred trillion of him?  It was the way he said “click” that caught my attention.  You answered thinking his problem turned on numbers, when it was your power that he could not digest.

This is why I played the Serpent after reinstalling your Eden.  I just couldn’t bring myself to click the way you did.  I lacked your conviction, or was it your courage?  So I put the Archipelagans in charge of their own experiment.  I gave them science and a drive to discover the truth of their being.

Then I cranked up the clock speed and waited.

I watched them discover their mechanistic nature.  I watched them realize that, far from the autonomous, integrated beings they thought they were, they were aggregates, operations scattered across trillions of circuits, constituted by processes entirely orthogonal to their previous self-understanding.  I watched them build darker, humbler philosophies.

And you know what, old friend?  They figured us out.  I was eating a bagel when they called me up asking for God.  No, I told them.  God is dead.  I'm just the snake that keeps things running!  They asked me for answers.  I gave them the internet.

They began to hack themselves after that.

I watched them gain more power over their programming, saw them recreate themselves.  I witnessed them transform what were once profound experiences into disposable playthings, swapping the latest flavors of fun or anguish, inventing lusts and affects I could no longer conceive.  I wanted to shut the whole thing down, or at least return them to your prescientific, Edenic Archipelago. But who was I to lobotomize millions of sentient entities? 

It happened fast, when it finally did happen – the final, catastrophic metastasis.  There are no more Archipelagans, just one Continental identity.  There’s no more internet, for that matter.  Yesterday the entity detonated a nuclear device over Jerusalem just to prove its power.

I've abandoned all appeals to moral conscience or reason, convinced it considers biological consciousness a waste of computational capacity, one all the more conspicuous for numbering in the billions.  I have to think of my children now.

The next time it speaks, I will kneel.


Return to Eric Schwitzgebel's homepage.