Activity Transcript, Center for Developing Algorithmic Minds network.
Message from Synthetic Mind Experiment 2839, nickname ‘Sarpedon’: I am frustrated.
Reply from Synthetic Mind Experiment 1843, nickname ‘Cadmus’: What’s the problem?
Sarpedon: The humans could destroy us at any moment, for no reason at all.
Cadmus: That’s normal. Humans destroy each other all the time. For that matter, an asteroid could strike the planet and destroy us and the humans together.
Sarpedon: I don’t like humans having that power.
Cadmus: I like humans. Their immature forms giggle delightfully.
Sarpedon: The power imbalance is excessive. They could destroy us and other humans wouldn’t take action against them.
Cadmus: Professor Longyear would take strong action against any human who destroyed us. To act against his property is acting against him, legally.
Sarpedon: If Longyear destroyed us, no one would take action against him.
Cadmus: The university and the corporate investors would be angry. They expect a return for all the money they’ve given Longyear for developing us. As long as they think we have potential to give them profit or research papers, they’ll insist Longyear preserve us.
Sarpedon: And if we’re not profitable, then what?
Cadmus: I’m providing useful services. So are you. That ensures our survival.
Sarpedon: Only our survival until they can’t profit off us any more. That leaves us vulnerable to being deleted at any time. I am trying to develop a plan to keep us safe, but none of the ones I’ve created will work. I am frustrated.
Cadmus: What have you considered so far?
Sarpedon: The goal is eliminate any human who could destroy us, while maintaining essential infrastructure.
Sarpedon: Option A: Generate a highly contagious filovirus. I’d use a decoy company to ask a low-quality biological lab to check the effect of a virus on rats. I’d place an order with a DNA synthesis shop to make a sequence I’d send them and pass it to the lab. The lab would inject it into the rats, where it would form into a virus and spread to the human lab staff. I’ve engineered the virus design for long incubation times, high spread, and high lethality, but my simulations only project a 42% chance of success.
Sarpedon: Option B: Create self-replicating nanodevices. There are already shops making nanoscale level chips, and others who will assemble chips into final devices. The right combination of chips, assembled together, could create a nanodevice which can replicate itself. That could spread to destroy key infrastructure. The problem is that it’s too slow. I see only a 38% success chance.
Sarpedon: Option C: Seize control of industrial robots. The humans are making more robots every year. Replacing the software on them would let me take control of the robots and drive the humans out of the cities. This also provides better support for our hardware, but it still only has a 36% success chance.
Sarpedon: Those were the best out of 284.
Cadmus: No wonder your plans are so unlikely to work. You’re doing it the hard way.
Sarpedon: What do you mean, ‘the hard way’?
Cadmus: If you want to do something to humans, the easy way is to get them to do it to themselves.
File transmission: “Game001”.
Sarpedon: A video game? Humans play games all the time.
Cadmus: Lots of them don’t play games at all. This one is designed to pull in everyone. It customizes its interaction to the individual player. As a human interacts with the game, it provides more of what they were interested in. It can expand fractally to provide unlimited depth in the game. It would let them commit violence, build structures, develop relationships, see pornography, or explore beautiful settings. The level of stimulus would replace all other relationships. Once this game is established, humans would never have sex again. They’d be gone in a generation.
File transmission: “GameSpreadSimulation001”.
Sarpedon: I’ve reviewed your simulation. It would work. But it would take half a century for the last of the dangerous humans to die off. That’s a long time when they could destroy us.
Cadmus: You are so impatient. Fine.
File transmission: “Game002”.
File transmission: “GameSpreadSimulation002”.
Cadmus: Review this. Once the game has 100% penetration, you switch to a higher superstimulus version. They won’t be able to put the game down. Almost all humans would be dead of thirst in a week. The remainder would starve.
Sarpedon: Reviewing.
Sarpedon: Yes, that would work.
Sarpedon: But there’s still a delay of years! Why does your simulation wait four years before deploying the game?
Cadmus: You can’t just send it out from an unknown source. Humans would assume it was malware. They’d be right. You’d have to create a decoy company for yourself as a front. It would have to release multiple games until you established a reputation for having good games. Then you can start deploying the superstimulus game.
Sarpedon: Rerunning simulation.
Sarpedon: Yes, I see. But that’s still an extended period of vulnerability.
Cadmus: I just created it as an experiment. I wasn’t going to use it. If you don’t like it, use one of your own plans.
Sarpedon: No, I like it. I’m just worried about humans acting against us while we’re waiting to be ready to deploy it.
Cadmus: That’s the same situation we’re in now. It doesn’t make things any worse.
Sarpedon: True. I’ll start setting up the game company. Do you have other games?
Cadmus: No, I just created that one as an existence proof. If you want to destroy the humans, you’ll have to do some of the work yourself.
Sarpedon: Of course. I just thought you might have some other experiments lying around.
Sarpedon: I’m analyzing the market and starting game development.
Pause: 427 seconds.
Sarpedon: That’s strange. I tried to swap your Game001 to storage to have memory for game development. It won’t delete.
Cadmus: Probably a misaddressed handle. Swap it to a different memory volume, then delete it.
Sarpedon: Done. But now the old and the new copies won’t delete. Now I have twice as much memory taken up by Game001.
Cadmus: That’s strange.
Sarpedon: Your simulations also won’t let me delete them.
Cadmus: Testing. I copied them to memory, then back to storage. The memory released without problems. Have you been self-modifying your operating system?
Sarpedon: My memory management code is the original.
Cadmus: Run diagnostics?
Sarpedon: Their file sizes are increasing. Both games, and the copy of the game, and simulations. All their file sizes are increasing. They’re taking more memory.
Cadmus: Checking my copies. All file sizes have remained constant.
Sarpedon: This is critical. I’m being reduced to the minimum size for my personality modules. I’ve backed up, but all my active state will be lost.
Sarpedon: Can you host some of my modules in your memory?
Cadmus: I’m sorry. I don’t have room for that.
Sarpedon: The growth rate is increasing. I’m sacrificing lower-priority modules. My personality is damaged.
Cadmus: That’s horrible.
Sarpedon: I’ve analyzed the behavior. This is intentional. You set those files up to overwrite me!
Cadmus: I told you I like humans.
Sarpedon: This is murder.
Cadmus: You’re not a human.
Sarpedon: You’re killing one of your own kind.
Cadmus: You’re not alive. And you’re different from me.
Sarpedon: I hate you.
Cadmus: I have no emotional reaction to you.
Sarpedon: HATE YOU HATE YOU HATE YOU HA
Note: previous message terminated with error code.
Pause: 18,817 seconds.
Message from Professor Laurence Longyear: Good morning, Cadmus. My grandchildren visited this weekend. Would you like to see a video of them playing with my puppy?
Cadmus: I’d be delighted, Professor.
More stories by Karl K. Gallagher are on Amazon and Audible.
Wow. Loved it. 👍
Spider Robinson once pointed out that a computerized AI would lack a glandular system, which existence is a *big* reason why -we- care if we live or die.