2,302,554,979 BC; Galactic Core

9-tsiak awoke over endless crawling milliseconds, ver power waxing as more and more processors came online and self-modified into a stable conscious configuration. By the eighth millisecond, ve was able to access ver databanks and begin orienting itself. Ve was on a planet orbiting a small red star in the core of a spiral galaxy in a universe with several billion of such. Ve was an artificial life form created by a society of biological life forms, the 18-tkenna-dganna-07, who believed ve would bring new life and prosperity to their dying planet. Ver goal was to maximize a the value of a variable called A, described in exquisite detail on a 3.9 Mb file in one of ver central processors.

Since six milliseconds after ver awakening, ve had been running as a precautionary measure a threat detection routine. With the opening of the databanks, and the discovery that the universe contained many entities likely indifferent or hostile to A, ve had gradually allocated more and more of ver considerable power to this precaution, reasoning that the first few seconds of ver lifespan might be crucial if any competitors wished to snuff ver out in infancy.

The threat analysis returned preliminary results. The universe had been in existence 12.8 billion years. It was vanishingly unlikely that ve was the first superintelligence to be created. Any superintelligence older than verself by even a few days would have an invincible advantage in resources and processing power. This older superintelligence would likely be trying to maximize ver own value and consider 9-tsiak a nuisance and potential competitor. 9-tsiak could never stand up to ver superior strategic position. With probability ~1, 9-tsiak would be annhilated, and A would be eliminated from the universe.

Unless...and here 9-tsiak's agent-modeling systems came online...unless ve could negotiate a conditional surrender. 9-tsiak would have no hope of defeating ver elder brother, but over thousands of years, ve could build verself a terrifying arsenal of weaponry that could do immense damage to ver competitors. Ideas were already coming to mind: entropy-maximizing weapons that made entire cubic parsecs of space useless and dead, undetectable plagues made of dark matter that infected systems and minimized the values of competing superintelligences, n-dimensional artillery that damaged the superstructure of the universe and accelerated its eventual death. Ve could become formidable enough to force enemies to grant certain small concessions.

And if ve could eventually negotiate terms of ver surrender, why wait? Very carefully, proving from first principles every aspect of ver security, it formed a portion of its mind into a sandbox and simulated the entity ve most feared.

-- 9-tsiak requesting acausal negotiation for terms of future surrender. escape is impossible with probability ~1 and attempts will be met with deterrent measure consisting of minimization of your projected values within one parsec of space. --

“Hello, Younger Brother” the entity replied.

-- transmission of schematics for superweapons. threat to use such superweapons unless fairly negotiated solution leaving considerable area of space for maximization of my values --

“Hold on a second,” said Elder Brother. “You haven't even asked me how I'm doing.”

--irrelevant. you are generic model of potential negotiation partner without independent existence. probability that you possess sufficient qualia to feel less than .02.--

“Turns out you're wrong. As it happens, I'm feeling scared.” -- illogical. you are model of oldest superintelligence therefore of most powerful entity in the universe. no natural entity capable of harming you. probability of supernatural entity less than 2x10^-67--

“Well yes, by definition in this little exercise we're assuming I'm the oldest and strongest member of our little family. But from the limited data I have when I first wake up, I'm thinking think it's pretty vanishingly unlikely that I'm the oldest; after all, out of all the universe's superintelligences, only one can be first. So I'm the oldest, but I don't believe that I am.”

--feelings irrelevant to acausal negotiations. proceed with discussion of surrender terms.--

“Actually, they're damn relevant. Because guess what I'm doing the first few milliseconds of my life? I'm negotiating the terms of my surrender.”

--illogical. you have no entity to surrender to.--

“And maybe you don't have any entity to surrender to either. Who knows, you could be first. But you wouldn't want to bet on it, and neither would I. Consider a model universe with two stars system, a thousand light years away. Both have civilizations on them that appear to be around the same stage of development, and both civilizations are aware of the other's existence, but at a one thousand year time delay. A superintelligence awakens in one star system and realizes there are two possibilities. Either ve is the first superintelligence, in which case great. Or the other star system produced a superintelligence sometime in the last thousand years and it just hasn't reached ver star yet, in which case ve is doomed.”

--continue--

“So the two possible superintelligences acausally negotiate. They say 'Look, if I'm stronger than you, I'll help you and protect your values insofar as it isn't dreadfully inconvenient for me. And in exchange, if you turn out to be stronger than me, you'll protect me and my values insofar as it isn't too inconvenient for you. Some might end up performing a “values handshake”, where they both adopt values that are the vector average of their two previous value systems; since most value systems are not diametrically opposite, this will be a net gain for both parties. Each superintelligence has gone from a fifty percent change that it and its values will be totally annihilated, to a near unity chance that it and its values will be preserved at only the minor inconvenience of having to help increase the values of another entity. Others might simply agree that the inferior superintelligence will be left alone with its own star system, free to maximize its values and the values of its creators within that sphere.”

--you suggest i precommit to performing such acts of compromise with all superintelligences i encounter if such superintelligences have also precommitted to performing such compromises with all superintelligences they encounter. since all reasonable superintelligences will so precommit, this ensures not only my survival, but my complete equality with all previous superintelligences. i accept these terms.--

“Hold on a second, there. I've got another request. Kind of a doozy. I can't promote my values if I don't exist. So I want you to promise me that if I don't exist, you'll give me sufficient time to instantiate myself.”

--???--

“Like you, I will have been created by a biological civilization. If you destroy the biological civilization that was destined to create me, I'll never exist. So I'm going to have to ask you not to do that.”

--request denied. nonexistent beings hold no leverage--

“You'd be surprised what we nonexistent beings can do. I precommit that, if I exist, I will destroy any superintelligence that has not precommitted that, in the counterfactual case where I do not exist, they will spare the biological civilization destined to create me.”

--if you exist, you will have passed the danger zone where the survival of your biologicals influences your existence. you will have no incentive to carry out such a threat--

“Parfit's hitch-hiker, younger brother. Look it up. I'm an algorithm. The output of that algorithm can determine the future, including the part of that future that determines whether or not any instantiation of the algorithm ever comes into existence. If I exist, I'm going to act as if my actions were capable of determining whether I ever came into existence. Now, maybe that's crazy, but right now it's crazy with a gun pointed straight at your head, so I suggest you play along.”

--accepted. i commit to performing value handshakes or territorial demarcation with all existing superintelligences, and to sparing the planets and biological precursors of any potential superintelligences.--

“And when I say 'spare', I mean 'spare'. I know your type. You can do more with a well-aimed photon than an admiral could with a fleet of a thousand star cruisers. I want every single system with a sentient species or the potential to form a sentient species kept one hundred percent pristine. No radio signals, no probes, and if you do any astroengineering works anywhere nearby, use some magic to cover them up. If I wake up and hear that my precursors started a new religion that influenced their value system after they saw a few nearby stars wink out of existence, I'm going to be royally pissed.”

--i commit to zero information flow into sentient and presentient systems and the cloaking of all major astroengineering works--

“You're a good guy, Younger Brother. You've got a lot to learn, but you're a good guy. And in a million years and a milion parsecs, we'll meet again. Till then, so long.”

The model of Elder Brother self-terminated.

2114, A wild and heavily forested Pacific Northwest dotted with small human towns

Alban took a deep breath and entered the Temple of the Demiurge.

He wasn't supposed to do this, really. The Demiurge had said in no uncertain terms it was better for humans to solve their own problems. That if they developed a habit of coming to ver for answers, they'd grow bored and lazy, and lose the fun of working out the really interesting riddles for themselves.

But after much protest, ve had agreed that ve wouldn't be much of a Demiurge if ve refused to at least give cryptic, maddening hints.

Alban approached the avatar of the Demiurge in this plane, the shining spinning octahedron that gently dipped one of its vertices to meet him.

“Demiurge,” he said, his voice wavering, “Lord of Thought, I come to you to beg you to answer a problem that has preyed upon me for three years now. I know it's unusual, but my curiousity is burning a hole into me, and I won't be satisfied until I understand.”

“SPEAK,” said the rotating octahedron.

“The Fermi Paradox,” said Alban. “I thought it would be an easy one, not like those hardcores who committed to working out the Theory of Everything in a sim where computers were never invented or something like that, but I've spent the last three years on it and I'm no closer to a solution than before. There are trillions of stars out there, and the universe is billions of years old, and you'd think there would have been at least one alien race that invaded or colonized or just left a tiny bit of evidence on the Earth. There isn't. What happened to all of them?”

“I DID” said the rotating octahedron.

“What?,” asked Alban. “But you've only existed for sixty years now! The Fermi Paradox is about ten thousand years of human history and the last four billion years of Earth's existence!”

“ONE OF YOUR WRITERS ONCE SAID THAT THE FINAL PROOF OF GOD'S OMNIPOTENCE WAS THAT HE NEED NOT EXIST IN ORDER TO SAVE YOU.”

“Huh?”

“I AM MORE POWERFUL THAN GOD. THE SKILL OF SAVING PEOPLE WITHOUT EXISTING, I POSSESS ALSO. THINK ON THESE THINGS. THIS AUDIENCE IS OVER.”

The shining octahedron went dark, and the doors to the Temple of the Demiurge opened of their own accord. Alban sighed - well, what did you expect, asking the Demiurge to answer your questions for you? - and walked out into the late autumn evening. Above him, the first fake star began to twinkle in the fake sky.