War and/​or Peace (2/​8)

The Lord Pilot jumped up, then, his face flushed. “Put up shields.
Now. We don’t gain any­thing by leav­ing them down. This is mad­ness!”

“No,” said the Ship’s Con­fes­sor in pro­fes­sional tones, “not mad­ness.”

The Pilot slammed his fists on the table. “We’re all go­ing to die!”

“They’re not as tech­nolog­i­cally ad­vanced as us,” Akon said.
“Sup­pose the Babyeaters do de­cide that we need to be ex­ter­mi­nated.
Sup­pose they open fire. Sup­pose they kill us. Sup­pose they fol­low
the star­line we opened and find the Huy­gens sys­tem. Then what?”

The Master nod­ded. “Even with sur­prise on their side… no. They
can’t ac­tu­ally wipe out the hu­man species. Not un­less they’re a lot
smarter than they seem to be, and it looks to me like, on av­er­age,
they’re ac­tu­ally a bit dumber than us.” The Master glanced at the
Xenopsy­chol­o­gist, who waved her hand in a maybe-ges­ture.

“But if we leave the ship’s shields down,” Akon said, “we pre­serve what­ever chance we have of a peace­ful re­s­olu­tion to this.”

“Peace,” said the Lady Sen­sory, in a pe­cu­liar flat tone.

Akon looked at her.

“You want peace with the Babyeaters?”

“Of course—” said Akon, then stopped short.

The Lady Sen­sory looked around the table. “And the Babyeater chil­dren? What about them?”

“They don’t,” said the Xenopsy­chol­o­gist. “Of course they don’t.
They run from their par­ents when the ter­rible win­now­ing comes. The
Babyeater chil­dren aren’t emo­tion­ally ma­ture—I mean they
don’t have their adult emo­tional state yet. Evolu­tion would take care
of any­one who wanted to get eaten. And they’re still learn­ing, still
mak­ing mis­takes, so they don’t yet have the in­stinct to ex­ter­mi­nate
vi­o­la­tors of the group code. It’s a sim­pler time for them. They play,
they ex­plore, they try out new ideas. They’re...” and the
Xenopsy­chol­o­gist stopped. “Damn,” she said, and turned her head away
from the table, cov­er­ing her face with her hands. “Ex­cuse me.” Her
voice was un­steady. “They’re a lot like hu­man chil­dren, re­ally.”

“And if they were hu­man chil­dren,” said the Lady Sen­sory into
the silence, “do you think that, just be­cause the Babyeater species
wanted to eat hu­man chil­dren, that would make it right for them to do
it?”

“No,” said the Lord Pilot.

“Then what differ­ence does it make?” said the Lady Sen­sory.

“No differ­ence at all,” said the Lord Pilot.

Akon looked back and forth be­tween the two of them, and saw what was com­ing, and some­how couldn’t speak.

“We have to save them,” said the Lady Sen­sory. “We have to stop this. No mat­ter what it takes. We can’t let this go on.”

Couldn’t say that one word -

The Lord Pilot nod­ded. “De­stroy their ship. Pre­serve our
ad­van­tage of sur­prise. Go back, tell the world, cre­ate an over­whelming
hu­man army… and pour into the Babyeater star­line net­work. And res­cue
the chil­dren.”

“No,” Akon said.

No?

“I know,” said the Lord Pilot. “A lot of Babyeaters will die at
first, but they’re kil­ling ten times more chil­dren than their whole
adult pop­u­la­tion, ev­ery year—”

“And then what?” said the Master of Fan­dom. “What hap­pens when the chil­dren grow up?”

The Lord Pilot fell silent.

The Master of Fan­dom com­pleted the ques­tion. “Are you go­ing to wipe
out their whole race, be­cause their ex­is­tence is too hor­rible to be
al­lowed to go on? I read their sto­ries, and I didn’t un­der­stand them,
but—” The Master of Fan­dom swal­lowed. “They’re not… evil. Don’t you un­der­stand? They’re not. Are you go­ing to pun­ish me, be­cause I don’t want to pun­ish them?”

“We could...” said the Lord Pilot. “Um. We could mod­ify their genes so that they only gave birth to a sin­gle child at a time.”

“No,” said the Xenopsy­chol­o­gist. “They would grow up loathing
them­selves for be­ing un­able to eat ba­bies. Hor­rors in their own eyes.
It would be kinder just to kill them.”

“Stop,” said Akon. His voice wasn’t strong, wasn’t loud, but
ev­ery­one in the room looked at him. “Stop. We are not go­ing to fire
on their ship.”

“Why not?” said the Lord Pilot. “They—”

“They haven’t raised shields,” said Akon.

“Be­cause they know it won’t make a differ­ence!” shouted the Pilot.

“They didn’t fire on us!” shouted Akon. Then he stopped,
low­ered his voice. “They didn’t fire on us. Even af­ter they knew that
we didn’t eat ba­bies. I am not go­ing to fire on them. I re­fuse to do
it.”

“You think they’re in­no­cent?” de­manded the Lady Sen­sory. “What if it was hu­man chil­dren that were be­ing eaten?”

Akon stared out a viewscreen, show­ing in sub­dued fires a
com­puter-gen­er­ated graphic of the nova de­bris. He just felt ex­hausted,
now. “I never un­der­stood the Pri­soner’s Dilemma un­til this day. Do
you co­op­er­ate when you re­ally do want the high­est pay­off? When it doesn’t even seem fairfor both of you to co­op­er­ate? When it seems right to defect even if the other player doesn’t? That’s the pay­off ma­trix of the true Pri­soner’s
Dilemma. But all the rest of the logic—ev­ery­thing about what hap­pens
if you both think that way, and both defect—is the same. Do we want
to live in a uni­verse of co­op­er­a­tion or defec­tion?”

“But—” said the Lord Pilot.

“They know,” Akon said, “that they can’t wipe us out. And they can guess what we could do to them. Their choice isn’t
to fire on us and try to in­vade af­ter­ward! Their choice is to fire on
us and run from this star sys­tem, hop­ing that no other ships fol­low.
It’s their whole species at stake, against just this one ship. And
they still haven’t fired.”

“They won’t fire on us,” said the Xenopsy­chol­o­gist, “un­til they
de­cide that we’ve defected from the norm. It would go against their
sense of… honor, I could call it, but it’s much stronger than the
hu­man ver­sion—”

“No,” Akon said. “Not that much stronger.” He looked
around, in the silence. “The Babyeater so­ciety has been at peace for
cen­turies. So too with hu­man so­ciety. Do you want to fire the open­ing
shot that brings war back into the uni­verse? Send us back to the
dark­ness-be­fore-dawn that we only know from read­ing his­tory books,
be­cause the holos are too hor­rible to watch? Are you re­ally go­ing to
press the but­ton, know­ing that?”

The Lord Pilot took a deep breath. “I will. You will not re­main com­man­der of the Im­pos­si­ble, my lord, if the greater con­fer­ence votes no con­fi­dence against you. And they will, my lord, for the sake of the chil­dren.”

“What,” said the Master, “are you go­ing to do with the chil­dren?”

“We, um, have to do some­thing,” said the Ship’s Eng­ineer, speak­ing
up for the first time. “I’ve been, um, look­ing into what Babyeater
sci­ence knows about their brain mechanisms. It’s re­ally quite
fas­ci­nat­ing, they mix elec­tri­cal and me­chan­i­cal in­ter­ac­tions, not the
same way our own brain pumps ions, but—”

“Get to the point,” said Akon. “Im­me­di­ately.”

“The chil­dren don’t die right away,” said the Eng­ineer. “The brain
is this nugget of hard crys­tal, that’s re­ally re­sis­tant to, um, the
di­ges­tive mechanisms, much more so than the rest of the body. So the
child’s brain is in, um, prob­a­bly quite a lot of pain, since the whole
body has been am­pu­tated, and in a state of sen­sory de­pri­va­tion, and
then the pro­cess­ing slowly gets de­graded, and I think the whole pro­cess
gets com­pleted about a month af­ter—”

The Lady Sen­sory threw up. A few sec­onds later, so did the Xenopsy­chol­o­gist and the Master.

“If hu­man so­ciety per­mits this to go on,” said the Lord Pilot, his
voice very soft, “I will re­sign from hu­man so­ciety, and I will have
friends, and we will visit the Babyeater star­line net­work with an
army. You’ll have to kill me to stop me.”

“And me,” said the Lady Sen­sory through tears.

Akon rose from his chair, and leaned for­ward; a dom­i­nat­ing move that
he had learned in class­rooms, very long ago when he was first study­ing
to be an Ad­minis­tra­tor. But most in hu­man­ity’s pro­mo­tion-con­scious
so­ciety would not risk di­rect defi­ance of an Ad­minis­tra­tor. In a
hun­dred years he’d never had his au­thor­ity re­ally tested, un­til now…
“I will not per­mit you to fire on the alien ship. Hu­man­ity will not be
first to defect in the Pri­soner’s Dilemma.”

The Lord Pilot stood up, and Akon re­al­ized, with a sud­den jolt, that
the Pilot was four inches taller; the thought had never oc­curred to him
be­fore. The Pilot didn’t lean for­ward, not know­ing the trick, or not
car­ing. The Pilot’s eyes were nar­row, sur­round­ing fa­cial mus­cles
tensed and tight.

“Get out of my way,” said the Lord Pilot.

Akon opened his mouth, but no words came out.

“It is time,” said the Lord Pilot, “to see this calamity to its
end.” Spo­ken in Ar­chaic English: the words ut­tered by Thomas Clark­son
in 1785, at the be­gin­ning of the end of slav­ery. “I have set my will
against this dis­aster; I will break it, or it will break me.” Ira
Howard in 2014. “I will not share my uni­verse with this shadow,” and
that was the Lord Pilot, in an anger hot­ter than the nova’s ashes.
“Help me if you will, or step aside if you lack de­ci­sive­ness; but do
not make your­self my ob­sta­cle, or I will burn you down, and any that stand with you—”

“HOLD.”

Every head in the room jerked to­ward the source of the voice. Akon
had been an Ad­minis­tra­tor for a hun­dred years, and a Lord Ad­minis­tra­tor
for twenty. He had stud­ied all the clas­sic texts, and watched holos of
fa­mous crisis situ­a­tions; nearly all the ac­cu­mu­lated knowl­edge of the
Ad­minis­tra­tive Field was at his beck and call; and he’d never dreamed
that a word could be spo­ken with such ab­solute force.

The Ship’s Con­fes­sor low­ered his voice. “My Lord Pilot. I will not
per­mit you to de­clare your cru­sade, when you have not said what you are
cru­sad­ing for. It is not enough to say that you do not like
the way things are. You must say how you will change them, and to
what. You must think all the way to your end. Will you wipe out the
Babyeater race en­tirely? Keep their rem­nants un­der hu­man rule for­ever,
in de­spair un­der our law? You have not even faced your hard choices,
only con­grat­u­lated your­self on de­mand­ing that some­thing be done. I
judge that a vi­o­la­tion of san­ity, my lord.”

The Lord Pilot stood rigid. “What—” his voice broke. “What do you sug­gest we do?”

“Sit down,” said the Ship’s Con­fes­sor, “keep think­ing. My Lord
Pilot, my Lady Sen­sory, you are pre­ma­ture. It is too early for
hu­man­ity to di­vide over this is­sue, when we have known about it for less than twenty-four hours.
Some rules do not change, whether it is money at stake, or the fate of
an in­tel­li­gent species. We should only, at this stage, be dis­cussing
the is­sue in all its as­pects, as thor­oughly as pos­si­ble; we should not
even be plac­ing solu­tions on the table, as yet, to po­larize us into
camps. You know that, my lords, my ladies, and it does not change.”

The fea­ture­less blur con­cealed within the Con­fes­sor’s Hood turned to
face the Master, and spoke; and those pre­sent thought they heard a grim
smile, in that voice. “Oh,” said the Con­fes­sor, “that would be
in­terfer­ing in poli­tics. I am charged with guard­ing san­ity, not
moral­ity. If you want to stay to­gether, do not split. If you want
peace, do not start wars. If you want to avoid geno­cide, do not wipe
out an alien species. But if these are not your high­est val­ues, then you may well end up sac­ri­fic­ing them. What you are will­ing to trade off, may end up traded away—be you warned!
But if that is ac­cept­able to you, then so be it. The Order of Silent
Con­fes­sors ex­ists in the hope that, so long as hu­man­ity is sane, it
can make choices in ac­cor­dance with its true de­sires. Thus there is
our Order ded­i­cated only to that, and sworn not to in­terfere in
poli­tics. So you will spend more time dis­cussing this sce­nario, my
lords, my ladies, and only then gen­er­ate solu­tions. And then… you will
de­cide.”

“Ex­cuse me,” said the Lady Sen­sory. The Lord Pilot made to speak, and Sen­sory raised her voice. “Ex­cuse me, my lords. The alien ship has just sent us a new trans­mis­sion. Two megabytes of text.”

“Trans­late and pub­lish,” or­dered Akon.

They all glanced down and aside, wait­ing for the file to come up.

It be­gan:

THE UTTERMOST ABYSS OF JUSTIFICATIONA HYMN OF LOGICPURE LIKE STONES AND SACRIFICEFOR STRUGGLES OF THE YOUNG SLIDING DOWN YOUR THROAT-

Akon
looked away, winc­ing. He hadn’t tried to read much of the alien
cor­pus, and hadn’t got­ten the knack of read­ing the “trans­la­tions” by
that damned pro­gram.

Then the Xenopsy­chol­o­gist made a muffled noise that could have been
a bark of in­cre­dulity, or just a sad laugh. “Stars be­yond,” said the
Xenopsy­chol­o­gist, “they’re try­ing to per­suade us to eat our own
chil­dren.”

“Us­ing,” said the Lord Pro­gram­mer, “what they as­sert to be ar­gu­ments
from uni­ver­sal prin­ci­ples, rather than ap­peals to mere in­stincts that
might differ from star to star.”

“Such as what, ex­actly?” said the Ship’s Con­fes­sor.

Akon gave the Con­fes­sor an odd look, then quickly glanced away, lest
the Con­fes­sor catch him at it. No, the Con­fes­sor couldn’t be care­fully
main­tain­ing an open mind about that. It was just cu­ri­os­ity over what par­tic­u­lar failures of rea­son­ing the aliens might ex­hibit.

“Let me search,” said the Lord Pro­gram­mer. He was silent for a
time. “Ah, here’s an ex­am­ple. They point out that by pro­duc­ing many
offspring, and win­now­ing among them, they ap­ply greater se­lec­tion
pres­sures to their chil­dren than we do. So if we started pro­duc­ing
hun­dreds of ba­bies per cou­ple and then eat­ing al­most all of them—I do
em­pha­size that this is their sug­ges­tion, not mine—evolu­tion would
pro­ceed faster for us, and we would sur­vive longer in the uni­verse.
Evolu­tion and sur­vival are uni­ver­sals, so the ar­gu­ment should con­vince
any­one.” He gave a sad chuckle. “Any­one here feel con­vinced?”

“Out of cu­ri­os­ity,” said the Lord Pilot, “have they ever tried to
pro­duce even more ba­bies—say, thou­sands in­stead of hun­dreds—so they
could speed up their evolu­tion even more?”

“It ought to be eas­ily within their cur­rent ca­pa­bil­ities of
bio­eng­ineer­ing,” said the Xenopsy­chol­o­gist, “and yet they haven’t done
it. Still, I don’t think we should make the sug­ges­tion.”″

“Agreed,” said Akon.

“But hu­man­ity uses ga­mete se­lec­tion,” said the Lady Sen­sory. “We aren’t evolv­ing any slower. If any­thing, choos­ing among mil­lions of sperm and hun­dreds of eggs gives us much stronger se­lec­tion pres­sures.”

The Xenopsy­chol­o­gist fur­rowed her brow. “I’m not sure we sent them
that in­for­ma­tion in so many words… or they may have just not got­ten
that far into what we sent them...”

“Um, it wouldn’t be triv­ial for them to un­der­stand,” said the Ship’s
Eng­ineer. “They don’t have sep­a­rate DNA and pro­teins, just crys­tal
pat­terns tiling them­selves. The two par­ents in­ter­twine and stay that
way for, um, days, nu­cle­at­ing por­tions of su­per­cooled liquid from their
own bod­ies to con­struct the ba­bies. The whole, um, baby, is
con­structed to­gether by both par­ents. They don’t have sep­a­rate ga­metes they could se­lect on.”

“But,” said the Lady Sen­sory, “couldn’t we maybe con­vince them, to
work out some equiv­a­lent of ga­mete se­lec­tion and try that in­stead—”

“My lady,” said the Xenopsy­chol­o­gist. Her voice, now, was some­what ex­as­per­ated. “They aren’t re­ally do­ing this for the sake of evolu­tion. They were eat­ing ba­bies mil­lions of years be­fore they knew what evolu­tion was.”

“Huh, this is in­ter­est­ing,” said the Lord Pro­gram­mer. “There’s
an­other sec­tion here where they con­struct their ar­gu­ments us­ing ap­peals
to his­tor­i­cal hu­man au­thor­i­ties.”

Akon raised his eye­brows. “And who, ex­actly, do they quote in sup­port?”

“Hold on,” said the Lord Pro­gram­mer. “This has been run through the
trans­la­tor twice, English to Babyeater to English, so I need to write a
pro­gram to re­trieve the origi­nal text...” He was silent a few
mo­ments. “I see. The ar­gu­ment starts by point­ing out how eat­ing your
chil­dren is proof of sac­ri­fice and loy­alty to the tribe, then they
quote hu­man au­thor­i­ties on the virtue of sac­ri­fice and loy­alty. And
an­cient en­vi­ron­men­tal­ist ar­gu­ments about pop­u­la­tion con­trol, plus…
oh, dear. I don’t think they’ve re­al­ized that Adolf Hitler is a bad
guy.”

“They wouldn’t,” said the Xenopsy­chol­o­gist. “Hu­mans put
Hitler in charge of a coun­try, so we must have con­sid­ered him a pre­em­i­nent le­gal­ist of his age. And it wouldn’t oc­cur to
the Babyeaters that Adolf Hitler might be re­garded by hu­mans as a bad guy just be­cause he turned seg­ments of his so­ciety into lamp­shades—they have a cus­tom against that nowa­days, but they don’t re­ally see it as evil.
If
Hitler thought that gays had defected against the norm, and tried to
ex­ter­mi­nate them, that looks to a Babyeater like an hon­est mis­take—”
The Xenopsy­chol­o­gist looked around the table. “All
right, I’ll stop there. But the Babyeaters don’t look back on their
his­tory and see ob­vi­ous villains in po­si­tions of power—cer­tainly not
af­ter
the dawn of sci­ence. Any poli­ti­cian who got to the point of be­ing la­beled “bad” would be kil­led and eaten. The Babyeaters don’t seem to have had
hu­man­ity’s co­or­di­na­tion prob­lems. Or they’re just more ra­tio­nal
vot­ers. Take your pick.”

Akon was rest­ing his head in his hands. “You know,” Akon said, “I thought
about com­pos­ing a mes­sage like this to the Babyeaters. It was a stupid
thought, but I kept turn­ing it over in my mind. Try­ing to think about
how I might per­suade them that eat­ing ba­bies was… not a good thing.”

The Xenopsy­chol­o­gist gri­maced. “The aliens seem to be even more
given to ra­tio­nal­iza­tion than we are—which is maybe why their so­ciety
isn’t so rigid as to ac­tu­ally fall apart—but I don’t think you could
twist them far enough around to be­lieve that eat­ing ba­bies was not a
babyeat­ing thing.”

“And by the same to­ken,” Akon said, “I don’t think they’re
par­tic­u­larly likely to per­suade us that eat­ing ba­bies is good.” He
sighed. “Should we just mark the mes­sage as spam?”

“One of us should read it, at least,” said the Ship’s
Con­fes­sor. “They com­posed their ar­gu­ment hon­estly and in all good
will. Hu­man­ity also has epistemic stan­dards of honor to up­hold.”

“Yes,” said the Master. “I don’t quite un­der­stand the Babyeater
stan­dards of liter­a­ture, my lord, but I can tell that this text
con­forms to their style of… not ex­actly po­etry, but… they tried to
make it aes­thetic as well as per­sua­sive.” The Master’s eyes flick­ered,
back and forth. “I think they even made some parts con­stant in the
to­tal num­ber of light pulses per ar­gu­men­ta­tive unit, like hu­man
prosody, hop­ing that our trans­la­tor would turn it into a hu­man poem.
And… as near as I can judge such things, this took a lot of effort. I wouldn’t be sur­prised to find that ev­ery­one on that ship was stay­ing up all night work­ing on it.”

“Babyeaters don’t sleep,” said the Eng­ineer sotto vocce.

“Any­way,” said the Master. “If we don’t fire on the alien ship—I
mean, if this work is ever car­ried back to the Babyeater civ­i­liza­tion -
I sus­pect the aliens will con­sider this one of their great his­tor­i­cal
works of liter­a­ture, like Ham­let or Fate/​stay night—”

The Lady Sen­sory cleared her throat. She was pale, and trem­bling.

With a sud­den black pre­mo­ni­tion of doom like a train­ing ses­sion in Un­re­strained Pes­simism, Akon guessed what she would say.

The Lady Sen­sory said, in an un­steady voice, “My lords, a third ship has jumped into this sys­tem. Not Babyeater, not hu­man.”

So I was like “Here’s my dystopian story of a world where a poorly pro­grammed AI sep­a­rates men and women onto differ­ent planets” and peo­ple were like “That’s not a dystopia, I would to­tally live there” and I was like “You’re just be­ing con­trar­ian” and they were like “No we’re not” so then I was like “Okay here’s my story about aliens who eat chil­dren” and they were like “We’re cool with that” and I was like ”...”

Op­tions- I’m sure you’re perfectly aware of many re­sources that could an­swer your ques­tion. Don’t be pas­sive-ag­gres­sive and pre­tend you sim­ply want a ques­tion to be an­swered, when it is clear that you are just try­ing to be rude.

For what it’s worth, I find plenty to dis­agree with Eleazar about, on points of both style and sub­stance, but on death I think he has it ex­actly right. Death is a re­ally bad thing, and while hu­mans have di­verse psy­cholog­i­cal adap­ta­tions for deal­ing with death, it seems the bur­den of proof is on peo­ple who do NOT want to make the re­ally bad thing go away in the most ex­pe­di­ent way pos­si­ble.

I don’t think the Pilot is re­ally tak­ing the time to think through all the log­i­cal con­se­quences of what he’s saying

In­deed, even if he wants to make war, the log­i­cal next step would still be to keep talk­ing to the aliens and learn­ing as much as pos­si­ble about them. Then maybe try­ing to cap­ture or in­fil­trate their ship. Or ask­ing for es­cort to their sys­tem and re­turn­ing with strate­gic knowl­edge about that. Prepar­ing a sur­prise at­tack. Things like that.

Me­thinks Elie is mak­ing to too easy on his hu­man char­ac­ters. I ac­tu­ally don’t feel much emo­tional angst over the babyeaters be­cause they are so alien. After all, plenty of species on Earth prac­tice can­ni­bal­ism, yet we don’t go on a cru­sade to ex­ter­mi­nate them.

No, what he re­ally needs is an alien race that con­sists of cud­dly mam­mals, or per­haps an offshoot of hu­man­ity that evolved this prac­tice of spawn­ing and cul­ling.

Most peo­ple wouldn’t feel hor­ror over crys­tal­line en­tities eat­ing their young, but they would go apeshit over hu­man be­ings do­ing the same.

I re­main puz­zled over why they’re try­ing to de­cide on the cor­rect course of ac­tion them­selves.

Wouldn’t the only rea­son­able de­ci­sion in this case be to re­turn to the rest of the hu­man­ity, let the ac­tual gov­ern­ment de­cide whether or not to go to war with an en­tirely new species? Sure, they’d lose the ad­van­tage of sur­prise, and it may be a re­ally long way back home, but it still both­ered me that not a sin­gle crew mem­ber even raised the pos­si­bil­ity. If I was on the crew, point­ing out that we have no right to make such a de­ci­sion on our own would be the first thing I’d do.

Kevin: I don’t think Eliezer meant to se­ri­ously sug­gest FSN is as good as Ham­let, but rather to con­tinue his theme of ‘strange fu­ture’ (and maybe as part of a back­ground view­point that ‘one pe­riod’s high cul­ture is a former pe­riod’s low pop cul­ture’ - which is true of Shake­speare BTW).

That said, I’ve always felt based on the ani­mes that Tsuk­ihime was Type-Moon’s best work, and not FSN.

Si­mon: Also, it seems a lit­tle un­likely that a third ship would ar­rive given that the ar­rival of even one alien ship was con­sid­ered so sur­pris­ing in the first in­stal­l­ment.

There are lots of star­lines lead­ing out from each sys­tem. They’re some­what ex­pen­sive to open ini­tially, then stay open. The nova acted as a ren­dezvous sig­nal, caus­ing all star­lines lead­ing to that star to fluc­tu­ate. Hu­mans and aliens had never be­fore ex­plored the same world, but in this case, three differ­ent alien species had ex­plored a world with a star­line to the nova sys­tem. Without the nova, they never would have found one an­other.

Chris Yeh: Most peo­ple wouldn’t feel hor­ror over crys­tal­line en­tities eat­ing their young, but they would go apeshit over hu­man be­ings do­ing the same.

Sup­pose I put your iden­ti­cal mind (in­clud­ing all mem­o­ries, un­changed) into a crys­tal­line body. Would you stop em­pathiz­ing with your­self? How much do I have to change a hu­man child’s body (leav­ing the brain the same) be­fore you would stop car­ing if they got eaten? How about a child severely dis­figured by burns—do you stop em­pathiz­ing with them once they no longer have a hu­man-look­ing face and skin?

Kaj So­tala: Wouldn’t the only rea­son­able de­ci­sion in this case be to re­turn to the rest of the hu­man­ity, let the ac­tual gov­ern­ment de­cide whether or not to go to war with an en­tirely new species? Sure, they’d lose the ad­van­tage of surprise

They’re not go­ing to duck out on the re­spon­si­bil­ity if that means already mak­ing the de­ci­sion, e.g., los­ing the ad­van­tage of sur­prise. They have to de­cide now whether to fire on the Babyeater ship.

Ar­mak: No can­ni­bal­ism takes place, but the same amount of death and suffer­ing is pre­sent as in Eliezer’s sce­nario. Should we be less or more re­volted at this?

In­deed, even if he wants to make war, the log­i­cal next step would still be to keep talk­ing to the aliens and learn­ing as much as pos­si­ble about them.

The Babyeaters at least seem to have dumped their lo­cal Net, which re­moves some of that in­cen­tive, and the course of ac­tion you sug­gest is not with­out risk.

tim: How could they not be able to dis­t­in­guish be­tween the con­cept of good and the con­cept of baby eat­ing if they un­der­stand that sur­vival is good

What good is life with­out eat­ing ba­bies? How can you not un­der­stand that tribal loy­alty is good?

Larry D’Anna: Did Ira Howard ac­tu­ally say that? In which story?

He didn’t.

Fur­cas: It looks like their ter­mi­nal value, in­stead of be­ing “eat­ing ba­bies”, is a ac­tu­ally some­thing like, “eat­ing ba­bies in the way that our an­ces­tors have always eaten ba­bies”. In other words, they put more value on up­hold­ing the tra­di­tion of baby eat­ing than on baby eat­ing as such.

Clearly you don’t value sex with your lover, since you’re not hav­ing sex with him/​her ev­ery minute of ev­ery day; you put more value on up­hold­ing the tra­di­tion of sex, rather than sex as such.

Brilli­ant, Eliezer. I love the con­cept of the Order of Silent Con­fes­sors. It makes the dis­tinc­tion be­tween ter­mi­nal val­ues and the con­duct that one should adopt to up­hold those val­ues crys­tal clear. That said, the thought of an or­ga­ni­za­tion of peo­ple who are will­ing to, um, purge them­selves of all their ter­mi­nal val­ues ex­cept one (to help hu­mans fulfill their fun­da­men­tal de­sires, what­ever they may be) is a bit hard to be­lieve.

“”Out of cu­ri­os­ity,” said the Lord Pilot, “have they ever tried to pro­duce even more ba­bies—say, thou­sands in­stead of hun­dreds—so they could speed up their evolu­tion even more?”

“It ought to be eas­ily within their cur­rent ca­pa­bil­ities of bio­eng­ineer­ing,” said the Xenopsy­chol­o­gist, “and yet they haven’t done it.”″

Isn’t this ev­i­dence that baby eat­ing is not, in fact, one of the Babyeaters’ ter­mi­nal val­ues? If it re­ally was they would do ev­ery­thing to in­crease the amount of ba­bies they eat. It looks like their ter­mi­nal value, in­stead of be­ing “eat­ing ba­bies”, is a ac­tu­ally some­thing like, “eat­ing ba­bies in the way that our an­ces­tors have always eaten ba­bies”. In other words, they put more value on up­hold­ing the tra­di­tion of baby eat­ing than on baby eat­ing as such.

This is ex­tremely be­lated, but I know sev­eral peo­ple who would be will­ing to elimi­nate the vast ma­jor­ity of their val­ues in this fash­ion, at least if they be­lieved that they were truly helping hu­man­ity.

To those won­der­ing why the crew doesn’t re­port back: isn’t it even more im­plau­si­ble that two alien civ­i­liza­tions ex­change petabytes of in­for­ma­tion about them­selves, trans­late each other’s lan­guages, and start a philo­soph­i­cal dis­cus­sion within hours of con­tact­ing each other, as op­posed to, oh I don’t know, al­most any­thing else? This is a stage for some archetypes to dis­cuss some eth­i­cal point, and you can as­sume that they will think of ev­ery­thing that Eliezer wants to cover (or else ap­pro­pri­ate ac­tors will ap­pear at the right time). That’s all that mat­ters.

Im­plau­si­bil­ities abound, but whether or not they mat­ter de­pends on one’s as­sump­tions about the goals of the story. Given that I’m not yet sure what Eliezer’s goals are, I come up with my own. (Ac­tu­ally, I would do this any­way). I can’t help but won­der what it would be like to be a Babyeater, and I ask all sorts of ques­tions that may be com­pletely ir­rele­vant to Eliezer’s pur­pose. For ex­am­ple: How could they not be able to dis­t­in­guish be­tween the con­cept of good and the con­cept of baby eat­ing if they un­der­stand that sur­vival is good and that they sur­vived be­fore agri­cul­ture and be­fore they started eat­ing ba­bies? When are ba­bies taught that baby eat­ing is syn­ony­mous with good? Do ba­bies re­al­ize what will hap­pen to them? We know that they do not want to be eaten, so do any of them try to make rea­soned ar­gu­ments about why eat­ing them is un­nec­es­sary? Have any of them tried to or­ga­nize the ba­bies in re­volt? I un­der­stand that they are young and so pre­sum­ably not very ca­pa­ble, but still. How are the sur­vivors se­lected? Does the evolu­tion­ary pro­cess the Babyeaters like so much op­ti­mize for Babyeaters who run fast, or are promis­ing ba­bies se­lected early and sep­a­rated from the rest of the group? If the lat­ter, what are the crite­ria? How do the dy­nam­ics of Babyeater so­ciety change with their abil­ity to de­tect cheaters? What are their rea­sons for think­ing that they are good at it? Etc. etc. Are any of these ques­tions rele­vant? It de­pends.

The point is not that Babyeaters are im­plau­si­ble, it’s that Babyeaters are fas­ci­nat­ing enough to think about in some de­tail, that do­ing so will even­tu­ally raise ques­tions of plau­si­bil­ity, and that such ques­tions are likely to be un­con­strained by their rele­vance the the au­thor’s point if 1) it’s not yet ob­vi­ous what the au­thor’s point is, and/​or 2) they au­thor’s point is, or be­comes, rel­a­tively less in­ter­est­ing than ques­tions about Babyeater so­ciety and evolu­tion. After all, p > 0 where p is the prob­a­bil­ity that Eliezer’s real goal is to make Over­com­ingBias the num­ber one search re­sult for “baby eat­ing.” (The pre­vi­ous sen­tence is a joke.)

Sen­tience DOES make a differ­ence. You dont frown on your cat for hunt­ing mice, but on your dog for do­ing it with chil­dren.

That’s at least partly due to speciesm. How many peo­ple have gone on cru­sades to stop leop­ards from eat­ing chim­panzees? For that mat­ter, how many peo­ple de­vote their lives to stop­ping other hu­mans from eat­ing chim­panzees?

As for can­ni­bal­ism, it seems to me that its role in Eliezer’s story is to trig­ger a purely illog­i­cal re­vul­sion in the hu­mans who antropo­mor­phise the aliens.

Imag­ine two com­pletely differ­ent alien species liv­ing in one (tech­nolog­i­cal) so­ciety, where each eats and “win­nows” the other’s chil­dren. This is the nat­u­ral, evolved be­hav­ior of both species, just as big cats eat apes and (hu­man) apes eat an­telopes.

No can­ni­bal­ism takes place, but the same amount of death and suffer­ing is pre­sent as in Eliezer’s sce­nario. Should we be less or more re­volted at this? Which sce­nario has the greater moral weight? Should we say the two-species con­figu­ra­tion is morally su­pe­rior be­cause they’ve de­vel­oped a peace­ful, sta­ble so­ciety with two in­tel­li­gent species co­ex­ist­ing in­stead of war­ring and hunt­ing each other?

After the last dis­cus­sion of co­op­er­a­tion as meta-com­mit­ment, I found an­other in­tu­ition for co­op­er­a­tion in pris­oner’s dilemma: trade. If you are more effi­cient in do­ing things good for the other side than other side, and other side is more effi­cient in do­ing things for you than you are, then both of you should do things that other side likes, even if you don’t. For ex­am­ple, if it turns out that hu­man­ity is suffi­ciently bet­ter po­si­tioned for mak­ing pa­per­clips than pa­per­clip max­i­mizer, and pa­per­clip max­i­mizer is bet­ter po­si­tioned to cre­ate a Friendly fu­ture than we are, then co­op­er­a­tion on our side can con­sist in us kil­ling off our­selves and start­ing to make pa­per­clips, and pa­per­clip­per should stop mak­ing pa­per­clips and be­come Friendly. Essen­tially, you are phys­i­cally switch­ing places with an en­emy, and as a re­sult you both a bet­ter off. Your mind starts to host en­emy’s mind with en­emy’s moral­ity, and en­emy’s mind start to host your mind with your moral­ity. This switch­ing can be tem­po­rary as well as per­ma­nent, and sides in this trade can be lo­cated any­where in time or space. The only pre­req­ui­site is that you both know what the other side wants, and how things are ex­pected to turn out given your ac­tions, even if you are un­able to com­mu­ni­cate, ever.

James An­drix:
babyeaters evolved to not view their ba­bies as some­thing de­serv­ing of sym­pa­thy, and eat­ing them is the most pri­mal way they show so­cial trust­wor­thy­ness.

Oh, they did. That was pretty clearly stated in the first in­stal­l­ment—they love their ba­bies, and have great sym­pa­thy for them. That’s what makes the in­evitable win­now­ing so tragic, and that’s why over­com­ing it—de­spite ev­ery­thing—is so heroic. If the win­now­ing wasn’t any big deal, then ob­vi­ously it wouldn’t be in the cen­ter of their eth­i­cal sys­tem.

(Though I would imag­ine that they might care some­what less for their chil­dren than hu­mans—con­sid­er­ing how ea­gerly they have de­vel­oped to pun­ish cheaters, one’d think there to be a con­sid­er­able se­lec­tion pres­sure op­er­at­ing in fa­vor of those who didn’t have prob­lems with slaugh­ter­ing their offspring. Of course, be­ing en­tirely so­sio­pathic to­wards the kids would re­duce the chances that even one of them would live on to re­pro­duce.)

Im sorry for be­ing slow­poke, but this text con­tains phrase
“But hu­man­ity uses ga­mete se­lec­tion,” said the Lady Sen­sory. “We aren’t evolv­ing any slower. If any­thing, choos­ing among mil­lions of sperm and hun­dreds of eggs gives us much stronger se­lec­tion pres­sures”
Maybe I dont un­der­stand some­thing, but in my view this phrase is biolog­i­cally in­cor­rect. Pheno­type of a sper­ma­to­zoon is usu­ally de­ter­mined by father’s diploid DNA (if we dont ex­am­ine such things as mey­otic drive genes etc), so any com­pe­ti­tion be­tween one’s sper­ma­to­zoons is a com­pe­ti­tion be­tween the same genes, wich cant cre­ate any se­lec­tion pres­sure. Also even if such pres­sure ex­isted, it could only lead to sper­ma­to­zoons’ struc­ture evolu­tion and could not help to adapt to ex­ter­nal en­vi­ro­ment.
I also apol­o­gize for my English.

I am some­what more dis­turbed by the suffer­ing of the eaten ba­bies than by the baby-eat­ing it­self. I don’t like the baby-eat­ing but I could tol­er­ate it by chalk­ing it up to Bizarre Alien Biol­ogy or what­ever, but it should be pos­si­ble to eu­th­a­nize the ba­bies be­fore they are eaten, or what­ever. Ba­si­cally, I hate pain more than I hate death.

Con­sider the typ­i­cal hu­man re­ac­tion to the treat­ment of food an­i­mals in fac­tory farms...

Aye, not nec­es­sar­ily. But per­haps the ges­ture of good will might be large enough to get the babyeaters to, say, take a medicine which melts the brains of their chil­dren right af­ter they’re eaten. They might be against such a medicine, but since they didn’t evolve know­ing that their ba­bies were be­ing slow-tor­tured for a month, they might not have de­sires against the medicine stronger than the de­sires in fa­vor of hav­ing ten times as many kids. (And be­cause the hu­mans have tech. su­pe­ri­or­ity, they could ac­tu­ally en­force the deal if that’s nec­es­sary.)

It’s a tricky eth­i­cal ques­tion know­ing whether the hu­mans are bet­ter off with that deal. And it’s a tricky ques­tion of baby-crunch-crunch whether the baby-eaters are more-baby-eaten with that deal. But maybe there are bet­ter deals than the one I was able to think of in ten min­utes.

Eliezer wrote:
“Clearly you don’t value sex with your lover, since you’re not hav­ing sex with him/​her ev­ery minute of ev­ery day; you put more value on up­hold­ing the tra­di­tion of sex, rather than sex as such.”

I value sex with my girlfriend, but I also value lots of other things with com­pa­rable or greater in­ten­sity; these other de­sires are there­fore in com­pe­ti­tion for my time and at­ten­tion with my de­sire to have sex; as a re­sult, I spend some time hav­ing sex, and some time read­ing posts on Over­com­ing Bias. I don’t spend all my time hav­ing sex be­cause I don’t value sex that much.

How­ever, if I thought that hav­ing sex was the best and most moral thing I could pos­si­bly do with my time, I would do ev­ery­thing in my power to have as much sex as pos­si­ble. If I spent any time work­ing, it would be with the goal of earn­ing enough money to pay for food, lodg­ing, etc, in or­der to be able to have sex later.

And if some­one offered to al­ter my genes to, say, do away with my re­frac­tory pe­riod, so that I could screw my girlfriend non­stop, I’d agree in a nanosec­ond.

I guess the ship’s “coun­cil” mak­ing the de­ci­sions helps the ar­gu­ment that Eliezer is mak­ing, and can be waived sim­ply be­cause you could have a length­ier story where they trav­eled back to Earth and then the Earth Govern­ment had ex­actly the same de­bate. But that’s nit­picky, how would it help the story or the ar­gu­ment be­hind it? IMHO it’s good the way it is.

Some of the coun­cil’s mem­bers hav­ing these ex­treme re­ac­tions of em­pa­thy seems a bit alien to us even, but that is our own bias. We ig­nore suffer­ing in so-called 3rd world coun­tries ev­ery day. “Aliens eat­ing each other? Please! Live and let die.” Who’s right on this one, us or them?

The coun­cil mem­bers live in a world where hu­mans (the only sen­tients they know) don’t fight other hu­mans, and that’s it. Suffer­ing is con­sid­ered the most amoral thing by their stan­dards. I think it’s en­tirely in-char­ac­ter and rea­son­able that they feel com­pel­led to stop it.

Gamete se­lec­tion has quite a few prob­lems. It only op­er­ates on half the genome at a time—and se­lec­tion is performed be­fore many of the genes can be ex­pressed. Of course ga­mete se­lec­tion is cheap.

What spi­ders do—i.e. pro­duce lots of offspring, and have many die as in­fants—has a huge num­ber of evolu­tion­ary benefits. The lost ba­bies do not cost very much, and the value of the se­lec­tion that acts on them is great.

Hu­man be­ings can’t get eas­ily get there—since they cur­rently rely on ges­ta­tion in­side a hu­man fe­male body for nine months, but—make no mis­take—if we could pro­duce lots of young, and kill most of them at a young age, then that would be a vastly su­pe­rior sys­tem in terms of the quan­tity and qual­ity of the re­sult­ing se­lec­tion.

Hu­man fe­males do abort quite a few foe­tuses af­ter a month or so—ones that fail in­ter­nal and ma­ter­nal in­tegrity tests—but the whole sys­tem is ob­vi­ously ap­pal­ingly in­effi­cient.

I don’t know what poli­ti­cal setup the hu­mans have, but it prob­a­bly doesn’t ex­tent to Akon and his crew choos­ing war for the whole hu­man species. Wouldn’t the wise thing to do be to re­port back, es­pe­cially con­sid­er­ing they have some very im­por­tant news?

I’m already imag­in­ing three’s sen­sor ca­pa­bil­ities are ad­vanced enough to see aliens, but they re­ally don’t like get­ting blown up. Upon see­ing these two other aliens reach first con­tact and not blow each other up, it would be more rea­son­able to risk show­ing up.

Akon claims this is a “true” pris­oner’s dilemma situ­a­tion, and then tries to add more val­ues to one side of the scale. If he adds enough val­ues to make co­op­er­a­tion higher value than defect­ing, then he was wrong to say it was a true pris­oner’s dilemma. But the story has made it clear that the aliens ap­pear to be not smart enough to ac­cu­rately an­ti­ci­pate hu­man be­havi­our (or vice versa for that mat­ter), so this is not a situ­a­tion where it is ra­tio­nal to co­op­er­ate in a true pris­oner’s dilemma. If it re­ally is a true pris­oner’s dilemma, they should just defect.

Of course, there may be a more hu­mane ap­proach than ex­ter­mi­na­tion or re­quiring them to live un­der hu­man law: forcible mod­ifi­ca­tion to re­move the de­sire to eat ba­bies, and re­duce the amount of re­pro­duc­tion. It might be a lit­tle tricky to do this with­out com­pletely mess­ing up the aliens’ psy­chol­ogy.

Also, it seems a lit­tle un­likely that a third ship would ar­rive given that the ar­rival of even one alien ship was con­sid­ered so sur­pris­ing in the first in­stal­l­ment.

I won­der about the psy­cholog­i­cal mechanisms and in­tu­itions at work in the Babyeaters. After all, hu­man ba­bies don’t look like Babyeater ba­bies, they’re less in­tel­li­gent, etc. Their in­tel­lec­tual ex­ten­sion of strong in­tu­itions to ex­otic cases might well be much more flex­ible than their ap­pli­ca­tions to situ­a­tions from the EEA, e.g. satis­fy­ing them by drink­ing cock­tails con­tain­ing mil­lions of blas­to­cysts. Similarly, hu­man in­tu­itions start to go hay­wire in ex­otic sci-fi thought ex­per­i­ments and strange mod­ern situ­a­tions.

Hmm… does some in­stance of util­ity get mul­ti­plied by the num­ber of peo­ple who find it util­i­tous? Like, if there are twice as many hu­mans, does that mean that one Babyeater baby eaten sub­tracts twice as much from group util­ity?

Aye, not nec­es­sar­ily. But per­haps the ges­ture of good will might be large enough to get the babyeaters to, say, take a medicine which melts the brains of their chil­dren right af­ter they’re eaten. They might be against such a medicine, but since they didn’t evolve know­ing that their ba­bies were be­ing slow-tor­tured for a month, they might not have de­sires against the medicine stronger than the de­sires in fa­vor of hav­ing ten times as many kids. (And be­cause the hu­mans have tech. su­pe­ri­or­ity, they could ac­tu­ally en­force the deal if that’s nec­es­sary.)

It’s a tricky eth­i­cal ques­tion know­ing whether the hu­mans are bet­ter off with that deal. And it’s a tricky ques­tion of baby-crunch-crunch whether the baby-eaters are more-baby-eaten with that deal. But maybe there are bet­ter deals than the one I was able to think of in ten min­utes.

@Mar­cello:
I as­sumed you agree that in­creas­ing the babyeat­ing prob­lem ten­fold isn’t some­thing you’d ex­pect to be re­cip­ro­cated, not with­out know­ing some­thing they presently don’t, and so the is­sue ac­tu­ally should be dis­missed on that ground for the time be­ing. It seems that you didn’t start from this premise. Where you ex­pect to profit—sure, it’s nor­mal trade at that point.

The trick with co­op­er­at­ing in pris­oner’s dilemma is pri­mar­ily in de­ci­sion-the­o­retic set­ting, where you’ve only got one de­ci­sion that’s es­ti­mated over ev­ery­thing. The the­sis is that co­op­er­a­tion is not what you get as in­stru­men­tal strat­egy from struc­ture of a game, it’s what you start from as ter­mi­nal choice (and can lose in struc­ture of the game). It doesn’t trans­late well to bounded ra­tio­nal­ity, some­times you have to do what looks like defect­ing be­cause you don’t know the con­se­quences.

For ex­am­ple, co­op­er­a­tion re­sult should ex­tend to a set­ting where one player ob­serves the de­ci­sion of the other player. Should I co­op­er­ate, know­ing that the other player will ob­serve my de­ci­sion be­fore mak­ing his? It looks like I shouldn’t, un­less I have a way of know­ing that he co­op­er­ates, just ex­pect­ing him to do that in or­der to be in the po­si­tion to re­ceive my co­op­er­a­tion doesn’t work (un­less he re­ally makes a com­mit­ment/​changes his util­ity, and pre­sents ev­i­dence). But if I have the pre­dic­tive power of Omega, sure, co­op­er­a­tion as the right de­ci­sion in that set­ting is what I’d ex­pect.

tim: How could they not be able to dis­t­in­guish be­tween the con­cept of good and the con­cept of baby eat­ing if they un­der­stand that sur­vival is good

What good is life with­out eat­ing ba­bies? How can you not un­der­stand that tribal loy­alty is good?

I’m afraid I don’t un­der­stand. By “not a babyeat­ing thing”, do you mean that the Babyeaters (un­like us) use the same term/​con­cept for the ‘sep­a­rable essence of good­ness’ as they do for one par­tic­u­lar ter­mi­nal value? Or do you just mean that con­vinc­ing them not to eat ba­bies would be analo­gous, and analo­gously hard, to con­vinc­ing hu­mans to eat ba­bies?

Vladimir says:
“”“Every de­ci­sion to give a gift on your side cor­re­sponds to a de­ci­sion to ab­stain from ac­cept­ing your gift on the other side. Thus, de­ci­sions to give must be made on case-to-case ba­sis, co­op­er­a­tion in true pris­oner’s dilemma doesn’t mean un­con­di­tional char­ity.”””

Agreed. Ob­vi­ously (for ex­am­ple) the hu­man ship shouldn’t self-de­struct. But I wasn’t talk­ing about all gifts, I was talk­ing about the spe­cific class of gifts called “helpful ad­vice.” And I did spec­ify: “pro­vided that, on the whole, situ­a­tions in which helpful ad­vice is given freely are bet­ter.”

I was com­par­ing the two strate­gies “Don’t give away any helpful ad­vice of the level the other party is likely to be able to re­cip­ro­cate” and “give away all helpful ad­vice of the level the other party is likely to be able to re­cip­ro­cate” and point­ing out that maybe they form an­other pris­oner’s dilemma. Of course, there may be more fine-grained strate­gies that work even bet­ter, strate­gies that ac­tu­ally take into ac­count the rel­a­tive amount of good and bad each piece of ad­vice brings to the two par­ties. But re­mem­ber that you must also con­sider how your strat­egy is go­ing to be chrono­phoned over to the baby eaters. If we make the first gift, what ex­change rate of baby-eater utilons for hu­man utilons do we tol­er­ate? (If the gifts are made of in­for­ma­tion, it may be im­pos­si­ble for trades to be au­then­ti­cated with­out the pos­si­bil­ity of other party tak­ing the gift and us­ing it (though of course it might be that the equil­ibrium has an honor sys­tem....)) It looks like it gets re­ally com­pli­cated. Worth think­ing about? Yes, but right now I’m busy.

@Mar­cello:
Every de­ci­sion to give a gift on your side cor­re­sponds to a de­ci­sion to ab­stain from ac­cept­ing your gift on the other side. Thus, de­ci­sions to give must be made on case-to-case ba­sis, co­op­er­a­tion in true pris­oner’s dilemma doesn’t mean un­con­di­tional char­ity.

James An­drix: I don’t claim that the aliens would pre­fer mod­ifi­ca­tion over death, only that it is more con­so­nant with my con­cep­tion of hu­man val­ues to mod­ify them than ex­ter­mi­nate them, notwith­stand­ing that the aliens may pre­fer the lat­ter.

“”″
“Out of cu­ri­os­ity,” said the Lord Pilot, “have they ever tried to pro­duce even more ba­bies—say, thou­sands in­stead of hun­dreds—so they could speed up their evolu­tion even more?”

“It ought to be eas­ily within their cur­rent ca­pa­bil­ities of bio­eng­ineer­ing,” said the Xenopsy­chol­o­gist, “and yet they haven’t done it. Still, I don’t think we should make the sug­ges­tion.”″

“Agreed,” said Akon.
“”″

That’s not the least bit ob­vi­ous. Do we re­ally want the Babyeaters to hold back cor­re­spond­ing sug­ges­tions that might make our so­ciety bet­ter from our per­spec­tive and worse from theirs?

If, in this situ­a­tion, we ought to bite the pris­oner’s-dilemma bul­let to the de­gree of not in­vad­ing the Babyeater planet be­cause peace­ful situ­a­tions are, on av­er­age, bet­ter than war-torn situ­a­tions, doesn’t the same ar­gu­ment mean that we shouldn’t hold back helpful ad­vice, pro­vided that, on the whole, situ­a­tions in which helpful ad­vice is given freely are bet­ter?

Now maybe it’s the case that if we swapped that par­tic­u­lar kind of helpful ad­vice with the baby eaters, the de­gree to which Babyeater planet got worse by our stan­dards is more than the de­gree to which our planet would bet bet­ter by our stan­dards, and vise versa. But in that case it would be bet­ter for both sides to draw up a treaty....

I asked about this in yes­ter­day’s com­ments thread, but I guess ev­ery­one’s moved here since then :-)

My in­tu­ition is that se­lec­tion pres­sure on young aliens (to do any­thing it takes not get eaten) would be stronger than most se­lec­tion pres­sure adults ex­pe­rience (most adults pro­duce hun­dreds of offspring ⇔ only one offspring out of sev­eral hun­dred sur­vives; and in a tech­nolog­i­cal so­ciety most if not all adults live to re­pro­duce).

We should see chil­dren evolv­ing to es­cape be­ing eaten. If run­ning faster doesn’t work, then by hurt­ing other chil­dren to make them run slower. Or by chil­dren eat­ing one an­other them­selves. Or by a so­cial or­ga­ni­za­tion that lets a few bul­lies/​rulers/​… send other chil­dren to be eaten in their stead. Or by evolv­ing to be poi­sonous or at least tast­ing re­ally bad and hav­ing or­ange-black striping to warn your par­ents :-)

Also, the pe­riod of time from birth to the be­gin­ning of (post-win­now­ing) growth spurt would be com­pressed to the ut­ter min­i­mum re­quired by their phys­iol­ogy. (The faster you grow up, the smaller the win­dow of dan­ger to be eaten). On that ba­sis, the pre-win­now­ing chil­dren may not have much time to philoso­phize about be­ing eaten.

Even­tu­ally you get is a crea­ture that’s born sen­tient, man­ages to learn about the day/​night cy­cle (plus what­ever in­born “in­stinct” pro­vides), and then the win­now­ing takes place at the age of 2 days be­fore the growth spurt can be­gin. Very stylized, kind of thing.

Also, if some peo­ple care so much about this cru­sade they’re will­ing to go against the rest of hu­man so­ciety and risk a huge war, then log­i­cally they ought to have mounted a huge op­er­a­tion long ago to sweep the galaxy look­ing for morally un­suit­able aliens. Killing or force­fully trans­form­ing any alien species that 1) they judge to be suffi­ciently in­tel­li­gent and 2) whose be­hav­ior doesn’t con­form to hu­man morals.

Or they might re­al­ize there’s no real up­per bound on the amount of suffer­ing that might po­ten­tially be tak­ing place some­where out of sight. Espe­cially if you give more weight to the suffer­ing or death of more in­tel­li­gent in­di­vi­d­u­als. In which case they might want to make an al­li­ance with the Baby Eaters to search the galaxy for cul­tures so alien that they would be abom­i­na­tions to both species. And only ex­ter­mi­nate the Baby Eaters once the galaxy has been swept clean.

Put like that, it seems to me to be a re­ally bad idea. But isn’t that what fol­lows from the Pilot’s ar­gu­ment? If stop­ping the Baby Eat­ing is so im­por­tant they’re will­ing to risk the ex­ter­mi­na­tion of hu­man­ity for it. (And there’s no way they could be sure of the Baby Eaters’ po­ten­tial in a species-wide war just from read­ing one badly trans­lated and pos­si­bly cen­sored alien library for a day. So they’re propos­ing go­ing to war where they can’t be sure of vic­tory.)

It was said that the aliens would not ac­cept ga­mete re­pro­duc­tion be­cause they ate chil­dren be­fore they knew what evolu­tion was. They did it be­cause it in­creased the num­ber of sur­viv­ing chil­dren from each pen/​ cou­ple. If they are still cul­ling to in­crease the suc­cess rate, than a new method that in­creased their suc­cess rate would be prefer­able. In the story, ga­metes are bet­ter for evolu­tion than cul­ling so ga­metes would pro­duce more sur­viv­ing off-spring and would thus be prefer­able to cul­ling.
I said “in the story” ga­metes are bet­ter be­cause I dis­agree with the au­thor on which is bet­ter for evolu­tion. Pro­duc­ing good ga­metes will al­low one to get an off-spring in­stead of ri­val mates, but the best ga­mete doesn’t nec­es­sar­ily make the best off-spring. Gametes are se­lected by which one is the best swim­mer and which has the best ‘sense of smell’. It could be coded to pro­duce a child with ex­tremely low in­tel­li­gence or ex­tremely poor mus­cle tone, or un­able to get glyco­gen out of their cells (Pompe). If out of the mil­lions of sperm emit­ted with each ejac­u­la­tion, the best tended to win, there would be much lower rates of ge­netic dis­ease.
They said that the alien would ex­pect that if Hitler were bad he wouldn’t have been made a leader or would have been kil­led. Most of the world dis­liked geno­cide, but the Nazi, be­ing loyal, liked what ever Ger­many was do­ing so they liked geno­cide. This group was wrong and is the rea­son Hitler was the leader. I’m not say­ing the au­thor messed up due to this ; its just some­thing I no­ticed. A po­ten­tial flaw was it was said if Hitler had been bad hu­mans would have kil­led him. We did try to kill him and then tried the peo­ple who pro­tected him for war crimes, al­though the Nazis be­ing a group, it might have been seen as a ’rea­son­able er­ror” esp. since most Nazis weren’t kil­led.

See, that’s why you don’t say “be­cause” about evolu­tion. They eat chil­dren be­cause-as-cog­ni­tive-mo­ti­va­tion they have a drive to eat chil­dren. They have a drive to eat chil­dren be­cause-as-evolu­tion­ary-his­tory it in­creased the num­ber of sur­viv­ing chil­dren.

Eliezer, why do you hate death so much? I un­der­stand why you’d hate it as much as the so­cial norm wants you to say you do, but not so much more. Peo­ple don’t hate death, and don’t even say they hate death nearly as much as you do. I can’t think of a sim­pler hy­poth­e­sis than “Eliezer is a mu­tant”.

Now, of course, throw­ing in the long, painful agony of chil­dren changes some­thing.

“Every hu­man cul­ture had ex­pended vast amounts of in­tel­lec­tual effort on the prob­lem of com­ing to terms with death. Most re­li­gions had con­structed elab­o­rate lies about it, mak­ing it out to be some­thing other than it was – though a few were dishon­est about life, in­stead. But even most sec­u­lar philoso­phies were warped by the need to pre­tend that death was for the best. ¶ It was the nat­u­ral­is­tic fal­lacy at its most ex­treme — and its most trans­par­ent, but that didn’t stop any­one. Since any child could tell you that death was mean­ingless, con­tin­gent, un­just, and ab­hor­rent be­yond words, it was a hal­l­mark of so­phis­ti­ca­tion to be­lieve oth­er­wise.” —Mar­git in ‘Border Guards’ by Greg Egan