Gamma Andromeda, where philosophical stoicism went too far. Its inhabitants, tired of the roller coaster ride of daily existence, decided to learn equanimity in the face of gain or misfortune, neither dreading disaster nor taking joy in success.
But that turned out to be really hard, so instead they just hacked it. Whenever something good happens, the Gammandromedans give themselves an electric shock proportional in strength to its goodness. Whenever something bad happens, the Gammandromedans take an opiate-like drug that directly stimulates the pleasure centers of their brain, in a dose proportional in strength to its badness.
As a result, every day on Gamma Andromeda is equally good compared to every other day, and its inhabitants need not be jostled about by fear or hope for the future.
This does sort of screw up their incentives to make good things happen, but luckily they’re all virtue ethicists.
Zyzzx Prime, inhabited by an alien race descended from a barnacle-like creature. Barnacles are famous for their two stage life-cycle: in the first, they are mobile and curious creatures, cleverly picking out the best spot to make their home. In the second, they root themselves to the spot and, having no further use for their brain, eat it.
This particular alien race has evolved far beyond that point and does not literally eat its brain. However, once an alien reaches sufficiently high social status, it releases a series of hormones that tell its brain, essentially, that it is now in a safe place and doesn’t have to waste so much energy on thought and creativity to get ahead. As a result, its mental acuity drops two or three standard deviations.
The Zyzzxians’ society is marked by a series of experiments with government – monarchy, democracy, dictatorship – only to discover that, whether chosen by succession, election, or ruthless conquest, its once brilliant leaders lose their genius immediately upon accession and do a terrible job. Their government is thus marked by a series of perpetual pointless revolutions.
At one point, a scientific effort was launched to discover the hormones responsible and whether it was possible to block them. Unfortunately, any scientist who showed promise soon lost their genius, and those promoted to be heads of research institutes became stumbling blocks who mismanaged funds and held back their less prestigious co-workers. Suggestions that the institutes eliminate tenure were vetoed by top officials, who said that “such a drastic step seems unnecessary”.
K’th’ranga V, which has been a global theocracy for thousands of years, ever since its dominant race invented agricultural civilization. This worked out pretty well for a while, until it reached an age of industrialization, globalization, and scientific discovery. Scientists began to uncover truths that contradicted the Sacred Scriptures, and the hectic pace of modern life made the shepherds-and-desert-traders setting of the holy stories look vaguely silly. Worse, the cold logic of capitalism and utilitarianism began to invade the Scriptures’ innocent Stone Age morality.
The priest-kings tried to turn back the tide of progress, but soon realized this was a losing game. Worse, in order to determine what to suppress, they themselves had to learn the dangerous information, and their mental purity was even more valuable than that of the populace at large.
So the priest-kings moved en masse to a big island, where they began living an old-timey Bronze Age lifestyle. And the world they ruled sent emissaries to the island, who interfaced with the priest-kings, and sought their guidance, and the priest-kings ruled a world they didn’t understand as best they could.
But it soon became clear that the system could not sustain itself indefinitely. For one thing, the priest-kings worried that discussion with the emissaries – who inevitably wanted to talk about strange things like budgets and interest rates and nuclear armaments – was contaminating their memetic purity. For another thing, they honestly couldn’t understand what the emissaries were talking about half the time.
Luckily, there was a whole chain of islands making an archipelago. So the priest-kings set up ten transitional societies – themselves in the Bronze Age, another in the Iron Age, another in the Classical Age, and so on to the mainland, who by this point were starting to experiment with nanotech. Mainland society brought its decisions to the first island, who translated it into their own slightly-less-advanced understanding, who brought it to the second island, and so on to the priest-kings, by which point a discussion about global warming might sound like whether we should propitiate the Coal Spirit. The priest-kings would send their decisions to the second-to-last island, and so on back to the mainland.
Eventually the Kth’ built an AI which achieved superintelligence and set out to conquer the universe. But it was a well-programmed superintelligence coded with Kth’ values. Whenever it wanted a high-level decision made, it would talk to a slightly less powerful superintelligence, who would talk to a slightly less powerful superintelligence, who would talk to the mainlanders, who would talk to the first island…
Chan X-3, notable for a native species that evolved as fitness-maximizers, not adaptation-executors. Their explicit goal is to maximize the number of copies of their genes. But whatever genetic program they are executing doesn’t care whether the genes are within a living being capable of expressing them or not. The planet is covered with giant vats full of frozen DNA. There was originally some worry that the species would go extinct, since having children would consume resources that could be used hiring geneticists to make millions of copies of your DNA and stores them in freezers. Luckily, it was realized that children not only provide a useful way to continue the work of copying and storing (half of) your DNA long into the future, but will also work to guard your already-stored DNA against being destroyed. The species has thus continued undiminished, somehow, and their fondest hope is to colonize space and reach the frozen Kuiper Belt objects where their DNA will naturally stay undegraded for all time.
New Capricorn, which contains a previously undiscovered human colony that has achieved a research breakthrough beyond their wildest hopes. A multi-century effort paid off in a fully general cure for death. However, the drug fails to stop aging. Although the Capricornis no longer need fear the grave, after age 100 or so even the hardiest of them get Alzheimers’ or other similar conditions. A hundred years after the breakthrough, more than half of the population is elderly and demented. Two hundred years after, more than 80% are. Capricorni nursing homes quickly became overcrowded and unpleasant, to the dismay of citizens expecting to spend eternity there.
So another research program was started, and the result were fully immersive, fully life-supporting virtual reality capsules. Stacked in huge warehouses by the millions, the elderly sit in their virtual worlds, vague sunny fields and old gabled houses where it is always the Good Old Days and their grandchildren are always visiting.
I don’t think I ever explicitly realized this before now, but it seems that tragedy multiplied by the population of a world automatically becomes horror.
How do you choose names?
Who needs more Earthfic though?
Yeah, when I read the Zyzzx section, I stopped and went back to the Gamma Andromeda section suspecting it to be a veiled reference to Earth as well. Couldn’t quite figure out how it was, though, and the later ones are even further off, so I don’t think this was the intended meaning for the whole post.
Earth is Zyzzx Zero.
New Capricorn sounds like an almost plausible prequel for The Matrix.
Chan X-3 bonus idea for Less Wrongers: their genetic material is encoded on paper clips.
Perhaps even more disturbing than New Capricorn is Macrobia, where scientists discovered the secret to eternal youth and millions unwittingly transformed themselves into toddlers. Only a handful of technophobes remain to care for the kids, and it is not unusual to babysit for thousands.
My hopeful side insists that the millions of toddlers are capable of developing the skills necessary to survive without constant supervision. My cynical side says I was unusually lucid by age 2, 24 years later still can’t manage, and really have no business being optimistic about an uncoordinated swarm of toddlers.
… *Hides 13-year-old notes on a village in which most everyone elects to stop aging by their fifth year, probably due primarily to peer-pressure since their education system consists of a small pile of books and what wisdom they could learn from the only two physical 9-year-olds who were the firstborn* … *And nothing ever went wrong for many years until they got invaded by a sentient plague*
(This is like the first time this ever had a reason to come up in conversation, go figure.)
Being four was cool. No work. Napping at will. A seemingly endless supply of Warner Bros. cartoons. But I’m not sure I could have fixed the video player myself.
Chan X-3 sounds like Niven’s Pak Protectors. New Capricorn has the “Near Death Star” from Futurama.
K’th’ranga V was flipping terrifying until they got a Friendly AI. Granted that they programmed it properly (ie: *do* allow for moral progress and *don’t* enshrine the decisions of current leaders for the rest of eternity), it would have figured out how to get the priests to give the right answers. (Yes, that’s right: we just found a scenario in which people are better off being manipulated by a superintelligent AI than not. This is a rare occasion.)
Zyzzx Prime is rather dull social satire.
Gamma Andromeda is more or less a full and proper refutation of Stoicism and virtue ethics, and I’m glad to have it in my arsenal whenever someone brings up those awful bits of bunk.
A planet of people failing at Stoicism and taking up drugs and electric shocks is a refutation of Stoicism?
New Capricorn seems like a narrow escape for http://en.wikipedia.org/wiki/Struldbrug.
Got there before me! I was going “Hey, the Capricornians sound just like Swift’s Struldbrugs”, which seems to mean less that we need new SF and more that we should all be re-reading 18th century SF.
I would laugh about the Chan X-3, were it not for the fact that a couple of days ago I read a burbling article by some woman about having her ova harvested and frozen in Las Vegas (now, apparently, a big centre for this whole ‘freezing ova, sperm and embryos’ business) in which she wambled on about how great it felt to have stored ova as well as embryos: maybe she’ll have a kid one day. Or even two! Or maybe no kids at all, she’ll adopt! Or no kids ever!
Which naturally made me ask “Then why the devil have your ova frozen if you’re not even sure you want children, you twit?”
I see I was under the misapprehension that she was Human, obviously she was one of the Chan X and their space programme has succeeded to the point where they’re now using the genetic repositories of other planets to store their own material.
“Then why the devil have your ova frozen if you’re not even sure you want children, you twit?”
This doesn’t seem entirely insane to me. If you don’t know what you want, and waiting until you’re sure will close off options by default, then it’s worth a certain amount of money and time to keep those options open.
People don’t stay fertile forever. Freezing sperm or ova seems less like throwing money down a hole and more like buying stock options.
This is starting to remind me of Robin Hanson’s provocative proposal which he posted on his blog yesterday.
To me, having ova or embryos frozen does indicate “I plan to have a child sometime, later on when I have a career established or achieve whatever goal it is that is preventing me having a child now. My fertility will probably be lowered or even finished, so it is prudent for me to take this step.”
On the other hand, having ova frozen because “Well, I dunno, it seemed like kinda a cool thing to do, I might want kids – or maybe a dog! Or I could adopt, or not. Or buy a car instead!” – I don’t get that.
Why sink money, time and effort into having your ova harvested (and this is not a trivial procedure) then frozen, then commit to keep them frozen for an undefined time which will cost further time, effort and money, and perhaps never use them in the end?
Being doubtful about your chances of maximum fertility when you are ready to reproduce is one thing; being all “Oooh look, shiny!” is another. I very probably am being unfair to the woman, based on what was meant to be a light-reading piece, but it is rather like going out to a restaurant with someone who keeps everyone else waiting as she tries to decide will she have the salad as a starter or a main course, or maybe chicken, chicken is lean meat and good for you, right? Although now she’s heard red meat is better, but the pasta sounds good too, though perhaps she should be cutting down on gluten – oh, how can she tell what she wants to eat? Or if she wants to eat at all, maybe it’d be better if we all went to a movie instead?
I’m not very tolerant of ambiguity when it comes to things like this: do you or don’t you want children? If you’re going to freeze your ova, you need to make your mind up one way or the other.
Why sink money, time and effort into having your ova harvested (and this is not a trivial procedure) then frozen, then commit to keep them frozen for an undefined time which will cost further time, effort and money, and perhaps never use them in the end?
If these technical and financial difficulties were solved, would you still have the same reaction?
As it is, the only way I see her affecting other people by keeping her options open, is if she would otherwise spend her money on the other people. She is not making others listen to her decision process. She is like a diner who, instead of making others wait, simply says quickly “Bring me the appetizer plate and a cold ham sandwich to go, so I can decide later.”
But if it’s a proper Kth’ AI, wouldn’t its definition of the right answer be “the answer transmitted back to me after I have sent the description of the problem up the chain with maximum clarity”?
But should you propitiate the Coal Spirit? That was never answered!
I’m rolling my eyes a bit at the “hectic pace of modern life” stuff, because really? So if your father died one hundred years ago, obviously it’s not as hurtful to you or as serious as we modern people would feel it now, because you only rode in horse-drawn carriages and we have jet planes! And in the same way, our descendants will feel a frenzy of grief our duller senses cannot apprehend, because they will fly around in rockets going millions of miles an hour?
We may do things faster. I remain to be convinced that things are hugely more complicated. “Do we drop a nuclear bomb and obliterate that city?” is not, in its essence, that much different from “Do we raze that city to the ground, plough the rubble, and sow it with salt?” in how we do or should treat our enemies, either from the standpoint of ethics or practical advantage.
I don’t think anything like that is implied. I don’t even get where this ‘increase in feeling’ is coming from, nothing about it in OP.
I think that stability is treated as vastly more important and the cracks are more visible. In the bronze age an army could run in, sack a city, and leave with slaves and loot if it became too poorly defended. Today, even a poorly defended city is still going to mean a huge fight where artillery smashes up the landscape and invading forces either get stuck on defenses coming in (if they don’t have air support) or smash everything but can’t loot it (if they do). And if you actually enter the city, the prepare for giant guerilla warfare mess.
At least in war, technology made individual heroism pretty ridiculous. In 1000 years we have gone from individual warriors who didn’t seem to take their lives so seriously, who lead hordes of foot soldiers, to huge armies of miserables trained by drill, to combined arms and strict professionalism.
A bronze age hero is Achilles or Odysseus . A modern hero is someone who charged the enemy after their machine gun jammed and saved his squad.
Hectic pace of modern life? That mostly consists of communication. Consider Jesus. If that happened in the 1800s, the outside world would hear of it and be commenting on it. Pontius Pilate might ask for outside advice or authorization and the Disciples would also have outside resources. If it happened today, people in distant countries would be discussing events *before* the Disciples had decided what to do next.
Stop comparing modern reality to ancient fiction and acting like something has changed.
The reality of Greek warfare involved large numbers of disciplined and individually interchangeable troops, just like today.
Greek heroes who actually existed also consisted of ordinary soldiers who performed extraordinary acts of selfless sacrifice for their comrades or their cause, just like today, while Greek mythology consisted of implausibly tough and competent bad-asses running around like a one man army, JUST LIKE TODAY.
To think what heroic Greek soldiers would feel about being remembered as being like Achilles. It’d be like remembering Vietnam War veterans as being like Major Alan Schaefer.
I think you’re conflating the much more modern classical era with the times the myths are written about? Also I wouldn’t quite call them fiction, either. They definitely aren’t like the modern romances told for city dwellers to whom conflict is alien.
Clearly there have never been hypercompetent badasses as a normal element of war. But I think that you could probably draw a line somewhere that marks the time when such a person changes from a romantic idealization of reality, to a rather ridiculous figure.
You can also note examples that survive (Fighter pilots seemed to be this very strongly in the popular imagination, and to a lesser degree in reality, for the entire time that fighter aircraft have existed).
You can also look at what modern fiction does. Consider “real robot” Japanese fiction which tends to use a technological excuse to make fighting units tiny and slow confrontations down to an armored-hand-to-hand speed.
Those are all good points, especially about fighter pilots.
But I still think that major practical transition here occurs at about the “invention of cities” time-frame rather than around the Industrial Revolution. To the extend that the Greek romanticized the individual hero in an era of professional soldiers, they did so to the same extent, for the same reasons, and despite the same conflicts between reality and fiction as we do. (Which would still include much of the industrial world for much of its history, and whose fiction still had many of the same tropes.)
I think there are actually two slightly different types of warrior hero in the popular imagination.
The first type, the epic hero, is Bronze or early Iron Age in origin. This type represents a warrior aristocrat, but the emphasis here is on warrior: this kind of hero grows out of traditions from tribal chiefdoms, pastoralist or small-scale agricultural, but either way dirt poor, too poor to support much difference between a grunt and a prince. An epic hero is likely to be rich by their society’s standards, but that just means they own a sword and maybe a mail coat. Strength and courage are emphasized, and by modern standards they’re likely to come off a little bit crazy. Beowulf is one of these. So is Achilles.
The second type, the romantic hero, is also a warrior aristocrat, but here the aristocrat is emphasized. They come from more stratified and technologically adept societies, roughly the 12th century onward in Europe: places that’ve developed their military arts to a point where a mounted and armored warrior can kill an arbitrary number of shit-kicking peasants armed with pitchforks. The emphasis here is on honor, grace, and style. Think Roland or King Arthur.
Simply because we can kill vastly greater numbers of people, at a distance, and increase collateral damage to the extent that we may as well be sacking cities and taking slaves does not change the underlying question: is this necessary? is it ethical, moral or reasonable? should we completely blow Hiroshima/Carthage to rubble to show to the watching powers that we mean business and you mess with us, this could be you next? Or is that going too far?
If murder is wrong, then it is not a case of it’s only wrong if it’s a Bronze Age prohibition in the days when I could hit Joe over the head with a rock because he annoyed the heck out of me, but hey, now that I’m using a deathray to vaporise his brains instead, this is completely different and not really murder because the hectic pace of modern life, right?
Indeed. Some of my great-great grandparents were farmers, not an easy business to be in, and they were doing it without reliable five-day weather forecasts or any way of keeping milk cool, or antibiotics or vaccinations for their stock.
K’th’ranga V would seem to be vulnerable to a “Broken Telephone” effect, where all the descriptions of the problems and solutions sent up and down the chain would be prone to such extreme garbling that the end results would be pretty random and not very well suited for whatever the original problem was.
New Capricorn, as Anatoly already pointed out, would seem to have been anticipated by Jonathan Swift centuries ago.
Perhaps the same could be said for all ethical systems.
I’m not entirely sure if such low-effort comments are frowned upon here, but I believe that a customary response to this is “Your words are as empty as your soul. Mankind ill needs a savior such as you.”
More on topic, I think at least the first two will keep bugging me now, just as I have a few pictures that I had in my mind for years but won’t ever draw them due to insufficient skill. Likewise, I think I may have written less than 1000 words of fiction in my life, so the result would most likely be dreadful, but scraps of Zyzzxians’ story will float around in the back of my head in 2020.
In case there are others who didn’t get the joke, and weren’t sure if BarryOgg was really taking STA to task, and if so why? why would he do that?: http://www.youtube.com/watch?v=7z65s0fqDw4
(I didn’t get the joke, and actually worried about BarryOgg’s comment for several minutes: he describes his own content-free criticism as “low-effort,” so he isn’t just assuming all readers understand his own ethical system as obviously true [or, if you’re one of the evil according to BO’s worldview, obviously false]. But he goes ahead and says it anyway! Is this some kind of trolling triple-axel?
All I’m saying: there’s a consequentialist/utilitarian case to be made for providing links when you make references to things in blog comments, so people who don’t get it don’t spend a bunch of time trying to parse your comments. [On the other hand, people who don’t get things should just google phrases in quotes and stop wasting people’s time on meta-jokey comments that waste everyone’s time.])
PS: Thanks for cluing me in to the Castlevania bit, BarryOgg – I enjoyed it, even though it made me feel foolish for not googling it sooner.
And, to be clear: the whole thing? Where I didn’t get the reference and wrote far too much instead of just posting a link as a followup comment? Totally my fault.
Astrology, Tarot cards, and delphic oracles suggest that cryptic, need-to-be-decoded guidance isn’t necessarily a bad thing.
This is amazing. Zyzzx Prime was my favorite because all of their problems revolve around a seemingly impassable issue – progress reduces your ability to make future progress. Its fun trying to figure out a way around this. Maybe have one person be the brains behind the research operation and perform all of their actions through someone else. This assistant can lie to the researcher so he never knows when a breakthrough is imminent and never loses his intelligence. But what if the assistant also loses his intelligence when progress is being made? And even if this isn’t a problem (an unintelligent assistant can still do the job) whoever came up with this strategy for making progress would lose his intelligence and motivation before being able to implement it. Its maddening.
Or just pit two brilliant scientists against each other in a reality TV show style format, manipulating each of them along the way into thinking the other is on the brink of the next breakthrough?
The person(s) running the reality show loses his intelligence as soon as the scientists start to make progress and the project collapses.
I don’t think anyone has ever considered “person running a reality show” to be high-status.
My more serious reply: I don’t think Nick envisioned an ACTUAL reality show. I think it would be enough to confine some scientists to quarters and cut out most media communications, constantly keeping them in the dark about how their progress measures up to the others. The people running it could keep it secret, so they don’t gain status.
Celebrity culture. As an emerging class of celebrities grows to command increasingly higher social status, status of non-celebrity fields declines and advances become feasible. Zyzzyx Kardashian is hailed as the founder of a new age of progress and prosperity.
That’s really clever. You have to rework the system so that progress and status are decoupled.
Alternately, Zyzzyx Kardashian is accepted as the founder of a new age of progress and prosperity, but nobody really cares about that sort of thing anymore.
Modern corporations have solved this problem: competent workers are too valuable to replace, so promotions are reserved entirely for the incompetent. (It’s known as the Inverse Peter Principle.) Competent workers never achieve high social status, so they never get stupid. 😉
Also, competent workers whose abilities are underrewarded, but who would be Peter-principled or Zzyzx’d into incompetence if they moved up their corporate ladder, move into independent consulting instead (typically a less secure position.)
Yup! Corporations are one solution.
A inhumane egalitarian organization where no individual is permitted to have social status is another. Or maybe it’s essentially the same as corporations.
A third option is to install literal Damocles’ swords above every individual, hanging by threads that get more threadbare the more productive the individual is. (If we make the swords metaphorical, like a risk of random firing proportional to the employees’ output, then we’re back with corporations again.)
> inhumane egalitarian organization where no individual is permitted to have social status
What does this even MEAN in practice? “Alright everyone, if we catch you thinking about whether you are higher or lower status than someone else, you’re fired.” You can’t just tell people they’re not permitted to have social status.
Sounds like the real world…. 😛
A sufficiently motivated person on the brink of senility hires an army of redundant ghostwriters, who are unlikely to achieve high status, and who will hire replacements in case one of theam breaks through.
A ghostwriter who is near a breakthrough realizes, “Woah, I’m going to solve this! I’m awesome/I’ll be famous!” and then they go dumb. Their replacement becomes familiar with the previous writer’s work and suffers the same problem…
Going to be famous is not as stable a position as actually being famous, as half the waiters in Los Angeles can tell you.
Fair enough. I was thinking very close to success, but even then I suppose if a ghostwriter goes dumb, that’s a good indication they’re close, so one could release what they’ve done.
Enforced anonymity should solve most of their problems, but also mess up incentives.
K’th’ranga V Is essentially the human brain, with different processing layers built on top of a scared horny lizard.
Zyzzx Prime is pretty similar to Capitol in Orson Scott Card’s The Worthing Saga.
In it, there is a drug called somec that extends your life through hibernation. Access to the drug is regulated by some measure of social value. So all of society’s luminaries, celebrities, business magnates, etc. get access to very high levels of somec (as measured by awake time vs asleep time) and get to skip across the generations. Circularly, high levels of somec are seen as one of the most important things an individual can pursue. One consequence of this is that scientific progress totally stagnates.
Bah I screwed up my formatting and then was on a plane and couldn’t fix it within the hour.
Also conveyed succinctly by Mr. Munroe
xkcd: Cryogenics http://xkcd.com/989/
The joke about the sea creature that eats its brain, and how this is like tenure, is a curious meme. It circulates almost exclusively within neuroscience and its associated fields but its ultimate origins are unclear to me. (The animal is usually a sea squirt, a primitive chordate, and “eats” is fanciful, as we don’t typically describe a butterfly as having eaten a caterpillar.)
Here’s Daniel Wolpert using it in a TED talk. Note that the butt of the joke is the American tenure system (what are the salient ways that European academia differs?) http://www.ted.com/talks/daniel_wolpert_the_real_reason_for_brains
Daniel Dennett uses it in Consciousness Explained, but uncertainly attributes it to Rodolfo Llinás.
Would you get upset if someone actually wrote something inspired by these ideas? It would be a fun challenge to write a story set in each world.
No, please, go ahead.
Actually, I may put some effort into something like your K’th’ranga V. I’d leave out the super intelligence stuff and just use the “chain of islands” bit, which to my view is sufficient to make the cool-deep points about conservatism versus social advancement. Adding AI would muddle the theme.
It starts with a young woman on a ship, traveling from the next-to-last island to the final “island of the priests.” But she is not from the next-to-last island. She is from high-tech-land, a spy, chosen for her mastery of history, geography, and dialect. No one else was found who had a chance to fool, successively, the leaders of each island, to convince them that she was local and thus eligible to be part of the delegation.
She has something hidden on board, a chest, of which she is quite secretive, quite concerned, even as she does nothing to hint it is hers.
When she arrives at the island of the priests, posing as a meek servant within the delegation — things will happen.
(Now I just need to think of what things. What does the protagonist want?)
Maybe there are several factions on the mainland, and she hopes to sway outcomes to favor hers.
Archimedes’ Chronophone fic?
I can’t figure out why high-tech land still listens to bronze-age land at all. A very simple and bloodless revolution seems to be in order.
Rawls’ Veil of Ignorance method says that the best way to determine morality is to know as little as possible. Therefore, we should keep our moral leaders maximally ignorant.
@Luke — that is a good question. My best stab at an answer is something like this: the higher tech people were plagued by factionalism and turned to the old guard to provide something that looks like stability.
But of course that stability is a lie. No such situation could be stable for long. But it is something like a tradition. There have been ups and downs, attempted coups, both among the priests themselves and among the tech people. Also there is behind the scenes espionage. (In my story there will be blocks against radio communications to the island, which would be a plot element.) Similarly, I am imaging a ban on weather control near the island, since that could be used to manipulate the priests: “Hurricane! The gods are angry!” (There is a high tech island maybe 500 miles away who insists on controlling their own weather, despite the fact that weather moves. There is much objection to this, but so far they’ve gotten away with it.)
Anyway, the point is this: at the start of the story the priests are at a high point of their power, cuz the particular history of mainland factions. But there is pending instability.
Which is the story.
(The fun part will be translating some of the tech-world’s problems to the various languages in between, to show explicitly how the “telephone” game plays out.)
I really want to do this, except it’s been forever since I wrote anything of substance and I have problems getting started.
Someone should tell me to get started. And then like bug me about it. Again and again.
Provide your email (to mine, say) and I can do that. My middle initial is A and my gmail address is my name with that in the middle, separated by ‘.’
I can also beta-read. As one, I tend towards harsh.
Its kind of interesting to see how this can work in real life. IDK but it seems Christianity isn’t too incompatible with modernity provided you stay away from the natural history. }
Of course, one issue with that is that you can reach the Industrial Age unhindered and then get smashed by post-industrial duty rosters killing the Sabbath and FAI killing God.
I’ve been trying to compose a humor story in the general genre of Scott’s fiction posts, where the protagonist (or perhaps two, a physicist and a moral philosopher) wakes up one day in the literal Least Convenient Possible World. I haven’t been making much progress, however, so as long as we’re sharing ideas for possible fiction premises, I’ll leave that out there for Scott and the rest of you.
I thought about this idea. It’s interesting and fun to think about, but I don’t think it would work as a story.
The attraction of such a story, to me, would be in learning more about the concept of Least Convenient Possible World by seeing it from another angle, and also in admiring the author’s skill at crafting a series of events that both conforms to a Least Convenient Possible World and makes a good story. If a story with this premise doesn’t have these attributes, then it may be a good story in other ways, but it will not live up to its premise.
However, when I thought about the concept of a literal LCPW, it seems like the author can make the world anything they want. The LCPW concept only applies when you want to remove practical considerations from a philosophical discussion. Yet there is no standard way to choose a philosophical discussion, and thus the LCPW in the story could remove the practicalities from any set of philosophical discussions the author chooses. For example, if the author decides that the protagonist struggles with Pascal’s Wager, then Omega might tell them that either God does note exist or Catholicism is absolutely right, but if the author decides that the protagonist doesn’t care about Pascal’s Wager, then such a thing would not happen.
The author would be given too much freedom to craft the story world into anything they want. The task of fitting a story into the LCPW mold is too easy for me to be impressed with it. And similarly, because a LCPW could be almost anything, I wouldn’t be able to learn much about them by just seeing one example.
And then the lost astronauts stuck on Zyzzx Prime discover: it was Earth all along! (Seriously, this would explain a lot.)
“Whenever it wanted a high-level decision made, it would talk to a slightly less powerful superintelligence”
Chinese whispers + ontological crisis + complexity of values = epic fail
Gamma Andromeda – the kit that does this, the “stoicism box”, does it keep logs? If the box keeps track of the shocks and opiates needed to maintain equanimity, then you’ve got an interesting log to analyse. I forsee a lot of people being shocked – or rather, getting a big opiate boost – when they realise that their shock-to-opiate ratio has been moving progressively towards the opiate side. Or maybe I’m being too cynical about their ability to cultivate virtue.
I’m reminded of the people unable to feel bodily pain (CIP), although it’s not a close analog of what we have here as they can still feel anguish just fine. People with that condition tend to accumulate lots of minor (and not-so-minor) injuries, this can really get some of them down.
I really hope that Zyzzx Prime isn’t Earth because the solution might be a constitutional jerkocracy. The idea is to tap into primal notions of high status – preferably pre-linguistic ones. Your jerkocrats need to have privileges such as being able to jump queues, eat first, take food from other people’s plates, interrupt people while talking, make lots of noise etc. – things which involve a lot of pushing people around but which don’t really involve controlling them in fine detail. More sophisticated things include taking credit for other people’s ideas. The key idea is that stupidity makes a jerkocrat a more effective jerkocrat – firstly it makes it easier to wind up everyone else, and secondly, it makes it easier to keep them away from situations where they could actually cause real trouble… one hopes. There’s always the danger that a jerkocrat gets their hands on something actually important, and then oh dear.
I need a shower.
Why … would you want this?
The idea was to make sure that the people who took the actual decisions didn’t _feel_ high-status, which might then mean the loss-of-intelligence doesn’t get them. Of course the precise trigger for loss-of-intelligence on Zyzzx might not be that. Also, I suppose I’m identifying with the ordinary people who don’t want catastrophically incompetent government or a string of revoltions, and not identifying with the unlucky people who’d be the class of people who actually ran the things the jerkocrats were nominally in charge of.
Except the whole idea makes me shudder so much that maybe I’d take my chances with the inept governments and the periodic revolutions…
Oh, wait, I think I get it. Jerkocrats “rule” by pushing around ordinary folks, but it’s actually a secret democracy. Kind of like England.
Riffing on this:
Omicron Zeta IIIA
quirks of the Omicronian neurophysiology permitted the development of full consciousness emulation and transfer technologies slightly earlier than for other species, allowing the Omicronians to solve their growing overpopulation problem by uploading half their population into a simulated environment. Unfortunately, the dominant Omicronian value systems were greatly at odds with living in a fully simulated reality, leading to the Great Compromise: Omicronians with bodies will spend every other day uploaded into the Simnet, trading places with a bodiless Omicronian.
A few orbits ago, a group of communist-egalitarian Omicronian terrorists uploaded a particularly nasty virus into the Simnet, which causes the body-allocation process to be completely random. Omicronians with bodies will always find themselves in their own bodies, but Omicronians without bodies will find themselves in a completely random person’s body, and incapable of conclusively proving their own mental identity. Political expedience and social values have together led to the policy that social reputation, personal property, real estate, and binding contracts in the physical world are all properties of bodies, not minds; thus, a bodiless Omicronian may experience vastly different social standing and opportunities from one day to the next, and an embodied Omicronian may find themselves beholden to agreements and decisions made by minds with vastly different goals and preferences.
Paging Dr. Rawls.
Cool concept. What’s the story?
This premise made me think of a much more boring story – a romance novel in which two people keep falling in love and they keep trying to find each other, but every time they come close, the randomization occurs.
That doesn’t sound boring at all.
The Zzyzx one reminds me of that *Monty Python* sketch – ‘The Deadliest Joke in the World’. In the sketch the British army discover a joke which is so funny that people die of laughter whenever they hear it, which they use as a weapon to win World War II. Part of what makes the sketch so funny is that the viewer never gets to hear the full English version of the joke because exposure to the joke is so fatal.
On Zzyzx you have the problem that anyone sufficiently intelligent to discover a cure for the brain atrophying hormones inevitably becomes high status and can’t continue their research, and anyone reading their lab notes will almost immediately have the same epiphanies and shut down their brains. It would make a fantastic story to have the military intelligence divisions of two sides in a Zzyzxian war engage in memetic warfare to try to raise the status of the opposing country’s generals and politicians by exposing them to predeveloped status-raising ideas created in pieces by shadowy researchers who are carefully recruited from the ultra-low status demographics of society. You could tell the story of a researcher who was accidentally exposed to enough weaponised status-raising material that she becomes addicted to the feeling of power, and uses all of the concentrated staus-memes she has access to to develop her own power base and overthrow the existing system. The story could end with one of her advisors recommending setting up a secret military installation to raise the status of her political enemies and prevent them from ever thinking up ways of overthrowing her…
Here’s the link for anyone interested in the Pythons – I’m not even going to try to format the HTML to embed it! http://www.youtube.com/watch?v=ienp4J3pW7U
Hey Scott, what did you think of Neal Stephenson’s Anathem?
K’th’ranga V reminds me of a Discordian fable on the mode of failure for beureaucracies.
That link seems to be broken.
Here’s a link that differs from the original in case, in both directions. (also it has .html, but that isn’t enough to save the other link)
But I wouldn’t say that it “works.” This is better:
PS – people who run web servers should use mod_speling’s case correction.
Yesh, I copied the URL across by memory; I’m not surprised it doesn’t work.
I would be more interested in a Zyzzx Prime where the people aren’t surprised by their natural life cycle. They’ve had thousands of years to settle into a normality. What do they expect out of life, and how do they organize their lives to match that? If we are to maintain focus on a time of their political and scientific change – how might these changes disrupt the average person’s life plan?
I hope you will give your perspective on the Ebola crisis some time soon…
Whenever I saw the word “K’th” I couldn’t help but read it as referring to an element in a series(the 1st island, the 2nd island… the K’th island). I hope that was intentional, and I’m not just seeing patterns that aren’t there…