[Epistemic status: not very serious]
[Content note: May make you feel overly scrutinized]
Sometimes I hear people talking about how nobody notices them or cares about anything they do. And I want to say…well…
Okay. The Survey of Earned Doctorates tells us that the United States awards about a hundred classics PhDs per year. I get the impression classics is more popular in Europe, so let’s say a world total of five hundred. If the average classicist has a fifty year career, that’s 25,000 classicists at any given time. Some classicists work on Rome, so let’s say there are 10,000 classicists who focus solely on ancient Greece.
Estimates of the population of classical Greece center around a million people, but classical Greece lasted for several generations, so let’s say there were ten million classical Greeks total. That gives us a classicist-to-Greek ratio of 1:1000.
It would seem that this ratio must be decreasing: world population increases, average world education level increases, but the number of classical Greeks is fixed for all time. Can we extrapolate to a future where there is a one-to-one ratio of classicists to classical Greeks, so that each scholar can study exactly one Greek?
Problem the first – human population is starting to stabilize, and will probably reach a maximum at less than double its current level. But this is a function of our location here on Earth. Once we start colonizing space effectively, we can expect populations to balloon again. The Journal of the British Interplanetary Society estimates the carrying capacity of the solar system at forty trillion people; Nick Bostrom estimates the carrying capacity of the Virgo Supercluster at 10^23 human-like-digitized entities.
Problem the second – does the proportion of classics majors remain constant as population increases? One might expect that people living on domed cities in asteroids would have trouble being moved by the Iliad. Then again, one might expect that people living in glass-and-steel skyscrapers on a new continent ten thousand miles away from the classical world would have trouble being moved by the Iliad, and that didn’t pan out. A better objection might be that as population increases, amount of history also increases – the year 2500 may have more historians than we do, but it also has five hundred years more history. But this decreases our estimates only slightly – population grows exponentially, but amount of history grows linearly. For example, the year 2000 has three times the population of the year 1900, but – if we start history from 4000 BC – only about two percent more history. Even if we admit the common sense idea that the 20th century contains “more” historical interest than, say, the 5th century, it still certainly does not contain three times as much historical interest as all previous centuries combined.
So it seems that if human progress continues, the number of classicists will equal, then exceed the number of inhabitants of classical Greece. Exactly when this happens depends on many things, most obviously the effects of any technological singularity that might occur. But if we want to be very naive about it and project Current Rate No Singularity indefinitely, we can just extend our current rate of population doubling every fifty years and suggest that in about 2500, with a human population of five trillion spread out throughout the solar system and maybe some nearby stars, we will reach classicist:Greek parity.
What will this look like? Barring any revolutionary advance in historical methodology, there won’t really be enough artifacts and texts to support ten million classicists, so they will be reduced to overanalyzing excruciating details of the material that exists. On the other hand, maybe there will be revolutionary advances. The most revolutionary one I could think of would be the chronoscope from The Light of Other Days, a device often talked about in sci-fi stories that can see into the past. Armed with chronoscopes, classicists could avoid concentrating on a few surviving artifacts and study ancient Greece directly. And since the scholarly community would quickly exhaust would could be learned about important figures like Pericles and Leonidas, many historians would start looking into individual middle-class or lower-class Greeks, investigating their life stories and how they tied in to the broader historical picture. A new grad student might do her dissertation on the life of Nikias the random olive farmer who lived twenty miles outside Athens. Since there would be historian:subject parity, it might be that most or all ancient Greeks could be investigated in that level of detail.
What happens after 2500? If the assumptions mentioned above continue to hold, we pass parity and end up with more classicists than Greeks. By 3000 there are a thousand classicists for each ancient. Now you wish you could do your dissertation on the life of Nikias The Random Olive Farmer. But that low-hanging fruit (low hanging olive?) has been taken. Now there is an entire field (olive orchard?) of Nikias The Random Olive Farmer Studies, with its own little internal academic politics and yearly conferences on Alpha Centauri. In large symposia held at high-class hotels, various professors discuss details of Nikias The Random Olive Farmer’s psychology, personal relationships, opinions, and how he fits in to the major trends in Greek society that were going on at the time. Feminist scholars object that the field of Nikias The Random Olive Farmer’s Wife Studies is less well-funded than Nikias The Random Olive Farmer Studies, and dozens of angry papers are published in the relevant journals about it. Several leading figures object that too little effort is being made to communicate the findings of Nikias The Random Olive Farmer Studies to the general public, and there are half-hearted attempts to make little comic books about Nikias’ life or something.
By 3150 this has gotten so out of hand that it is wasting useful resources that should be allocated to fending off the K’th’rangan invasion. The Galactic Emperor declares a cap on the number of classics scholars at some reasonable number like a hundred million. There are protests in every major university, and leading public figures accuse the Galactic Emperor of being anti-intellectual, but eventually the new law takes hold and the grumbling dies down.
The field of Early 21st Century Studies, on the other hand, is still going strong. There are almost a thousand times as many moderns as Greeks, so we have a more reasonable ratio of about fifteen historians per modern, give or take, with the most interesting moderns having more and the ones who died young having fewer. Even better, the 21st Century Studies researchers don’t have to waste valuable chronoscopes that could be used for spying on the K’th’rangans. They can just hunt through the Internet Archive for more confusing, poorly organized data about the people of the early 21st century than they could ever want.
Gradually the data will start to make more and more sense. Imagine how excited the relevant portion of the scholarly community will be when it is discovered through diligent sleuthing that Thor41338 on the Gamer’s Guild forum is the same person as Hunter Glenderson from Charleston, South Carolina, and two seemingly different pieces of the early 21st century milieu slide neatly into place.
A few more population doublings, and the field of Hunter Glenderson From Charleston Studies is as big as the field of Nikias The Random Olive Farmer Studies ever was. The Galactic Emperor is starting to take notice, but the K’th’rangans are in retreat and for now there are resources to spare. There are no more great discoveries about new pseudonyms to be made, but there are still occasional paradigm shifts in analysis of the great events of Glenderson’s life. Someone tries a Freudian analysis of his life; another a Marxist analysis; a third writes about how his relationship with his ex-girlfriend from college ties in to the Daoist conception of impermanence. All these people have grad students trawling old Twitter accounts for them, undergraduates anxious to hear about their professor’s latest research, and hateblogs by brash amateurs claiming that the establishment totally misunderstands Hunter Glenderson.
Late at night, one grad student is preparing a paper on one of Glenderson’s teenaged Twitter rants, and comes across his tweet: “Nobody notices me. Nobody cares about anything I do.” She makes special note of it, since she thinks the irony factor might make it worth a publication in one of the several Hunter-Glenderson-themed journals.
When the historians trace the intellectual and cultural influences back to and through this blog, I think Scott will have well over fifteen specialists fawning over him.
Judging from the amount of time many of us spend on SSC, he already does.
Or all the insight from his blog will filter down and they’ll find him boring, surprised how we could spend so much time on the guy’s blog.
If his idea of ‘read history of philosophy backwards’ proliferates far enough, will people have to consider this maxim on the meta-level.
(Think about how it must have been before people thought about how it must have been before significant paradigm shifts had occured.)
IN THE FUTURE, EVERYONE WILL BE FAMOUS TO FIFTEEN PEOPLE
Oh good, I was afraid Heinlein had written a story about this or something. At least I’m only terribly unoriginal about titles rather than terribly unoriginal about content.
That makes sense. Title-space is much smaller than content-space.
In the future, Scott Alexander historians will wonder whether you’d read that before posting this. There’ll be an entire debate about whether that comment qualifies as an admission of stealing the title, or whether it was saying it was a coincidence; the latter side will debate amongst themselves whether or not it was really a coincidence or if you read the article but then forgot about it.
This is a major reason why I am skeptical that future civilizations, human or otherwise, would not consider it worthwhile to scan cryopreserved brains if much information is recoverable and civilization isn’t fragmented at very small scales.
Not necessarily to give the scanned minds a great life or big cut of local resources thereafter, but at least for the information value. Giving the subject a great quality of life under research ethics standards wouldn’t be prohibitively expensive relative to Hunter Glenderson Studies either. However, a big chunk of a Jupiter Brain, or near-maximal lifespan, could be far too expensive. And some of the ways to get the most historical value out of a mind could be in tension with its preferences, like making trillions of short-lived copies to run through experiments in varied primitive circumstances to improve predictive models of Hunter Glenderson and generalize to other questions and counterfactuals.
Your last sentence makes me wish Scott had carried this through to the obvious conclusion that we are all ancestor simulations being used to inverse-solve for the details of significant historical events.
The look at our wondering how the arrow of time works and laugh, since they know it’s running backwards.
A little piece of history I happened upon. Whatever became of buddiiboii12?
Papers and Academic Books from the Future:
A Romantic Reconstructionist Critique of the Slate Star Codex
Ozymandias: A Partial Reconstruction of the Lost Years
From Marx to Moloch: Rediscoveries of Societal Unity and Destiny Control Within the After War Transhumanist Community
Heil Dir Im Siliziumscheibe: Shedding Light on the Unlikely Origins of the Graphene Bismarck
Analysis of the Efflugence Axis Members as Jokers
An AI Planning Locus Analysis and Simulated Prototype of the Archangels
Military Capabilities of the Weltraumburg Empire: An aromantic approach
Basically, the world wars and the resulting loss of confidence.
“I’d rather be 1/1000th-of-a-person’s favorite thing, than 1 person’s 1000th favorite thing.”
Does history grow linearly? Time may be steady, but we’re certainly producing far more information now than humans in the past did. Even if you limited yourself to fiction, I bet the amount published in just the past year surpasses that from the first thousand years BC.
I agree, I think history grows exponentially. With a small constant, to be sure, but the pace of events has increased steadily as the ability to communicate has increased, and I think that has clear effects on how much history is ‘being created’ each year.
This. Aside from the fact that we have way more people, things are changing much more rapidly. Maybe that will calm down in 10 years and we’ll have another 10 interchangeable generations, but that seems unlikely to me.
When was the last time there were 10 interchangable generations?
In the Western world, maybe the Middle Ages? There would be wayy more recent runs than that in isolated tribal societies, though.
Yeah, I concur. Even before I read this post, I was thinking about the size of all information about future history, and how it will be bigger given a bigger human population, and bigger given how much more data is collected per person, and per event, for this century than it would be for any prior century.
Upon reflection, I think I failed to account for the data given us by the documentary Star Trek. At some point in the near future, culture seems to stagnate, so that we no longer produce any noteworthy literature or great art for the next few centuries. Note, for example, how the crew is still acting out holonovels based on Sherlock Holmes (the original, not even an updated Space Holmes). Tom Paris is a fan of old (by our standards) motor vehicles; you never see anybody restoring an early-era shuttle. Apparently history and classics really do stop producing new material to analyze after a while.
Note, for example, how the crew is still acting out holonovels based on Sherlock Holmes (the original, not even an updated Space Holmes). Tom Paris is a fan of old (by our standards) motor vehicles
Excuse me, I believe you are forgetting the classic novel that will be written sometime in the 2030s by an author from a planet orbiting Betelguese, as noted by Captain Kirk in the episode “City on the Edge of Forever”.
The Holmes holonovels and vintage car enthusiasts you instance are obviously the 24th century equivalent of today’s early music enthusiasts.
On the contrary, this illustrates the point! Human culture stagnates to such a point that one has to go all the way to Betelgeuse to find a noteworthy novel.
It’s not the presence of pre-21st century enthusiasts that is noteworthy, it’s the total lack of any enthusiasm for what comes after. Where is the retro Ten Forward where all the waitresses wear TOS-style miniskirts? (Wearing TOS-period uniforms because you have time traveled to TOS does not count.) Why does nobody enjoy a good space-western about the early days of exploration? For some reason, they’re all stuck on the old American West, 20th Century detective stories (not just Holmes), and washed up Vegas singers from the 1960s.
It’s clear that our current rash of remakes is the beginning of the end for our culture; we can no longer produce anything new and worthwhile. Star Trek proves that the Neoreactionaries are right!
But the 24th century uniforms are too near in time. Where are all the theme bars of today featuring people dressed up in 90s neon styles? (I except the 80s school disco nightclub events which are simply excuses for 40 year olds to get trashed like they’re 20 again).
We romanticise the clothing of the 19th century (for instance) because it’s distant enough from our own era, but to those of the period and immediately after, men’s wardrobe was dull and boring and stodgy everyday wear.
To the 25th century, our times are as exotic and nostalgic as 16th/17th century styles are to us. The 24th century, on the other hand, is their grandfather’s time and (despite Macklemore) how many of you are dressing up in your grandfather’s clothing?
I think the Star Trek writers missed a trick here. Something like this would probably have been a hit with the fanbase, although you couldn’t introduce it too early in the series without confusing people.
This made me realize something else: Any time we meet an alien society roughly as developed as our own, the amount of history available for study will double. Forget Nikias; our resources will be spent delving into the finer points of the K’th’rangan Sm’glox Restoration, and trying to unify those ideas with the teachings of Surak.
Forget the K’th’rangan; I myself will war to the death in support of TOS “The Savage Curtain” Surak (even if you want to hair-split and quibble that was only the Image of Surak) as against Star Trek: Enterprise revisionist Surak and the whole retconning of the Vulcans as xenophobic racist assholes.
(Is swearing, or at least vulgarity, such as “assholes” acceptable on here? If not, please remove it and replace it with derogatory-epithet-indicative-of-contempt of your choice).
I thought the same, since the amount of history for any given period should be proportional to how many people were alive, because humans create history (among other sources.)
That points toward an interesting idea. Suppose we could divide history into “the history of people” and “the history of stuff other than people.” The first is the set of all historic events on the human scale (elections, revolts, books, etc) and the second is those that have almost nothing to do with people (volcanic eruptions, supernova explosions, algae blooms)
The first set probably increases exponentially with time, since it’s proportional to the population. The second probably increases linearly. So we have a ratio (people history):(non people history) that decreases
On the other other hand, some history really hasn’t scaled with human population. US Presidential elections are still every four years, and there is still just one president at a time. The number of Senators and Governors, though it has increased a little, has not nearly kept pace with population.
I’m not sure what the accounting would be with wars; I imagine a lot depends on how you decide to count and divide them. Do we divide ISIS in Iraq from ISIS in Syria? Is our current Iraqi bombing a new war or a continuation of the previous? Do we count size of wars or number of wars?
Surely the amount of history generated in a period is proportional to the amount of man-years lived in that period? Then the 1000 years of history between 2000 and 3000 are very different to the 500 years of history between 1000 BC and 1 AD, simply because of the vastly higher population recently. (And that’s saying nothing about the quality/quantity of sources to study.) There will probably be classicists in the future studying increasingly finer-grained levels of ancient society, but there’ll be many more near-future-ists studying the Great Conquering and the enslavement of humanity by giant space ants; a single classicist studies N effective ancients, and a near-future-ist will probably study the same N effective near-future-folk.
Rough argument for that: the ease of publication in a field depends inversely on how heavily it’s already been studied—i.e., if you can find a topic no-one else has ever published on, anything you say on it will be publishable. Scholars will naturally distribute themselves uniformly with regard to the amount of studiable material. For historians, the amount of studiable material is proportional to how many man-years were lived in a period; hence, historians will ‘disproportionately’ study more populous times.
By the way—this right here is my second comment to SSC. (Hi, future historians!) How do I get a distinctive avatar-thingy? Is it a hash of the email field or something?
register and upload an image at gravatar.com . It propagates to any wordpress-based site.
Yes, just fill in the email field. It’s the md5 hash, if you want to know before you post.
Why do you have the same avatar as the OP?
It’s the default. If you don’t put in an email address, you get that. See my avatar.
I think that interest is generated by being in a sweet spot between being closely related to covered subjects, but different enough to be novel. If 6th century China becomes a hot subject among historians, then people will be interested in late 5th century China, and 6th century Mongolia, due to how they influence 6th century China. But just picking something that no one has studied isn’t a guarantee of success, because if there’s no connection to something people care about, people won’t care about it.
For the possible downside of multiple historians studying one single past subject, particularly when using time-travel technology to do so, see the story The Ninth Symphony of Ludwig van Beethoven and Other Lost Songs.
Link to story in PDF form.
Other links, probably more robust:
The Survey of Earned Doctorates tells us that the United States awards about a hundred classics PhDs per year.
But each year the number of tenure-track jobs in classics is in the single digits. Most of those 100 do not spend 50 years studying classics.
Once we start colonizing space effectively, we can expect populations to balloon again.
Even in sci-fi paradise, there will be no exponential population growth: space is basically flat, and there is a speed limit, so the human population in year Y will have to fit in ~Y^3 amount of space. Assuming there is a lower bound on how much space a future person can take, there will be about as many people alive in year Y + 50 as there were alive in year Y, for sufficiently large Y.
By my calculations we can continue to double once every 50 years for the next 7000 years before the speed limit becomes a problem (till 9133 to be precise). If we became EMs we would double faster but also require less space and experience more subjective time, and so we might still experience 7000 years before exponential growth was forced to halt.
EDIT: Actually I think the limit on our population growth is that the light speed limit and the sparsity of space stops us collecting matter fast enough to make more of us. After 7000 years of doubling there would be 10^52 of us, but there isn’t even 10^42kg of matter within that radius, so we would have less than 10^-10kg (grain of sand) each, which I’m not sure is enough to support an EM.
Relevant link: the living will never outnumber the dead.
Unless we achieve immortality, I guess.
You’re making an order of magnitude error with your estimates. The number of phds granted/full time positions is quite large. Almost all of the phds granted go in to non-academic jobs.
Also, the recent trend has been towards decreasing academics in general. Extrapolating from trends, in the future there will be no classicists (or physicists, for that matter).
This is kind of cheating, but there are also thousands of people who study the classics closely without being PhDs or in an academic setting.
It took several readings for me to realize that you intended the slash as a mathematical symbol rather than a grammatical one. When you put symbols in prose, the grammatical meaning predominates.
In “The Parasite” by Arthur C. Clarke, instead of scholars, future voyeurs are studying random PUAs.
To elaborate on Jaskologist’s point: if lifespans and growth rates are fixed, the ratio of living people to dead people will be constant(*). The reason the ratio seems to be growing is that we recently transitioned from a low-growth regime to a high-growth one. But if we stay in this regime, we’ll asymptotically approach a constant ratio. So I’m afraid the math doesn’t work out.
(*) Proof: Suppose everyone has the same lifespan, the population grows by a factor of R every lifetime, and there are currently N people. Then the number of people alive one lifetime ago was N/R; two lifetimes ago, N/R^2; and so on. So the total number of dead people is N/R + N/R^2 + N/R^3 + …, which is N(1/R + 1/R^2 + 1/R^3 + …), which is N/(R-1). So the ratio of living people to dead people is always (R-1):1.
Ratio of living:dead isn’t the important variable here.
There are two different models of studying history. In one, you have X historians per time period – ie X historians studying the 1100s, X historians studying the 1200s, X historians studying the 1300s, et cetera.
In the other, you have one historian per X people – so if the 1200s had twice the population of the 1100s, there would be twice as many historians studying them.
I think the truth is probably somewhere in between. But as long as it’s not solely the second model, there’s room for the historian:person ratio of earlier centuries to keep rising.
Now I want to make a chart of the historian:person ratio for different eras. It should be easy to get population estimates, but historian estimates would be harder. I might have to go off published papers for different keywords.
Nah, your error bars on the (log) number of historians are going to be a lot smaller than on historical population. Maybe the population in 1200 is not so bad, but the population of Greece in 400 BC, let alone China or America? At least the question of the number of classicists is not politically charged. The only objection so far is a mere order of magnitude.
I don’t care about error bars so much as having any estimate at all.
I don’t think there’s even a way to go about estimating the number of, say, Renaissance historians. Google isn’t very helpful. Best I can do is go to individual universities’ departments, count what percentage of their historians deal with the Renaissance, average it out among a couple unis, and multiply by some estimate of total historians.
Or you could measure the proportion of this list endorsed by wikipedia.
But that doesn’t deal with the specialisations within the field (are you studying the Renaissance in Florence? Or in Northern Europe?) never mind trying to parse out where the High Middle Ages ended and the Early Renaissance started, when and in what locales.
I think you’re correct that future historians (if there still are historians) will find plenty to occupy them in smaller and smaller specialisations.
I think the most plausible model is that the number of historians for a period will be proportional to the amount of “interesting stuff” that happened during that period. (I guess the historian’s term for “interesting stuff” is “history”, so I’ll just say that.) So what is the relationship between a period’s population and the amount of history generated?
My assumption has been that it’s linear, which is why I think the ratio matters. One could imagine it being sublinear: maybe only the people at the top of society’s hierarchy produce history. But one could also imagine it being superlinear: maybe there are economies of scale in producing history. So I disagree that it necessarily lies between the first and second cases.
Also, if you do believe that the relationship is sublinear, you’ll want a different justification than the one I gave. If only people at the top make history, then Nikias will never get much attention.
It’s not linear. Disasters are more interesting than other stuff, with wars being most interesting of all. You see a lot of War X buffs; not so many Presidential Election of 1872 buffs.
The number of disasters and wars also grows with population size, of course. Everything grows with population size! We will have proportionally more cities on earthquake-prone sites, more oil rigs with shoddy safety, and more archdukes and assassins.
(Actually, I do hope we’ll get better at avoiding wars and disasters over time, but I think we’re assuming society stays about the same. Obviously if people stop doing interesting historical stuff, eventually all the interesting history will be in the distant past.)
People have lived basically everywhere, basically for all time. What has increased is density, not the number of fault lines people live on.
As for wars, it’s practically the opposite. The frequency of war depends mainly on the number of states. But it’s not like all states are Dunbar States. Greater density allows states with larger extent, hence fewer states and fewer wars.
In case this wasn’t clear: we are talking about the year 3000, where there are five quadrillion humans . They are clearly not all living on Earth. Just because a trend has held for all of human history, that doesn’t mean it’s sustainable over the next 6 orders of magnitude of population growth.
So yes, there will be more fault lines in the future.
1. This reminds me of http://uncyclopedia.wikia.com/wiki/File:Infini-T.jpg .
2. Academia is already a crushingly competitive rat race in many fields, but in the dystopian future you describe, unless you’re a ridiculously brilliant badass /and/ sacrifice everything else to Moloch, you’re stuck studying something as mind-numbingly boring as the unremarkable moments of unremarkable people’s lives today.
3. So I guess one could read this post as “You think your life is meaningless and insignificant? You’re lucky you aren’t in the future, where people’s lives are so meaningless and insignificant they have nothing better to do but study yours!” Which is about as much consolation as the old “children are starving in the third world” response, though thankfully there’s reason to hope the future won’t actually be that bad (see point 1 and other people’s comments).
IDK, man, I actually do follow strangers’ blogs on tumblr because I actually get invested in, like, superwholockedinthetardis’s job as a maid and recovery from alcoholism and crush on the hot girl who works at the coffeeshop.
So it’s not hard for me to imagine the field of Nikias The Random Olive Farmer Studies being really interesting.
The idea of future or even present people taking this level of interest in me terrifies me. Everything that gives me constant, overwhelming, insurmountable shame and self-hatred is going to be studied and studied by dozens of people? Do you really think every analysis of some random schmoe’s life is going to be positive, and nobody’s going to be in the Journal Of Get A Load Of This Fucking Loser?
It sounds frightening, but in such a future, people will be heavily desensitized to such things and likely not be emotionally concerned about routine things that we consider embarrassing today. Doctors and nurses, because it’s their job to deal with people with medical conditions and they do it every working day, don’t typically see those conditions as something to be ashamed of or embarrassed about, even if the patient finds it so.
To people who have entire academic fields devoted to dissecting people’s tweets and Tumblr posts, posts with sulking, passive-aggressive drama, unusual sexual practices, and details of medical problems would be normal everyday content. Plus, they’ll likely have a much better understanding of what was going on in the poster’s mind. My guess is that they would be far less judgmental towards us than a modern-day peer would be. The content would have to be particularly bad to cast a very negative light on the person being studied. A brief online slapfight on Facebook in which someone got called a turdball wouldn’t qualify.
The chronoscope technology is one that isn’t yet real, and it seems actually predicting that this version of the future will be true depends on chronoscopes working as well as they do in science fiction. Peering into the past isn’t magic; I imagine it will be computationally expensive.
Digital archaeology may depend on what you put online. From Cadie, other reply:
>posts with sulking, passive-aggressive drama, unusual sexual practices, and details of medical problems
The less of this you post on the Internet, the less future historians of your life will have to go on. If you post none of this on the Internet, than there is little for you to be embarrassed about. The good news to extract from this information is that, aside from what information other people generate about you on the Internet, you control what future historians we’ll know about you. If you keep being not famous, than most data generated about you will be from you, and this affords you great power over how the future perceives you.
If you’re sly, you could use the Internet to engineer an image of your real identity which is way more awesome, or different, than what you are off the Internet. Your friends might be confused while your alive as a carbon-based life-form, but future historians of your life might think you’re a total badass. Of course, this is also called ‘lying’, which lots of people are personally uncomfortable with.
Alternatively, I’ve read some transhumanist opinions that the more data about yourself is accessible on the Internet, perhaps the more information the descendants of humanity will have to reconstruct you as an upload in addition to what they learned about you from your brain. When your personality becomes digital, perhaps what can be mined from decades of Internet use can be used to fill out the future you more realistically.
Why should I care about a future copy of me? He’s not me. And future me is probably going to be as miserable and powerlessly suicidal as present me, I don’t get why they’d even try to make another one.
Internet as performance art is pretty common even without the specter of armies of future historians poring over every archived detail of your life. See Münchausen by Internet for a dramatic subclass of this, though not all the cases I’ve seen have terminated in heartwrenching illness and death. (More have than you’d probably guess, though.)
On the other hand, once you’ve seen a couple of these, the rest get a lot easier to identify.
>Do you really think every analysis of some random schmoe’s life is going to be positive, and nobody’s going to be in the Journal Of Get A Load Of This Fucking Loser?
Honestly, yes. Or at least neutral, since they’ll view us all as future-racist and whatnot.
It’s pretty hard to get inside someone’s head and hate them, at the same time.
For example: most people consider Hitler to be basically the devil, so there actually is a small field of Hitler Studies.
But most people who study Hitler don’t think he was the devil – they come up with explanations for his behavior, like “he failed to get a into art school, then get a job, then and the army, so naturally he felt like the Jews were taking all the jobs” or “he was obsessed with cleanliness and germs from childhood, and he analogized Jews to disease, so naturally he felt Germany needed some ethnic cleansing.”
(Those specific theories are probably wrong, of course.)
But they present a a reasonably sympathetic internal narrative, rather than “he was a psychopath who used the Jews as a scapegoat to gain power.” For Hitler, the guy who’s only being studied because he’s the most famously evil person ever.
I wonder if you are among the few people who could potentially understand how I managed to spend the last several days grieving over someone I never met who died in a tragic manner a quarter of a century ago.
Is it not monstrous that this player here,
But in a fiction, in a dream of passion,
Could force his soul so to his own conceit
That from her working all his visage wann’d,
Tears in his eyes, distraction in’s aspect,
A broken voice, and his whole function suiting
With forms to his conceit? and all for nothing!
What’s Hecuba to him, or he to Hecuba,
That he should weep for her?
On the other hand, if there are going to be fifteen academics studying each online persona, I estimate that I will be giving gainful(?) employment to seventy-five people much smarter and with much higher academic qualifications than me.
I don’t know whether to be proud or sorry, particularly if there are going to be vicious squabbles over whether or not X from site Y could conceiveably be identified with A from site B, given the disparity in subject matter and style.
You could write down the relationship between your online identities in meatspace, bury that ‘answer key’ at some remote geocaching location, then refer to it in a post from one of your internet personas (obviously with some kind of strong identity-verification on both the post and the buried answer key). This would be a nice thing to do, because some graduate student working on some out-of-the way subfield of Deiseach studies will discover the time capsule and absolutely blow open the entire discipline, all but guaranteeing tenure for whoever makes the initial discovery.
As someone who knows a lot of struggling academics, I think it is a great shame more ancient philosophers and mathematicians didn’t pay it forward like that.
Oh, I have absolutely no interest in linking up my various sub-personalities with each other, much less my ‘meatspace’ real personality (if such a thing exists; sometimes I wonder which is the ‘realer’, the me which sits here doing the typing or the split-off ‘me’ which is presenting certain attributes of itself?)
No, the poor sod who gets stuck with “Deiseach” studies is going to have to struggle on with ploughing their own lonely furrow, which is going to be particularly interesting because I ran across somene else online with the same nom-de-internet who left comments on blogs not a million miles away from ones where I was commenting myself, but who definitely wasn’t me!
The rival schools of “Two separate but contemporaneous Deiseachs” versus “One Unified Field Theory of Deiseach” should be fascinating for pure hair-pulling row entertainment and why would I deprive the future of that? 🙂
>The rival schools of “Two separate but contemporaneous Deiseachs” versus “One Unified Field Theory of Deiseach” should be fascinating for pure hair-pulling row entertainment and why would I deprive the future of that?
You just did.
No, they didn’t. There’s no way to tell whether the current claim that there are two Deiseachs is true.
Nor, if I am indeed one of the two, which of the two I am.
If there are two. Or if I am one of them, and not a deceiving lying liar pretending to be one.
It’s “Did Bacon write Shakespeare?” all over again 🙂
I like Paul Graham’s theory that classicism is a vestigial social institution from way back when Europe was emerging from the Dark Ages and there was a lot of valuable insight to be rediscovered from the Greeks and Romans. And it’s hung on for so long because academia is really, really bad at pruning away useless stuff. I think Paul Graham sees this as a point against academia, but there are a lot of world-changing concepts that look pretty useless in their infancy, so I think it just goes with the territory of trying to be an incubator for fragile ideas. To the extent that academia is broken, being more strict about the value proposition it provides does not seem like the solution.
This model suggests that classicism is basically a special case, so we shouldn’t expect it to extend to devour all history in the future. Rather, there will just perpetually continue to be a few guys studying Greeks and Romans, and nobody will have the heart to tell them they’re just reiterating theses from a hundred thousand years ago.
(link to Paul Graham’s piece http://www.paulgraham.com/essay.html)
Okay, by “Dark Ages” do you mean the Dark Ages (post-collapse of the Roman Empire in the West) or are you merely repeating the shibboleth about the Middle Ages? Because Classicism, defined narrowly as the study of the Roman and Greek Classical Eras, which took off in the 15th-16th century, wasn’t a leap from ‘dark ages to modern times’ unless you’re going to consider the 8th-14th centuries as ‘dark’ (e.g. for Ireland, our Golden Age roughly encompassed the 8th-11th centuries).
“Even if we admit the common sense idea that the 20th century contains “more” historical interest than, say, the 5th century, it still certainly does not contain three times as much historical interest as all previous centuries combined.”
I don’t share that certainty. Where do you get it from?
If “history” is basically “what people did in the past”, then it makes sense to me that with more people you have proportionately more history.
The other day I was reading One Bridge Too Far, and when I think about it, it strikes me how many stories and subepisodes are to be found in that one single episode of a great war… and I wonder if the reason WWII unpacks into so many books and publications and movies etc is not just that it’s more recent, feels more relevant, and we have more and better preserved data than we have on the Trojan War, but simply because there’s really so much more of it, all of it interesting.
Fun fact: more people served in WWII than lived during the entire… I’d finish this sentence if I actually knew anything about history. 😉
According to this link, 1.9 billion people worldwide served in WWII. That’s 1.78% of all people who ever lived; about a fourth of all the Ancient Egyptians who ever lived, a dynasty which lasted for ~3000 years;
That 1.9 billion number is the total population of all belligerents (and a few non-belligerents, like Spain). Do you really want to say that they all “served” in the war? Your link that claims that they all served only counts 10% of the US population as serving.
Good point, but there are plenty of books about the homefront/civilians in World War II also!
Yes, I should have said: call it “total war” if you want, but be explicit if you want to say that everyone “served.” Indeed, the source is inconsistent on this point.
But I don’t think India was much of a home front.
Soooo… you’re saying I should be really thorough about wiping out every evidential trace of myself when I die?
Some corporation with the software, and the technical and legal means to do this well could make lots of money off its clients. I don’t believe this is a viable business in the present, but within a few decades, I can easily see this being a viable business model. I don’t know if it will be a start-up, though. Google and Facebook and the like might have a huge advantage in this, and could exploit their clients like this as they convert from one service to another.
Good luck with that.
I seem to miss the point. Classics or historical research has almost nothing do with the number of people having been alive in the respective period. There were already more Aristotle scholars/interpreters than one (Aristotle himself) in the middle ages (or certainly in the early renaissance when something like scholarship started, the scholastics did not treat Aristotelian texts philologically). There will probably always be a larger number of historians specialized on Napoleon than Napoleon and his bunch of generals etc.
There are also all kinds of events that could in principle boost the material available to Classicists immensely. Say, someone saved a few hundred manuscripts from the library of Alexandria before it burned down and we find this trove in the desert in 2030. The available texts for certain authors jump from a few quotes and bits in later authors to dozens of original pages. We could probably double or triple the number of scholars specialized on papyri etc. to set them to work on this new stuff. Similar stuff has happened, especially with better techniques for deciphering overwritten manuscripts (palimpsests).
Seems physicists are lucky, because there are so many atoms and elementary particles around…
Also, Classics was actually the original umbrella term for a multi-disciplinary field, long before American Studies or Black Studies or Women’s Studies. Many classicists have almost nothing to do with history – their field of expertise is literature or philosophy or mythology or philology.
You forgot to mention the enormous culture clash that resulted when the previously-separate fields of Hunter Glenderson From Charleston Studies and Thor41338 On Gamer’s Guild Studies merged: XYZZY-359, the android that proved the two people to be the same, even got death threats, and a few academics on both sides stubbornly refused to accept the validity of the finding.
Most of the people did eventually accept the finding, but for a long time many universities continued teaching Hunter Glenderson From Charleston Studies in a different faculty than Thor41338 On Gamer’s Guild Studies, and using different methods. That practice continued until the Second K’th’rangan Invasion, when the Galactic Emperor finally told them to get a grip and accept the fact that they were both really sub-sub-sub-sub-sub-branches of computer science and thus should relocate to the Faculty of CS, as literally every other academic field had already done 200 years ago.
Being under the CS department? Now there’s a dystopia.
There will be even more vouyeuristic attention from non-academics. If there’s anything like a chronoscope, our lives will be one of many entertainment offers.
If there isn’t, it will be approximated: Someone will put the available info together and synthesize a timeline where the blanks are filled up with best guesses.
People will watch you in your homes and observe your day as depicted in the timeline. They will zoom in on all kinds of random details, they will rewind and fast-forward, and create simulated alternatives where all kinds of divergence points happen.
There will be a Nikias The Random Olive Farmer simulation; in fact, this fictional person will get one just because it was mentioned in this blog post.
OH and dear people from the future, please make a super-sex simulation of me and run it 24/7, thanks. (You are allowed to watch.)
Cold Lazarus , the 90s BBC television drama written by Denis Potter.
Cryogenically frozen head of author is thawed out for research purposes, but the project is going to be cancelled until media mogul takes over funding it because he intends to broadcast the unfrozen brain’s memories as light entertainment – the ultimate ‘reality TV’, if you will.
I had pretty much this exact idea a few years ago. If I was paranoid I’d assume you are reading my mind through your evil psychiatrist powers and head for the tinfoil.
But, it’s a relatively common sci-fi concept. Here’s Winston Rowntree’s take, quite poetic and sentimental. http://www.viruscomix.com/page585.html (Warning, fuckhuge comic book with a lot of text, but I guess here people won’t complain it’s all wordswordswords like 4chan does…)
I was gonna link that too.
“Quite poetic and sentimental” might actually be the overall tagline for Winston Rowntree’s work.
I’m pretty sure the idea was discussed in the comments of one of Scott’s posts on his previous blog, though I don’t remember which post it was on. (Seconding the recommendation of that comic too.)
The question I have, is how much computational resources do those historians have?
How is The Light of Other Days? IME Clarke does rather better in the short fiction range, and I haven’t tried Baxter’s novels at all. (I was introduced to the chronoscope through “The Dead Past”, which despite the neat idea (did anyone else do it first?) really wasn’t one of Asimov’s better ones.)
It’s good (disclaimer: I was around 16 years old when I read it). The writing was so much noticeably better than Clarke’s usual when it came to novels (along with “The Wire Continuum” short story co-written with Baxter) that I actually bought one of Baxter’s books (Vacuum Diagrams) to look more into the guy.
Baxter’s one of those authors who benefits from collab work. He’s ok at the hard sci fi fabulation but he doesn’t really do “human beings”, which is why partnering him with someone like Pratchett works really well.
Yeah, I was disappointed by Vacuum Diagrams largely for that reason.
Yes. On the other hand, if you want interesting hard sci-fi without all those pesky humans cluttering things up, go read straight Baxter. ^_^
And yeah, LoOD was quite good.
Well, my other problem with Vacuum Diagrams was that I didn’t think the execution matched the vision (the history of humanity from now until the end of the universe). I mean, crews of baseline humans piloting starships in AD 170,000? c’mon.
If you’ve already got a chronoscope, don’t you already know almost everything about the subject being studied? How would they go about doing these studies, what questions would be debated? How did chronoscopes work in these stories?
>If you’ve already got a chronoscope, don’t you already know almost everything about the subject being studied?
Not really. You know what did happen, but not why it happened.
Of course, the chronoscopes are mostly a metaphor for the internet; it doesn’t really seem plausible that there would be device that could only act as a perfect window into the past without (for example) allowing you to scan the targets brain brain.
I suppose they might be based on, say, FTL travel allowing us to intercept “sattelite footage” of Earth or some such nonsense. That would be sufficiently filled with gaps to allow a quite substantial “field” while still providing data on almost every person who ever lived.
It still seems to me like few ideas about the past are debatable with a chronoscope. Normally, with a bogus explanation, people add additional details that justify anomalies and argue these details are “plausible”. But with a chronoscope, such hypotheses are easily falsified. So we’re stuck with a tiny pool of hypotheses that are all equally compatible with the evidence. How could anyone go about debating those (except by appealing to occam’s razor)?
Are you thinking that the debates would arise as a consequence of the large volume of information that would need to be sifted through? Couldn’t Monte Carlo methods or something like that fix the problem easily?
IMO studying the past with a chronoscope would look very little like studying the past through normal means. It would necessarily be more about predicting the future based on the past, and so I think historians and classicists wouldn’t exist as a profession at all, and instead they would merge with predictive social sciences.
Perhaps historians would discover heuristics other than Occam’s razor that accurately predict which of two explanations of similar plausibility could be favored. That would be awesome and might create room for debates. What might such heuristics look like, if any existed?
Also, now that I think about it, chronoscopes are a huge invasion of privacy, unless they are legally regulated somehow to prevent people from looking up events of the recent past. If I can watch in perfect detail what anyone on the planet did yesterday or a year ago, that is probably very bad. Makes keeping secrets impossible. I now endorse chronoscope Luddism.
This was the plot of the book that coined the term (in the book, they worked by a quirk of particle physics that let one generate photo realistic audio and video.)
For those interested: it was wormholes. They found out how to make wormholes, and then discovered that if you pumped in enough energy to make a pair span, say, one light-second, but generated the endpoint next to the startpoint (rather than a light-second away), it would still have a separation of one light-second, but into the past (so, one second). Eventually they got really good at doing these, so “seconds” became “centuries”.
So they saw through the wormholes by literally having photons cross into the present. They got audio by careful analysis of nearby air molecules to detect waves.
Wait, shouldn’t the main application of that have been sending information to the past? Chronoscopes seem like pretty small fry when compared to TIME TRAVEL.
Nah, handwavium prevented energy flow from future -> past; only past -> future was allowed.
I’ve taught history at a lay level. There’s already way too much data out there for any one person to possibly absorb. Much of my lesson planning was spent deciding what bits to leave out. I tried to find a unifying theme for each lesson, and then sift through the wide range of material for what was relevant, then send half of that to the cutting room floor just to fit in time constraints and not confuse people.
More data is just going to mean more work like that for popularizers. And there will still be plenty of room for finding overarching themes (think Guns, Germs, and Steel). Furthermore, a chronoscope would open up the entire past to statistical analysis (perhaps even using the entire population rather than sampling), which can be mined endlessly and will involve plenty of grunt work.
The machine learning answer is to set aside some of the data as a test set, come up with your theories based on the remainder, and measure their predictions against the held-out data.
There are obvious practical difficulties with trying to apply this to the study of history. It could work with a centralized chronoscope authority and copyable EM scholars, though.
One of the great modern trends is getting more and more information out of less and less data. Once they get to the point of being able to study classical Greek gut bacteria, there should be enough to study to go around.
Also, detailed study of the past could be considered a memetic infection. All that’s needed is to convince the K’th’rangans that they need to put their resources in to studying their past and proving how glorious it was.
Why stop at gut bacteria? There’s probably something going on in the deep structure of the universe which changes and affects people.
Does anyone remember an sf story which included someone studying a small plot of land, year by year, with a time viewer? I believe he mentioned that there was a robin who was quite a character. (Story was probably golden age.)
Of course, the Greeks are studied because they particularly clever and interesting. There may be postmodern Platos and Aristotles but I suspect that most of us are more equivalent to the Hallstatt culture or the Bantu peoples, who were at least doing useful things like working out languages and creating iron tools.
Is, perhaps, the real lesson here that history doctorates will simply have to give up the “novel contribution” dissertation requirement, and simply be a certification of having adequately studied the material?
One could start to argue now that the requirements are producing more noise than signal in terms of the output of the dissertations.
There is a distinction between hobby and profession. To the extent people want to learn about a given topic, whether broad (turn of the millennium broadcast television) or narrow, (Buffy the VS, “Hush” episode), so be it. The next question though is whether they can capitalize that knowledge into something others would be willing to pay for.
There is no limit to the number of people who can spend their free time studying a narrow or broad category, nor is there a reason to limit their choice. There is a negative feedback loop on who would be willing to pay for that expertise though.
To the extent people still get doctorates in Classical Greece, they are either planning on leveraging that into a socially useful skill which people will pay for, or getting a certificate of accomplishment for signaling puposes, or doing it for personal enrichment (or some combo of the above). As more study a given field, after a transition point, extra commitment to the field becomes increasingly personal rather than professional. Do I watch Buffy tonight or study Greek history? Hmmm, good question.
A few thoughts that come to mind as a result of reading this.
1. I find it somewhat pleasant (in a self-aggrandizing way) and at the same time somewhat horrifying (the reasons for which to be puzzled out later) to imagine the resources of a matrioshka brain devoted to a detailed simulation of the exact thought processes behind my choice of a tuna salad sandwich for lunch on May 23rd, 2010. And later the resources of multiple brains to the analysis of the researcher’s decision to simulate the exact thought processes behind my choice of a tuna salad sandwich for lunch on May 23rd, 2010. And then the resources of a cluster of brains on a meta-analysis of all studies using simulations of the exact thought processes behind my choice of a tuna salad sandwich for lunch on May 23rd, 2010. And then….
2. Somewhat following from the above, at any point should we be concerned that the actions we take on a daily basis are causing undue stress to future researchers or to the resources that may be required in their research? If I was less ambiguous in my review of the tuna salad sandwich on May 23rd, 2010, would it perhaps lead to a simpler understanding of my thought process, allowing the use of less resources in effecting a simulation, resources that could be put towards improving the lives of the researcher and others?
3. Following from #2, will every decision and action in my life eventually affects the suffering or happiness of billions or trillions of future entities, human or otherwise? Could my choice to be less vague in food reviews provide the future Galactic Emperor with enough available resources to prevent the K’th’rangan invasion? Is my action in leaving this comment at all the salvation or destruction of an incalculable (to me) amount of life? Am I either (or simultaneously) the greatest threat or benefit to the future of the universe?
Out of the arguments in favor of the starting assumptions here, the one I find the least convincing is the second argument for a constant proportion of classicists: the idea that also the proportion of historians out of the general populacy is going to remain constant. If anything, this post makes a compelling reductio ad absurdum argument that it must instead start falling at some point. Even assuming that there will always be a constant proportion P of historians studying recent events in particular, I would predict that more heavily studied topics will attract a proportionally smaller and smaller crowd. So as long as growth continues, the proportion of historians altogether will continue to converge towards P.
This does not have to mean that the general education level will have to fall at all. For one, new fields of study entirely might continue to be invented, to keep a couple generations of researchers busy, and marginalizing older fields. (How many Lamarckists, or alchemists, or Olympian theologists have you ever met?) For two, there’s always mathematics, which offers an inexhaustible variety of problems at almost every possible level of difficulty. I actually recall seeing a proof, based on something of Gödel’s if not his altogether, that it’s impossible to run out of human-researchable mathematics.
However: a weaker version of the main thesis can probably be made regardless of all this! Even if we assumed only a constant number of classicists — even just the bare minimum required to ensure that the traditions and the knowledge of the field remain alive (i.e. not just catalogued in cold storage on Omniwiki) — there will still be a strictly growing number of classicists across history as a whole. As t → ∞, this situation seems to admit a solution space with three extremes:
1) Research will slow down towards zero, and classicists settle to being ever increasingly keepers and curators of knowledge, rather than creators of it.
2) Classicists will end up conducting the same research indefinitely over and over again; perhaps because no one can keep up with the entirety of the research history.
3) We still get Nikias The Random Olive Farmer Studies, except the papers come out once in a millennium and every scholar in the field gets to be its uncontended master for the entirety of their career.
And these are not mutually exclusive solutions. It’s possible to have both #1 and #2 in effect, and regardless end up with NTROFS, just exponentially slowly. The second NTROFS paper might come out a mere century after the 1st; the 10th might come out seven million years after the 9th.
Lastly, we can incidentally note that in all of these cases, and perhaps even in Scott’s constant classicist proportion solution, most of the early NTROF research will presumably have been conducted due to nonstandard motivations — as laconic humor, out of extreme boredom, due to having pulled a research topic out of a random number generator…