This is the bi-weekly visible open thread (there are also hidden open threads twice a week you can reach through the Open Thread tab on the top of the page). Post about anything you want, but please try to avoid hot-button political and social topics. You can also talk at the SSC subreddit or the SSC Discord server – and also check out the SSC Podcast. Also:
1. Although I’m very happy with the quality of discussion here most of the time, I was disappointed with some comments on the Trump post. Part of this was my fault for going for a few jokes that made it more inflammatory than it had to be – but enough of it was your faults that I banned six people and probably should have banned more. Remember, if you see an immoderate comment that needs moderating, please report it using the report button in the corner.
2. Related: I am going to be stricter on the “necessary” prong of the comment policy. If it’s a thread about the poll numbers for some right-wing policies going down, and you post “The reason Trump won was because everyone knows all liberals are…”, you are probably getting banned. I’ve been reluctant to do this before because it’s the sort of thing that could be true and I don’t want to make it impossible to say certain true things. But now I’m thinking it’s so irrelevant to the topic that it will have to fit both the “true” and “kind” prongs to stay up without getting you banned. If you really can’t figure out whether something you want to post is like this, imagine someone on the opposite side said it about you, and see whether it feels more like a reasonable critique or like they’re trying to start a fight. I like the way Vorkon talks about this here.
3. In early October, I asked people to pick up anxiety sampler kits, try to use them at least twice a week, and send me the results. I gave out thirty kits and so far I have gotten valid results back from two of them (though it hasn’t quite been 10.5 weeks, so don’t worry, you’re not late). If you have a kit, please don’t forget to try it; if you’ve tried it, please don’t forget to send me your numbers.
4. Comments of the week are this discussion of “rods from God” as a nuke alternative; see especially Bean and John Schilling’s responses. Aside from everyone’s erudition, I also appreciate their ability to turn some good phrases – “like playing hide-and-seek while the seeker’s head is wrapped in a burning towel” is going to stick with me for a while.
Does anyone here have an educated opinion on when belief in reincarnation first appeared in Greece? Apparently we can be sure as early as Pherecydes of Syros and the more famous Pythagoras. But then no one seems sure whether Pythagoras got it from Orphism, and he and Pherecydes are slightly too early to say “Oh, it was probably transmitted from Hinduism via the breadth of the Persian Empire.”
I know it’s obscure. There are theories based on PIE that the belief is autochthonic and a feature of PIE cultures. Unfortunately, there’s no smoking linguistic gun. But the language of reincarnation has a preponderance of horse and chariot metaphors and those are often from common PIE words. The phrase ‘immortal soul’ (or ‘undying soul’) is a concept and specific phrase that appears to exist in all PIE cultures as well.
This is often tied to a single origin theory of all reincarnation religion. The idea is that the idea spread from PIE cultures to others but comes from that single origin. If you buy this theory, the origins would probably be during the Dorian Migration (or whatever you want to call that), which does square with some religious changes that appear to have taken place. For example, Hades becomes increasingly portrayed as the God of the afterlife and his importance greatly increases as a result. It’s not entirely clear Hades was associated with death at all prior to this. He appears to have been an earth deity with a focus on mining and secrets. The lord of what was literally, physically under the earth.
The earliest explicit mention we have of reincarnation in a primary source is the 6th century in Greece. However, there are some theories that Homer purposefully refused to include reincarnation in his afterlife. It fits with the Illiad’s themes about war that its effects are permanent. But that’s highly speculative and based on things that appear to be left out or maybe were being alluded to. (And on the fact Homer goes out of his way to portray death as final and terrible.)
Is it really that speculative? Comparing the Iliad to the Odyssey or visual depictions of the scenes from the Trojan war which ~always show life after death (eg depiction of Patroklos’ tomb will also show his eidolon, here in the top right), the difference in afterlife views is rather stark. And obviously if they’re having a grand old time in the Elysian fields, it would completely undermine the entire poem. Could hardly be coincidental…
I think it’s pretty definitively established that Homer was using his own version of death and the afterlife to make a point in the poem. What’s speculation is whether he removed reincarnation specifically. It would undermine the point of the poem too but absent other evidence it’s speculative.
Yeah, I know “undying soul”, like “undying fame”, is a stock phrase traced back to PIE.
I didn’t think about Homer consciously choosing to exclude beliefs about the afterlife other than “you become a shade in the underworld, which is worse than being a poor man’s slave”, but it totally makes sense, albeit being speculative. After all, the deities Dionysus, Demeter and Persephone, who definitely (Dionysus) or probably (the two ladies) were part of Bronze Age state religion, are pointedly excluded from Homeric Olympus.
Given that Celtic and Germanic peoples believed in reincarnation (see Wikipedia on reincarnation), it seems very likely that reincarnation is common to all IE.
Especially that report from Julius Caesar is interesting: “by such doctrine alone, they [druids] say, which robs death of all its terrors, can the highest form of human courage be developed.”
When you have a new technology – including religion, culture, or even fashion – that convinces warriors to fight harder and fear less, you can bet that all your neighbors will copy it (see the spread of Hungarian-type hussars across Europe for a relatively modern example). This might be how the entire IE culture got spread, not just reincarnation.
Do you know of anything looking at the creation of hussar regiments? Because I’d always thought that was a bit weird, but never explicitly connected it to fashion/memetic evolution before.
See this stack answer. The link to “Hungarian hussar” there is broken; here’s the correct link.
I essentially agree with that answer except I don’t think that sexual attractiveness to women was caused just “by Balkan-style flamboyant dress, mustaches, hairdos”. It seems to go deeper than the costume. But I haven’t read enough to talk about it.
There’s a lot more material on hussars in German and Russian than in English.
Generally, when one encounters the phrase “light cavalry” in historical sources it’s usually reasonable to mentally substitute “pre-industrial motorcycle gang” (note that this even accounts for the observations on sexual undertones made in eigenmoon’s link). Despite indiscipline or outright lawlessness light cavalry served valuable military purposes: scouting of course but also harassment/marauding of vulnerable enemy forces, especially rear areas. “Hussars” were a sort of cultural adaptation that allowed marauding light cavalry to be brought under more control while retaining and even increasing their military effectiveness.
Heh. So Cossacks were Russian motorcycle gangs influenced by Kazakh culture?
I don’t think that’s a bad comparison at all. I mean, look at them.
Soon: a near-future SF novel based on the conquests of Genghis Khan… but on motorcycles instead of horses.
Not to be all John Sidles, but there was a bit of that in The Iron Dream.
(Its conceit is that an alternate-universe Hitler immigrated to the US and became a pulp SF writer instead of going into politics, producing the eponymous novella. It plays out as an Animal Farm-like allegory of early Nazi politics, and the point, or the joke, is how well they end up mapping to classic pulp tropes. The motorcycle bandits are the SA analog.)
Soon: a near-future SF novel based on the conquests of Genghis Khan… but on motorcycles instead of horses.
Well, we already have the music video for it! 🙂
No need to write a near-future SF novel. Jihadi motorcycle gangs terrorizing the countryside are already a feature of life in Mali.
https://ktla.com/2018/12/13/42-killed-in-suspected-jihadist-attack-in-mali/
The problem with that is the question of whether the Rigveda should be interpreted to speak of reincarnation. The Chandogya Upanishad, perhaps a thousand years younger, frames it (polemically?) as a truth that the Brahmins don’t know.
No doubt it would be the simplest explanation if the late Archaic Greeks, pre-Classical Hindus, Celts and Germans all got it from their common ancestors, but the data is complicated.
But all this assumes that Rigveda, being the earliest, should be therefore also Indo-Europeanest. The problem is that it somehow isn’t.
Take the cosmic struggle between devas and asuras, corresponding to Greek gods vs titans. Rigveda doesn’t even know that asura is a bad word.
Take the 3 sins myth. I’ll describe the myth first.
An IE society has 3 orders:
(1) priests / mages / government
(2) warriors (very preoccupied with honor)
(3) peasants + women (because women kinda grow fruit)
The 3 sins myth has the hero antagonizing each of the orders in turn, usually in reverse order, and getting destroyed in the end:
Heracles:
(3) kills his children (counts as an anti-peasant act because children are fruits)
(2) kills his guest by setting a trap for him (dishonorable kills are totally against the warrior caste)
(1) kills some king
Indra:
(1) killed a brahman
(2) establishes friendship with Vritra then kills him
(3) raped a woman
Also, Jesus (Mt 4:3-11):
(3) refuses to produce bread
(2) refuses to command his angelic troops
(1) refuses to become government
Jesus subverts the trope by getting initiated rather than destroyed.
Also, Darth Vader:
(3) attacks his pregnant wife
(2) attacks his commander
(1) attacks the mage-emperor
So, back to Rigveda. For some reason it has no idea about 3 sins of Indra. Dumezil explains this (same link as above) by saying that Rigveda is poetry so whatever. But that doesn’t explain how Rigveda has no idea that asuras are bad. Somehow Rigveda is a lot less IE than one would expect.
So I still think that IE reincarnation is the simplest explanation.
Dodds has some stuff on this, he argues that it’s just a more-or-less universal shamanic thing, no need for any “transmission”. Empedocles of Acragas also fits the pattern. The connection to Orphism seems to fit well even though we know very little about it.
Uninformed uneducated opinion off the top of my head, it probably came in with mystery cults. What the ordinary people actually believed versus what we have surviving in writing is generally not really known, we seem to have bits’n’scraps of cultic practices and things like local hero cults and autochtonic deities, but my main impression is that belief in an afterlife was that it was a pretty glum place, and that seems to have been the opinion not just of the Greeks but of the other great civilisations as well.
Getting a second chance at being reborn into human life would be a much more desirable state of affairs. Now, when did the mystery cults (like Demeter and Kore etc.) get popular or well-known? I have no idea.
One obvious question would be whether a belief reincarnation shows up in any societies that had no contact with Indo-European ones, such as the Maori or Aztecs.
“My soul will go down to the grave (Sheol)”, as the Psalmist says.
When the mystery cults became popular or well-known is a tough question to crack. In Linear B tablets, “the two ladies” who receive sacrifices are commonly guessed to be Demeter and Persephone, whose names are coherent Greek (in the former’s case simply Dea-mater, a feminine Deus-pater) rather than mysterious loans like so many of the Greek deity names. Yet the surviving tablets recording sacrifices to official deities come mostly from Crete and Pylos, not Eleusis or anywhere else in Attica. So Mystery cults like the Eleusian Mysteries could have tacked on innovative doctrines to longstanding deities at any time before we have proof that Mysteries existed (Herodotus’s time).
Reincarnation is the most obvious possible idea about what happens after death and thus I would expect it to pop up all over the place. As birth and death are the two endpoints of life, it is very convenient and obvious to think that if death follows birth, maybe birth also follows death, maybe it is the whole thing just over and over again. Reincarnation is a very convenient way to explain people being born with a skill, talent (you just practiced it last time), in fact, I don’t even know what’s the scientific explanation of the Mozart type of musical wunderkinder, isn’t music a social construct and therefore you shouldn’t just be able to have an inborn skill for playing the piano?
So are there any serious claims reincarnation was not all over in the indigenous cultures of Africa, Americas, Australia? What did they think happens after death? Any idea of the “nothing” type gets outcompeted by more hopeful ones, so the alternative is afterlife. Why would a complicated idea of going to a spirit world outcompete the idea “it is just the same thing all over and over” ?
I’m not sure I agree with that. Reincarnation seems like the least obvious to me, because the mechanisms to cause it don’t make any sense. Some kind of spirit world where your soul goes or “nothingness” would seem like the most obvious.
I found the book helpful. I took her thesis to be that introverts need their own advice and role models for success in life, and here are some. I see her as saying that introverts have great potential to do awesome things, but they will do these things differently, and so will have different intermediate achievements. Much of our culture, meanwhile, valorizes the intermediate achievements of extroverts, like speaking extemporaneously in a way that draws people in.
So I think it was useful for recognizing certain recurring types of failure in my life not as evidence of underlying character flaws that were holding me back, but instead as evidence that I am trying to operate contrary to my personality and may have better success if I try different things.
My personal experience is that skills “just” based on muscle memory are quite learnable, but the constraints of adult life make it hard to actually find the time to practice for the time needed.
I don’t know exactly what you mean by fine motor skills, but I’ve successfully taught myself touch typing, and I’m also getting better at martial arts (which requires doing weird things like gripping tightly with your fingers while keeping your lower arms relaxed – does that count as a fine motor skill?)
I’m more skeptical of things requiring actual dexterity, since there are significant anatomical differences between people: IIRC, the tendons controlling the ring and pinky fingers can be (partially?) fused together, and you can also miss an entire muscle used to stabilize the palm of your hand. Also just having long fingers seems to help a lot e.g. if you want to learn a musical instrument. So if you feel you were born with two left hands, I don’t think there’s too much to be done about it. I’m in the same boat btw, so would love to hear other takes.
I started playing the electric bass at the age of 27. (Granted, I had played the electric guitar a few years in my late teens, but still.)
I haven’t even heard of the book until now, but:
Moral considerations aside, how exactly is it going to happen ? Is a charismatic introverted leader going to unite all introverts and hikikomoris everywhere into a single committed force, or what ?
In general, I completely agree that society would be totally better if I was its God-Emperor; or, barring that, if people of my exact physical/mental/whatever makeup were on top. By “better”, I mean, “better for me”, of course. But that’s not a very interesting observation; literally everyone in the world can say the same thing.
Reminds me of a Bill Hicks bit:
There’s a new party being born: The People Who Hate People Party. People who hate people, come together! “No!” We’re kind of having trouble getting off the boards, you know. Come to our meeting! “Are you gonna be there?” Yeah. “Then I ain’t fucking coming.” But you’re our strongest member! “Fuck you!” That’s what I’m talking about, you asshole. “Fuck off!” Damn, we almost had a meeting going. It’s so hard to get my people together.
Damn, we almost had a meeting going. It’s so hard to get my people together.
That is the trouble, isn’t it? 🙂 I’d sign up for a “People Who Hate People Party” in the morning, but wild horses wouldn’t drag me to a meeting!
It’d be interesting to see if there are enough people who “like people” but think they don’t, to make a decent group out of.
According to my step-father he had a friend who used to rant:
“90% of the people are assholes and come the glorious people’s revolution 90% of the people will be shot!”
I think that there’s a flaw in there somewhere ..
Not if you assume even assholes don’t like assholes.
The flaw is that the assholes are doing most of the shooting, meaning you go from 90% assholes to 100%.
Hating is too much emotion. To describe the atheist / libertarian / technophile / sf-fan / early-adopter / programmer / etc crowd I would say “people who compulsively disagree with people”.
” *sigh* There are three of us 🙁 “
Incredibly narrow anecdote with limited real application here, but as a result of hours and hours of tedious micromanagement mouse practice scenarios for the video games Starcraft 2 and Dota 2 in my early 30s I was able to increase my accurate actions per minute from my natural 50 to around 100
Regarding gwern’s comment :
Changing our genome would have no effect on bacterial infection/colonization. Bacterial weapons are purely chemical – they have no need to interpret their host’s genetic machinery. All they care about is your phenotype. Humans with reprogrammed-codons-but-identical-phenotype would look exactly the same to bacteria, and would be just as susceptible to bacteral infection.
This is why you can grow bacteria on things that have no genome at all, like agar plates.
Viruses are different. A virus’s weapons are informational, and it must interact with the hosts’s genetic machinery. So, reprogramming all our codons would make us immune to viruses. (With the exception of all the viral DNA that is part of our genetic code, but that’s another story.)
This is correct, although stated in a somewhat confusing way.
Viruses are lazy. They provide nucleic acids and a few proteins but rely on the host cell to provide everything else necessary to reproduce, including their tRNA pools. They would be very vulnerable to changing which codons map to which amino acids for this reason.
Bacteria, on the other hand, are complete organisms in their own right. Some bacteria do enter cells and hijack their cellular processes, but even those have their own tRNA. They just don’t care about how your cells synthesize proteins; as long as the proteins themselves are the same, it makes no difference.
Viruses show the advantages and risks of being extremely optimized.
Yes, I stated this in a reply to the comment in that thread.
Could anybody comment on the veracity of the book Lies My Teacher Told Me?
I began reading it recently, and while I am enjoying it, I am shocked by some of the claims. According to the book, American history books chronically white wash favored historical figures and peddle flagrantly false claims. For example, it would seem not only did Columbus not discover America, most of his backstory is made up and books gloss over his more… unpleasant actions. The author cites all this with copious primary source documentation, but he also has an axe to grind, and I am shocked history books make baldly falsiably untrue claims as opposed to just fudging and ommitting.
Anybody have a takedown or vindication of this book to share?
If the gist of this book is true, have history books gotten better since it was published?
I haven’t read the book but this is exactly what I would expect.
Suppose that you see two history books before you. One says that the kings of your nation were really nice people, all the heroes you’ve heard about are even more heroic than you imagined, and the past of your nation is even more glorious than you’ve thought. The other book argues that those Great Kings were just lucky bandits, that all those heroes had done way more shady business than heroes are supposed to do, and that those Glorious Conquests were actually just plunder and should be undone. Suppose you’re unable to determine which of the books has better support in reality. Which of the books would you recommend to your friends?
Of course there are professional historians who can and should determine which of the books has actual merit. But – and here’s the rub – who pays them?
In other news, Muslim conquerors had no idea that they’re Muslim until about 690. See Holland “In the Shadow of the Sword”.
The person most responsible for that theory about early Islam no longer believes it. I had the interesting experience of being graded down for stating in a paper this theory by its creator. Now we should certainly not view early muslims as orthodox well catechized Muslims , but they were in a meaningful and self conscious sense muslims.
Thanks for sharing that. Now I see there is a mention in a primary source (Yohanan bar Penkaye). I stand corrected.
It’s over a decade ago since I read some of it (I can’t remember if I finished it).
I remember that it seemed plausible, but other than some stuff about Woodrow Wilson I really don’t remember what details I read in the book.
What age of kids are we talking about here? (I haven’t read it either, sorry.)
For children under, let’s say, 10 years, everything is of course completely whitewashed. At the younger ages you get more a canon of American myths, rather than any attempts at factual accounts of history. Even completely fictional stories like “George Washington and the Cherry Tree” are common.
By the time they reach high school, students are hopefully learning a more rigorous curriculum which gives a balanced account of events. But no doubt there are some schools which don’t do a very good job…
For whatever reason, “Columbus Discovers America” is one of the canonical legends which is taught to young children all around the country. There’s even a national holiday, Columbus Day. In high school, you might hear that Columbus was one of the many explorers of the era, who ran upon some Caribbean islands and thought he was in India. But since Columbus is given much more airtime at the younger grades, the mythical version might be what sticks with people the most.
I only remember cherry tree-style anecdotes from picture books I read when I was very young. The first actual history I learned from actual teachers in actual school, was about 75% about how drunken gold miners in the 1800s converted the local Indians into corpses (it’s cheaper than Christianity). It continued more or less in that vein until eighth grade, when nuts-and-bolts civics (what the branches of government do, how bills get passed) started getting taught. High school was a mixed bag: AP US History was pretty bloodless, but most of the books we read under the headings of American and world literature were about the experiences of marginalized populations in the US and the world, respectively. (British literature was mostly Shakespeare and poetry.)
This was in rural California, in the 1990s. Yeah, sure, California, but it was still a swing state at the time and my hometown (and its school board) was solidly Red.
For example, it would seem not only did Columbus not discover America, most of his backstory is made up and books gloss over his more… unpleasant actions.
On the one hand, yes, Columbus is part of the Founding Mythology of the United States (like every other nation has its foundation myth) and the less pleasant parts are indeed glossed over or ignored.
There’s even a national holiday, Columbus Day.
For example, I have heard this was pushed by Italian-Americans to clean up the popular image of Italian-Americans as dagoes, criminals and general nogoodniks, by associating them with the (still at the time) celebrated figure of Christopher Columbus as a successful Italian who contributed to what would become the United States. That’s all part of the Foundation Myth.
On the other hand, this is the pendulum swinging wildly via turbo powered nitrous oxide fueled rocket boost to the other side, and is just as biased, partial, and cherry-picking in its way. The whole “Columbus didn’t discover America, the Native Americans discovered it millennia before him” – yeah, sure, okay. And probably throw in Leif Erickson, St Brendan the Navigator, and anyone else you can think of.
But what Columbus did was find the islands and the mainland (even if it was more or less by fortunate accident), land there, get established there, get back to Europe and start the whole “hey, there’s a whole new land out there in the Western Ocean” which St Brendan etc. did not do. So he can get the credit/blame for that.
The book sounds like “hey, I just discovered proper history not the version we get taught in school and who knew that it wasn’t like the text books tell us when we’re kids?” Well, everyone who’s read outside the text books, for a start, not to mention those who go on to read history as a subject at a higher level than secondary school and professional historians! “Lies My Teacher Told Me” is a great snappy grab-attention-sell-copies title, not necessarily the whole other half of the tale.
San Francisco used to have lots of Italians.
A while back (I think in the ’80’s) there was a news segment on television about protesters against Colombia day and the footage showed an old Italian man, dressed like Columbus on the parade float, he’d probably been doing it for years, and I remember looking at thr scene on the screen and seeing fear and confusion in the old man’s face as he’s being yelled at.
Sure I’ve read that the real Columbus did some cruel things, I’m sure the real Robin Hood, and King Arthur (if they existed) did as well.
Just let the old guy wear his costume and wave in peace.
Please.
Nice guys rarely make it into the history books.
I think there are at least three well established facts inconsistent with the simple versions of the Columbus story. The first is that the argument against Columbus was not that the world was flat, it was that the distance from Spain to the east end of Asia was much larger than the range of Columbus’ ships. A spherical Earth had been the accepted wisdom for well over a thousand years and the diameter of the Earth had been measured by experiment back in classical antiquity. Combining that with rough estimates of the width of Asia, it was clear that his project was undoable unless he had very good luck in finding land much closer than India—which wasn’t what he claimed he was doing. Columbus himself had fudged the numbers, using much too low a figure for the size of the Earth and much too large for the width of Asia, in order to make it look as though India was close enough to reach. In fact he only made it something like a quarter of the way.
Second, it’s well established that he treated the natives he met very badly.
Third, it’s pretty well established now that the Norse made it to the New World and even established settlements–that apparently didn’t last–long before Columbus. I’m not sure denying that is part of the conventional story, however–it’s sufficient that Columbus was the European whose discovery resulted in eventual colonization.
I read the book a long time ago, don’t remember what else in it I thought correct or incorrect.
That raises the question: why did Columbus do it?
Maybe he was really, really convinced that all the experts were just wrong on the the circumference of the Earth and he was right. But how could he be so sure? Was it just an extreme case of overconfidence? Is there evidence that Columbus was extremely overconfident in other situations?
Maybe he wasn’t that confident, but secretly hoped that there was some unknown continent (or at least some islands) between Europe and Asia waiting to be discovered. That, however, seems inconsistent with Columbus insisting to his last day that he had found a route to Asia – not America.
Are there other explanations?
Lies My Teacher Told Me is a book, like The People’s History of the United States, that has an explicit agenda to show us things that are different, and to sow a less positive history of the country. Like any history book it has its own factual errors, but the major issue one should have with it is that it has a narrative error in that it appears to imply that America is different in the fact that our founders were flawed.
Having flawed founders is not only the norm, it is universal. Ghandi and Mandela, are still currently in their “pure whitewashing” era in India and South Africa, but I’d take Washington over either. I think a more mature history book for advanced history students has its place, and it would incorporate some of the points of ‘Lies’, but ultimately it is a flawed book because more history books already do what it suggests, just not as much and in as horrific language as the author desires.
I agree with this comment – US history textbooks are certainly full of myths, but the only way to fairly compare them to is not some Platonic ideal but the amount of truth-telling you find in alternative textbooks produced by other nations (especially when discussing their own genocides or ethnic origins). We all have our blind spots.
You should also compare them to the amount of truth-telling in the alternative quasi-textbooks produced in your own nation to “enlighten” you about the myths in the more official textbooks. Too often, they are just an equal and opposite dose of cherrypicked not-technically-lies promoting different myths.
I’m not sure this is nearly as cut-and-dried as he makes it look. Writing history isn’t a simple matter of “let’s consult the primary sources, then decide what to put in the book”. You have to find the sources. You have to read them, figure out if they’re trustworthy, and reconcile them to make a coherent vision. This always can take more time than you have, whether you’re me writing a blog post or a Harvard professor writing a book. It’s going to be even worse if you’re trying to put together a history textbook for high school. History is really big, and it’s frankly impossible for anyone to know all of American history well enough to give a really good picture of it from firsthand knowledge. So you have to rely on secondary sources, or a cherry-picked array of primary sources. If you’re doing your job right, you can manage this without distorting the picture too much. But if you’re writing (or worse, revising) a history textbook, you’re probably not going to have the time to do it right. Even leaving aside the political implications (and I think that we should try to find a balance between wallowing in the errors of the founders and whitewashing them(except Columbus, who was an idiot)), you’re going to grab a well-regarded book on the topic and just use that. If you’re feeling diligent, you’ll use two. And there are books that do say what the textbooks say. They may be wrong, but they exist.
Edit: It’s also worth pointing out that information takes time to filter through history books. Apparently, the most damning report on Columbus was found in Spanish archives in 2006. The revised edition of Lies My Teacher Told Me was from 2008. I’d guess that Loewen was using the report to criticize books that mostly were published before it was found. But as I said, there’s no way the authors could possibly digest it in time to make even the one book published in 2007. Maybe it’s a fraud, written by someone who was trying to smear Columbus. Maybe they’d already done that section before it became available. It’s entirely fair to critique a book as being wrong according to current knowledge. It’s another to attack the author for not being on the bleeding edge, particularly when he’s doing something as ponderous as a high school textbook.
Edit 2: It’s also worth pointing out that primary sources are often terrible. To use a personal example, I’ve had numerous discussions with former naval personnel who were absolutely certain of things I can document as being false. The most common is the claim that battleships move sideways when the fire, which is disproved by simple physics. But someone who has an axe to grind can drag out some testimonials and accuse me of “ignoring primary sources”. In this case, physics is on my side, but most issues are more complicated.
Edit 3: This whole thing reminds me of Scott’s discussion of the dissemination of medical ideas. This is a case of the journalist half of level four pouncing on the teaching half for not updating fast enough.
That sounds odd. If the battleship is firing to one side rather than ahead or behind, physics implies a force on the ship in the opposite direction. It might not be enough to move the ship very far, but how does physics disprove that it moves the ship at all?
Strictly speaking, you are correct. There is some sideways movement, but it’s so small as to be meaningless. You’re looking at 7.5″/second for Iowa if she’s really lightly loaded and a hovercraft. In practice, with the ship running into a wall of water 30′ high and 887′ long, it’s going to be pretty hard to measure, let alone feel. I’ve had someone tell me he saw the ship jerking sideways a considerable distance, which simply isn’t possible.
There’s no reason the ship couldn’t rock, though.
It seems more likely to me that that the ship might tilt slightly (list?) rather than actually move through the water. I would expect it to be a pretty small effect either way. I also freely admit I haven’t thought very hard about this and this was just my intuitive reaction when I read your earlier comment.
Edit: Ninja’d by Hoopyfreud
I analyzed that, too. It’s another vanishingly small effect. I got 1.42 deg/sec rotation rate with an unreasonably light ship, and it should damp pretty quickly.
As a sailor would attest, you also have to account for how much of that side-force is displaced into forward (or backwards) motion of the vessel. In water ships have a very low coefficient of friction along their axis and a very high one across it. Sailing is like squeezing a banana.
I feel that as I have mastered this I am now ready for the transatlantic crossing in a Laser.
1.42 degrees/second would be quite noticeable when you’re 30+ feet from the center of rotation (ie on the deck) though. Even assuming it’s only 1 degree/second, that’s still more than half a foot of motion. The naval accounts strike me as plausible, especially if they were higher up (ie on the bridge).
@tossrock
That’s a reasonable point, and it’s something that would definitely be noticeable if it were to happen in an office building. It would probably be noticeable if we were to blow away half of San Pedro or most of the Port of Long Beach. (Although you’d notice the noise and devastation a lot more.) But battleship guns are almost always fired while the ship is at sea, and that means the ship is moving. 1.42 deg/sec isn’t a very fast roll at all, and I rather expect that anyone with any time at sea would just tune it out.
That said, I crunched the numbers on this, and it looks like the total extent of the roll might reach as much as 22 deg, assuming a very light ship and no damping. These are both really bad assumptions. I’m not going to crunch the numbers now, but a more realistic configuration is going to struggle to break 10 deg with no damping. I’m not sure how fast damping goes, but a 10 deg roll isn’t that big, and in this case it’s pretty slow, too. It might or might not be noticeable, but it’s definitely not the level of violence you get from talking to some former crew. (More details on the math are in a comment on the relevant post at Naval Gazing.)
Columbus, as with many historical figures, is a mixed bag. Leaving aside whether discovering America was the dawn of a new era or the beginning of catastrophic end to thousands of years of history, he was controversial even in his own lifetime. I think the current historical perspective on Columbus was that he was an ass that got lucky, but that covers two thirds of people who you heard about in history class so he’s probably not the worst of the lot. It takes a certain type of person to enter the books, rarely are they good people and very few you’d like to share a drink with.
To quote Mal Reynolds: It’s my estimation that every man ever got a statue made of him was one kind of sumbitch or another.
The book was required reading for me in AP US by my conservative teacher over a decade ago, so that gives you some idea of how history class is right now. I would say all of it was accurate at the time of publication, though very one sided. There is basically a division in history curriculum where teachers lie to regular students, but tell the truth to AP students. However, he never thinks of why lying to the regular students is the better option.
It is extremely hard for an average teacher to teach below average students history. These students barely pay attention, will only remember one fact per 50 minute lecture, and do not have a good understanding of historical nuances. So for the Columbus example, it does not make sense to spend an entire 50 minute lecture on Leif Erickson, when his lasting impact to history consisted of Norwegian legends and small Canadian settlements that couldn’t survive. If you try to tell the students in passing about Leif Erickson while talking about Columbus, some students will get the two of them confused. It’s simpler to just tell students that Columbus discovered America for European royalty, which is essentially true, and had large affects on the world.
The next question is whether a teacher should go into discussion on how horrible of a person Columbus was, which is more of a value question. Teaching that would give students insight into the origins of the slave trade and help them realize that heroic people can still be terrible people. But that one lesson will take 50 minutes, and a teacher has to decide what else to cut? The economic reasons for the slave trade? Half of the Bill of rights? Federalism?
It is very hard for teachers or textbook authors to decide what every American should know. Obviously a lot of states teach civics to increase loyalty, patriotism, obedience to authorities, and prefer textbooks don’t teach things unnecessary to that goal. Other states want to teach multiculturalism and independence. As much as it would be nice to teach every American all of history, that’s impossible, so educators have to decide what to prioritize. Your own values will probably decide whether you think certain educators do a good job, which is why I want to homeschool and take out the middleman.
Tl;dr authors of history cannot include everything, so everyone presents a biased view, the only question is how much and in which direction.
Sometimes I seriously wonder if it’s even worth teaching any history at all to students before high school/university, if all you can hope to teach them is an outrageously mythological version of history. You might as well use this time to show them a reasonably historically-accurate movie about whatever Past Subject you want them to be aware off. No test, just movie watching, free of any stress or after-thought.
Very much tracks; the issue is that this applies equally well to every subject, and if we’re going to have to subsidize daycare anyway we might as well at least try and do something productive with it. Probably the negative effects of too much (poorly) structured learning too soon balance with socialization benefits; anecdotally I think my own early schooling ended up a net-positive, just not on academic grounds.
I recently started relearning a fair bit of history, and having learned the simplified basics made it a lot easier to piece together the next level of complexity.
I doubt I will go beyond this current level of analysis, but if I did, I will use this current version as the scaffold for the next.
Teaching the remedial history often backfires with more than just apathy about history
A lot of those remedial kids don’t stay remedial for long, and eventually do come across Lies my Teacher Told Me or the Howard Zinn book etc., not long after high school, and because they never absorb the big-picture story of history that AP students get (“every story has 15 or 20 interpretations that are always existing in tension and as histiography evolves and as we respond to different modern day political moments different historians emphasize different parts”) they are permanently locked into a contrarian bias that instead says “the state wants to brainwash you into one story – the answer is always the second, secret story you don’t find out until you grow up.”
If you look at a lot of the people who like to say things like “the Cold War could have been stopped day one if the imperialists ceased their aggression” or “MLK Jr had affairs and thus did nothing of worth his whole life” or “Africans were selling Africans into slavery so Europeans are morally neutral in the slave trade” a lot of them were so betrayed by pedagogy they feel like upholding the Hard Contrarian viewpoint (once you’ve uncovered it) is the only way an adult can signal that they are truly serious and intellectually honest about history, the only way to signal to society that they are the remedial who has overcome mediocrity, etc.
It makes it extremely hard to start explaining “yea, the American Civil War wasn’t about slavery like how 8th grade teachers say, but if you look into, it was for X, Y and Z reasons about slavery” later in life
the way I like to say it is that in high school, you learn that the civil war was about slavery. Then if you go to college and take history classes, you’ll learn about tariffs, sectional rivalries, and cultural differences that all played their part. And if you go to grad school and study the Civil War, you’ll learn nope, it really was about slavery.
Whether the American Civil War was about slavery depends very much on which part of the country your school was in.
It’d be great if ilikekittycat, cassander, and bullseye could explain in more detail what they believe the US Civil War was really about. I’ve noticed that high schools teach one thing, and contrarians say another, but it sound like you three have a more considered view.
My rough experience was
Elementary School: The Civil War was fought to get rid of slavery because slavery is bad
Middle School: Well, ackshually, it was about a bunch of factors
High School (AP): But really all those factors stem from slavery as an institution and its position in society
So it’s accurate to say that it Wasn’t About Slavery in the sense that it wasn’t expressly an Abolitionist Crusade. But when you boil it all down, it’s absolutely true that Slavery Caused The Civil War.
You end up back where you started but with a more refined and nuanced understanding. It is emphatically not “no, actually, the elementary school myth was right about everything and middle school lied”. The PBS Kids version of the story is that the Civil War was fought to free the slaves. Which captures the right spirit of the Civil War’s place in the Civic Mythos but is not really accurate. The confusion over the middle school detour stems from a) lack of time in the curriculum and b) the conflation, perhaps by both students and teachers, of “not about abolishing slavery” -> “not about slavery”.
This is further confuddled by the ultimate result of the war being the total abolition of slavery, but it remains teleology to claim that was why the war started.
@Gobbobobble:
It’s funny you should say “the PBS Kids version”, because PBS made an animated series about the American Revolution, Liberty’s Kids, that was somewhat nuanced. The Tory girl never gets to give any of several steelmen of the Tory position (like “while you violently yelp for liberty, we’re freeing your slaves”, which was Samuel Johnson’s version) so she ends up being what TVTropes called a FOX News Liberal. But she’s still there, and they show dark sides of the revolution like a Tory being brutally injured by Whigs tarring and feathering him.
Unrelated, but the series attracted a crazy range of voice talent, like Walter Cronkite as Benjamin Franklin, Ben Stiller as Thomas Jefferson, Sylvester Stallone as Paul Revere, and Ahnuld as General von Stuben.
I went to public schools in the South, and we were taught that it was only a little bit about slavery; mainly, they said, it was about trade policy that was unfair to the South, and a couple of other issues I don’t remember. I don’t think I was taught otherwise at any grade level.
From what I understand, in the North they teach that it was pretty much about slavery. And I had a friend from California who barely learned about the war at all.
The tv show Firefly, for those not familiar, is basically Wild West in space. There’s a war that serves as backstory, and to someone with a southern education it’s obviously the Civil War. But to others, it’s not the Civil War at all.
If I’m wrong about what other parts of the country are taught, please let me know.
As for what the war was really about, I think that’s a culture war issue.
@bullseye: The CSA representatives demonstrably held a belief that abolitionism was an attempt to destroy their agrarian economy, and that said agrarian economy produced cash crops that the British Empire would never be willing to do without despite its own hatred for slavery.
In the event, the British market was satisfied with Egyptian cotton and the Royal Navy never swept in to save them from Lincoln, nor did any other naval Power like France.
That’s still about slavery, but in a way that involves trade, the economic foundations of culture, and betting on British hypocrisy.
@analytic_wheelbarrow
It was about slavery.
I mean, don’t get me wrong. Factors like cultural differences, sectional rivalries, and tariffs were real and important. But all of them were ultimately either created or exacerbated by the existence of slavery. the war and its course certainly weren’t inevitable, but slavery was definitely the root cause.
@bullseye
I’ve always been entertained by how well Joss Whedon was able to write Mal Reynolds given how vastly different his personal politics are. As you say, the Firefly is an allegory of the post-civil war west where a guy who fought for the losing side bosses around a black lady. But he’s shown sympathetically.
The standard line I’ve heard is that the North (and later supporters) claimed that the war was about slavery, but they were actually fighting over state’s rights, while the South (and later supporters) claimed that the war was about state’s rights, but they were actually fighting over slavery.
That’s glib, and I’m no historian, but I’ve never read anything which made me think that wasn’t broadly true. Hmmmm… I wrote some stuff but it was culture war-ish so I’ll leave that.
Perhaps the reason why the South declared war can be best described as being in defense of their way of life…in which slavery was fairly prominent.
@Aapje
Agreed on the core principle. Lincoln having won the presidency with zero southern states in support was a huge factor – but that links directly back to “our way of life” which included slavery as inseparable.
I majored American history undergrad, and my personal rankings on causes of the civil war are economics, state’s rights, then slavery. In a counterfactual world where the south was incredibly wealthy even if they freed the slaves, it is highly unlikely that there would have been a civil war. In a counterfactual world where the U.S.still used the articles or confederation, or were completely unitary, the civil war probably would not happen. In a counterfactual world where there was no slavery but instead low wage, oppressed workers, and the south was objectiving to minimum wage and OSHA, I could still see the civil war happening because of the first two issues, even if it is less likely.
I grew up in the South and was taught that the war was explicitly about slavery at pretty much every education level (although I never took a US history course in college). That may not have been the experience of most southern kids going through the school system, but that’s how it went for me.
I asked my Facebook friends about this; my experience isn’t as universal as I assumed, but it’s common.
There’s an Andrew Jackson quote about the Nullification Crisis where he says that those motherfuckers are going to try that again someday and the pretext for it will be slavery.
My guess is that part of it was a conflict between egalitarian nationalism and diverse but caste-stratified imperialism — which is what we’re still arguing about today, but the sides have switched.
Some history will be necessary, if kids are reading anything historical at all, and if they know what a fairy tale or a parable is, or they or their parents grew up somewhere else, they’re gonna know some history.
Also, kids can (ought to?) be voracious readers, so that’s probably the time to get them reading about it. Building vocabulary takes a while.
The idea of filling most of history class with movies is interesting, though. You could show them movies that are or were popular, but set in some period. Alternately, you could introduce it as a branch from whatever else they’re learning about. If they’re learning about electricity, tell them about Franklin and Faraday. If they’re into D&D, that’s when you introduce them to medieval Europe.
An argument for not teaching anything at all until eighth grade will, I suspect, ultimately morph into an argument for unschooling.
@aristides
I was a below average high school student (good in freshmen year, and worse each subsequent year because of absences until I testes out, and that the test would do that was a sure to me!).
Maybe because they were front-loaded, but history and sociology were my best high school grades, though I had stolen the 10th grade textbook in 9th grade so I had an advantage.
I think that I got a B for European History and a C for Cultural Anthropology my one semester of community college, unless it was the reverse.
I’d submit that your real advantage was that your were the kind of person who stole textbooks to actually read them …
I spent ten years in Texas where i can say i learned about very little history wise, except I will never forget what the Alamo was or the battle of San Jacinto. It took moving back to Scotland to be taught about World War 1 and 2 at school, and since then pretty much all history has come from Wikipedia.
Friends here are shocked when I don’t have even basic understanding of things like the Bolsheviks or ask what Apartheid was. Obviously now i’ve read books on history in the years since school and find them fascinating, if only because it was effectively kept from me for so long.
I guess you want students to be able to have some idea of the history of the world but ultimately like most things, at the end of the day only those who are actually interested will learn anything
I’m not quite sure how much merit there is in presenting the “these are still bad people” in a historical context/textbook.
Though people are flawed especially when viewed with modern morality, is there reason to push their flaws with their accomplishments in a historical presentation?
As in, should you learn that Newton was a believer in alchemy and spent much effort in occult studies when learning about his contributions to science? wiki link (Not sure why there aren’t more Lovecraftian tie ins with him though…)
Should probably be covered in a more in-depth biographical study, but when learning “history”?
The question of what narrative to teach and whether to take those into account when people are elevated to herodom and whether the way history is taught creates bias against current issues is important, but that’s often more of a practical, political and philosophical argument than a factual one.
Also the pushback tends to be pretty bad too (and undermines the whole “teach them there are uncertainty and this will have people research on their own and find the “truth”)
For example, the evils of Columbus have been exaggerated or simply lied to quite a bit.
This video is a decent takedown on some of the more outrageous claims.
https://www.youtube.com/watch?v=ZEw8c6TmzGg
Regarding Newton and occult tie-ins, Monster Hunter International by Larry Correia (not the first book but one of the later ones) actually uses a Newton invented superweapon to kill an elder god from another dimension. So, somebody decided to use it. In case that tidbit sparks your interest, you’re welcome.
Not really – the point isn’t that Newton was a genius (and certainly not a Tesla like mad scientist), but rather that he fits more as a crazed scholar studying the Necronomion and possibly belongs in a cult that sometimes decides to point out “obvious” insights to mathematics and physics during his breaks (which his more sensible friends urge him to publish). His attempts were probably more likely to summon an elder god than to fight it.
You’d think there be plenty of places to put a fitting cameo or two.
If I recall correctly, it was just a Newton invented superweapon. Someone else in [current day] employed it. Newton messing with the occult to build a superweapon does fit your description. I’m unaware of any other cameos, but I found you one.
Not to be too gauche, but Newton’s interests in the occult play an important role in my own novel. They’re only in the background, though–the actual magician is a young, purely fictional friend of his. The more important theme is really the struggle between his occultism and the theological and (also very Newtonian) scientific convictions of his friend, who introduces post-Keplerian, heliocentric, gravitational theory to a fantasy world that does not have it.
I’m afraid I don’t know of any significant fiction works that include Newton. Renaissance magicians more generally? You do get those, of course, even in a famous children’s book like The Trumpeter of Krakow.
The various evils of Columbus are not particularly exceptional, historically, but that’s still a level of evil which is hard to significantly exaggerate.
One of the examples is of where people blame him for a practice (sexual exploitation of the natives) that he was complaining about.
Is that not an exaggeration?
And while there is some truth to the idea that if you can get around to believe that he shares the glory for the creation of the USA and the western development of the Americas, he should also share the blame for the bad things that were caused by them, there tends to be much less of a causation link (and more metaphorical) in the former than the latter.
I can’t watch the video right now, and so don’t have anything to say about that specific example; in any case I don’t know that that’s very important in the face of the whole mass-torture-slavery-and-genocide thing.
I don’t recall having any sort of watershed moment where I learned that all my heroes of history were also bad people in some way, but I do know that I knew some of them were jerks as early as I can remember. My takeaway was that even imperfect people can make the world better by my standards, which means I probably could, too. Seems like a worthwhile lesson for anyone.
Columbus absolutely discovered America, he just wasn’t the first European to get there. Discovery credit doesn’t go to the first person to stumble upon a thing, it goes to the first person to stumble upon it and tell everyone else. Like suppose i become a CFO of a corporation, and i find that there’s a large criminal operation, and the subsequent news reports go something like, “New CFO of Americorp discovers massive money laundering scheme.” Do you think it’s a fair criticism to say, “Well no she didn’t discover it, the money launderers obviously already knew about it. She wasn’t even the first non-money launderer to find the scheme either, a couple of VPs had come across it too, they just kept it quiet.” That’s a pretty ridiculous objection, no? Discovery is the process of making an unknown thing be known, and both Columbus and our hypothetical CFO are the ones who did that.
Also it’s funny to me how Americans get all shocked and appalled that Columbus might have committed an atrocity or two. When i was in elementary school, we were told matter of factly that Columbus enslaved the natives of Margarita and worked them all to death collecting pearls. Consequently, the first time i heard people criticize Columbus Day by pointing out his crimes my reaction was, “And? What’s your point here?” It didn’t occur to me that we obviously shouldn’t celebrate his man because he did bad things, since i’ve known he did bad things the whole time i’ve been celebrating him. There’s something to be said for exposing kids to the ugly truth of history when they’re young so that they accept it as normal.
That said, the whole thing about Columbus and the roundness of the world is a complete fairytale, but so is the take that he was an idiot who got lucky. The thing of it is that Columbus was completely certain that there was a landmass about 3000 miles West of Europe. He knew this because he had studied the Atlantic currents and correctly surmised the existence and shape of the Atlantic Gyre. It’s also likely he had heard tales of Portuguese fishermen, who were by that point fishing off the Grand Banks, and had probably sighted and possibly even landed in Newfoundland. The problem came when he decided the landmass in question was Asia and therefore either the Earth was a third of the size everyone knew it was, or else Asia extended thousands of miles further East than everyone knew it did. Neither of these things were true, and everyone correctly told him that this was the case. Nonetheless he was himself correct about the size of the Atlantic Ocean and being a sailor he was more certain of this than he was about the size of the Earth or Asia.
Ah, I’ve never heard this before. It answers the question raised by David’s post above, namely, why was he so confident that he could reach land where he did?
Maybe Columbus knew something and didn’t just get lucky, but we don’t know what he knew. Maybe he knew about Leif Erikson. Maybe contemporary fisherman had gone past Greenland, but there isn’t much evidence for that. More plausibly, they reported suspicious debris. Columbus visited Britain and maybe Iceland, so he had more potential sources than Iberian fisherman.
Maybe he deduced something from the Atlantic gyre, but I think you’ve mixed up two stories. Understanding the ocean currents was how Columbus was able to travel even 3000 miles and back. That was definitely an advantage he had over his predecessors, but I don’t think it has much to do with the length of the ocean.
Columbus thought he as going to the Indies. Even if he had heard enough rumors to believe that there was something out there, he still ended up just a couple hundred miles off from being as lost as it is physically possible to be.
He settled on that after the fact, and his opinion hardened after a big personal breakdown on one of his successor voyages. Supposedly he would alternate between objectives as part of his campaign to get funding for his first voyage. To some, he would talk about a shorter route to the Indies and China using his pseudoscientific view of the world’s size. To others, he would talk up the potential of finding new island chains and land that could be settled and exploited like the Azores and Canary Islands.
Define “everyone else”. Because I’m pretty sure those uncontacted hunter-gatherers on the Andaman Islands still haven’t gotten the message about the New World. And on the other hand, Leif Erikson told the general community of Icelandic mariners, to the extent that they were still making timber-harvesting voyages to Newfoundland four hundred years later. So is “discovery” a thing that absolutely cannot be done prior to the invention of the printing press, or do you just have to devote your life to sailing around the known world telling people what you have found?
Now, given the finite spread of information in the pre-Gutenberg era, or even to some extent the modern era, it is entirely possible for two people to legitimately and independently discover the same thing even if the first one does satisfy any reasonable publication requirement.
Both Lief Ericsson and Christopher Columbus discovered America, but Columbus’ discovery was orders of magnitude more momentous due to a confluence of technological and sociological factors. The point is that “Columbus discovered America” is an unequivocally true statement unless you start adding extraneous and erroneous qualifiers like “first person” or “first European”.
We know that Columbus was an idiot who got lucky because we have a summary of the presentation he gave to the crown when requesting funding. He thought the earth was smaller than it really is because he had a mistranslated copy of Eratosthenes’ book. His contemporary scholars recognized this error and told the queen not to sponsor him. The queen decided she liked him and overruled the advisors.
Maybe he knew how big the Earth was but thought lying about the size was a better way to get funding to find the continent he somehow surmised was there.
Every piece of evidence we have says that he very firmly believed he’d reached the Far East up until the day he died. (This is why I classify him as “idiot” instead of “visionary”.)
Yes every piece of evidence, except for how Columbus wrote in his journal of the third voyage, “I have come to believe that this is a mighty continent which was hitherto unknown. I am greatly supported in this view by reason of this great river, and by this sea which is fresh.”
Columbus initially believed he had reached some islands short of the Far East. Which was more or less what he was expecting to happen. After further explorations he began to realize he had discovered a whole new continent. The river in question, by the way, was the Orinoco in what is now Venezuela. He named the new continent Paria, and while the name mostly did not stick, a peninsula north of the Orinoco river delta still bears the name.
That said, he certainly was not expecting the entire Pacific Ocean to be between him and his actual target. The object of the fourth voyage was to find a straight through the newly discovered continent in order to get there. Of course no such straight exists, so it was rather fruitless.
I just looked at the first page of the appendix of I think the 2007 edition (that is the only page of the appendix that appears on Amazon’s free preview). The books listed there are pretty old; the newest is 1990, and several are from the 1970s. Maybe the author examines more recent books that appear later in the appendix, but if he didn’t, he certainly should.
FWIW, the California History-Social Science standards adopted in 1990 don’t seem to whitewash too much; for example, the 5th grade standards include this; “Discuss the role of broken treaties and massacres and the factors that led to the Indians’ defeat, including the resistance of Indian nations to encroachments and assimilation (e.g., the story of the Trail of Tears).” But that is CA. The standards are here: https://www.cde.ca.gov/be/st/ss/documents/histsocscistnd.pdf
Re: Columbus, to respond to some of the earlier comments, I was taught about Leif Erikson, etc when I was in elementary school in the 1970s (again, in CA). But, that begs the question of why we have students study history. Leif Erikson might well have arrived in N. America well before Columbus, as may have explorers from East Asia. But, so what? How were those events historically significant? In contrast, Columbus’s arrival was one of the most significant events in world history. It changed the lives of hundreds of millions of people on every continent by creating demand for labor in the Americas and by introducing what swiftly became staple crops in Europe (eg, potato, corn) and in Africa (eg., cassava). So, yeah, kids should certainly know about Columbus. Should they be taught that he was a “hero”? Probably not, but then that sort of “great man” history is pretty lousy in general.
I haven’t read the book, but to take your example, my understanding is that pretty much everyone who knows what they’re talking about agrees that Columbus was a uniquely horrible human being. Even judged by the standards of his time, which were obviously very different from today. If you have to pick figures of pure evil, your Hitlers and Stalins and so forth, Columbus is a very good candidate for joining them.
The first edition is good, albeit dated now in parts because it came out nearly 25 years ago. The “ten year version” with added content is . . . not good, mostly because Loewen went all in on fears about Peak Oil (rendering it instantly dated in only a handful of years).
Obviously a lot of nuance gets lost in a general book, but it’s still a solid read. I’d recommend reading 1491 and 1493 by Charles Mann as follow-up books with much more recent information and writing.
Isn’t Lies My Teacher Told Me the one where he talks about Africans visiting the Americas before Columbus? I stopped reading after that.
@Atlas
It seems most likely to me that there are things that introverts are better at and that extroverts are better at, but also that the latter tend to be overvalued and the former undervalued, and the former catered to too little and the latter too much.
So I would say that we could do with some push to cater more to introverts, but it’s absurd to demand society to be designed around them.
This kind of demand is the classic narcissist/low empathy cry: design society around my needs and all important needs (of me) will be catered to and only the unimportant things (to me) will not.
That is obviously wrong and actually quite immoral. Let’s stop paying for healthcare then, since it’s very expensive, better pay people to produce paintings instead. The people who suffer from easily treatable diseases & the family and friends of the sick will adapt their thinking. They’ll be no more sad for losing their small child than they are now after stubbing their toe.
I wouldn’t even go that far. How do you calculate the true value of introverts vs. extroverts ? What would valuing introverts more even look like ?
One might say, for example, “more introverts in positions of power”; but I could argue that effective administration and management requires extroversion traits; so an average introvert would perform poorly in such a role. Or you might say, “introverts should be getting paid more (on average) than they are now”, but how would you enforce that ? Would you even want to ?
(These are just random examples off the top of my head, I’m not here to strawman anyone)
The idea that people with more power should necessarily get more pay than a non-manager is a very extroverted idea in the first place. It seems to me that this could be changed a bit.
Anyway, having more objective criteria to judge people helps silent workers over people who ‘sell themselves’ better.
There are also other ways one can favor introverts. For example, the Dutch national railway company introduced ‘silence carriages’ where people are not allowed to be noisy.
It directly falls out from the fact that those who have the power set the pay. That’s one reason skilled non-leader workers are so often resented by management; it upsets what they consider to be the natural order to have to pay such extravagant salaries (or not get the workers).
There are ways to tilt the balance of power back to the professional and weaken the managers. The system that especially the Anglo-Saxon world has chosen is not inevitable.
What ways? And if you say “government” I’ll point out that you’ve just introduced a third entity which has more power than the other two and consists of management types.
We have “quiet cars” here too. They are of some, but limited effectiveness. People will play their headphones loud enough to annoy; nothing can be done about this. People will talk on their cellphones; whether this can be stopped depends on whether anyone else around with a high enough CHA to win a confrontation does so. I’ve seen people succeed, but if I try it fails every time. People will bring their crying babies; nothing you can do about that if the parents aren’t willing to.
Very flat organizations are one way. An example is this fairly successful home-care organization.
This is more in accordance with the Rhineland organisational model, rather than the very hierarchical and amoral Anglo-Saxon model.
As for the noisy babies, Psalm 137:9 has the answer (just kidding).
Organizations flat on paper just mean you have an unacknowledged status hierarchy, which extraverts will tend to float to the top of.
I believe that one difference between the Dutch quiet cars/sections (which are quiet) and those in other countries, particularly IME the UK (which aren’t) is that the Dutch are more willing to tell someone breaking the rules by making noise to be quiet.
Another is that it is impossible to reserve a seat on a Dutch train. Meanwhile, on British trains the quiet carriages tend to be on longer-distance trains where it is possible to reserve a seat (although unlike in some countries, you can also travel without a seat reservation).
While I believe it is possible in some cases to pick the seat you reserve, it is also very possible to end up with a seat reservation but no choice as to where on the train it is. So you end up with groups of people who may want to be noisier than is acceptable in the quiet carriage but whose reserved seats are there- and unless they can negotiate a swap (unlikely for all sorts of reasons), if the train is crowded their only choices are to sit in their reserved seats or to stand.
Extraverts are better at getting other people to do things for them. Introverts are better at everything else. Unfortunately that one skill counts more than everything else.
Downright untrue, as much as I wish it weren’t, and rather unkind when unqualified.
[Young man holding book looking at butterfly. Text: IS THIS PUNCHING UP?]
(This may be an uncharitable interpretation, but I’m annoyed enough at perceived “boo outgroup” to make it)
“Extraverts are better at getting other people to do things for them. Introverts are better at everything else. Unfortunately that one skill counts more than everything else.”
You may be just worlding a bit here. No reason everyone has the same amount of points on their character sheets.
Posts such as this seem to posit a world that is utterly narcissistic, which is to say, wtf?
A person can be quite introverted, but nevertheless respected. Likewise, a loud, boorish extrovert can generate exponential levels of eye rolling wherever they go.
To a point. But not as much as the extravert who employs many respected introverts.
Even for the loudest, most boorish extrovert, the sky is the limit.
@The Nybbler — It seems as if you’re trying to build a rigid binary between the evil extroverts (who are all dark triad types) and the good-hearted introverts (who are all unfairly held down). The world is more complicated than that. Some people are popular because they are decent and kind. Likewise, some introverts are actually rotten narcissists who suck at being narcissists, and who thus sit around bitterly resenting everyone around them.
Most people float around in an ambiguous center. It’s actually a nice enough place to be.
I’m not saying they’re evil or “dark triad”. I’m just saying they have an extremely significant advantage. Sure, some extraverts will squander it (just as some people who are 7′ tall suck at basketball), but that doesn’t mean it’s not there in general.
Wait, what does it mean to “squander” it? Squander what?
The Nybbler’s wording is needlessly confrontational, but I agree with him in principle.
Our society had long passed the point where a single person could reliably accomplish anything significant (barring a few outliers with over 9000 IQ, perhaps). Most valuable accomplishments rely on a team of specialists working together to achieve a common goal; extroverts are essential for creating and maintaining such teams. Therefore, extroverts are highly valued. Note that an extrovert who is actually effective at leadership (i.e., one who is able to build effective teams) will be more successful than one who is not (i.e., someone who is highly charismatic but unable to find and retain productive team members).
Sure, it’s nice to dream of a society where people like me somehow happen to be on top (it would surely be convenient !), but I see it as unlikely. Extroverts are in charge because this arrangement works, not just because of some memetic trick.
@Bugmaster:
I believe you are mistaken in two respects. To begin with, there are valuable things that can be done by an individual, such as writing a good novel or coming up with an important new idea in some scientific field.
Aside from that, even if it were true that important things need teams and that a team needs an extrovert leader, how valuable extrovert leaders are depends on how many there are. If there are several times as many as are required for teams then many of them will be doing something else and the ones leading teams won’t be all that well paid, since substitutes will be available.
@DavidFriedman:
I could grant you the novel — except that, without some extroverted assistance, virtually no one would ever read it. I might be persuaded to grant “coming up with an important idea in mathematics” (with the same caveat), but not in science. Science relies on experimental verification, and modern experiments take a lot of work.
True, but the same holds for introverts.
@veronicastraszh
Let me rephrase it in a way that fits the way your ideology likes to frame things: the claim is that extroverts have huge unearned privilege.
@Bugmaster
I think that the evidence suggests that extroverts get rewarded beyond their actual contribution, because advertising, bullshitting, being immodest, etc works.
It may be true that a top extrovert produces more value for others than a top introvert, but that doesn’t make it fair if an extrovert who produces equal value to an introvert gets rewarded more.
I think that you and David are falsely stereotyping introverts as loners. It seems to me that a scientist who spends most of his day doing experiments alone or ‘alone together’ (in the same space as others, but with little interaction), but who pretty intensely interacts with a few others a fraction of his time; is more of a job for an introvert than for an extrovert.
This is indeed the consensus among university administrators, grant funders and other policy makers.
In related news, science is no longer making progress.
Inferring causality (and its direction) is left as an exercise for the reader.
[redacted]
On second thought, maybe this is too close to engaging in CW or at least inviting a CW discussion to start to leave here. Maybe a discussion can be started on the next CW thread.
@Aapje:
You might be right, though as I said above, analytically determining the proper reward for someone’s actual contribution is virtually impossible. That said though, the same skills that make a person good at leading others, make him good at “advertising, bullshitting, being immodest, etc.”, so I doubt you’d ever be able to separate the two domains.
Well, I’m an introverted loner, so I admit to the bias. That said though, the scientist in your example is not working alone; he just thinks he is. In reality, he is (most likely) being supported by an organization that caters to his needs, and directs his efforts. Someone is in charge, and it isn’t him. If the scientist was in charge, he wouldn’t be doing science, he’d be doing management, and that would be a waste of resources. On that note:
@The Do-Operator:
Last time I checked, modern scientific experiments involved doing things like building giant multi-mile-long buildings to minute tolerances, launching probes into space, and operating supercomputing clusters. Can you show me a person who can do any of that all by himself, with his own two hands ?
Science is still making progress, but every new discovery is harder than the last — otherwise, it wouldn’t be much of a discovery. Scientific administration is far from perfect, but that’s not the same as saying that it serves no purpose at all.
You all are treating this “status thing” as a dichotomy between introvert and extrovert, but I don’t think that is the correct axis.
Some of you, according to your descriptions, seem to have a fundamental problem navigating social spaces. However, an extrovert can be either likable or unlikable, likewise for an introvert. After all, the distinction between extroversion/introversion seems to hinge on how much you are energized by social interaction, not by how likeable you are. The same applies for being effective on a team.
Not true. You can self-publish a book by yourself with no extroversion required. If it’s very good, there is a significant chance that enough people will notice and spread the word.
I don’t know if you count economics as a science, but Ronald Coase got (and deserved) a Nobel prize for a set of ideas he apparently came up with when he was about twenty and ended up publishing in two articles, each of which set off a sizable literature.
It’s not a matter of navigation but of conflict.
It’s not about likeability. Even an unlikable cuss can get his way most of the time.
So what is the precise conflict?
I don’t know if I count as an “introvert” (I’m likely in the murky middle between the extremes). However, I hate “networking.” I suck at it. In fact, I skip most “team lunches” and usually have lunch alone (and read math books). The only “social-ish” things I regularly do at work are various math-and-science reading groups.
That said, I have one trick: I make sure to build a few stable ties to people who do like to network. I work to earn their respect. That way, when I need to reach out for help with a problem, I can ask them to point me in the right direction.
I guess I can imagine a more “conflict driven” workplace. For example, I think sales and finance are pretty dog-eats-dog by default. (Unless they aren’t. The only thing I know about those jobs are what I see in movies.) I can also suppose if one is playing “sociopathic ladder climber” game — well that is something certain personalities can do, and others cannot.
So don’t do that, unless you feel trapped in such a social space. If you feel trapped — well that is situational, not essential.
(Note, I work for {big tech}, which has a famously effective work culture. I realize not every work culture is this good.)
Let me add, I don’t spend much time worrying about “getting my way.” I worry more about “getting things done.”
This, I think, reflects attitude.
@Bugmaster
I think you misunderstand my objection. I’m objecting to the idea that introverts are unable to have or are always less effective at jobs that have a substantial cooperative element.
Cooperation is not the same as intense social interaction. You can have lots of cooperation, with only sparse social interactions and a lack of cooperation, with lots of social interaction.
@DavidFriedman
In my country, readership seems heavily affected by whether or not the writer appears on talk shows and such.
Like many smart writers, you seem to arranged things so you never needed to live of royalties and the sales of your books can thus be relatively low. And your books do seem to sell only a tiny fraction of the sales of Michelle Obama’s biography.
Surely true.
But I was responding to:
I may be mistaken, but I believe that my books accomplished something significant. If not mine, then those of other people.
Making a living by writing is difficult, although some people manage it, and I suspect some of them are introverts. But making a living is easy in our society and quite a lot of people manage to do it while leaving enough time to write books on the side.
@Aapje:
I will grant you some degree of cooperation, but not leadership. And I was talking primarily about leadership (or management, if you prefer): the ability to find, attract, and organize other people working toward a common goal.
@DavidFriedman:
I don’t really want to get into a discussion of how you determine “significance”. However, I am compelled to point out that I’ve never heard of you or your books until I happened to read some of your comments on this blog (despite the fact that I find the topic interesting); on the other hand, almost everyone, including myself, had heard of Kim Kardashian (despite my total lack of interest in her).
@Atlas
Extroverts are better at things that involve
manipulatingdealing with people, while introverts are better at dealing with things.Many very important jobs that underlie civilization are presumably more in the wheelhouse of introverts, like most engineering, science, transportation, etc.
My very subjective and probably biased impression is that while extroverts can provide important ‘glue’ for society, we have more extroverts than is good for us.
Sure, but that reduced self-esteem is in large part an artifact of the move to a hardcore meritocracy, where we measure people primarily by their contribution and we expect people to meet a very high standard.
Perhaps we can bring back some more basic respect for people who play by the rules. Then we might also reduce the issue of people trying to find respect in a narrative of oppression or (mental) illness or such: ‘I would be king of the world if not for being oppressed by group X or because I have ADD/dyslexia/autism/etc.’
I reject the premise that particle accelerators are necessary for scientific progress. There will always be room for better theories. All you need is a desk, paper and a wastebin. I have seen major scientific progress occur in our lifetimes using nothing more than human cognition. (look up the reference in my username).
You can come up with plenty of theories using a piece of paper. However, without testing, this is just an exercise in intellectual onanism.
Mathematical models require physical validation.
The validation doesn’t have to be done by the person who comes up with the theory, however, and the theory by itself can be an important contribution. Also, the theory can be used to explain experimental results others have already come up with.
And in some fields, a single person can do both. My first economics journal article was a theory of the size and shape of nations, complete with tests of the theory using data that I could find in a library.
Yeah, I know a lot of folks who have done serious work in crypto either alone or with a few coauthors with whom they communicated mainly over email and seldom saw face-to-face. For that matter, I’m *one* of those people.
“The greatest happiness is to scatter your enemy, to drive him before you, to see his cities reduced to ashes, to see those who love him shrouded in tears, and to gather into your bosom his wives and daughters.”
These words are attributed to Genghis Khan. Your client firmly believes they describe the best possible life for a man. He has retained you to advise him on how to live up to this ideal.
This may be difficult, because he is in most respects quite an ordinary man. He is 35 and physically unremarkable. He used to work construction, but now works as a bookkeeper for a construction firm. He is married with a wife and three young children.
What advice do you have for your client?
“Lower your expectations.”
More seriously, I’d suggest cashing in his wealth, retaining a bunch of mercenaries, and moving to South Sudan or the Congo, where he would seek to become a warlord. The result would likely be his death, but for every Genghis Khan there are thousands of Genghis Khan’ts.
Those places probably have too many competent soldiers. Better to look for remote primitive tribes, where basic proficiency with an assault rifle goes a lot farther.
Those tribes are useless, because they’re isolated. Nothing to conquer once you’re the top dog. And if you want to prey on non-primitives, that’s not different from dysfunctional Africa.
Presuming you’re insane enough to actually help him, the natural choice is Afghanistan or Somalia. Both have weak states and have had them for decades. Both have other warlords who have local power bases rather than national ones. Both are extremely poor. Presuming he’s saved 10% of his income, he should have assets worth about $200,000. That’s enough to hire and equip a somewhere in the neighborhood of a company of local soldiers with slightly above average weaponry (though nothing compared to the western soldiers or their Afghani counterparts).
He’s still highly likely to die. But if he succeeds in taking over either of them, he’s mostly surrounded by weak states. Especially in Afghanistan. He might even be able to get US support by putting pressure on Russia, Pakistan, and Iran while allowing them to extricate themselves from Afghanistan.
Before sending him to Afghanistan to gain power, have him read Kipling’s short story “The Man Who Would Be King.”
Or at least watch the excellent movie.
Where’s the button to report a comment for Comment of the Week (for “Genghis Kahn’t”)?
Cf. Mark Thatcher. But yes.
Make really puny enemies. Like an ant queen. That looks doable.
I suppose he could join the police. He wouldn’t get to burn the cities of his enemies, but eventually he’d get to bust down their doors gun in hand and haul them off in handcuffs. But maybe 35 is too old to join the force.
My friend’s boyfriend is in his 40s and going through police training right now. He used to sell insurance and decided he’d rather be a cop.
Yeah, but taking their wives and daughters into your bosom is a lot less fun.
Yeah, the internal affairs department gets really cranky about that sort of thing….
That is less true than we would like to hope.
Ref: nearly every https://maggiemcneill.wordpress.com post ever, and about 15% of the posts at Reason.com
This, but unironically. Let’s get medieval on the mosquitos.
Take up an online game such as Eve.
+1, and now I think I’ll have to re-download that.
This. Eve specifically is about being a complete ass to everyone else within the structure of the game.
Ya, lots of games might somewhat satisfy the itch and lets you reduce your rivals cities to ash and have their people enslaved weekly rather than it being a 1-time thing… all in a socially acceptable way.
My first thought would be to stake them with some seed capital and let them participate in the commodity, bond, or derivative markets. Those markets are pretty well known for being very competitive, allow trading on insider information, and leveraged enough that failure is often accompanied by real tears. It’s a bit less obvious now that trading moved out of pits, but the same competition is still there.
For a lot less cost, Eve Online is a pretty close substitute.
Harass people on the Internet.
I like the “pick up a massively multiplayer PvP game” idea above, though I’m not sure about the “gather into your bosom” part. Then again, I think most interpretations of “wives and daughters” in this context are either heavily skewed or extremely repugnant to mainstream Western society (which I’m assuming based on the context of the post).
I have an alternative, weirder approach: learn to approach accounting problems this way, with a slant toward maliciously created ones. Scatter their mismatched spreadsheets, drive them to the auditors, reduce their ill-gotten gains to ashes, see those who were complicit prosecuted alongside, and gather into your treasure hoard the great fees you charge for scouting out just what happened. This may require him to switch jobs to optimize for this, but his existing accounting experience should give him at least a little bit of a head start.
Yes, if you want to live the life of the Great Khan in this society, you have to be content to conquer and loot in very symbolic terms. Business works great; you drive your competitors out of business and take their customers and contracts. But it needs to be a competitive activity where when you win you take something from those you beat.
Obviously videogames.
It seems like you can go in two directions here. One is to accept a simulated experience. And in that case there are plenty of options, starting with Axis&Allies and going up through increasingly intricate video games.
The other is to insist on a real lived experience, even if it’s further from what Genghis actually talked about. You could get somewhere in the neighborhood by being in charge of a police vice unit that takes down drug dealers, sometimes violently, and energetically seizes their assets. You don’t get to gather into your bosom their wives and daughters, but you do get to beat them up and take their stuff.
That might depend at least a little on how corrupt you and/or your vice unit is. Could be closer to what Ghengis meant than you thought.
Hmm. Bookkeeper for a construction firm.
Sudhir Venkatesh’s inside report on the economics of a drug gang, the one that put “Freakonomics” on the map, had “J.T.” running an organization of ~100 professional criminals on the basis of having obtained a college education in accounting or the like without having lost his street cred. Safe to say he probably had at least a few opportunities to crush his enemies, drive them before him, and enjoy the lamentations of their women in various un-PC ways that are now frowned upon.
And if he still hangs out with construction workers, he might have the connections to start a criminal enterprise come the next recession and construction-industry bust. The trick, is finding a criminal niche where a new upstart won’t be immediately crushed by existing franchises but which the police will tolerate at least to the extent that they do low-level drug dealing. That’s going to be highly dependent on local circumstance, but for California today maybe a protection racket aimed at the quasi-legal marijuana industry. Could even integrate that with actual construction work – pay our exorbitant rates and we’ll build your greenhouses, protect your greenhouses, and launder some of your cash through other legitimate construction projects. Hire someone else to build your greenhouse, and it likely burns down.
Rules lawyering: Is it a requirement that he achieves all this as a Genghis Khan-like top dog? That’s difficult, but as a simple soldier/mercenary it’s probably feasible, if risky. Pick a force that’s likely to win in some conflict, is willing to take a 35-year-old recruit (perhaps one for whom our man’s Western connections or other attributes are valuable), and is sufficiently prone to destruction and atrocities such as war rape. You won’t get all the women of the enemy, of course, but you may get a few.
No. That’s probably beyond this man. The point here is for your client to live up to his ideals as best he can, and his best is going to be pretty darn far from what Genghis did.
Maybe get a job in a fertility clinic and “accidentally” cause all the clients to get your offspring instead of theirs. Not so much conquest, but you can be a huge genetic success like Genghis (albeit not on quite so grand a scale).
I think some crime drama show had an episode about this. Law and Order maybe?
This actually happened for real in The Netherlands. The speculation is that the (ex-)CEO has hundreds of children, although it’s hard to know since he made sperm cocktails from multiple donors, including himself. He also sent sperm abroad. So you need lots of DNA tests to figure out how many people he fathered, in multiple countries.
I think there was a case like this in the US that came out a few years ago, too. Probably a lot of cases–it can’t be all that rare for a fertility doctor or employee at a sperm bank to decide he’d like to reach for the gold ring in terms of genetic success.
For the Dutch guy, ideology seemed to have played a role as well. He really, really wanted women to be able to have children when they wanted them. He was eager to violate all kinds of ethical norms to do so.
To pick up a competitive sport as a hobby. Kick boxing for example, or mixed martial arts. And go for a gym that favors frequent sparring.
If he really wants to apply it to all the facets of his life… give him The Gervais Principle to read and send him to climb the corporate ladder.
Don’t mess up the order.
It seems to me that the obvious solution is to beat down your client, (stopping short of killing him only because he’s still your client) then burn down his house, take his wife into your bosom, and adopt his children.
Either you’ve given him a concrete example of how to accomplish his goals for him to emulate, or you’ve successfully dissuaded him from pursuing a potentially ruinous goal. Either way, you’ve achieved your objective.
Some people on the very libertarian side of the immigration argument believe there should be no immigration restrictions at all. Anyone who wants to move in, and doesn’t cause trouble, should be allowed to do so.
Does any modern nation run things this way?
The closest seems to be Georgia.
https://en.wikipedia.org/wiki/Visa_policy_of_Georgia
It doesn’t allow just anyone but if you’re allowed, there’s no 90 days restriction. You’d need to make a visa run every year but that’s it. The incomers have no claim on social and medical insurance.
I mean, the U.S. states do with regard to each-other, though you can argue they only get away with that because of the U.S’s external borders.
This sounds culture-warry.
Disagree; ‘does anybody actually do this?’ is a different sort of question than ‘is this the right thing to do?’.
EDIT: But yes, anybody answering the second rather than the first would in fact be waging the Culture War.
One of the most useful features of culture-war free threads is precisely that you can discuss culture war adjacent topics without having them descend into the actual culture war territory.
It was certainly common historically.
I lived in mexico for a year and a half. the immigration process required that I buy a work visa that cost ~20 dollars. If you didn’t have it when you left the country after a sufficiently long stay, you had to pay a fine of ~40 dollars. No one else ever asked for it. There is a longer term visa you’re supposed to buy if you stay longer, it costs ~150 bucks. Mexico also has some strong restrictions on non-citizens buying property. Now, I was working for businesses that paid me in cash, I didn’t have a bank account and I didn’t pay taxes, but I knew and worked with people who more formal arrangements. None of them had any serious complaints about the process beyond the general level of lethargy and ineptitude expected in Mexican government and the whole thing was very straightforward.
The work permit does come with some other restrictions. Certain kinds of work are not eligible for work permits. They typically don’t like manual laborers and non-management employees, and can have issues with technical roles that could be filled by a Mexican employee. Also, the government can choose to deny permits on a pretty arbitrary basis (they don’t very often, but corrupt countries can make that a pain).
At base, it’s pretty easy and (for an American) cheap to get. YMMV depending on circumstances.
Of course the same libertarians don’t support welfare, lacking which would cut down 95% of immigration anyway. (Number pulled right out of my ass. I just think today in most developed nations you can get a near minimum wage level welfare by playing the system, so why flip burgers?)
I don’t think illegal immigrants can get much in the way of welfare anywhere, and my impression is that most illegal immigrants in the US are economic migrants–they’re here because the economy in El Salvador sucked and they want work. So I don’t think open borders + no welfare causes a decrease in immigration.
Svalbard? https://en.wikipedia.org/wiki/Visa_policy_of_Svalbard
Many threads ago someone asked me to report on a book on Medieval guilds I was reading, a few thread a ago someone (I think it was @Nick) asked about my take on labor and trade unions, but in the responses to that decided “I guess it’s just me interested” (or something close to that), but more recently @cassander wrote something that sparked my writing out my take on that subject which will touch a bit on some old history and American leftist political parties:
While often a slavery based economy the Roman Republic and Empire had some wage paid labor, and in time occupation based social clubs called by many words such as collegium(college) and corporatio (corporatation), it’s a bit murky but apparently the Roman state granted a few privileges (such as not being regarded as a criminal conspiracy) and imposed some responsibilities such as making the carpenters college have fire-fighting duties (on the theory that they understood buildings). Bits and pieces on the collegiums got into Roman documents including some laws.
Despite some claims by masons most historians don’t think that any colleges/corporations survived the fall of the western Roman empire though they did linger in Byzantium, but as urban civilization and a money economy re-developed in western Europe and expanded to northern Europe new artisan and merchant trade associations developed called by various names including companies, corporatations, gilds, and guilds (scholars guilds came to be called colleges after the Roman social clubs).
A shopkeeper would employ an apprentice learning a trade who would often be a relative and if not have a status somewhere between a foster child and a servant. On the theory that teenagers will listen to authorities who aren’t their parents better than their own parents the “masters” would effectively swap children, and in time parents would pay to have their children apprenticed to others and town governments would pay to have orphans apprenticed (sometimes “orphans” meant destitute parents not unlike our modern foster care system), the orphans usually starting younger than the non-orphans but both ending their apprenticeships around their early 20’s.
Just as they did their own children masters could and did beat their apprentices, but there were limits, in one incident an apprentice who had been severely beaten in the 16th century and ran away in the City of Exeter, the beating was judge so severe that the master was put in stocks with the apprentice near, shirtless to show the scars of what the master had done.Elaborate rules were created, including how apprentices were to be treated by the masters wife, and in time women also became apprentices, usually of seperate trades such as sewing, but widows and daughters would sometimes join the same trades as men depending on the trade and location, they’re records of daughters becoming smiths, but not masons (sort of like rural workers, women mostly didn’t plough and men mostly didn’t milk cows but at harvest time every hand was needed to cut wheat).
In time the guilds became the city government in many late medieval towns with Mayors being selected by “the free men of the city” (guild masters).
In England it become custom for the nobility as well as artisans and merchants, sometimes even rural folks to “go into service” when young in other people’s households, so just as a smith or fletcher in training would be apprenticed to another family so would the nobility be “pages”, “lady’s in waiting”, and “squires” (there accounts of Italians finding the practice strange).
Elaborate rules were also created to protect the reputations of the guild, quality control rules and rules about how to qualify as a master allowed to set up shop, a “masterpiece” demonstrating one’s work often being required (to complete my union plumbing apprenticeship I had to build a “rough-in” of a bathroom to “turn out” after 9,000 hours of work and five years of night classes).
In time a new status developed besides apprentice and master, the journeymen who didn’t employ apprentices themselves and didn’t stay with one master:
In Britain in the 16th century land enclosures became more common, where what had once been common land increasingly became private property and rural serfs were forced off land their families had worked and lived in for generations, this process continued up till the 19th century with more and more land enclosures, the 18th century Scottish “Highland clearances” were said to be particularly brutal, often fields became pastures for sheep as wool was more profitable, I’ve read this described as “The original sin of capitalism”.
Wages were relatively high in late Tudor England but in the 16th century opportunities to achieve “master” status were diminished and they’re letters and screeds from that time bemoaning the “scandal” of apprentices and journeymen acting married instead of waiting to become master first which used to happen in their late 20’s or 30’s.
Wages dropped from Elizabethan highs in the early 18th century restoration era and increasingly the guild system broke down and by the 19th century being “in service” wasn’t a matter of what age you were but of what class you were.
The cities became more crowded, with the infamous “slums”.
Meanwhile in the British North American colonies of the 17th century the younger sons of the nobility in the “Tidewater” region tried to make themselves feudal lords, so they needed labor and the demand for “indentured servants” (I had to sign an agreement to be an “indentured apprentice” for a few years in order to be trained as a plumber in the late 20th and early 21st century) to be indentured is a contract pledging to labor for someone, to learn a trade or for ships passage, or as an alternative to being hanged for committing a crime, soon demand for labor outstripped supply, as word got back to Britain about the conditions in the “new world” and being what became called “shanghaid” when used to crew ships became more common because they’re was something in the tidewater region.
Malaria.
Escaping to inland and starting homesteads became more common among the indentured, and the “planters” found a new source of labor….
….which is a whole other tale.
Further north away from the planters homesteads and then cities developed and then…AW just read Albion’s Seed already.
Suffice it to say cities developed, Benjamin Franklin was famously a printers apprentice and there was enough of a residual of the guild system that an application to start a couch company in the city of Boston in the early 19th century was rejected because other coach company owners objected that the applicants had never been teamsters themselves so not “in the craft”, but that wasn’t to last and as the last vestiges of the guild system dissolved (with a few remnants such as the City of London “livery companies” which are charity organisations and social clubs that sometimes sponsor trade schools in deference to their orgins) and laws rewritten a new type of organization, the “trade union” appears (really, Parliament abolished guilds and within a decade trade unions appear, I have a 19th century book right now from inter-library loan that explicitly states that the end of the guilds created the necessity of trade unions, On The History And Development of Gilds And The Origin of Trade-Unions by Lujo Brentano of Ashaffenburg, Bavaria MDCCCLXX (1870) which I’ll quote from:
)
The first trade union in the United States is supposed to be a group of shoemakers in Boston, and after the Civil War with increased industrialization as employees go on strike and the movement spreads despite brutal reprisals, the 1870’s and the 1910’s are particularly bloody but labor violence continues at least into thr 1970’s in “coal country”.
The first plumbers union forms in New York City under the Knights of Labor umbrella and the “International Association” of Plumbers (’cause Canada) is formed with a number of co-op shops (especially in Chicago) is formed, Steam Fitters and Gas Fitters are brought into the fold and…
….in the “Panics” (as economic recessions were then called) of the 1880’s they go under.
Dropping the co-op shops the remaining viable union locals form a new “international” (cause Canada) Plumbers union – The United Association (which I’m a member of) as part of the new American Federation of Labor led by Samuel Gompers in the early 1890’s.
Gompers advocates “plain and simple unionism” which is free of entanglements with socialist, syndicalists, and anarchists and just concentrates on wages and working conditions for it’s limited members who are skilled workers in a concept called “craft unionism” (Gompers skilled trade was making cigars).
“Unskilled” labor isn’t invited (it didn’t work out quite like that, but that was the idea).
In my union it’s clear that the idea is to almost be like the old guilds, they’re indentured apprentices and journeyworkers, you have to demonstrate a certain amount of skill to become a journeyman, and a certain quality of work is expected.
In contrast there’s “industrial unionism” in which all employees of an industry are to be organized and included.
A new umbrella union was created in the early 20th century: “The Industrial Workers of the World” and they try “organize the unorganizable”, itinerant loggers and field hands, immigrants, everybody, and their explicit end game isn’t better wages it’s a great “general strike” leading to a syndicalists society.
They are jailed, killed, and exiled, and are chiefly remembered for their militancy and “the little red songbook” of “union hymns”.
After a peak during the First World War, and a 1919 strike wave union density plummeted in the 1920’s, and then came the Great Depression.
The “Panics” of the 19th century and the recession of 1921 offer some preview, but this economic turn down is deep especially compared to the prosperity of the ’20’s and radical movements rise.
After three years of Depression a campground of desperste thousands in Washington D.C. form a “Bonus Army” around a nucleus of First World War veterans and their families asking for a promised pension to be paid early, and they are dispersed by the U.S. Army (including future WW2 generals Eisenhower and MacArthur) in what is described as “Today’s soldiers fighting yesterday’s”.
Soon a new President is elected who signs “The National Recovery Act” which aims to create a sort of government/business price fixing “corporatist” scheme, which causes a lot of Blue Eagle signs to go up with “We Do Are Part” written on them, which is soon overruled by the courts but has a lingering effect: There’s a line in the act that seems to allow easier union organizing, a breaking point has been reached and in 1934 three cities are in “general strikes” involving street violence: Minneapolis, San Francisco and Toledo.
The Minneapolis “Teamster Rebellion” is helped organized by Farrell Dobbs, a “Communist League” member which is a leftist group headed by James Cannon that is loyal to an exiled Leon Trotsky who has not yet met an icepick, the San Francisco “Big Strike” is led by an Australian born former sailor longshoreman Harry Bridges who was long suspected (but never proven in multiple deportation attempts) to be a CPUSA member, whether he was or not the Stalinists later publish a book celebrating the strike.
In Toledo, Ohio A.J. Muste and his “American Workers Party” help organize a particular brutal strike.
All three strike stop almost all work for a few days in each city, and the San Francisco strike closes almost all U.S. ports on the west coast.
In 1935 the Wagner Act passes which makes organizing unions a legal right, soon sit down strikes close down auto-plants and the United Auto Workers are organized under the auspices of the “committee of industrial organization” a faction within the A.F. of L. that beginning is usually marked by when John Lewis the head of the United Mine Workers punched the president of the Carpenters Union (a proponent of craft unionism) at the 1935 A.F. of L. convention.
Local governments as well as employers resist the organization drive but President Roosevelt proves relatively friendly, as does Secretary of Labor Frances Perkins who as a young lady saw employees at Triangle Shortlist dive out of windows to their deaths because they were locked in during a fire.
Perkins is celebrated for among other things coming to a town where local law enforcement tried to break up a union meeting because it was against a local law a Perkins is supposed to have seen an American flag flying over a post office and said “Meet there”.
By 1938 the C.I.O. is expelled from the A.F.L., renames itself “The Congress of Industrial Organizations” and the two labor federations are rivals, the west coast longshoreman break from the I.L.A. and become the “International Longshoreman and Warehouse Union”, the very communist led U.E. (United Electrical Workers) are created as rivals to the AFL International Brotherhood of Electrical Workers, etc cetera.
By 1941 even the Ford Motor Company, which had stockpiled more munitions than the U.S. Army had at the time in case of strikes is unionized.
Then comes the war.
In the interest of “labor peace” the federal government encourages unionization of the munitions plants and merchant ships and except for a few like John L Lewis’ ever fighting mine workers a “no strike” pledge is honored and as a 90 something old Trotskyist told me over a decade ago “Come the war the Stalinists suddenly became super patriots”, as former rebels suddenly urge more production now that the U.S.A. is on the Soviets side, causing a bit of bad blood with some of their fellow workers.
In 1946 after the wars end the greatest wave of strikes the U.S.A.’s ever had occurs, among which is the Oakland General Strike which starts when a street car driver stops the trolley when he sees women picketing a department store for better wages, traffic is snarled and soon all trades are off work in support of the department store clerks, but this strike doesn’t have the character of the bloody ’34 (or ’19) strikes, instead a holiday atmosphere pervades, the strike committee says “no hard liquor” so instead beer is passed around and the whole thing becomes almost a party, when you look at photos of earlier strikes you see fights, but in looking at photos of the ’46 general strike (the last all city general strike in the U.S.A.) you see smiling faces.
After over a decade out of power the Republicans control congress and in 1947 pass the Taft-Hartley Act which makes secondary boycotts (like those that occurred in the ’46 strike) illegal, which is made clear in the Sailors’ Union of the Pacific/Moore Dry Dock decision of 1950, and the DeBartelo decision of 1983.
Not known at the time, but Taft-Hartley takes the wind out of the sails of the U.S. Labor movement.
On momentum, and an expanding post war economy, union density grows until 1954, President of the A.F.L., plumber George Meaney has the dimished C.I.O. merge again with the A.F.L (you can tell the real old-timers because the always say A.F. of L, instead of AFL-CIO). Previously after Taft-Hartley new requirements the C.I.O. had purged itself of ten “communist” unions, in the 21st century only the I.L.W.U. which rejoins the federation decades later, and the still independent U.E. survive.
Also in the late ’40’s “Operation Dixie” an effort to expand the successful organization campaigns of the industrial north into the areas that had been the confederacy is an abysmal failure.
Decades after the purge Father Charles Rice, a leader of The Association of Catholic Trade Unionists (who in the 1930’s was a labor organizer who turned anti-communist activists in the ’40’s upon learning of what the Soviets were doing in their newly occupied areas) apologized to the leaders of the U.E. for being opposed to their union, and they apologized to him for not realizing before 1956 when the tanks rolled into Hungry what the Soviets really represented.
Come the 1960’s and a campaign with a poster of an elderly woman in poverty asking “How’s your old teacher?” is successful and teachers have unions, in time public employees become the largest number of union members as the number in the private sector dwindle back down to the levels of the 1920’s.
The last big organizing drive (if you don’t include the smaller “Justice for Janitors” campaign of the 1990’s) is the ’60’s and ’70’s farm workers organizing campaign which is effective because it convinces enough of the public to boycott grapes, the success is short lived, while largely Spanish speaking and of Mexican descent themselves, the farmworkers of the U.F.W. in the 1970’s are often citizens, soon they are swamped by further immigration.
The 1981 air traffic controllers strike marks a watershed moment, many pointing to the mass firing of the air traffic controllers without pushback from the rest of the unions as the last chance for labor unions to stop their decline which goes from slow and steady to a very fast decline.
Factories close, move to Dixie and overseas.
Jobs are automated.
After the 1982 recession it’s clear that men will take lower wages than their fathers, and other than the lingering traditional coal county “Harlan County war” little fight is left
Then mines close.
I.L.W.U. stevedore jobs still have relatively high wages but despite much more cargo coming from across the Pacific with containerization and automation far fewer men are needed to unload the ships.
The old building trades craft unions still exist in much the same form as they were in the 1920’s or even the 1890’s, diminished from their 1950’s peak but still holding on.
While not exactly new, a “craft” union emerges in the California Nurses Association (later National Nurses United), a bright spot for U.S.A. Labor unions in the 1990’s and 21st century.
Those RN’s are tough!
In writing this tonight (and now this morning!) I’ve consulted two pages from two different books, and three Wikipedia pages (to get some years right, which were 1934, 1950, and 1983, and one first name right, which was Charles), the rest of the info is largely stream of consciousness and from memory and apologies for my mistakes, oversimplifications, et cetera, I’m a plumber not an academic! (Hopefully @DavidFriedman who is an erudite professor will chime in with some correct facts or at least his take).
Wow, this took me far longer to write than I thought it would!
Thanks for reading this far, please give your take.
Plumber:
Thanks for the effort-post! I didn’t know most of that history, and found it interesting.
Seconded. It’s always cool to see something like this.
@albatross11 and @bean,
Thanks guys that means a lot to me, I was afraid that the reaction would be either silence or my being slammed for being too anti-Marxist or not Marxist enough!
Nice précis of the trade union movement and its origins, thanks Plumber!
Yeah, that was the real war in 80s Britain when Thatcher took on the miner’s unions and won. There’s to-ing and fro-ing over did the mines need to be closed, but the coal industry as it was did need overhauling. Thatcher took this as the chance to break union power as a whole, and the bitter aftermath was devastation in the North of England where towns were dependent on the mines as the main/sole employers and the feeling that there was the deliberate creation of a two-tier society where the south of England and especially London was seen as important while the rest of the country could go hang. Still a lot of resentment lingering even today. There’s probably no easy way to restructure an entire industry in that manner, but putting all your eggs into the stock market and financial sector basket and encouraging the public to get rich by investing, while there were no alternatives for the closed mines, did seem to be very pointed in serving one section of the people out of everyone else.
Thanks @Deiseach, very interested in what was going on the other side of the Atlantic.
Obvious differences but also obvious parallels between Reagan/USA and Thatcher/UK at the time.
I remember reading some old book by Bertrand Russell in which he contrasted how much more violent U.S. labor struggles were than Britain’s (of course he was only including the island not the Empire) but from television in the 1980’s it seemed worse in the U.K. at the time, though in same sense the cocaine market share “turf” battles of the ’80’s could almost be considered a “labor struggle”.
Also you taught me a new word!
from television in the 1980’s it seemed worse in the U.K. at the time
Prior to Margaret Thatcher and the Conservatives forming a government, there had been the Labour government of James Callaghan which, at its end, was hit by a series of public strikes – the famous Winter of Discontent. The British economy was weak, the Labour government broke agreements on pay rises in order to try and keep the public finances under control, but some private unions were able to negotiate separately and got higher rises, so the public unions pretty much all came out.
Public discontent with the state of affairs resulted in the Conservative election victory, and as Prime Minister Thatcher had two things going on: one, she was being advised that the coal industry was more or less dead on its feet and had to be rationalised, which would mean shutting down a lot of the pits, importing cheaper coal from abroad (yeah, it was apparently cheaper to buy it in than mine it at home, I have no idea why) and try to diversify into other sources for providing power like oil, natural gas, and nuclear (because a miners’ strike earlier in the 70s had hit energy generation hard due to the reliance on coal which had a bad effect on industry) and two, she wanted to break the power of the unions so that something like the Winter of Discontent couldn’t happen again.
By taking on the miners’ union, which was the strongest and most militant and the leadership, she would kill two birds with one stone. And going into the second period of government after winning a second election, when the economy was beginning to improve and there was the start of a boom, she had the impetus to do that. Things did get bad, what you saw on the television was real about clashes with police, but she won and the power of the unions was much decreased.
That the northern industrial towns and cities which had been the powerhouses of the Industrial Revolution lost out when the Stocks and Investments Revolution took off down south was a by-effect, and it echoed or foreshadowed the same decline in the American Rust Belt states – good blue-collar manual labour jobs gone never to come back, nothing coming in to replace them, and the economy centring on new skills and locations.
Nothing like Blair Mountain, with thousands of armed men on both sides exchanging fire and tens to hundreds killed, ever happened in Britain (and this is not because of gun control, which didn’t exist in any form in the UK until 1920).
At the time Russell was probably writing, the most infamous case of labour-related violence in recent British history was the 1910 Tonypandy Riots in which troops were deployed on the orders of then-Home Secretary Winston Churchill, but never fired a shot. Exactly one striking miner died, of head injuries from a police truncheon. These events have been mythologised to the extent that many people now believe that the troops fired into crowds and large numbers of miners were shot.
More people died in the 1911 Llanelli riots, but these are talked about less for various reasons. The 1919 ”Battle of George Square” in Glasgow was again a riot in which nobody died, and the army was sent in to restore order. A lot of myth has grown up around this as well, including that tanks were used (they were sent to the city but never left the railway station) and that all the troops deployed were English out of fear that Scots would support the rioters, while the troops based in Glasgow were confined to their barracks (simply untrue).
The 1926 General Strike AFAIK involved little or no violence, though the military was deployed- including two battleships sent to Liverpool to deliver food supplies.
Do you happen to know which battleships? Google doesn’t turn up anything, and I don’t have the right books to hand.
@bean I thought you might ask. Barham and Ramillies.
Thanks. I’ll have to see if I can find more details when I get home.
>yeah, it was apparently cheaper to buy it in than mine it at home, I have no idea why
Maybe they’d already mined all the easy-to-reach stuff in the UK way back in William Blake’s day.
Pwl Mawr, for example, has old gallery mines (i.e. dug horizontally into the hillside) superseded by proper underground gubbins.
Mines will always eventually hit an uneconomic point and close. Remember, at the end of the day the mine is a big machine for lifting the resource from great depths; eventually the costs of doing this will outweigh the value of the minerals. (ETA: The cost of removing water increases with depth as well–Homestake Mine was being looked at for a deep laboratory, and it was costing almost $250,000 per month to dewater) Most mines don’t “play out” because they run out of ore.
I grew up and went to college in Michigan’s Upper Peninsula, which at one point was producing nearly 25% of all copper in the world. Now that area doesn’t even have an Interstate, and since 1980 has been in an economic decline similar to what the rest of the country experienced in 2008. The last mines were almost 9000′ deep. Imagine the cost of hauling thousands of tons, only 2% or so copper, almost two miles up before you can even consider extracting it from the waste rock.
(To keep at least somewhat to the thread OP, this area was the location of the Italian Hall disaster, where somebody shouted fire in a crowded room)
It’s also interesting to note that apparently these mines were not profitable while they were attempting to mine the large nuggets of native (pure) copper found underground. They couldn’t effectively break the pure metal up with explosives, so had to laboriously hand-excavate the 100% copper. When they found veins that were only a couple of percent copper, it was easy to break this up with explosives and skip it to the surface for further processing, and this made the enterprises profitable.
@CatCube & Deiseach
In addition to what catcube says, I can’t imagine that the mines operated by the heart of the labor movement in a country that was still officially dedicated to full employment over economic growth were the most efficient operations in the world. After all, these were the same people and policies that brought us British Leyland.
If foreign coal was cheaper than domestic, why didn’t the domestic mines shut down on their own? Was the government propping them up before Thatcher?
@AlphaGamma
Are you sure about Barham? R. A. Burt indicates that she was in the Med at the time. Ramilles was in the Atlantic Fleet in 1926, but he also doesn’t mention anything about it in her entry, which is at least a little bit weird. There’s the claim in Ramilles’s wiki article, and in a few places on the internet, but none of them seem to go back to anything I’d be willing to trust against Burt. (Which is a fairly high bar for me, to be sure.)
More than propping them up: the mines were owned and operated by the government. The Atlee Government nationalized the British coal industry in 1946, and it wasn’t privatized again until 1994.
@bean: I can’t find anything particularly reliable, certainly about Barham. There are photos of Ramillies in Liverpool at about the right time, which claim to be from during the strike (in various histories of the ship). Some have captions stating that Barham was also there.
I also can’t find the original source of this piece of information, so perhaps one battleship turned into two!
Effortposts always do; hence the name. But they are worth it, and I am still digesting this one.
One thought: If entities like your Plumber’s Union are either deliberately or coincidentally trying to duplicate the old Guild system without being directly derived from it – and if the Guild system itself traces back to the Roman Empire and survived a Dark Age, that suggests an enduring need that will be met even after the present institutions fail. So I wonder what the next incarnation will look like?
Thinking this over more, I see a distinction in unions between those that actually function like the old guilds (specifically in the matter of training and credentialing) and those that don’t. And while I’m generally anti-Union, I think the former makes a lot more sense as an institution. Based on what you’ve said, the union provided most of your training and was the organization that verified you had the skills to be a master plumber. That makes a lot of sense, even if it’s a very different model from the one we usually use today.
But it’s one that fits best with a certain model of work that’s pretty rare today. If I need a plumber once in a while, makes sense to hire union for quality if I don’t feel competent to judge an individual plumber. If I’m a building contractor who has steady work for a crew (or any other institution that has steady work for a plumber) I can train my own plumbers or do my own quality screening.
And it makes even less sense when extended into other industries. Take a teacher’s union. They don’t provide training. That’s paid for out of the prospective teacher’s pocket. They don’t evaluate prospective teachers, or provide any sort of quality guarantee. If anything, they probably reduce the average quality of teachers by preventing the bad ones from being replaced. And that ignores the obvious conflict of interest of a large lobbying group for public employees when it’s all being paid for out of someone else’s pocket.
This is somewhat informed by my brief experience as a union member. I was in the IAM for 2 months during a summer job in college. The only thing they did for me besides taking a cut of my paycheck was the time the shop steward threw a fit that kept me sitting around for an hour. It was a composites factory, and I usually did a lot of the grunt work of moving things about and other mostly-unskilled labor. But that day, we’d run out of things for me to do, and the cell lead wanted me to do some layup work (actually building the parts). The shop steward insisted that I wasn’t allowed to, and so I sat around while he argued with management. He seemed to think they were bringing us in to take their jobs. Never mind that the factory was moving to Mexico because it cost too much, or that I had a year to go until I had an aerospace engineering degree, which meant that staying around at $11/hr didn’t hold much appeal. When the word finally came down that I was a full IAM member, and that he couldn’t stop me from doing layup work, I managed to not be smug about it, but the whole thing left me rather cynical about unions as a whole.
Are there that many big enough building companies that provide steady jobs and could afford training?
My understanding of building is that most companies hire on a per-project basis instead of a permanent basis. Building is one of those industry with a lot of itinerant workers and temporary jobs. A lot of workers also work as one-man shops (plumbers, electricians, roofers), so a quality check is probably a good idea.
No clue, although I cut the parenthetical admitting that I don’t know such things. A better example might be someone who needs to employ a full-time plumber to fix their existing buildings. (Which is what Plumber actually does.) In any case, I absolutely grant that the building trade is a case where unions make much more sense than, say, education.
The craft union model seems like it’s an alternative to the credentialing function of universities. Though at the high end, you seem to get back to a kind of apprentice/journeyman/master model. Think of doctors going through residency, sometimes a fellowship, and then finally being allowed to practice medicine. Or researchers doing a PhD and maybe a postdoc.
@bean
Very apt, after ten years of working construction (which is what 9/10th of the classes were directed towards and 99/100th of the apprentice work) 7/10th of what I’ve done these last seven years had to be learned new, and since much of it is jail and autopsy room specific, doing the job is the way to learn most of the job.
As it is the city requires eight years of experience to apply, ideal would be someone with both large scale construction experience and household repair experience, which is a rare combination (since unless they learned at a different counties jail no one will know that already!)
Typically the only guys who get into the city are union construction and self-employed repair guys as non-union employers won’t give the references (I assume as I’ve never seen them).
The only plumbing apprentices the city employs and trains itself are “utility plumbers” which is for large water mains in the street.
As far as the quality of guys, I’ve never worked with non-union trained guys in a non-union setting, but I have worked with them when they got union jobs, and compared to those of us who went through the union apprenticeship I’d say the guys who learned the trade non-union are more extreme in their skills than us, a little over half are worse at first, with about half of them typically not kept, about a quarter are about on par with a bit different strengths and weaknesses than a typical union trained guy and the rest are better than all but a few guys who’ve only worked union.
Typically the best of the guys who learned the trade non-union have owned their own company and got tired of the paperwork or need health insurance, often they’re made foreman.
The very best plumbers I’ve encountered have been union trained but a higher percentage of the non-union trained guys have gotten close to that level.
Where the union apprenticeship seems to do better than the non-union (unless a lot of those guys were just lying about their experience) is getting guys up to s minimum skill level, but fewer are exceptional good.
From my perspective the odds are better of getting a good plumber if you hire union, and the odds are better of getting a really bad or really good plumber if you hire non-union (actually ex-non’union), with the caveat that the best I’ve seen have been union-trained but I’ve simply worked along side more union trained plumbers so probability.
Truthfully though, for most household jobs you just want someone who won’t make things worse.
What distinguishes the very best plumbers from merely good ones?
@Chalid
Most plumbers judge each other on speed (which is typically all an absentee owner cares about), but also aesthetics (is it plumb level and true?), especially of how things look behind a wall where the customer can’t see, and if the piping will hold more pressure than the code requires, though I have heard foreman yell “If you don’t have any leaks that means you’re working too slow!”, but for the most part, as long as a minimum speed is met, quality of workmanship is esteemed more, much of which is judged by the adage “If it looks right, it is right”.
Breadth of knowledge is valued, as is a sort of practiced imagination “If we put a tee and union here we can still be within code and bleed off the water getting passed the shutoff valve and solder the line”.
Old skills tend to be admired more, knowing how to set a lead and oakum joint is usually more impressive than knowing the new crimp techniques, except when speed of production is critical (you only have so much time before the tenants wake up and start using the drains).
I am not certain that doing your own training and quality screening is a good fit for skilled long-term employees. The guild/union system is a pretty common model for professionals. For example, I’m an actuary. The Society of Actuaries sets the syllabus, administers the exams, and issues the credentials. The substitute doesn’t seem to be credentialing by employers, but credentialing by educational institutions.
On the flip side, our machinists learn almost entirely on the job, and I don’t believe there’s a better way to train them…
That’s a decent point, although the obvious difference between an industry association and a guild/union is that the later attempts to be a negotiating body with employers in a way that most professional groups don’t. When I was talking about training and screening, I was thinking of things like assembly line work, where there’s a lot less skill.
Does the government require that some things be done by credentialed actuaries?
There are a few roles that are legally required to be filled by an actuary, but those are a very small portion of what actuaries do. For example, my employer has 3 people who fill roles that are required to be an actuary (Appointed Actuary, 2 Illustration Actuaries) but employs about 100 actuaries.
If you don’t mind a giant inflatable rat in front of your project.
A couple of points.
My uncle Aaron, later a U of C professor and one of the founders of economic analysis of law, told me he was at one point a member of the IWW. He explained that he joined for the fringe benefits.
My understanding of the eventual situation in the coal industry was that the union and companies cooperated in cartelizing the industry. If a mine produced too much the union would shut it down, and the mine workers and mine owners shared the profits from the higher price.
On the subject of the Scottish Clearances there is a recent book by TM Devine (I have not read it) that has been highly praised. Here is a summary:
TM Devine says in his conclusion how writers, from Alexander Mackenzie in his 1886 “History of the Highland Clearances” to John Prebble in “The Highland Clearances” have “opted for the single explanation of human wickedness” – a famed warrior race betrayed by its leaders whose greed and lust for riches led to empty glens populated by sheep. The truth, as explained in this outstanding book, is infinitely more complicated. Surprising facts: Highland populations continued to rise during the age of the Clearances, landlords strenuously opposed emigration in the late 18th-early 19th centuries, and from the end of the Seven Years War onwards emigration for many was a positive choice. This book is a powerful social and agrarian history of Scotland from the seventeenth century, and as the title suggests is not limited to the Highlands. Indeed, the central third of the book gives a detailed account of the clearances in the Lowlands and Borders which have been little examined by historians.
The first section deals with the “Long death of clanship” from James VI/I onwards. The odds were stacked against the Highlands agriculturally – with only 9% in cultivation and good only for raising black cattle too valuable for the people to eat, the poverty of the region was one reason it took the Scottish state so long to gain control over it – it wasn’t worth it. Clan-based society was undermined by acquisition, crown charters and intermarriage, while more and more clan gentry pursued expensive lifestyles in the capital which the incomes from their poverty-stricken estates couldn’t support. The Napoleonic Wars provided some relief with the demands they created for beef, men and kelp (for chemicals). By the mid-19th century two thirds of Highland estates had changed hands following the bankruptcies of their traditional owners. Edinburgh lawyers acting for the new owners were unsympathetic to the plight of their tenants.
Two generations before the Highland Clearances, the Lowlands underwent a clearance which resulted in the disappearance within a few decades of an entire social class, the cottars. But this went with a rapid expansion of towns and villages, and new economic activities which meant leaving the land was a positive choice. Between the 1871 and 1911 censuses the trickle leaving the Lowlands countryside became a flood, “caused not by destitution but by the lure of opportunity”. Protest against enclosures took place in Galloway, but most protest was around religious, not agricultural, matters.
The background to the clearances in the Highlands was a rapid and sustained population increase, a fact ignored by Prebble and his ilk. And, unlike in the Lowlands, the population stayed put – they moved to overcrowded, tiny holdings and to the coast, where they were expected to take up fishing, kelp gathering and whisky distilling. The potato came to the rescue until the potato famine of the 1840s added a new level of misery. Crofting was a new system. The recruitment of Highland regiments (with recruitment bonuses for landlords) provided some relief in the late 18th century, but this reached its limits.
Villains – the Countess of Sutherland, of course, and her agent Patrick Sellar, “whose name lived on in infamy”. Devine explores the racial overtones of the Clearances – the view of the Gael as an inferior race, and the role of CM Trevelyan who saw mass, forced emigration, to rid the Highlands of Gaels, as the only answer. For a while the Scottish press supported the landlords, but in the final chapters he explores how the tide turned in the 1880s with the Highland Land Law Reform Association, the Napier Commission, the 1886 Crofters’ Holdings (Scotland) Act, and the vilification of the landlords in the press – and in Alexander Mackenzie’s book.
Thanks @Rusty!
I know of the clearances chiefly because of my extremely anti-English, despite [because?] my mom having an English maiden name, congenital rebel father’s rants, and beyond “Okay he didn’t completely make it up” I haven’t researched it much.
This was really interesting thanks!
That’s a fascinating history. I was unaware that the words collegium and corporation had ties to old Roman history!
I can confirm, at least, that Winston Churchill’s History of the English Speaking Peoples makes some reference to the economic and social troubles you refer to with land enclosures, and that it occurred during the time of the rise of sheep-farming and wool-production in England. I can’t figure out how it related to the change of status from serf to tenant-farmer, nor the full scope of those legal and cultural changes.
Two comments about indentured servitude in the history of the United States:
1. I’ve seen one reference, online, to the history of slavery in the United States. Apparently, black-skinned Africans were bought at slave markets, but were called “indentured servants”. Most such indentured servants did not have any knowledge of the law and customs of the English speaking world, and were forced to make a mark on a piece of paper to sign up for a new period of indenture every so often. Somewhere in the first 20-30 years of this practice, local judges argued that African-descended people could not be freed from indenture the way that English-descended people could…thus, putting race-based slavery into practice in the English-speaking world.
This history ought to be better known, in my opinion. The legal framework of indentured servitude, in the English speaking world, had existed for centuries at this point. Slavery (for people not convicted of crimes) had been almost entirely non-existent for as long, or longer.
If anyone ought to considered the Primary Villains in developing the practice of slavery in the United States, the judges who made rulings that African-derived people could not be released from indenture ought to be very high on that list. Separately, the legislators who wrote laws that the condition of slavery was inherited from the mother ought to also be very high on that list.
2. Distantly related to the above point: one of my ancestors came to the American Colonies as an indentured servant. He was English-speaking, white, and carried an Anglo-derived surname. He arrived in Massachusetts Bay Colony around 1630. (Which places him among the second wave of settlers to arrive in the colony, if I remember rightly.)
He finished his term of indenture, and was listed as a freeman in local records after a certain number of years. Shortly afterwards, he relocated to a new settlement near the Connecticut river. His date of marriage is also about that time. Within two decades, he passed away, and left an estate that was recorded in probate records. (At this point, I give my personal thanks to my uncle, the family historian, who has pieced together an extensive genealogy covering the family history inside the United States…)
This is an indenture that doesn’t look related to a skilled trade, as the man in question was apparently a farmer for most of his life, after finishing his term of indenture.
This reminds me that indenture was very useful for guilds and skilled trades, but it was also used in many places where there was simply a need for labor. And that indenture, though possibly under harsh conditions, was a good thing for those who finished the term of indenture, and became freemen in the Colony.
@S_J,
That’s very interesting!
I’d read that the Africans brought over were originally thought to have the status of “indentured servants” as well, but I never learned the legal changes that created chattel slavery.
Of some interest to me is in the Appalachian mountains (as attested to in Album’s Seed) they were supposed to be “Greek”, “Phoenician”, or “Portugese” communities that were a mystery how they got there, well after that books publication, mystery solved.
The case seems to be that of John Casor. Interestingly enough (ok, very interesting enough), the plaintiff in the case (i.e., the person who was ruled to be John Casor’s owner) was a free black who himself had previously been an indentured servant. https://www.smithsonianmag.com/smart-news/horrible-fate-john-casor-180962352/
On indentures, the 1777 Constitution of the Republic of Vermont bans both slavery and indentured servitude, although with exceptions to allow apprenticeships. The text has largely survived into the present-day State Constitution, saying that nobody should:
be holden by law, to serve any person as a servant, slave or apprentice, after arriving to the age of twenty-one years, unless bound by the person’s own consent, after arriving to such age, or bound by law for the payment of debts, damages, fines, costs, or the like
(In the original Constitution the relevant age was 21 for men and 18 for women- I don’t know whether this changed in 1994 when the Constitution was revised to be written in gender-neutral language, or before).
Fascinating stuff, thanks you @gdanning and @AlphaGamma!
The report button is still broken, at least for me, so it’s going to be difficult to report anything.
As a Biomedical Scientist™, I have to say that gwern is talking out of his ass about codon re-mapping giving “near-perfect immunity” to bacterial infections. Viral infections, sure, but bacteria don’t care about your codons because they have their own tRNA. The reason why bacteria can infect your body is that your body is (by design) a very hospitable and nutrient-rich environment, and changing your codon mapping is only going to change that if it kills you.
Yes, in the previous thread I posted a response to Gwern’s original comment saying the same thing. It’s also been noted above in this thread.
On the other hand, immunity to viruses would be pretty nice anyway.
I haven’t read the books, but the thesis you are proposing (folks should be nicer to introverts) strikes me as true, for values of ‘should’ that are mostly ‘I desperately wish they’d’.
I definitely agree that confidence/assertiveness/extroversion is likely to increase someone’s likelihood of succeeding. I don’t have anything beyond my own experience to base this off, but I figured I’d add my drop to that bucket.
IIRC, a surgeon in the US won’t enter residency (i.e. start learning surgery) until they are 27-28 years old. (High school to 18, undergrad to 22, medical school to 26, *then* residency.)
If you want to tour the world playing the violin as a soloist you’re probably out of luck. Anything else you might reasonable want to do, go for it.
This is a bit of a festive corporate ethics question that’s been bugging me since I recently listened to A Christmas Carol again yesterday:
Today is December 27th, somewhere in 19th century London. Your name is Ebenezer Scrooge and you are the last survivng partner in a counting-house, Scrooge & Marley. Just two nights ago, you were haunted by three (four, really) ghosts who taught you the true meaning of Christmas. Yesterday, you took the first steps in repenting for your sins by, among other things, not punishing your clerk for coming in to work at a normal hour and by giving him a raise.
So, now what? How do you run a successful 19th century book-keeping / accountancy firm in keeping with the spirit of Christmas?
You don’t. The missing part of the story is that Scrooge ends up going bankrupt and in debtor’s prison. Since he is estranged from his family, no one is willing to pay for him, so he lives there for the rest of his life, until the rats finally kill him.
The End.
😛
Wait, why wouldn’t the modern corporate tactics work in the 19th Century ? That is, you work your employees like slaves 364 days/year; but once a year, you throw a lavish Christmas party, and invite all the little people as well as their families. The net cost will be a negligible fraction of your profits, and the morale boost will last the whole year.
While that may be in the spirit of Christmas as commonly celebrated, it seems to not be in the spirit of A Christmas Carol.
I dunno, it seems to pretty aptly describe Mr. Fezziwig and he’s held up as a role model for Scrooge.
I’m not sure they work all that well in the 21st century. Lavish they might (sometimes) be, but I’ve never been to a company Christmas party that didn’t bore me to tears.
I’m just a bit nicer to Cratchit and call it a day. Scrooge & Marley seemed to be have pretty decent profits (and Scrooge himself was such a miser that he wasn’t enjoying them anyway), so it probably wasn’t such an efficient market that any increase in costs would sink the firm.
Biggest problem is going to be finding a new partner who will both be competent and buy into this plan.
I’d argue that priority 1 should be keeping people out of debtor’s prison. Priority 2 is probably giving people money for things like Tiny Tim’s operation. I think you may be able to get here by using some sort of credit score/acceptable risk of default system to ensure that you can keep the lights on. So, step 1, don’t loan to people who are kicking the can down the road by going further into debt, and don’t enter into any loans you expect to size collateral on. It may be wise to offer financial hardship deferment through some sort of formal process as well.
Step 2 – Scrooge seems to be in a position where he could (which is not to say that he’s morally obligated, just that he wouldn’t be threatened by it) stand a few more defaults. So let’s say that Scrooge designates a portion of his business for uncollateralized, low-interest loans with a relatively high default rate, possibly with something like an income-based repayment plan. He should NOT advertise this, and should make secrecy a condition of this sort of loan, to be penalized by an interest rate hike up to normal levels. Alternatively, give a fixed number of referrals per client, or make people somehow culpable for their referrals. This ensures that it’s mostly the truly desperate who will hear of this system.
You are aware of the afterlife, more so than any man who has ever lived. Your economic solvency pales in comparison to the possibility of propogating what amount to Good Place point totals for various actions. Your comfort pales in comparison to the fact that damnation awaits the unwary.
You need to keep Christmas, as the spirits command, but beyond it is morally obligate upon you to get others to keep Christmas. One is tempted to say you should finance missionaries and the like, but how would you know that the spirits would agree with their messages?
No, the most important thing to do, the only one you can be sure will work, is to publish your own story. Let everyone else hear the words of the spirits, and decide how to behave. Ideally it should be written in such a way as to pass into popular culture without debate or fuss.
Get a writer friend to do a novelization of the happenings, pass the words of the spirits on to the future swaddled in a delightful cloak of fiction.
Nice. 🙂
Counter: https://tvtropes.org/pmwiki/pmwiki.php/YMMV/TheManWhoInventedChristmas entry “Fridge Brilliance”
How do you run a successful 19th century book-keeping / accountancy firm in keeping with the spirit of Christmas?
Are they an accountancy firm, though? I must re-read to find out, but I get the strong whiff of ‘engaging in money-lending’ as well as whatever legitimate business they engage in, and that a lot of the legitimate business was built around foreclosing and squeezing every drop of advantage and profit that they could out of their clients:
Yeah I tried to figure out exactly what kind of firm he was running but it was very unclear. As you point out he apparently had debtors and changed money, and I think at more than one point Dickens refers to his building and his former master’s as warehouses (warehouses of what?). I settled on accounting because Scrooge & Marley is most consistently described as a counting-house.
So I guess stepping back from whatever usury he was involved in? Presumably he should continue lending to needy people, but without charging interest it’s unclear how that’s sustainable. I suppose if he’s old enough and has deep enough pockets it doesn’t have to be sustainable, he can afford to lose money until he dies. Then again, at that point he might as well close the doors and give everything to charity.
Scrooge runs a network of warehouses. Traders and the like who need space pay him to store, load, pack, etc their stuff in his warehouses. It’s a business model that’s still around today. He also owns property that he rents out and loans out money and he owns some stocks. But that’s not Scrooge & Marley, those are just ways for him to grow his money. (What? You didn’t think he was going to spend it on luxuries did you?) He has a counting house to do the accounting for his businesses.
While Scrooge is miserly, he’s explicitly not cheating anybody. Jorkin is the one who stole from the till, from clients, cut costs in dangerous ways. Scrooge booted him out of the company for such behavior. This is important to Dicken’s narrative. Scrooge is supposed to be something like his idea of the libertarian ideal: scrupulously honest and only engaging in voluntary transactions. He’s right to demand people who are past due on their payments pay up, for example. But for Dickens, this lacks Christian charity, so Scrooge is going to hell.
Anyway, the real answer to this is ‘be as much like Mr. Fezziwig as possible’. At least within Dicken’s novel.
…and then Scrooge realized what a bad person he was and stopped running his businesses as profit-making enterprises. A few years later, the last of his businesses went bankrupt, and Scrooge retired on his savings, with a warm glow of goodness accompanying him. All his employees had to go find other jobs; his warehouses fell into disrepair and became hideouts for local criminals; all his former customers had to raise prices or let employees go to handle the now higher cost of finding properly-run warehouses. God bless us, every one.
I do have to admit, I think Dickens is particularly ruled by his childhood traumas. One of them was when his father ran up debt and was arrested for it. The school-aged Dickens worked pasting stickers on shoe polish to make ends meet. He was very clear he considered this to be degrading and humiliating. When his mother forced him to continue working he developed a strong hatred for her.
Dickens always struck me as someone who cares deeply for people like himself. His sexism, anti-semitism, etc are all failures of him to imagine others complexly. This includes people who run factories like the one he worked in. You can often trace his sympathy or antipathy for entire classes of people to specific incidents in his life where a specific person treated him well or poorly.
I have no doubt young Dicken’s life would have been better if a paternal wealthy man had spent his fortune supporting the young lad. But a large part of A Christmas Carol is a morality tale about how anyone who doesn’t help out young boys like him is going to hell. Meanwhile, the impoverished Jew who resorts to crime because of discrimination is evil and deserves no sympathy. And that rankles.
One of Trollope’s novels has a character who I think is patterned on Dickens. Not an attractive character.
As far as I can tell the spirit of Christmas involves making your employees come in at 6 in the morning the day after Thanksgiving. So Scrooge probably doesn’t need to change much.
Engage in marginal charity. Hire more employees, invest in a comfortable workspace, take on more customers, and charge them less–doing all of these things more than the profit-maximizing amount, but close enough to it that most of the marginal costs are still offset by marginal revenue.
There’s no lack of wealthy Victorian philanthropists out there.
See George Peabody, who funded education in the US and affordable housing in the UK.
So I haven’t read this book, and based on the description I don’t really care to, but it sounds like the author fell into a common utopian trap:
If your ideal society requires 100% buy-in in order to work, it won’t ever work. Likewise, if your ideal society can’t be built towards one piece at a time but has to be imposed on the entire world all at once, you’re going to be very disappointed with the results if it is imposed somewhere.
Does she have any suggestions for how society could reform in ways that would make life easier for introverts or does she think nothing short of a total reorganization of society would work? Because if it’s the former there’s actually something to talk about, but if it’s the latter she needs to get comfortable having her ideas treated with the seriousness afforded perpetual motion machine cranks and sovereign citizens.
I have made a reasonable effort at teaching fencing to ~1,000 adults (ranging from 18-75 years old with most of them being closer to 18 than 75) There seem to be three groups of people as far as blade work/fine tip control go. Some people will never be good and it is obvious within a month, even if they train for years that part of their fencing will always be crippled. Some people are good from the beginning and within a month you can tell that if they stick with it they will be making beautiful pinky or toe shots within the year. Some people are not in either category and seem to get better in direct proportion to how hard they work on it. I would say incidence is roughly 30/20/50. And this does not seem to correlate well with what other fine motor skills they may have. I.e. I teach a woman who is a professional pianist and she is solidly in the first group, but on the other hand she picked up the piano skills as a child so maybe ability to acquire fine motor skills and an adult is not well correlated with being able to do it as a child?
My favorite part about teaching fencing is how petrified about half of people start out at poking people with foils. My old teacher used to have a ‘first lesson’ that included people having to hit him as he employed the awesome defensive technique of ‘just standing there’ and about half of the group would fail.
Yep, overcoming people’s instinct to not do things that look like they would hurt one another is one of the basic hurdles. I generally call a more advanced fencer over and whale on them in a spectacular looking way to demonstrate that the gear actually works and no one is going to get (seriously) hurt.
Epee in my case, are you still active?
No, I am old and spectacularly obese. But I used to be alright at foil and saber. I never did epee, but it looked like a lot of fun!
Yup, that never stops being funny.
On the other hand, teaching people to get over this is much easier than teaching the people that start out think they’re jedi’s to parry an attack without decapitating the referee.
Heh. I started with saber at age 6, and so never really had an issue with this – when I took over my college club I had a devil of a time figuring out what these peoples’ problem was.
You should see what people are like with HEMA. Yes this spear isn’t going to hurt that bad i promise!
Yeah… That’s a big part of why I don’t do HEMA. Too much pain and concussion risk.
Anyone else feel like you’re surrounded by idiots?
Not just average/subaverage normies. Those at least have an excuse for their lack of competence. But I’ve recently been rudely shocked by the ignorance exhibited by lawyers, accountants and physicians. Those people are supposed to be smart and knowledgeable! Meanwhile, it seems that any intern with access to google could do their job equally well, if not better.
It’s like in the joke where two just-graduated students are talking. “You know, Frank, when I consider what kind of engineers we are, I’m afraid to go to the doctor.”
I think this feeling is far stronger nowadays, and I blame wikipedia. Everyone looks dumb next to websites designed to make us feel smart.
Have you ever heard the joke about where the idiot in the room is if you can’t immediately spot him?
There are errors and then there are errors. The ability to Google something after it’s become painfully clear what the correct thing to Google is does not correlate meaningfully with the ability to predict what that thing will be from symptoms, or to argue that it’s some entirely different thing per USC Section 198.4.6 (which hasn’t been otherwise referenced since 1928), or to set up a new and exciting tax-avoidance scheme which remains within the bounds of things the IRS probably doesn’t care about (or, more charitably, to notice that the CFO is embezzling before it shows up in the papers). These latter are the kinds of things we need experts for. It should not be expected or assumed that people with lots of knowledge in one technical domain have a lot of knowledge in others, and I suspect people doing precisely this is the cause of your confusion.
I don’t think this is it. I got referred to supposed experts by their colleagues in the trade. What I found was basically people making shit up and not being current.
Generalized example:
I want to do X. I read relevant legislation. It allows X. But I’m just some guy with a computer. I want to talk to someone actually educated on this topic because I moght have missed something.
I go to person A, an expert. They say X is not allowed.
I go person B, another expert. They say X is not allowed.
I go to the government agency that actually administers X. They say there’s no injunction against X, so it’s allowed.
They are overworked and have bad boesses.
Not sure how old you are Anonymous, but one thing you learn as get older is how terrible most people are at their jobs.
I learned this. The next thing I learned was how miraculously well the world managed in spite of all this apparent incompetency.
What can mere mortals do when Moloch is on your side?
The world has learned how to manage this. Jury duty in a large city showed me how designing to the least competent of us can make things overall efficient, yet infuriating.
Sometimes, mostly when it comes to experimental design and statistics.
One of my early shocks in graduate school was when one of the higher-ups in my department quizzed me on how many biological replicates I would need for an experiment. I started laying out how I would go about doing a sample size calculation to make sure that the experiment had sufficient power without being too expensive, but he cut me off. The correct answer he was looking for was “three.” The convention is for three biological replicates and therefore that’s the appropriate number for any experiment.
That said, I think that most of the time people can do a good enough job despite a poor grasp of the principles behind what they’re doing.
As well as what others have said, and depending on specifics, I’d also look for both “perverse incentives” and “optimizing for speed”. I see (and commit) errors of both kinds in my field of software engineering.
On the one hand, I always have more potential work than I could possibly complete before the next release. So I’ll take a fast guess at whether a given problem is really urgent/important, and what the root cause is likely to be. Sometimes I’m wrong.
On the other hand, I’m rewarded more – or punished more – for certain successes/failures than for others. And the people making the rules/providing the rewards never have enough time to understand my work in detail. I get to choose between doing what I expect to be best for me, and what I expect to be best for the company/client. I can also be wrong about both of these – not because I’m an idiot, but because I have limited time and limited information. So you get CYA behaviour and coverups. You get people doing what they are told, even though they fear it’s a bad idea – because they might be wrong, and they will get in trouble for making waves. At best, my incentives don’t align very well with other people’s needs. and after too long in the job, I probably forget there’s any goal beyond promotion/raises/better asssignments etc.
At any rate, I think a lot of “stupid” is accounted for by this pair of problems. Also, that it’s, unfortunately, a lot worse for physicians in particular, than for software engineers – insurance simply won’t pay them to spend long enough per patient to get beyond “most likely” all, or even most of the time. And they’re probably working long enough days that they have no time left for curiousity and learning, beyond mandatory license-maintenance classes.
Also, somewhat scarily, it’s probably cost effective to get the wrong answer 10% of the time, catch 90% of those when they make a new appointment, and fail to adequately treat the remaining 1% – rather than e.g. spending twice as much time per patient, on average. That would be cost effective for the insurers 🙁 The patients who fall through the cracks won’t tend to agree.
When I was a child, I used to believe there were grown ups out there. Now I see there are just other kids in aging bodies, pretending to be competent. Some of them still believe that others are not faking it; I don’t.
But also what @idontknow131647093 said. If someone is not overworked at the moment, most likely their manager is already dreaming about increasing productivity by assigning 10% of team members to a different project (or firing them).
I like to phrase it “In your twenties, you think that there are adults somewhere running things. Then you get to your thirties and realize, nope, it’s just highschool all the way up.”
As someone in my twenties, I find this comment very disheartening.
Maybe I should phrase it “As a teenager you think you know everything. then you get to your twenties and realize you don’t, but assume that there are adults somewhere running things that know what they’re doing. Then you get to your thirties, meet those adults, and realize they don’t, it’s just highschool all the way up.”
No! It’s good news! There’s no secret knowledge possessed by the guys running things, they’re all just faking it. I can do that.
It is like high school all the way up, but not in that way. You get old enough and you find that every company is like 5% hyper competent people holding it up. Ask someone how many great teachers they had in school and it comes down to like 3 names over 12 years and probably 30 teachers.
As I’ve moved up the ladder, I’ve started to doubt that there is such a thing as hypercompetence. there are things that people can be very good at, sales, technical domains, fundraising, etc. But no one is good at all of them, and the people deciding the future of institutions are almost always either throwing darts at a board or trying to replicate past success. And that’s enough to muddle through pretty well, most of the time, but it’s a far cry from wise men sagely guiding anything.
cass, that is what hypercompetence is: knowing what you are good at and sticking to it.
It is definitely not top down, its distributed. Some middle manager is holding a department together, someone is mediating between employees who don’t like each other without having to get HR involved etc.
Man, I wish I could have ever felt like there were competent adults ahead.
I mean, TBH, shouldn’t public school disavow everyone of that notion?
I do. The way I’d put it is that there are competent and incompetent people in every profession, and while some professions have different distributions than others, it’s very rare to reach even a 50/50 split.
The only reliable indicator is knowing the person. Even just talking to someone for a few minutes is a fairly reliable data point, usually more reliable than their reputation. Of course, sometimes we can’t choose whom we interact with.
I run a team of people with graduate degrees. I struggle mightily to get them to understand anything more complicated than extremely basic excel usage. I can teach them particular tricks, but I cannot get them to re-combine or expand on them to to apply them to novel circumstances without explicit hand holding. Getting them to google solutions to problems that I haven’t taught them to solve isn’t even a dream anymore.
I have an example I was surprised by:
I took an instructional techniques course for pilots several years ago. Everyone else in the class was age 20-25 and had a fresh commercial license and was working on an instructor rating. They were very familiar with how to teach aerodynamics and lift. They all knew the lift equation by heart, had memorized lessons for teaching lift and aerodynamics and all the other things to the new students that they would be teaching once they had passed their flight test.
So the course instructor goes up to the front of the class with a hair dryer and a ping pong ball, and balances the ball on the upward stream of air, then slowly turns the hair dryer so it is at 45 degrees from vertical, and magically the ball is still stuck in the middle of the stream of air a couple feet from the hair dryer. Like This. e instructor asks everyone to get up and explain why the ping pong ball didn’t fall out of the side of the airstream, one-by-one.
The first five students each try and explain and have no idea why it happens. Then I get up and explain (not very well) correctly why the ping pong ball stays there. I expected everyone who went after me to understand it at least and explain it better, but only one person out of the following seven understood the concept. She explained it much better than me, but even the students that followed her still didn’t understand!
The students had all had it drilled into them, bernoulli’s principle that gas at higher velocity has lower pressure, and the top of the wing had higher velocity/lower pressure than the bottom, and that’s why there was lift imparted on the wing. They could all explain this with total confidence and no hesitation: they had already learned how to teach it specifically. But they could not for the life of them apply that knowledge to something that wasn’t a wing but a ping pong ball. Not only that, at least half couldn’t understand it after having it explained!
That one experience made me quite a bit darker on the capacity of humans for knowledge creation. The height of our civilization and technology is an absolutely miraculous achievement given how difficult it is for people to actually understand even the basic things they ‘know.’
Ok, so now I’m curious, how would you explain this without using the pre-cached concept of Bernoulli’s Principle, and without writing out the actual equations ? I’m tempted to say something like, “the air inside the airstream from the hair dryer is moving, which means that there’s less air inside the stream on average, and so the ball is sucked into the stream”. However, I’m pretty sure that’s wrong on multiple levels…
It’s not Bernoulli, it’s Coanda — airflow deflected to follow a smooth surface. I believe Bernoulli does apply to the version with the funnel, however.
This is somewhat sadistic because it’s not Bernouli – that equation’s validity is limited to streamlines, and the static pressure from the hairdryer is atmospheric anyway. Shame on your teacher.
I was drawing a shitty mspaint diagram, but there’s actually a Wiki page.
https://en.wikipedia.org/wiki/Coand%C4%83_effect
Wait a minute, but isn’t slide #1 (demonstrating molecule entrapment) on that wiki page literally the Bernoulli effect ? The subsequent steps build on it, admittedly, but still…
Short story: viscous entrainment is weird, and a free jet into a similar medium is an awful headache.
Long story: at hair dryer speeds, air exiting the fan remains at ambient static pressure; PV=NRT (or really, for a control volume, PV’=N’RT) for the blower tells you this, as you’re not blowing hard enough to see compression effects, so the air acts relatively incompressible. This is not the case for all jets, particularly when nozzles are involved, and those CAN create a low pressure system by forcing a change in V’ at supersonic speeds. But in our case, there’s no nozzle, just a fan. However, air is not bullet-like; when you blow air into a reservoir, viscous effects cause the flow to spread and agglomerate more air in a process called entrainment. This is fairly well-modeled for laminar flows, but not so much for turbulent flows. Regardless, you should recognize that jet carries much more air by mass with it than it exits the blower with. This is NOT a Bernoulli effect; Bernoulli calculations assume zero viscosity and thus zero momentum transfer, and are not suitable for analyzing free jets where the fluids are similar.
Now, you may, protest that Bernoulli shows that I’m wrong, and the fact that you can draw a streamline from stagnant air in the reservoir (room) to the air that’s entrained in the jet shows that the pressure must drop. This is wrong. The jet does work on the air to accelerate it viscously, violating Bernoulli assumptions.
This all requires a bit of fluids thinking; usually people think of pressure as an exerted force such that pressure differences drove flow. It’s easy to think that a fluid outflow creates a “low pressure,” dragging more fluid behind it, and that the absolute pressure of the fluid is therefore low. This is completely false. Pressure is only necessary to overcome losses in the system; conservation of momentum is the thing that drives fluids, and in real life the pressure along a streamline equalizes instantaneously, such that (in the high school version of Bernoulli) if v and h are constant, P will NEVER change.
In real life, of course, this is complicated by the fact that air is compressible. However, if you like you can do the floating ball experiment with a water jet, in which case entrainment is minimized (at laminar speeds, relative to the jet’s momentum), but adhesion still happens, and you observe the same behavior.
@Hoopyfreud:
Thanks, that does make sense. I must confess, I’m a fluid dynamics dummy 🙁 That said, I still don’t understand this part:
Are you talking about the initial pressure that injects energy into the system (i.e., the fan), or something else ?
@Bugmaster
I mean that if the fan (or leaf blower) were pumping air into a pipe or tube, that air would have to be pressurized at the pump, just enough to offset the pressure it would lose in the pipe. A flow in a straight frictionless pipe could flow for miles with no pressure behind it. So while the air “at the fan” (intake or outlet) may see some local pressure change, this is (mostly) simply what’s needed to overcome internal loses (you can retain a slightly-higher-than-atmospheric pressure in the free jet for a tiny bit due to [fluid dynamics I don’t understand]); the rest of the work done by the motor increases fluid velocity. When your turbines are big enough or you’re creating supersonic flows, this can change… but that sort of behavior is beyond the bounds of my knowledge. I only dealt with systems that manipulate pressure, velocity, and temperature simultaneously very abstractly in thermo, and I’m running on incompressible fluid dynamics knowledge in this conversation. Basically, my knowledge of fluids is just enough to make fun of the boundary layer approximation and run crying to a real fluid dynamicist if I have an non-trivial problem to solve.
But hey, it’s all cool; literally nobody knows how fluids work.
From reading the wiki on Coanda it looks like it incorporates Bernoulli (low pressure along the surface exposed to the jet).
The way lift is generated from a wing is really a rabbit hole. Bernoulli ties it all in a nice neat bow even though it isn’t technically correct, and that is the way it is taught to pilots. The way it is explained with wings should be easy to apply to the ping pong ball which is where this classroom of students fell over.
The canonical examples of Bernoulli’s Principle involve closed control volumes with no pumps or compressors. It is absolutely unreasonable to disparage people for not extending such reasoning to free jets from a fan when anyone who regularly breathes could tell you that Bernoulli assumptions are violated in the process. That’s why the idea that speed trades off with pressure is so difficult to begin with, and why the conditions under which that tradeoff is made are (or should be) made clear when it’s taught.
I’ll take your word for it because you’re clearly more knowledgeable about this than me. If you had been around when I was doing my basic gas turbine exam it would likely have saved me some time.
When you say ‘closed control volume’ are you referring to things like:
-A venturi tube used to create suction for vacuum instruments (venturi
-on a turbine engine, after the compressor section, the diffuser (just prior to the combustion section diffuser
Or are those not really Bernoulli either?
I still contend though that if you’re taught over and over again that wings generate lift by Bernoulli because of different air velocities over the top and bottom, that it makes perfect sense to apply that to a ping pong ball with differing air velocities.
Venturi tubes are good; so are pipes. Diffusers seem hard – I have no real idea how mass flows work in one, but I suspect it will work. I’m just not brave enough to do the thermo + fluids math that’s required, since I’m pretty sure incompressible assumptions don’t hold and some heat goes into evaporating the fuel, and the ratios of everything are dependent on temperature differentials and ambient pressure and god knows what else. Makes me feel ill just thinking about it. But yes, it probably counts, though you probably can’t use Bernoulli for other reasons…
For a control volume, you just need to be able to draw a box around your system such that the input and output mass flows are known. For a Venturi tube, for example, there are two outlets and everything else is a wall. For Bernoulli assumptions to hold, frictional energy loss within the volume must be negligible and no work can be done on fluid within the volume (or, if you allow work, you have to modify the equation). The reason why the levitating ball example doesn’t work is that you’re shooting a jet into a reservoir – a body of fluid that’s so large that you can’t draw a useful control volume that contains it. Frictional effects determine basically all of the jet’s behavior in this region, so they’re very no good.
As for your last point – when I said “anyone who breathes regularly” I meant that literally. You breathe out by creating positive pressure in your lungs and breathe in by creating low pressure. So when you’re told that a free jet of air – something very much like an exhalation – has a pressure lower than ambient, it makes sense to be very, very confused. All the tactile sensations we experience while breathing out say, “positive pressure.” Your throat expands naturally, you feel drag forces pushing your jaw apart and your mouth skin is dragged slightly towards the front of your mouth. None of that makes sense if the jet pressure is low, and many of the objections to the Bernoulli principle when it’s taught arise from things like this. People mostly have zero intuition for change in flow (aside from the classic hand-in-window analogy) and are naturally inclined to think about generating flows in processes like breathing, where Bernoulli doesn’t hold (pop quiz: why does Bernoulli not hold for an exhalation?). So when you teach it, you teach them to look at a changing flow – the air is already moving around the plane and it’s just being deflected. You stick to hand in window and water through a pipe as much as possible. But then you throw an exhalation at them, with no prior treatment of the subject and after telling them not to worry about such things because that’s not the kind of flow you’re talking about. Is it any wonder they don’t link it back to Bernoulli? I don’t think it is.
While I have you here maybe you can answer another question for me about gas temperature:
If you stick a thermometer in a vessel with a gas in it, then expand the vessel so that the gas is less dense / less pressure, the temperature measured on the thermometer drops, correct? Is this because the molecules are actually less excited, or just because there are less molecules around to impart heat on the thermometer?
I ask this in relation to the temperature dropping as you gain altitude.
For temperature to drop, the average kinetic energy of the molecules must drop; the number of molecules doesn’t enter into it. It’s not necessarily the case that expanding a vessel results in a temperature drop, however; free expansion of an ideal gas results in no temperature change. So it depends how the gas expands.
So, in the context of a rising parcel of air (like a thermal), how fast and high the parcel rises depends on the ‘environmental lapse rate’ or the change in temperature as you move up in altitude. If the ELR is quite fast (increasing 1000ft altitude lowers temperature by 3 degrees C) then the rising parcel of air, which drops temperature naturally as it rises, will always be warmer than the surrounding air, and therefore less dense, and therefore it just keeps rising. That’s how you get towering cumulus clouds.
So my question is really: why does the rising parcel of air naturally lose temperature as it rises?
To be fair, Bernoulli’s Principle is not magic; more fundamental and incisive is that lift is generated by the deflection of air downwards, by Newton’s 3rd law[1]. How/why that air is deflected downwards is obvious in the case of a wing with a chordline/angle-of-attack, but is not obvious in the case of a ping pong ball, and I honestly don’t think that even correctly applying Bernoulli’s Principle to “explain” why the ball stays in the air employs useful insight about what is actually going on, and in fact I think it leads to more confusion than anything. Fluid dynamics is complicated; there is no tricking Newton’s 3rd law.
[1] This is a pet peeve of mine as a physicist, but even more because the incomparable “Stick and Rudder” emphasizes this point very clearly, so it’s depressing to me how much work Bernoulli’s Principle does in the pilot’s curriculum.
Take it as a compliment. The smarter/better you become in the specific areas that interest you, the dumber everyone around you will appear in those same areas, as you become more able to discern depth of knowledge from casual knowledge.
Most people never get beyond — and aren’t interested in — casual knowledge on any given subject. Once you get beyond that threshold, they can appear stupid, whereas they are really just unconcerned. Keep in mind that you also appear idiotic to those who are past that threshold in areas for which you only possess/pursue casual knowledge. Most “stupid” people are really smart/experts at something, it just might be something that you don’t value.
+1
Every now and then, there’s some report of a survey in which high school teachers are given very basic questions to answer, and fail a surprising fraction of the time. My guess is that this is the same everywhere–high school teachers are just like everyone else in that they often skated on cramming/copying a friend’s homework/doing just enough for an acceptable grade in school, they’ve often forgotten a lot of what they learned, etc.
I’m well-established in my field, and know many other well-established people. It’s routine for all of us to have areas of our field we don’t really understand all that well. I’ve explained what I thought of as very basic ideas from my part of the world to people who didn’t know them, but were super smart people who really deeply understood some other part of our field, and they’ve done the same for me.
One of the more interesting ways this became visible to me is listening to TWIV and TWIM–excellent microbiology podcasts by genuine experts. Every now and then, they will drift over into some other area of science I know better than they do, and then they often walk right off a cliff. These are guys with PhDs and professorships in a hard science and lifetimes in the scientific world, who are also science-geeky enough to want to do a podcast about their area of expertise. They must be in the top 1/10000 of Americans in terms of science knowledge. And yet, get a little outside your expertise, and it’s easy to just misunderstand or misremember something.
a) You know what they say about running into assholes all day every day, well, ….
b) w/r/t doctors in particular, your knee or whatever will always be more important to you than it is to them, unless you’re doing the whole concierge medicine thing. incentives matter.
a) Not applicable here.
b) I still would expect them to be useful for something other than signing off sick leave notes.
Me: I have a genetest that says I have A and this bloodwork shows that I do appear to have A.
Doc: The tests are worthless. You should exercise an hour a day and you’ll be fine.
I later do more extensive testing, and it turns out I have A. For which exercise is zero help.
I’ve felt surrounded by idiots at times in the past, but no longer. After getting to know my colleagues more since getting my job three years ago, I’ve come to be very impressed with how educated and smart (two different things there) they are.
However, my window on the broader world–which is mostly defined by what I see on social media and in the news media–is much bleaker. The people immediately around me are not idiots, but a sea of them is lurking out there, a little farther away.
Doesn’t that imply you just need to get to know everyone else better to see that they’re actually regular people, like with your co-workers?
Welp, this morning 3 co-workers were arguing about which kind of fire should not have water put on it: paper, or grease.
To some extent a lot of knowledge is just specialized, and sometimes people just forget crap. On the other hand, a lot of people are just incredibly lazy, incredibly stupid, or incredibly non-resourceful. These kinds of people (the majority) only work well in organizations with lots of bureaucracy and pre-established routines to ensure all critical work gest done timely. I’d say about 1/3 to 1/2 of people I’ve ever worked with are in this “unable to function outside of command environment” tier.
Been reading Nassim Taleb’s Skin in the Game recently, and it has given me food for thought about a number of issues; somewhat related to this one in particular and hopefully not too Culture War-y is a heuristic he points out using an example of restaurants, where supposedly restaurants that win awards decided by other restaraunteurs/industry professionals go out of business much sooner than you’d expect.
His explanation is that professionals in most industries are usually judged by some combination of peers and “exposure to the real world,” e.g. customers. However, this proportion varies widely by industry and maybe also by subdiscipline and even individual (some people are probably just motivated more by industry accolades and others by customer satisfaction). Unsurprisingly if you know Taleb/have been reading the book, he puts a lot more stock by the evaluation of customers as compared to the evaluation of peers. He points out that industries heavily reliant on peer…review tend to accumulate a lot of BS, because they become self-referential status games whereas industries with a lot of exposure to “reality” cannot afford to do so.
An interesting example of a field with a pretty high exposure to “reality” is medicine (especially patient-focused as opposed to research), because you have to deal with cold, hard facts like patient survival rates, whereas more academic fields are free to explore decades’-long cul-de-sacs of knowledge, weaving elaborate theories designed to impress one’s peers rather than more closely, usefully map the real world. Claiming, for example, that macroeconomics is a more BS-heavy field than micro, he states “It’s much easier to bullshit at the macro-level than it is to bullshit at the micro-level.” And, in fact, my impression is that macro is more prone to domination by fashion and charismatic personalities like Keynes. Taleb is a big fan of Hayek.
Anyway, more to your point, this made me realize something positive about doctors: I had long been annoyed by what seemed to me the “unscientific” approach of many doctors. Doctors on TV do a bunch of tests, figure out you’ve got complicated-sounding-syndrome x and then administer the perfect therapy to zap x. Real doctors check google and say things like “have you tried just not moving your arm in that direction?” “Sorry that last psychiatric medication made you unable to sleep for 3 days. Maybe just try this other one? Some people seem to do pretty well on it.”
This relates to Taleb’s idea (paraphrasing) that “scientism looks more like science than science.” In other words, you might be more reassured about the state of the field of medicine if doctors acted more like TV doctors, but if they did, it would actually probably be a sign that they are BSing you, because the reality-facing doctor has learned that, when dealing with something as complicated as the human body, a bunch of experience-based heuristics are often much superior to complicated theorizing. That is, I may sometimes be disappointed with what seems like the “un-scientific” approach doctors take, yet I should be much more frightened if doctors talked, and judged each other, as do professors of literature.
Taleb mocks this comic because while the audience for The New Yorker prides itself on respect for the opinions of “experts,” Taleb’s point is that plane-flying is not comparable to the political and scientific fields the comic implicitly compares it to because politicians and many scientists, economists, et al. are much more insulated from bad consequences if their theories turn out to be BS than is an airplane pilot.
See also: “Would rather be governed by the first 2000 names in the Boston phone book than the 2000 members of the Harvard faculty” and the problem of even well-intentioned industry experts working closely with even well-intentioned regulators.
So, yes, everyone’s winging it and it is possible you are surrounded by idiots (or that you are unusually competent), but I think it’s also worth keeping in mind that sometimes the people who seem to be flying by the seat of their pants are actually much more trustworthy.
Those idiots are probably as smart as you; they’re just solving a different problem. Your self-imposed task is to e.g. do as good of a job as possible: acquit your client, lower your client’s taxes as far as possible, heal your patient as efficiently as possible, etc. There’s nothing wrong with that, but the problem your peers are solving is different; they just want to make as much money as possible without sacrificing their entire life to this task. They look at you, with your student loans and your after-hours work, and they think, “what an idiot !”
I am bemoaning the incompetent of supposed experts in their alleged core skill. When a rank amateur with internet access can outperform them, this is bad. Really bad. This is not on the level of memorizing the entire damn legal code, this is getting the facts wrong about one of the core documents of the current law, this is apparently not reading anything about medicine published since the 1970s, etc. Really basic things. From highly respected, well-credentialed people.
What do you mean by “bad” ? Clearly, these people are still able to function as lawyers/doctors/whatever; if all their clients went to jail and/or died, they would be fired. Also, I bet that these people make more money than you or I.
So, from their personal perspective, they are doing the right thing: making the most money with the least effort. But one could argue that they are doing the right thing from the social perspective, as well. A full-course fine dining French cuisine meal is much tastier and healthier than McDonalds or a homemade sandwich; but if food was only available in fine dining establishments, people would starve. Your plumber doesn’t have a fluid dynamics Ph.D., but if we required plumbers to have Ph.D.s, we’d all drown in leaky faucet water. And no, you don’t need Dr. House to diagnose each case of common cold. It’d be a waste of resources.
No, I have the opposite problem. I instinctively assume that other people know what they’re doing.
Which can’t be true, because just for starters, people invariably disagree strongly on what the best thing to do is, and they can’t all be right. And for that matter, I’ve seen first-hand how the most cock-sure people can be completely and hideously mistaken. But I still have to struggle to remind myself of it.
I do, on the other hand, frequently feel surrounded by lunatics. Very smart and competent lunatics, though, which just makes it worse.
I used be just like you until recently.
Biweekly Naval Gazing links post:
A Brief History of the Aircraft Carrier: A look at the evolution of the aircraft carrier from the first days of flying off ships through WWI and WWII to the present day, when they dominate the seas.
The Falklands War, Part 9: A look at the Exocet attack on HMS Sheffield. What was going on during the attack and why Sheffield’s crew failed to take action.
The First South Dakota class: The last of the classic American dreadnoughts, cancelled under the Washington Treaty.
Commercial Aviation, Part 3: A look at the various business models airlines use to move people about.
Electronic Warfare Part 1 – ESM: Electronic Support Measures is the term for passive reception of an enemy’s electromagnetic emissions. It’s a vital component of modern warfare, covering everything from gathering data on a new enemy radar to figuring out where someone who unwisely used a radio is.
Do you happen to know why it’s so odd for a carrier to have a port-side island?
I believe it had to do with the rotary engines used on WWI-era aircraft. Turning right pushed the nose down, while a left turn raised it. So the island went on the starboard side, where it wouldn’t be in the way if you had to go around. The Japanese built a couple of ships with islands on the port side, under the idea of operating them with right-hand patterns next to ships with left-hand patterns. It didn’t seem to be successful.
Ah, that makes sense. The mirrored carriers thing sounds nifty in theory but I’m not surprised it didn’t pan out in practice.
In addition to what bean says, I also read somewhere that aircraft had a tendency to pull left on landing (likely as a result of engine torque + landing maneuvers), and so port side carriers had higher accident rates, but I don’t remember the source.
It’s quite possible that I’m misremembering something. The comment above was put together based on a quick glance at how the precession worked and where the islands were placed. I can look in more detail when I get home.
This piece linked by Marginal Revolution is a good example of what we don’t know when it comes to macro economics, but not in the way presented by the author. I am leaning heavily toward the belief that we don’t even know what we don’t know in macro. Some examples
We basically have it presented as “obviously major investments in a struggling downtown will lead to revitalization” but Nashville wasn’t the only city to try this in the 90s, and funnily enough it wasn’t even the only city to build a music hall of fame and sports arena in such an attempt. In 1994 Cleveland opened a new baseball stadium (then jacobs field, now progressive fied), a new basketball stadium (then Gund Arena, now Quicken loans arena), and the Rock n Roll hall of fame opened in 1995. These were pretty successful operations, Jacobs field set the MLB record for most consecutive sell outs at 455 between 1995 and 2001, and the Indians went to the playoffs 6 times in 7 years with 2 WS appearances. The R&R HOF drew 1.5 million visitors over its first two years and still averages around half a million a year 20+ years later.
These were also not the only tools that the Cleveland area had to work with. There exists a quality higher education clump in the Cleveland Institutes of Art and Music and Case Western (CIA and CIM are typically ranked top 20 in the country and occasionally top 10 for art and music, and Case is typically top 40), as well as a world class hospital system in the Cleveland Clinic. There were some other notable developments (Keytower’s completion in 1991, the rise of progressive insurance as a major employer in the suburbs, an expansion of the public rail service in the 90s), so suffice to say that it was not simply a one shot deal at trying to rebuild Cleveland.
Cleveland did not experience nearly the revitalization that Nashville had, since 1990 the population of Cleveland has dropped by 25% with a quarter of that decline coming during the 90s.
You can tell either story here, that the proposed investments that cities often make won’t lead to revitalization or that the spending that Cleveland engaged in prevented it from becoming as bad as Detroit. I’m not trying to argue one viewpoint or the other here, but to highlight how difficult it is to approach the question of what cities should do to revitalize (and that is already assuming that is a good proposition on its own).
I always wonder if we are in a long-run post air-conditioning sorting(*).
I grew up in Pittsburgh-Youngstown-Cleveland and have plenty of family and friends still there. They are perfectly fine places to live short of the weather. (Well, Pittsburgh is better than Cleveland, and Cleveland is better than Youngstown, and ok maybe Youngstown is less than fine but whatever.)
But I live in Florida now. I would consider moving to Nashville for the right opportunity. I will never move back to Western PA/Northeast OH on account of the weather.
(*) For mid-sized cities, that is. NYC and Chicago and the like seem to be doing just fine.
I guess a bit, but there are some mid-size cities that are still growing despite not great climates.
Indianapolis
Columbus
Denver
Grand Rapids
Hell even Des Moines picked up a lot in the last decade.
I think Columbus is picking up what’s leaving NE Ohio. Grand Rapids might be doing the same with Detroit.
Denver might be a fair counterpoint to my hypothesis, but it might be just an exception. Cold weather, sure, but fun blue sky outdoor activity type of cold weather…as opposed to grey and dreary rust belt cold weather.
I don’t know enough about IN or IA.
You kind of have to look at these on a case by case basis to see what is going on, but here are some random first impressions. Cold weather is not necessarily a detriment to cities with other attractions, but when choosing between average cities without any real appeal the weather is going to play a bigger factor.
Cleveland is hurt by cold weather, lots of snow (lake effect), and sort of being in the middle of nowhere. Columbus and Indy are certainly cold, but still have noticeably better weather than Cleveland. I don’t know what Grand Rapids whether is like, but based on geography I imagine it is similar to Cleveland. Although it is growing fast, it is still the smallest city on that list by a fair margin, so kind of need to see how far that growth goes before it gets lumped into this group. Grand Rapids also benefits from being close to Chicago.
Cleveland is basically on a geographic island. Short trips would take you to places like Pittsburgh, Detroit, or Columbus (all meh). On the other hand, in Columbus you swap Detroit for Indianapolis (a big win), add Cincinnati, and also get a huge state school that is massively popular in the state.
Denver, as actualitems mentions, is certainly cold and snowy, but has much more natural appeal than any of the other cities for both locals and tourists.
Grand Rapids has done a really good job of making it’s downtown areas more attractive and walkable. It also has attracted several big medical research type facilities which bring in more professionals. I’ve lived there on and off for 20 years now, and they are definitely doing a lot of the ‘right things’ for city growth. It helps that it is still surrounded by a lot of land where new developments can be put, that are still within a 40-minute commute of just about anywhere.
Minneapolis and Saint Paul have significantly worse winters than any city listed above, and are both growing very well.
I was thinking of raising this one as well, but I figured people might dispute it because it is a large metro area. The Twin Cities have 3.6 million people in the MSA, Cleveland has 2 million.
Boston is growing, and has 4.8 million people, which is a lot, but substantially less than the major metros. You also can’t just say “oh, big city,” because Chicago really isn’t growing all that much in terms of population.
But these are all just arbitrary distinctions created to fit data, which I think is part of the problem Bacon is pointing out, and these explanations really can’t fit the broad pattern of data we see.
Slow cooker pre-treatment:
I have noticed that I probably should have seared beef cubes, tomatoes leave just their skin, and carrots shrink like a frightened turtle.
What are some things you do with ingredients before throwing them into the slow cooker and mixing for overnight heating?
Generally, I sear meat and caramelize onions/garlic as the major things. Tomatoes turn into liquid; I generally just get a can of crushed ones since it’ll happen anyway. I’ve got nothing for carrots; I kind of like the shrunken ones from a flavor/unit area perspective but I’d like to know if anybody else has a scheme to manipulate that.If you’re working with beans, drain, rinse, and pat dry (counter-intuitive, but controlling liquid levels is important). Dried herbs go in at the start, fresh herbs go on top at the end.
Carrots might shrink, but you probably still want the flavor of at least sweating the vegetables a bit before dropping them in the crockpot.
The easiest option, and the only really “required” one IMO, is to sear the meat before it dropping it in. Luckily the instanpot has a sear function, so it’s still one-pot. I sear the whole thing in steak form, cut up, and then drop it in the crock pot.
If you are making something like stock (either chicken or beef), I’d definitely recommend roasting your carrots/celery/onion along with the bones/meat for an hour or so at 400, and THEN throwing it into the slow cooker.
It seems that some of these steps can be bypassed by using a pressure cooker instead. (I do not own a pressure cooker, cannot confirm.)
You still want to sear any red meat and caramalize onions, and tomatoes will still liquidize in a pressure cooker. Electric pressure cookers will allow you to do the searing in the same pot, but it’s stil an extra cooking step (and personally I find it easier to do on the stove).
Thanks for the info! Is the searing just to keep meat cubes intact, as opposed to going for falling-apart pot roasts?
(Tomatoes liquidizing is a plus, for me.)
There are a number of different reactions going on with cooking. Firstly you are breaking down the collagen to make tough cuts tender, this happens around 160F, which a slow cooker can do. A pressure cooker will do it faster by using pressure to take the dish above boiling temperature. This is about texture.
The other big thing going on is the Mailard reaction, this happens above 280F, which is higher than a pressure cooker can reach (~240F). This adds depth the the flavors, which is why a slow cooked dish will taste better if you sear the meat and caramelize the aromatics first.
I read this book long enough ago that I don’t remember details, and apparantly I liked it. But I’m somewhat of an aggro-introvert – or more correctly, an aggro-Aspie. I.e. I’d apprciate impractical polemic in favour of my kind of people, or new arguments in that line, or simply the fact that some best seller could be cited as supporting my self-serving opinions. So me rating it 4.5/5 could well be consistent with it being completely useless in terms of practical advice.
I think there is a place for books that basically tell people – “you are well within the range of normality, and not pathological at all, whatever other people may say.” They only help in practical ways to the extent that depression and self hatred is part of an individual’s problem, on top of whatever devalued minority status may be their issue. But a lot of people grow up thinking of themselves as defective, and/or putting a lot of emotional effort into defending their conscious mind from this belief.
There’s also certainly a place for “how to succeed in spite of …” books. But I suspect they are much harder to write – much more common is “how to cope with …”. Note the subtle difference. IMO, step 1 is learning to cope, and step 2 is finding something that matters, which you can do better than your competitors. And if there were a book handing people #2 on a silver platter, they’d all try it, and the niche would simply become over populated. Unless of course it’s something like “being tall”, which they can’t easily influence.
As a tall introvert–won’t comment on the IQ part–I can say I’d probably like to be an inch or two shorter and a good bit more extroverted.
I’m not sure there isn’t an extrovert cut-off point, though. Part of being an extrovert is usually being less comfortable being alone, and a lot of life requires some solitary time. But you definitely have an advantage if you enjoy networking, are eager to discuss your results with other stakeholders, are adept at persuasion, and so on, probably moreso than excelling at introspection and being a master at settling down with a good book in a quiet, empty room.
As for height, in the outdoors, taller is probably better on net (caloric expenditure aside) but indoors, once you can reach the top cupboards that’s enough; much more than that and you need to beware doorways, are cramped in vehicles, can’t recline on a couch, and so on. True that most women like you to be taller than them, but 6′ is probably as good as 7′ for that, and five and half might even be enough.
Of note – increased baseline caloric expenditure is a good thing if you’ve got a First World living standard; cardiovascular disease is a way more serious risk than starvation.
Assuming hunger & willpower levels are constant.
Sure; in all things moderation.
Oh, I’ll totally take being 7 feet tall. At that point your childhood dreams of playing professional basketball are vastly more realistic, even if you end up playing in the D-League or overseas. Also, given how shitty the Bulls are right now, maybe they will be happy to have 12th man 7 foot ADBG.
Maybe you have a point regarding height. I shall update in favor of my privilege being greater than I thought.
It might depend on what exactly you’re talking about.
Something like 30% of the population have slight hand tremors, naturally.
If you want to start a profession carving grains of rice freehand and you’re part of the unlucky third… you’re probably not going to have much luck.
But if you just want to, say, learn to paint or something I see no reason why you couldn’t learn.
Indeed. I have hand tremors (diagnosed in my mid teens) but play keyboard in a band, teach fencing, touch type in qwerty and colemak keyboard layouts (learned the later in my mid-30s). As long as I don’t try to do all that after chugging red bull on an empty stomach after not sleeping in days I’m usually ok.
I’m trying to spread the word here without sounding too much like I bought stock in a light therapy company (I didn’t). I was peripherally aware of seasonal affective disorder but never considered that it could apply to me because I didn’t feel sad in the winter (I mean, how its sufferers are reported to feel is right there in the acronym!).
While I’ve never had anything exactly like the proverbial “winter blues”, I have had a problem—going back at least a couple of decades, i.e. my entire working life—with randomly waking up at 3 or 4 in the morning and not being able to get back to sleep for a period of maybe one to three hours. This sometimes would happen several nights a week. And I definitely had symptoms of not feeling fully awake pretty consistently.
Last fall after a particularly bad spell I saw a sleep specialist. We ruled out any glaring apnea issues and I got a prescription for Ambien to use occasionally as needed. Then my sleep problems more or less spontaneously improved (not 100% but definite improvement) in the late spring/summer, and when they asked if I was still having issues I gave them an all clear. But come fall, and I’m starting to wake up in the middle of the night for no particular reason again, probably every other night on average.
This year I finally put two and two together that maybe this was a seasonal thing. I also realized that “feeling super drowsy all the time” is not normal, and is one way that depression can present itself. And having read some of the sleep-related articles on here things finally clicked that maybe I was suffering from SAD.
Anyway, I’ve bought one of those lamps (that one recommended on the Wirecutter) and have been using it pretty consistently for the last couple of weeks, and the improvement is basically 100%, both in terms of daytime drowsiness and in terms of being able to sleep through the night. I can’t say for sure if it will keep working, and I also can’t say for sure if spending $120 on some kind of placebo would have helped as well, but it sure seems to be doing the trick.
It’s worth noting that I in the Pacific Northwest (~49°N and lots of clouds in the winter), and was working in a dimly lit home office with some, but not a lot, of natural light. So YMMV.
TL;DR Had sleep problems, started using a SAD lamp, now I don’t have sleep problems.
My fiancee bought one of those – do you not find it….way too bright?
Now that you have a system, you should probably stick with it, but if you want to try an experiment, try melatonin to regulate your sleep schedule. 0.3mg ~6 hours before bed. That posts mentions a theory that SAD is caused by an unmoored circadian rhythm.
Certainly, if you’re ever tempted to use ambien again, try melatonin first.
I’ve heard that vitamin D tests are routine in PNW. Is that true? Have you had one?
My wife and daughter sometimes have trouble getting to sleep lately. I ought to review that Melatonin info.
I raised a follow-on discussion to the melatonin topic asking people’s progress on using it like actually proven instead of at bedtime. I forget where it is but most of the comments were “yeah I need to start doing that.”
I have re-resolved to try using it the proper way.
I also live in the Pacific Northwest and have chronic issues with daytime sleepiness. I’ve been using a SAD lamp for probably at least a year now. I think it has caused a slight improvement in my energy levels, but nothing nearly as dramatic as you describe. For me, it could just be the placebo effect.
I will second the value of light therapy in the winter. I have this model which is nice because it clips to the brim of a hat and you can wear it as you do your morning routine.
Also, of note, the right time to start is roughly in September, not the start of winter.
When do you (anyone, not just OP) use your lamp to get desired effects?
@knownastron:
I wear (see link above) mine in the morning for ~20 or 30minutes (it is on a self timer with two different brightness settings). I just go about my normal routine with the light on. I do this from September till sometime in March
I thought the report button was still broken. Has it been fixed?
It sounds as though the most useful information for you would be what activities extroversion isn’t an advantage, or at least isn’t a large advantage, in.
Well, listening to extroverts whine, that would be anything that doesn’t involve spending most of your time interacting with other people.
It doesn’t matter whether or not extroverts can do solo activities effectively, if their morale plummets and they start looking for excuses to blow off those tasks. And if it’s true that only human interaction energizes them, it sounds like a long day of bookkeeping wouuld leave the classic extrovert totally wiped out 🙁
An introverted but ADHD person would also not enjoy bookkeeping. But write software to automate bookkeeping far more.
@nameless1
The software engineer thing gets complicated. On the one hand, while an introvert can write code happily and well, advancing in the field requires a lot of dealing with people. On the other hand, it’s currently fashionable in Silicon Valley to cram software engineers into open offices with no visual or auditory privacy, where they are constantly “on stage” to promote something that executives and facilities people are pleased to call “collaboration”. Net result – my job is exhausting, and I’m fairly close to the introvert/extrovert border. (I identify as an introvert, but what I really am is an Aspie – no talent for interaction with normies, and enough bad history to dislike it intensely.)
Scott:
Is there a way to contribute monthly support to SSC aside from Patreon? I’m happy to support your work but will no longer do it through Patreon. I suspect I’m not the only one who will feel that way.
FWIW… Sam Harris sent this out today.
I encourage this, but the payment processors are also chokepoints. If I built freespeechtreon, how long would it take for Visa and Mastercard to decide I am a witch?
This is the culture war free open thread so I don’t want to get into it right now but something similar already happened with Hatreon.
That might have something to do with it calling itself Hatreon…
There might a worse name for a business, but I can’t think of any.
What if I wanted to go into business selling Eevee headwear?
The Boring Company?
Or not, as they did the same to SubscribeStar.
Now I’m wondering whether there’s money to be made setting up a little empire of internet services catering specifically to groups unpopular enough that they get tossed out of regular forums. Make yourself unshutdownable by owning a lot of the infrastructure outright and where that’s impossible, deal with large organizations that everyone hates anyway, like Comcast. And charge your customers hefty rates for all of it, of course.
Ask voat. https://gizmodo.com/goodbye-and-good-riddance-to-voat-reddits-gross-clone-1795337099
The trick would be the ‘charge your customers hefty rates’ part. There are still plenty of Mastodon servers and similar open-source peer-to-peer tech for those communities. Just no real commercial eco-system.
Reports of voat’s death appear to be somewhat exaggerated.
What’s the current state of using cryptocurrencies for this sort of thing? Is it possible to make it not much more complicated or costly than the usual methods of taking payments?
You are not going to get 1000 people to give you $5 a month with crypto currencies. You are not even going to get 500 people.
Also, if you set up a separate payment system, you will attract a lot of real witches, mostly scammers. Scammers will flood the system.
Separately, if you want to get someone with a sketchy reputation kicked off a credit card processor, you can buy a bunch of stolen credit cards[1] and use them to donate to that person.
[1] It doesn’t matter very much if the cards are known bad, so it is not too expensive.
Yes, you might have to handle payments by direct bank transfers, or maybe handle them through retail outlets, the way gift cards are done now.
What sort of scams do you think of?
Anyone that is getting money fraudulently is always looking for a way to transfer it that looks as legit as possible. The fact that “Uh hi, this is your boss, I’m at a client site and I really need you to go buy some Itunes gift cards and read me the numbers on the back” is something that scammers actually do is a testament to how hard it is for them to fool Visa/Western Union/Etc in to working with them.
I sympathize with you. I considered deleting my Patreon account and am held back mostly by raw greed – I get about $2000 a month from it, I donate a lot of that forward, and there’s no other tool that would work nearly as well.
(I also suspect that these companies’ hands are tied; ie if they didn’t delete these people, they would risk some kind of lawsuit or loss of relationship with credit cards or strong reputational damage. I have to be a lot more censorious on this blog than I would like and I don’t face anywhere near the institutional level of pressure they do.)
I don’t know of any other good way to support me monthly, but I encourage anyone who wants to leave Patreon to do so.
I think you’re correct that Patreon’s hands are tied when it comes to policing this sort of content. Its short-lived competitor “Hatreon” was founded as a direct response to Patreon’s moderation, and it died before ever taking off because Visa refused to do business with the site after it was enthusiastically adopted by white supremacists. Even if you believe that people have the right to monetize that sort of content, boycotting Patreon is just shooting the messenger.
That they suspended payment processing for a specifically witch-oriented site doesn’t imply they would’ve suspended a general-purpose site with a few witches on it. Not suspending the former is much harder to defend.
Sam Harris sells subscriptions through paypal. Maybe you should set that up without withdrawing from patreon. But I don’t know how hard it is to do. There are lots of downsides, such as higher transaction costs and lack of social proof.
Scott:
Please consider the PayPal route until something better materializes.
Counterpoint: I object more strenuously to PayPal than to Patreon, on the grounds that PayPal is bad at being a payment processor. Not that my opinion should matter, since I don’t donate – but PayPal being the only means of donation is a disqualifying reason for me to not donate to people and organizations. It’s also why I refuse to use Venmo.
Paypal kicks people off, too. One might think that Paypal’s decisions are better than Patreon’s, but going into the specifics seems very CW.
Would you apply the same standards to all the other vendors and services you use in your daily life? If the restaurant you like to eat at refused to cater an event for an organization because of a conflict in their values, would you stop eating there?
I feel like the real concern is not towards Patreon’s standards, but the standards being imposed on Patreon by their providers. You may be holding Patreon to an unreasonable standard. I assume you still use Visa or Mastercard?
Like, say, a same sex wedding? We all know how this turns out.
Honestly, I don’t understand why the banks themselves don’t provide patreon-like services. Why are we allowing Visa and Mastercard to be the arbiters of who deserves to get money and who doesn’t?
Banks do offer arbitrary wire transfers. The better question is why they give market to oligopolistic third-party providers (mastercard, visa) by not making wire transfers faster and cheaper. (Perhaps because wire transfers are still necessary for some purposes, and they can make more money on them if they don’t make them as cheap as a credit card payment. That still doesn’t explain why they can’t make them as fast.)
I suspect direct transfers are a complicated problem when there are a lot of banks, as there are in the US. In Canada, it’s fairly easy to transfer money between accounts because there are only a few major banks and all of them are on a system called Interac. A while back they introduced something called Interac e-Transfer that works over email.
I don’t know enough about Patreon’s process or this latest high-profile deletion to have much of an opinion of the rightness or wrongness of their decision. But I’m very sure that I don’t want to put Patreon (or Visa or Paypal or anyone else) in the position of deciding what writers/thinkers are allowed to receive support from donors, and which ones are not. It is way, way too easy for this to become a way to silence people whose views offend some powerful people.
Isn’t the whole point of going to a restaurant for the warm and fuzzy atmosphere? If it was something else maybe I’d think twice, but I certainly wouldn’t eat at a restaurant I felt bad about supporting.
One difference is that there are a lot of restaurants, and very few payment processors. This is an easy way to create a censorship machine where someone (including the government) can just lean on a few people to cut you off, instead of trying to lean on thousands of vendors.
Good point. But I do think Patreon is more like the restaurant than being in a privileged and entrenched position like the payment processors it uses.
I suspect that, at least in part, it’s to do with them wanting cool points from lefties as well. The thing that touched off this latest campaign may have been a result of potential reputational loss, but…well, I can’t know for sure, but I don’t think it was the only thing going on there.
There’s always mail. You’d need to rent a PO box to conceal your address. The USPS doesn’t do witch hunts or censorship though, so that’s a plus.
Yet. The USPS doesn’t do witch hunts or censorship yet. If people start routinely using USPS for cash microtransactions, they are going to notice.
Just how risky is it to send cash through the mail? I don’t remember hearing much about people stealing mail.
The post office doesn’t care much if you are sending $20 a month to someone.
The problem is that postal workers, if they know what is going on, will steal the money, and it’s hard to catch postal workers who steal mail. The #1 prevention is the fact that nearly all mail is worthless to someone besides the recipient.
Patreon is a lot more convenient. I suspect the number of people who would actually bother to send cash through the mail is vanishingly small.
Yeah, the thing a Paypal or Patreon competitor has to offer is an auto-pay/recurring billing function where the customer pledges x dollars per month and x dollars deducts automatically. This saves the patron the trouble of remembering to do it, gives the content creator some idea how much money to expect next month, and also incentivizes the content creator to keep creating.
Despite all the volatility and initial barrier to entry, I’m surprised more people don’t use cryptocurrency for these purposes (apparently some, like Dave Rubin, are intending to do so), as it’s super easy to get e.g. a bitcoin address and post it on your site. Only thing is I don’t yet know of an app or currency that enables auto-pay functions like this, though in principle it should definitely be possible.
Transaction costs for Bitcoin are currently $140. Even if you are smart and roll transactions together, that is still going to swamp the actual donation amounts.That doesn’t sound right. I’ve been out of the Bitcoin game for a while, but I remember them being ~20 bucks near the peak of the market last year and I’m sure volumes have gone down since then (though hashrate has too). This claims they’re currently 19 cents.
@Nornagest
Does that include the cost of the audit when you cash them out?
No audits on mine, although the capital gains tax was painful.
I googled something to get my original number, and I can’t re-google it now, so consider my number trash.
If you donate a lot of that forward to entities you’re willing to name, you could cut out the middleman and name orgs you’re willing to accept additional donations to in your honor in lieu of Patreon support.
Maybe he should name specific charities, but his patreon does ask people to give to charity instead.
Maybe asking people to give in his name would provide some of the collective action social proof of the patreon, but probably only in a coherent community that was already paying attention to a small set of charities.
For some reason I suspect this would decrease the amount of money coming in.
Regarding the stricter “necessary” prong: I recently read Peter Elbow’s essay “The Believing Game or Methodological Believing” (pdf freely downloadable), and it made me wonder if all the the angsting about political balance on SSC is really just symptomatic of a different issue.
I think a lot of people come here, not so much for “balance” but to deliberately expose themselves to smart people with very different political viewpoints, to see what these points of view look like from the inside. After all, there are many people out there with Scott’s extremely high critical reasoning skills, but few who are also remarkable for their ability to charitably describe interesting weird ideas he doesn’t share. I wonder if what some people are missing from the older days is not actually “balance”, but rather more “agreement game” techniques, where people try to probe other people’s viewpoints in a more cooperative way. Not saying this should replace arguing and poking holes entirely, just that it should be a greater portion of the kind of conversations people have here.
Deliberately posting this on the CW free thread so that it will remain at the meta-level and not degenerate into yet another rehash of which political side is nicer at the object level.
Is there anything stopping one from making a podcast of supreme court oral arguments in their entirety?
https://www.supremecourt.gov/oral_arguments/oral_arguments.aspx
the files are easy to upload. there would being little to no commentary, just select arguments to feed curiosity of the court.
I haven’t been able to find anything saying I couldn’t do this. Does anyone know of any reason I couldn’t?
Official federal government records are generally in the public domain, not to mention a matter of strong First Amendment public concern, so I think you’re golden.
You don’t even need to upload the files. Just point the podcast at them.
They already have been uploaded to archive.org. This link IS a podcast that works in my podcast app.
Is there an analgoue of the Kohlberg stages of moral development, but for the scientific method? (Epistemology is analogous to ethics in that they are both normative, i.e. they have “shoulds” and “should nots”.)
Pre-conventional
Stage 1 (relational): “The world is round because my teacher said so”
Stage 2 (objective): “I can feel myself that mixing hot and cold water makes lukewarm water”
Conventional
Stage 3 (relational): “These scientists who write books about evolution and quantum mechanics seem a lot more sensible then those other guys who believe in homeopathy or astrology, so I should trust what they say” [NB: The reverse judgement that the astrologers are more trustworthy would still be Stage 3, just done badly]
Stage 4 (objective): “I can find out what’s true by following these rules of the `scientific method’ that I’ve been taught. Let’s see, I need to start by coming up with a hypothesis…”
Post-Conventional
Stage 5 (relational): “There are actually lots of different potentially valid ways to do science, but some of them work better than others. We need to figure out how to structure our procedures so as to optimize results.”
Stage 6 (objective): Identification of universal principles of epistemology, e.g. Bayesianism. [But, “Go Team Bayes, they’re the smartest!” is still Stage 3]
Note that because these stages are acquired sequentially, anybody at a later stage should still have the ability to operate at earlier stages as appropriate. (If the theory is correct then it should not be possible to “skip” to a higher stage without first acquiring at least proficiency at each lower stage.)
This is probably not what you meant; stage 6 would really involve the development of synthesis tools, not epistemic tools; the scientific method largely limits itself to making synthetic a posteriori inductive claims, which a whole lot of epistemic reasoning goes into justifying in the first place.
Can you unpack your critique a bit more? I agree that the “scientific method” is a narrower topic than epistemology in general, e.g. there are other technical methodologies for discovering truths e.g. the historical method, the philosophical method etc, which are not best characterized as parts of the “scientific method”.
On the other hand I would expect stage 5 and 6 thinkers to be aware of the fact that science has fuzzy boundaries and cannot be completely decoupled from the question of what makes for good reasoning in general. Would you be happier if I had written “more universal principles”? To me, even the attempt to try to find more general unifying principles puts you at Stage 6, and if your proposed unifying principles are kind of bogus (e.g. Popper’s falsificationism) that just makes you a worse Stage 6 thinker.
BTW in my own high energy theory work, most of what I do is technically a priori, i.e. here are theorems (not necessarily rigorously proven) about implications of QFT and GR. These theorems would be valid in a mathematical sense even if QFT and GR didn’t describe Nature, but of course they are more interesting given that they do…
I would say that you should have an intermediate stage. I think 5 should be,
“I should learn which techniques are most robust for analysis and produce the fewest erroneous conclusions”
Followed by a 6 which is,
“I should think critically about the predictive power and fundamental truth value of my findings”
This leads to 7,
“People who reason in certain ways and have certain attitudes towards fundamental truth believe more things that are empirically borne out, so those attitudes are fundamentally correct, and a proposition can only be fundamentally true insofar as it can be reasoned about in this way.”
And 8,
“There are qualitative differences between propositions; different propositions require different methods of reasoning about in other to determine their relationship to fundamental truth, and these methods are implied within each proposition.”
As a starting point I was assuming I was trying to match Kohlberg’s 6 stages as closely as possible.
Your proposed set of stages are interesting, but it is a further question whether people actually take this path to getting to your #8.
Also, not being a positivist, I don’t think I agree with your last clause in #7. At least some facts are “fundamentally true” without regard to our own feelings or reasoning in the matter. Arguably, that’s what “fundamentally true” means, at least about non-culturally constructed aspects of reality. In other words, you seem to be sliding between an epistemological claim (people who reason in certain ways get good results) and a metaphysical claim (that’s all we mean by truth). I don’t think the latter follows from the former.
I bet you would enjoy reading https://meaningness.com/ if you haven’t already (and if you can figure out how to navigate it).
Thanks for the recommendation! I’ve previously enjoyed reading some of Chapman’s other pages on Vividness, but I never went through it systematically.
Looking through his Meaningness philosophy, it looks like he would have me pegged pretty much automatically as an “eternalist” given that I’m a Christian. (Not surprisingly, therefore, his list of supposed flaws with eternalism comes across as a little strawmanny to me; as if all the theists he knows are incapable of incorporating acceptance of nuances and ambiguities into their worldview. The sort of Christian who hasn’t read Job or Ecclesiastes…)
So there’s probably a limit to how much I can let this particular “cruel angel’s thesis” into my soul, but there may still be some interesting insights here to explore.
Should the first two stages be switched? Children observe the world around them before they understand what they’re told.
Well, this is the way that seems to match Kohlberg’s stages better. That doesn’t necessarily mean we learn scientific reasoning steps in the same order that we learn morality, but it’s at least a starting point for the analogy.
But to be clear, I was thinking that Stage 2 is something considerably more sophisticated than just raw observation. It takes cultural transmission from adults to be able to look at something that just happened, and express using language a general proposition about how the world works.
Do you think that for morality, Kohlberg got his first two stages in the right order?
I hadn’t read about Kohlberg until just now. Yes, I think he got them in the right order.
I feel like this belongs in Expanding Brain format, with the last stage being “It is neither possible nor desirable for every man to re-derive all of human knowledge. We see further by standing upon the shoulders of giants. I believe what my teacher taught me.”
Ha ha! (or on this thread, Ho ho!)
As I said, the higher stages include the ability to use the earlier stages, including Stage 1, as appropriate. On the other hand, if your elementary teacher told you something that contradicts the scientific consensus, one hopes you’d be able to notice that fact…
More generally, once you have access to a library or the internet, you get to pick your teacher, and you need at least Stage 3 thinking to be able to pick the right one.
Maybe this isn’t the proper place to ask, but does know anyone how does income compare to pair bonding in terms of long term welfare?
That is, how much income would one have to lose due to pair bonding before the pair bond would no longer be of benefit to long term welfare?
What metric are you using to measure welfare? Life expectancy? Reported happiness?
Reported happiness.
Happiness doesn’t really scale with income. It’s more like a threshold. If you are penniless, you are pretty much guaranteed to be extremely unhappy. Anything above subsistence level is going to be a mixed bag.
@Dack
In the U.S.A. the income satiation point for happiness is higher than subsistence and well above median personal income.
As far as I know, income is strongly connected to happiness, especially within countries, it’s just that there is an upper limit.
@ Plumber
Three points:
L. didn’t ask specifically about the US.
The research very much does not agree with itself.
We’re not even looking for a satiation point. We’re looking for a break-even point between income and pair bonding and pair bonding has a much bigger effect on happiness.
Have not read, but there is a trade-off between extroversion and introversion, because the set of tasks people perform require both at some time. I spend a lot of time staring at spreadsheets, which would drive most people up a wall, but definitely anyone who is extroverted. Ditto tasks like balancing your checkbook or anything else that’s solitary (like any daily chore).
Also, don’t confuse “everyone’s work is of equal value” with “everyone is OF value.” Just because you don’t get to be Patton doesn’t mean your life is a waste. Introverts can succeed, which is demonstrated by, if nothing else, the fact that we don’t have 100 million poor homeless people, which seems to be the bare minimum of Americans who are introverted.
Indeed and I think we lost sight of that today. Rather ironically, the disqualifying factors to be considered a huge success decreased, but we then increased the standards.
Perhaps we traded many local competitions for one big one, where nowadays it is easier to be a huge success, but also far easier to end up feeling completely downtrodden. In the past, a black man might not have a chance at president or CEO, but he knew it wasn’t personal, so if he took good care of his family, he achieved a local maximum.
Nowadays, he can theoretically become president, but the downside of this greater opportunity is that the achievement of being a good provider now gets way less respect, so he is only better off if he is above average. If he is an average guy, he lost status.
I argue that revolvers are obsolete. Who will challenge me?
First and very importantly, let me define “obsolete” as meaning ‘No longer appropriate for the purpose it was obtained due either to the availability of better alternatives or change in user requirements.’
(Source: http://www.businessdictionary.com/definition/obsolete.html)
Somewhat surprisingly, there is a small yet highly vocal faction of gun owners who vehemently argue that revolvers are not obsolete weapons, and that they have important advantages over semi-auto pistols, as well as niche applications that can’t be filled as well by anything else.
Arguments against revolver obsolescence usually hinge on a strict definition of “obsolete” that says a piece of technology only becomes obsolete if there’s an alternative technology that does EVERYTHING better or at least as well, followed up with the observation that revolvers have inherent reliability advantages over semi-auto pistols. I argue that the reliability argument has no real relevance since generic semi-auto pistols like Glocks routinely go thousands of rounds between malfunctions so long as they get regular, basic maintenance.
The only situation where a revolver’s inherent reliability advantage would come to bear might be in a post-apocalyptic world where people went indefinitely without access to the simplest gun cleaning supplies and had to use very old/poor quality bullets. But we are not living in that world.
A problematic for the “reliability” argument is that, using the same reasoning, Derringers are also not obsolete since they are even simpler and more reliable than revolvers, and there are imaginable situations where a revolver would malfunction thanks to the cylinder misaligning with the barrel, but where a Derringer would still fire. Also, in a post apocalyptic scenario where you had to use randomly found ammunition of unknown quality and gunpowder loadings scattered across the landscape, a revolver would be better than a semi-auto, but in that same situation, the Derringer might be better than the revolver since it would be more robust and easier to fix. If the revolver advocate is unwilling to defend the non-obsolescence of Derringers in light of facts like this, then it undermines his own argument, which I point out again hinges on a very liberal definition of “obsolete.”
Moreover, while there are narrow circumstances under which a revolver would be better than a semi-auto pistol (can fire multiple shots from inside a pocketbook; if an opponent presses his hand against the barrel of your gun, it would unlock the slide of a semi-auto gun but not affect a revolver), it’s impossible to predict when those circumstances will manifest themselves, they are surely very rare, and it’s much more likely that the combat situation will yield favor to whoever has more firepower (biggest magazine + fastest reload), which is the biggest inherent advantage of the semi-auto pistol.
A niche application of revolvers that proponents put forth as proof the weapon is still relevant is as a lightweight defensive tool against bears and other big game. I believe that these large revolvers, firing .41 Magnum bullets or larger, have been made obsolete by bear spray. Though bullets of these sizes can kill bears with the right shot placement, in real-world bear encounters, a person’s fear combined with the heavy recoil of a magnum revolver severely hurts accuracy and the speed of follow-up shots. Being able to spray out a continuous stream of noxious gas and create a cloud of it is better, and a can of bear spray is cheaper and weighs less than a big revolver (there have been many documented cases where people used bear spray successfully). If you want to REPEL a bear and are agnostic about killing it, bear spray is the better choice.
And aside from indulging one’s machismo by taking unnecessary risks, there’s no sense in picking a large revolver instead of a long gun to go into the woods and hunt down bears (or any other large animal) for sport. While some people do successfully hunt bears with revolvers, they are vastly outnumbered by peers who use long guns. There are some people who like hunting bears with handguns, and there are also people who like climbing vertical cliffs without any safety ropes.
While it’s true that revolvers have aesthetic and historical merits, and there’s nothing wrong with shooters owning them for personal pleasure or collector’s appeal, the same things are true of muskets, and thus it doesn’t change the fact that revolvers are obsolete.
Are we counting revolver cannons? They still have a few niche uses, since they’re generally going to be lighter weight than gatling guns, and don’t need as much time to spin up.
I didn’t even know those existed.
No, they’re not the type of “revolvers” I was referring to.
States with mag limits.
I was thinking the same thing. You can impose limits on magazine sizes for semi-automatic pistols, but it’s possible to create larger clip sizes secretly. It’s a lot harder to do that with a four- or six-chamber revolver.
Sure, I’ll take you up on this.
I own a revolver. Here are the requirements that it must satisfy.
-I pull a trigger, bullet comes out
-Concealable
-I already own it
Let’s examine:
‘No longer appropriate for the purpose it was obtained due either to the availability of better alternatives or change in user requirements.’
I haven’t changed my requirements, and no better alternative exists, so it is not obsolete.
Edit to make clear that I am challenging Proyas, and not his Aspect Emperor.
I argue that the main points of owning a handgun (for most people) are a) recreational shooting, and b) deterrence in a personal defense situation; a revolver fulfills either perfectly well.
I’d like to add that no semi-automatic handgun can load .410 shotgun shells next to .45 LC, which is a point in a revolver’s favor. Sure, you don’t need it for bear, but home invaders can wear body armor too.
Revolvers aren’t much more reliable than semi-autos anymore, not with known-good ammo. They’re much less picky about loadings, though, which mostly doesn’t matter but does let you do some neat stuff like firing .38 Special and .357 Magnum out of the same gun. They’re inherently more single-shot accurate than short recoil pistols, thanks to the fixed barrel; not necessarily more than blowback, but then you’re limited to fairly low-energy rounds. On the other side of the spectrum, you can build a revolver much lighter than a semi-automatic if you want to be able to fire magnum loads. And having a grip that the magazine doesn’t go through can be useful for shooters with small hands.
On the other hand, they’re less concealable than semi-autos of comparable power, they have stiff recoil, high bore axis, and heavy trigger pulls that make them less user-friendly and worse at placing follow-up shots, they usually have lower capacity, and they’re a lot slower to reload unless you have speedloaders or moon clips and extensive training. These limitations mean that on practical grounds, I’d only seriously consider buying one new for handgun hunting or predator defense (yes, bear spray’s a better first option, but I’d like to have a backup). But some people are into the cowboy gun thing for aesthetic reasons, and that’s fine too.
Semi-auto pistols with fixed barrels are available.
Yeah, but they generally use blowback actions, which are limited to low energies and light bullets. .380 ACP or 9×18 Makarov is about as far as you can push it in a pistol form factor, unless you’re willing to compromise with a hilariously heavy bolt (which, to be fair, Hi-Point does, but you don’t want a Hi-Point).
Usually full-power pistols use short recoil actions, which implies a moving barrel mechanically separated from both the slide and the frame and therefore from the sighting system. There’s a few oddball exceptions like the Mauser C96 (which also uses short recoil action, but lacks a slide and so has its sights mounted directly to the barrel), but they’re rare and usually come with compromises elsewhere.
They’re inherently more single-shot accurate than short recoil pistols, thanks to the fixed barrel;
But how big is the accuracy difference, and is it big enough to make any practical difference? Again, it’s possible to conjure exotic scenarios, but they won’t be encountered in real life so what does it matter?
They’re much less picky about loadings, though, which mostly doesn’t matter but does let you do some neat stuff like firing .38 Special and .357 Magnum out of the same gun.
What is the practical benefit of this versatility?
.38 Special +P is identical to 9mm, and 9mm is considered the optimal pistol round for defense against humans. That being the case, why would you need to also shoot .357 Magnum bullets? .357 is overpowered for humans. It might be suited for hunting some types of animals under certain conditions, but that just brings us back to a point I made in my OP: handguns are inherently poor weapons for hunting because they aren’t accurate enough.
So yes, the same revolver can shoot .38 and .357, but so what? How does that give you any advantage over a revolver that can only fire .38?
On the other side of the spectrum, you can build a revolver much lighter than a semi-automatic if you want to be able to fire magnum loads.
Fair enough. I didn’t think of that.
And having a grip that the magazine doesn’t go through can be useful for shooters with small hands.
With VERY small hands.
It means you can buy a single revolver and load it with .38 Special (which is cheap and not as hard on the wrists) for practice or ordinary home defense, then load it with .357 Magnum if you expect a possible need to defend yourself against predators or humans in light body armor, or, for example, humanely kill large animals. If you bought two guns (say, a Glock and a shotgun) you could cover both of these niches better, but if you’re only going to buy one gun it’s a reasonable choice.
As to handgun hunting, I agree that in most circumstances a rifle or a shotgun is a better choice, but people do it, and I’m not going to tell them they can’t just because it offends your sense of aesthetics. With the right ammo and enough practice, apparently, you can make shots out to a couple hundred yards with a scoped revolver, which is about where round-nose bullets out of a .30-30 lever action (probably the most common American deer rifle) start getting inaccurate too. And for some types of hunting, like in dense brush, you just don’t need that much range in the first place.
It means you can buy a single revolver and load it with .38 Special (which is cheap and not as hard on the wrists) for practice or ordinary home defense, then load it with .357 Magnum if you expect a possible need to defend yourself against predators or humans in light body armor,
Why would you need to shoot someone who was wearing body armor? If such a person were assaulting you, the odds are 99% they will be a police officer, in which case you shouldn’t be shooting a gun at them.
or, for example, humanely kill large animals.
That requires some caveating. The .357 is barely powerful to kill a deer, meaning your shot placement must be excellent to compensate. That means your revolver will need a long barrel and ideally a sight of some sort. Those requirements in turn diminish the revolver as a home defense weapon since it’s longer and clumsier than it needs to be for shooting a human trying to break down your door. The added bulk also makes it unsuitable for concealment.
If you bought two guns (say, a Glock and a shotgun) you could cover both of these niches better, but if you’re only going to buy one gun it’s a reasonable choice.
I think a better choice would actually be to just buy a pump-action shotgun. It’s great for home defense against humans, can penetrate body armor better, and is a better hunting weapon. Depending on which type of ammo you use, you can hunt flocks of birds or deer. Such a shotgun wouldn’t be any more expensive or lower in magazine capacity than a .357 revolver.
If the choice is to live another day (probably literally) as an outlaw or be killed by the police right then and there, I’d choose living. Standing there and being killed because it’s the cops doing is required by law, but not by morality, IMO.
And it turns out ordinary criminals sometimes wear body armor too.
You know you can just buy body armor, right? I don’t really expect many criminals to, but I don’t really expect many criminals to be interested in a gunfight, either.
When I said “humanely kill large animals”, I was thinking less “deer at 50 yards” and more “hog at 3 feet, which you are going to cook for Christmas dinner”. A friend of mine has a funny story about a large hog and an underpowered gun.
There have been home invasions in my country with the criminals disguising themselves as the police, where they also wore bullet proof vests.
However, I would advise against shooting at people who look like cops.
I’d go with obsolescent, but the combination of niche advantages plus a broad range of applications where their disadvantages are minor will keep them in the running for some time to come. Others have done a fair job of addressing some of the niches, but I will add one that I haven’t seen but does still matter: Revolvers are easier for the mechanically disinclined to develop justifiable confidence in. For example, people really still do the thing where they accidentally shoot someone with a semiautomatic pistol because they removed the magazine and could not understand that the weapon might still be loaded. These people are not irredeemable idiots who are never to be trusted with the means to defend themselves; they can do just fine with a revolver whose mechanics are obvious and visible.
They also aren’t in the bubble of most people who post to SSC, but they do exist.
I’m skeptical of your claim that people with that poor knowledge of guns should own them. It seems like you’ve raised a self-negating point, if it could be called that.
I did not say that people with a poor knowledge of guns should own revolvers. I said that people with poor mechanical aptitude might benefit from doing so. These are two different things, and knowledge is not the right tool for this job.
Also, what alternative do you propose? If the answer is that nobody should be allowed to have a gun unless they can master the semiautomatic pistol, I see that as the same sort of elitist snobbery that would say nobody should be allowed to drive a car unless they can use a manual transmission, and just no.
If “obsolete” means “there’s no reason you should ever buy a revolver instead of a semiautomatic pistol” then I disagree. Revolvers have several advantages.
The first and probably least important is that they can fire more powerful rounds. While it’s possible to build an autoloading pistol that will fire “magnum” rounds, or even a “pistol” that that fires rifle rounds, these involve considerable tradeoffs in size and reliability. Also, I simply disagree that bear spray is always better than a firearm, mostly because a fistful of pistol rounds will repel more wildlife than a can of bear spray. And a large revolver is still lighter and smaller than a small rifle.
Revolvers also tolerate much more variance in ammunition than semiautomatic pistols. This usually isn’t that important, but does mean that you can use one firearm for several different purposes more easily.
The most important factor I think is that revolvers require less knowledge to operate. If a semiautomatic handgun malfunctions, fixing this requires both hands and quite frequently at least a bit of informed reasoning about the workings of the weapon. If you pull the trigger on a revolver and it doesn’t go off, then you pull the trigger again. If it still doesn’t go off, reload. That’s it. The manual of arms for semiauto pistols is not brain surgery but for an owner who can’t or won’t get intimately familiar with his weapon a revolver is by far the superior choice.
Hammerless revolvers are also in my opinion a better choice for extremely confined spaces where the slide on a semiautomatic pistol might not have room to operate (such as, you might need to fire from your coat pocket).
These are admittedly niche applications, aside from their use by people with minimum knowledge of firearms, which I think is fairly important. However, they’re enough to justify the continued existence of revolvers outside of a purely recreational context. If .45-70 has survived this long on the basis of reloadability (and cowboy fans) then they’re not going anywhere.
“Tap, rack, bang” isn’t complicated and doesn’t take much knowledge of the weapon. Does take two hands, though.
You can’t fix a double feed that way, but I’ve never actually seen one in the wild. Squib loads might be a bigger caveat, but again they’re not something that occasional shooters are likely to run into, and frequent shooters should be trained in what to do with “BANG BANG pew” (which is, immediately remove the magazine, field-strip the weapon and take it to a gunsmith).
It’s still harder than “just pull the trigger again”. Not just physically harder, but easier to screw up by e.g. not pulling the slide back all the way or riding it forward. It requires more drilling, with ammo or snap caps, to get correct.
If someone decides they need a gun but can’t or won’t do more than “here’s how it works and a box of ammo, maybe fire ten rounds at a target” then a revolver is the way to go. I don’t endorse this attitude at all, but I’ve certainly encountered it and if that’s how it is then revolver > semiauto.
Tap-rack-bang fixes a smaller fraction of semiautomatic pistol failures than does a second trigger pull on a double-action revolver. And it requires a mental gearshift under extreme stress, which if you haven’t specifically trained for it is another distinct failure mode. Pulling the trigger again, is what you were planning to do anyway if the first shot didn’t work.
What’s the word for a more complicated thing that is harder to use and achieves the same results?
Because my single-word response is: Clips.
I despise loading clips.
More, in response to the notion that more firepower is better – if I -need- more than six bullets, my failure began long before I reached the point where I ran out of bullets. The pain-in-the-ass factor that clips are to me far exceeds the potential value of being able to shoot four extra rounds. I’m not a mall ninja, if I’m shaking too bad to hit the first three targets in the first six shots, odds are I’m not going to be able to get the clip in without dropping it anyways.
…
It is literally the worst linguistic error you can make.
Go forth, and sin no more.
The upside of mags is that all that tedious loading happens before you need the bullets rather than in the middle of a firefight.
For range days, I usually just load 10 rounds into 15 round mags. It’s just those last two or three that are always a pain if your mag has tight springs. They do make loading aids that can help but it’s mostly a rhythm / familiarity thing to get efficient at it.
I don’t need more than six bullets. And if I do, I really want a quieter weapon, or a much, much louder one. Reloading is something people do in movies. If I would need to reload, I’d rather be using that time getting closer and biting someone’s face. (Would you keep fighting somebody biting faces? I think not.)
Well I certainly wouldn’t just stand there and let you.
You are assuming an infinite, or at least adequate, amount of spoons/executive function/mental energy here. As someone who struggles with the basic maintenance necessary for daily life, I have often thought that if I ever get around to buying a gun I will end up getting a revolver for concealed carry and a revolver shotgun for a long gun, precisely because the last thing I want is another fucking chore in my life.
Honestly the amount of time/effort/manual dexterity to give a Glock or other modern polymer framed semi-auto a perfectly adequate cleaning / lubrication is basically the same as required for a revolver or pump gun (actually the pump gun is probably harder). My Springfield XD takes all of 5 minutes to clean after a range session. Plus gun solvent has an alluring aroma.
With modern firearms firing modern ammunition, you don’t need to clean and lubricate after every range session and it isn’t actually recommended to do so except insofar as “every range session” is a useful schelling point for people who aren’t going to keep a log or whatnot. For a Glock with the use model,”500 rounds in the first month learning to shoot, then it stays in a locked drawer except for 50 rounds of target practice one per year”, it isn’t terribly unreasonable to have the cleaning schedule of a: once after that initial learning period and b: once when your grandson inherits it.
If you’ re going to carry regularly, you do have to consider the carry environment as much as you do the shooting schedule.
Or, if you have a revolver, your cleaning schedule is “Hey, the cases are sticking and are hard to get out after firing and hot, it is time to clean.”
Which is to say, your gun tells you. And annoys you until you do it.
@John – sure, my intent was just to show that even the most rigorous maintenance schedule a gun owner who likes their guns put away clean would follow is hardly a burden at all.
@Thengnskald – sure, but a Glock can do basically the same. You could put at least thousands of rounds through it before any chance of failure due to lack of cleaning.
You are assuming an infinite, or at least adequate, amount of spoons/executive function/mental energy here.
There are hundreds of thousands of not-terribly-bright people in modern armies who learn how to use semi-auto handguns and carry them around safely for years. I’m sorry to say this, but if you can’t grasp the mechanics of how such a weapon works (I’m not talking about the fine motions inside, like springs (de)compressing and sears engaging), then you probably shouldn’t own any guns at all.
a revolver shotgun
I’ve never heard of such a weapon. Wouldn’t a double-barrel or single-barrel shotgun be even simpler and cheaper?
Ah, so it is elitist snobbery. Again, no. Also, go talk to some combat veterans about their experiences with semiautomatic pistols. Armies achieve reasonable handgun safety records in peacetime through the precaution of basically never allowing anyone to chamber a live round except on a target range under expert supervision. Sometimes with an exception for MPs and the like, but there aren’t hundreds of thousands of those. In wartime, ADs are disturbingly common and pistols are one of the biggest culprits in spite of their limited role in military operations.
They’re better for playing Russian Roulette with.
Playing devil’s advocate, I’ve heard people reference statistics where a small % of soldiers make the majority of kills in combat, and this would imply that mindset is the most important thing, which in turn might imply that mere coolness (insofar as it might shore up a person’s mindset) could be more important in some cases than technical efficiency.
In terms of ground combat, the majority of kills are made by artillery, and I don’t know how you’d measure which infantryman is responsible for who in a meaningful way. You might be thinking of SLA Marshall’s studies, most famously Men Against Fire. The main debate about his work these days is how much is semi-honest mistakes and how much is outright academic fraud.
That said, it’s definitely true for fighter pilots (or was during WWI/II), and I’d be kind of surprised if it wasn’t at least somewhat true for ground combat, too.
The 80/20 rule holds for most things like this.
I took my wife to the shooting range to try out and practice pistols.
She finds her 38 revolver easier.
Too long experience with plumbing fixtures and heating systems has taught me to trust proven over new (and as we used to tell each other at the motorcycle shop “Never buy the first model year, especially Kawasaki’s”).
Plus dirty conditions away from cleaning supplies is much of the world, not “rare”.
We’re keeping the revolver.
Sorry guys, but after reading all the responses (thanks), I’m still convinced revolvers are obsolete. The only niche where they might be the optimal weapon is for someone with very small hands who wants a concealed carry gun that they plan on firing from inside a purse or large coat pocket, and who maybe also needs a very simple weapon because they’re intimidated by the “complicated” mechanics of a semi-auto. Overwhelmingly, I think the people in this niche would be females worried about their personal safety. No need for magnum bullets, either.
@proyas,
All the “niche” reasons you list match my wife.
Christmas is nearly upon us – what are you planning for your big holiday meal? Any traditions that you’d like to share?
My wife and I (+baby) will likely have a Christmas lunch of prime rib. I’m still debating what sides to cook along with it, though I’m having another go today at roast potatoes to see if I can make some worthy of Christmas dinner. My dad always made excellent roast potatoes. We might invite some friends to lunch as well, as they just had twins and probably won’t want to travel or cook for themselves.
My wife is from south Texas, so we’ll be getting in tamales for Christmas eve, as was tradition in her family.
On potatoes – my girlfriend’s family dices them up and roasts them after mixing with olive oil and, of all things, pre-packaged dried onion soup mix; they’re absolutely delicious and almost literally couldn’t be simpler. I just add some fresh rosemary about halfway through the roasting process (~400F for ~45min).
I gave Kenji’s recipe a try, and was pleased with the results for a first time go. They were a bit dry, but I think that was my fault for not being ready with the rest of dinner when I needed to remove them from the oven. They weren’t as good as my dad’s, but I’m pretty sure his secret is goose fat, which I don’t have access to.
I might try your suggestion when I don’t want to go through the effort of parboiling them though.
Raclette is very popular during the holiday season here in France (the modernised version that used a special tabletop electric grill). Other common holiday foods include smoked raw salmon, raw oysters, fois gras (liver paste made from a force-fed duck or goose; highly unethical, expensive, unhealthy, sounds disgusting, and is delicious), some manner of roasted poultry (depending on the number of guests: chicken, capon, or guineafowl — a whole turkey is much rarer) with various casseroles of mashed vegetables (potatoes, carrots and celery are common), and yule log (either the cake version as described in the link, or an “iced” version mostly made of ice cream).
A couple of years ago I was on holiday in Berlin in December. The raclette sold at the Christmas markets there was really great. Between that and the gluhwine I had a pretty amazing vacation.
There is a Raclette street fair vendor in Seattle.
Yum
If you are going to have a rib roast, you should have yorkshire pudding with it.
Ahhhh, Christmas dinner. Depends on where we go.
Our Friends Christmas typically has lasagna, prime rib, mashed potatoes, and a veggie of some sort. This year my wife made spinach lasagna rolls, and we had no vegetarians or anyone keeping kosher, so I made a mashed potato casserole with breadcrumbs and bacon on top. Veggie was braised cabbage, which is stupid easy, and everyone loved it.
We also have appetizers and desserts from friends, whatever they want to bring.
My Mom makes molasses cookies and beef bourguignon. It is delicious, but we don’t always go there. If we go to my sister’s, we typically get a roast of some sort, some lemon mint red potatoes, and Caesar/Greek salad.
My In-laws go out to Olive Garden. I am fine with this, as at Thanksgiving:
-I am pretty sure the stuffing was undercooked, since they did not use a thermometer
-They did not serve dark meat
-The store-bought gravy was not fully mixed
-The turkey was shoe leather tough
-Every vegetable was undercooked
-Sweet potatoes had no brown sugar
-“I spent 12 hours on this 7 layer salad!” No, you made jello. Jello is gross. Just because you made jello 7 times does not mean you cooked. It means you made 7 poor life decisions.
I might get pulled into cooking for their extended family Christmas, but I really need more details. If I do, I might make vegetarian lasagna and a massive, massive amount of stew (though they will likely hate my stew, which will include both thick cut bacon and enough red wine to slosh the entire house).
Seconded on the prime rib, paired with some excellent mac-and-cheese and crescent rolls (plus various vegetable dishes, of course).
Christmas Eve we do a feast of the seven fishes, and then Christmas Day is standing rib roast.
Two years ago we cooked a goose. I always wanted to do the “Christmas goose” thing. It was really good. Kind of like duck. But it’s a pain in the ass to get a goose and expensive. Was nice to do once though just so I can say I’ve done it.
The tradition is to make dumplings/potstickers/gyoza/mandu, but that’s on the cards for every holiday. You can never have enough dumplings!
Does anyone know of any good research around (or persuasive arguments for/against) companies that stop tracking sick time? I’m not talking about the “policy some employers have of not tracking any kind of time outside the office (which there were a bunch of articles about the pros and cons of a few years ago). I’m talking about a more limited policy, where people still get 2-4 weeks per year of vacation, but staying home sick doesn’t affect that time.
I imagine it would come down to the upside of “people don’t come to work sick and get everyone else sick” vs the downside of “people might abuse it and stay home a lot more.” I think/hope it will be a good thing on net, at least in a professional setting where people have careers and not just jobs, and I’m considering trying to get it enacted at my company. But I also don’t know much about it.
I don’t know about research, but I do know that many tech employers (especially in the Bay Area) have stopped tracking time off, altogether.
They don’t specifically say the reasons AFAIK, but I think I know the answer(s). Attracting talent is job one for the companies. Time is actually not fungible with money. Asking someone to go from 4 weeks of vacation to two weeks (the normal “starting” vacation amount) is a hard ask. This makes it hard to attract people to switch to your company.
So why don’t they just give all employees 4, 5, even 6 weeks? Because all the unused vacation is accrued as a liability on the company’s balance sheet, which they do not like at all. This is why most companies have caps on the amount you can carry, even though anyone losing vacation at the end of year is usually very angry about it.
Yeah, a lot of people seem to think the “don’t track time off at all” model perversely disincentivizes people from taking vacations due to the fear of being seen to be among the most frequent time-off takers. And it would be off the table for us anyway. So I’m restricting myself to just sick time, and leaving vacation alone, as a half measure that would do some good and be more likely to get implemented.
I assure you that the major reason for “unlimited PTO” is not attracting talent, but rather not having an accrued vacation liability on the balance sheet.
That doesn’t jibe with the fact that it’s primarily tech firms that are driving this. Their are other jobs with time pressures that end up with people having large PTO balances (medicine, for instance) and they aren’t at the fore front of this.
It allows them to tailor the amount of time off each employee gets without creating the liability on their books. If they gave everyone two weeks, those who want more would look elsewhere. If they gave them all 4+ weeks, they take on a huge liability. Letting them take the time they want then does become a selling point, at least for those who don’t want a low limit.
“Unlimited time off” also sounds really generous, without costing much if nearly everyone takes a really conventional amount of time off.
My impression is that most people aren’t all that keen on extra vacation. I can only remember one person bitching about having less than they wanted, and that was in the context of slightly different policies across countries. The Canadian subsidiary gave three weeks paid vacation while the Americans got four. Our manager said he simply was not allowed to give anyone a fourth week, as a matter of high-level policy, but he could offer an extra week of unpaid leave.
Well, duh, not openly. This is America, you can’t say things like that and expect to be successful. Gotta be a team player and a go-getter. Spout off about wanting more time away from the loving embrace of the company and management will give you a parental disappointed look that blocks advancement or gets you sacked.
This ethos feeds into itself and combines with the stratified economic conditions. The people who desperately need a break can’t afford it. The middle class have to keep up appearances – and a bunch honestly believe the rhetoric that vacation is for unambitious sissies. And the elite put in the work to stay in business but can go on golf trips or whatever whenever they like.
Something something “nation of temporarily embarrassed millionaires”
Non-American here. Could I confirm that this means that US companies normally take sick days out of your vacation allowance?
Depends on the system. Some companies use dedicated vacation and sick days, and might possibly tack on other days as well. I believe it’s up to state law and company policy to validate if you are truly sick.
Some companies used a combined “PTO” system, where you can use your personal bucket of days for whatever reason you like (sick or vacation).
Thank you. I have Opinions about this but I suspect that’s straying into territory that’s against the house rules of these threads. I was completely unaware of that fact until now.
Honestly I kind of like the PTO system and don’t get the gripes about it… from what I’ve heard most places that do “sick days” either get wildly abused (because the sick days are use-or-lose) or require annoying hoops to jump through (I feel like crap, I don’t want to have to go to the doctor for a note just to say I have a cold and shouldn’t be at work).
Add up however many sick days you were going to give me and add them to my PTO, and I’m happy.
Then again this is a “high trust” workplace where the company is pretty cool about you working whenever as long as you bill enough hours and are there most of the time in core business hours.
“Sick Days” might make more sense in low trust or more rigidly scheduled jobs (where “sick days” are the only allowed days off without substantial advanced notice). But sick days are disruptive, even if they are fake, and “use-or-lose” sick leave in a separate bucket from regular PTO practically guarantees a lot of fake sick days in the 4th quarter.
This is generally the rub. A generous company will make the PTO bucket larger than the previous vacation bucket. The version that draws howls is when they just relabel vacation as PTO and you have to draw your sick days out of it.
I’ve gone through a change from Vacation/Sick to combined PTO and heard anecdotes from others as well. Typically the combined PTO has 1-2 days less per year than the separate vacation/sick, and the transition is used a period to reduce the cap as well.
My company just announced a transition from separate sick/vacation to a PTO pool. They seem to have handled it pretty well. Everyone has the same number of days they did before (except 1st year employees, who got 2 more days, but that’s a rounding error). What they did is cap accrual at 1.5 years, where it was previously 2 years of vacation (comes out about the same in most cases) and something weird with sick leave that let people pile up a lot of days. (I think we got to keep half of any we had left over from the 10 we got each year, but don’t quote me on that.) That’s no longer possible. This annoyed people who were getting close to retirement and looking forward to a big payout from that (they do get what they already have, but nothing new), but I’m so far away, I don’t care. And we have more flexibility now, which I’m OK with.
My company has been good about allowing Sick days to include preventative medical appointments, or family medical appointments, or just “I feel like crap” days.
I work for the U.S. federal government, and they track vacation days and sick days as separate things. I’ve worked for private companies that do the same, and I’ve worked for private companies that count vacation and sick days as the same thing.
No one has ever asked me for proof that I actually had a medical reason to take a sick day, though I’ve had to sign things saying so.
It has been decades since anyone asked me for documentation proving I was sick. That even applied to a major surgery I had awhile back. I told my boss when it was scheduled, she said “I hope everything goes well, see you in four weeks,” and nobody ever asked for any kind of documentation or anything.
ETA: Paid sick leave which I’d accumulated over many years.
The quick and dirty answer is: Every company does it differently. Nothing is guaranteed.
One more data point: Aerospace Corporation places no cap on the number of (paid) sick days, but anything more than five consecutive business days sick turns into “short-term disability” with doctors and bureaucrats talking to each other about how this is going to be handled. I assume HR would have some trick up their sleeve to deal with someone who consistently gets a four-day cold every week, but that’s not really a thing that comes up here (or I expect in most other high-trust white-collar jobs).
We’re also pretty flexible about allowing people to work from home on days when they are suffering something contagious but not debilitating, and/or watching over sick family members.
Do you also have unlimited/untracked vacation time? Or do you have a set amount of vacation (4 weeks or whatever) paired with unlimited/untracked sick days, with strings attached for lengthy or numerous absences?
If the latter, that is exactly what I’m hoping to propose, and I would appreciate any source you can find or hear about that talks about the pros/cons of it, or any study about it, or even just your impression of whether it’s generally considered to be an unalloyed boon.
(I should add that this is also a white-collar high-trust kind of place, and the point of doing this is that we are way more worried about an employee leaving because our benefits aren’t good enough than about an employee abusing the system to get extra pto)
Three weeks vacation for new hires, four weeks after ten years, plus the usual Federal holidays. Unlimited vacation is still a fairly rare thing
The company I works for has unlimited vacation. Everyone takes very conventional amounts of time off, typically three weeks or so. I suspect if I asked around I could find someone who takes four.
Habits and ideas about what is normal change slowly.
Heh, in -theory- we have it, but it was delivered to salaried team members with the unofficial message “And if you take more than conventional amounts of time off There Will Be Questions Asked, Buddy”.
A few more data points (one person, multiple jobs):
-big box retail store, entry level, circa 2010-2011: one week of PTO a year but only after the first full year of employment. Can use PTO time to get paid for calling in sick but it will still be tracked for discipline per the attendance policy unless you have a doctor’s note.
-Casino, wage (hourly pay, overtime and labor laws apply): 10 days of PTO the first year, 15 days years 2-5, 20 days years 5-10, 30 days for employees with over ten years of employment. Up to 40 hours of PTO can be banked and carried over to the next calendar year, and once you have 40 hours banked you can trade up to another 40 hours on top of that for a one time cash payment once per year. The rest is use or lose. You can use PTO time to get paid for calling in sick, but it will still be tracked for discipline per the attendance policy. 2-5 day absences with a doctor’s note will be treated like a single day. Anything over five days requires medical leave of absence paperwork to be completed with Human Resources and doctor’s notes. Due to American privacy laws, HR bans supervisors from asking any questions about an employee’s health and discourages them from allowing employees to tell them anything.
-Casino, Salary, “Back Of House” (meaning most duties are in an office rather than on the floor with customers. fixed monthly pay, most OT/labor laws do not apply): “Flex Time Off”. No minimum or maximum amount of time off set. Instead, expectation is that it will not be abused and that all tasks will be completed in a correct and timely manner, and that senior personnel will track usage and discipline any team members abusing it or failing to manage their work load. This has translated into a rough guideline of “If you use more FTO time than an hourly employee with the same number of years of employment, you better have a good personal reason like family death, serious illness, etc.”. There is no attendance policy as such, but in most departments it is expected that unless you use FTO to cover a sick day that you will work the extra hours to make it up in that same two week pay period. It is also expected that 45-50 hours a week is the minimum salaried team members should work, and questions will be asked if you aren’t working that, even if you are getting all your work done. Same with schedule flexibility: In theory and on paper as long as you get your 45-50 hours in a week, handle all your duties, and are available whenever you’re needed for meetings/questions you can work whatever hours you like, but in practice it will be held against you if you are not in the office no later than 8-9AM.
Are sick days paid?
In Spain, the first three days of sick leave are not paid (so they discount that time from salary). After that, you get 60% (company pays for a couple of weeks, after that Social Security steps in).
Vacation time on the other hand is paid fully. Why would they be tallied together? How does it work?
That seems backwards.
Unpaid up front discourages frequent use and fraud, while having it kick in for longer periods helps employees who have legitimate needs. Most people only want to call off for one day at a time, so making it unpaid helps reduce unnecessary times that happens.
Short Term Disability plans typically do something similar, with a period (often 1 week) of unpaid time off before benefits start.
The sounds like the opposite of how people work to me. A short period of unpaid leave means people who abuse the system (ie claim lots of days) get far more out of it than people who only use the system when needed (ie a couple of days a year). It also pushes people to come in while sick, especially late in the year if they haven’t used any to date.
@baconbits9
I think you might be slightly misunderstanding. Under this system, if I call out sick for 2 consecutive days 10 times in the year (20 days), they would all be unpaid. This is the type of behavior I assume you mean by abuse.
If I call out for 4 consecutive days twice (8 days) I would have 6 unpaid days and 2 paid days (one day per 4 day absence).
It does push people to come in during brief illnesses because otherwise they won’t get paid, but it does not encourage abusing the system by calling out sick all the time because the 3 unpaid days is per absence, not cumulative for the entire year.
I’m not saying that it’s a good system, as it had it’s pros and cons. It does push people to come in while sick, but for any system intended to reduce abuse, that’s going to be a fundamental feature. Deciding which policy to use will depend on company culture and such. Acymetric is also correct that the system appears to reset after every consecutive set of absences, such that there is no incentive to take off multiple additional times. That feature actually should significantly reduce the number of days that an abuser might take, since it will be weighted towards unpaid days. Someone abusing a longer leave in order to get more days paid will only be getting 60% of the wage, plus they need to deal with whatever attendance policy exists. Most companies are going to want to know why you are out and when to expect you back if you are taking more than three days in a row.
Usually, the unpaid days are for cases of the flu and such, when people frequently don’t even bother to go to the doctor. The worst part of the flu usually doesn’t las more than a couple of days for a reasonably healthy person.
For a longer stay at home, it is quite reasonable for the employer to expect that if a disease is lasting more than three days, you will go to the doctor, and while they check what is wrong with you, they also give you a slip confirming that you were indeed sick.
For long term sick leave you need to go through a medical tribunal.
Yes, this system does encourage people who have the flu to go to work. It is not perfect, but I am not sure what would be the better way.
That seems strictly worse than the “Sick leave comes out of PTO” policy, because you get “punished” (with lost pay) for getting a cold.
With a PTO policy, you really can’t abuse it (except insofar as your spur-of-the-moment leave disrupts the office). You don’t lose money if you get short-term sick. Most people end up banking some PTO every year anyway (or spending it frivolously because they are at the cap). And, at least at our place, you’re welcome to take unpaid leave voluntarily if that’s your thing.
If I were under the “first 3 days are unpaid” policy, and had paid vacation available, I’d be pissed I couldn’t use my vacation days to get paid for my sick day.
I don’t think you can have a policy where sick days are discounted from PTO days for all jobs.
School teachers have pre-defined days when they have to take vacations – whether they want to or not. So if they take days off, this disrupts the workflow in the school (in Spain, there is a policy of not calling substitute teachers until there has been an absence of three days).
Shift workers in factories have very pre-defined shifts. You can’t just take a vacation without informing them in advance, because this is very disruptive. My dad was a shift worker, and he frequently just changed shifts with other workers to avoid disrupting work flow.
People accumulating vacation days is not something that happens in those jobs. They have vacation days that are scheduled and set.
This system benefits those who get a serious, long disease that lasts weeks. Yes, people who get the flu may loose a couple of days of income per year. But that probably isn’t that much.
@ana53294
That is definitely not true in the US. In fact, I would guess that accumulation of untaken time is more likely among shift workers in factories because they are more likely to have their vacation requests denied (thus forcing them to accumulate, essentially). This matches both my intuition and my personal experience doing shift work in a factory (which probably influences my intuition). Shift workers still get vacation time and get to choose when they take it (or not take it), their vacation days are not pre-set by the company. Your point works much better for teachers, which are definitely an unusual case with regards to time off in a lot of ways.
Now, the “first couple sick days are unpaid” would definitely be a good incentive for shift workers not to call out if that is the policy, but you are more likely to find companies that have “sick time” and “vacation time” or a general “PTO pool” than to find the unpaid time off implementation in the states.
That said, there are all kinds of policies you can put in place to avoid people who abuse the pooled PTO. Just because sick time and vacation time are lumped together doesn’t mean it becomes acceptable to call out once a week. Same goes even for if you have a separate “sick time” pool.
I can only talk about how it works in one company (Firestone), because a lot of these things depend on collective agreements.
Shift workers ask in advance, and they are generally given vacations in the summer, although not necessarily when they prefer (the different workers take turns to get summer vacations between July and September). The company sometimes denies your vacation during the year if there are too many workers who are on vacation and they need you.
But vacation time cannot be accumulated, and it has to be taken. So, if you still have 4 weeks of vacation left by the end of the year, they don’t deny vacation days in December. For this reason, December is a slow time in the factory (and they close completely on Christmas and New Year’s).
Very rarely are there vacation days not taken; in that case, payment is arranged, or days are passed to the next year (the latest you can take them is January of the next year, anyway). In any case, this is avoided as much as possible.
So basically, if they haven’t given you the days during this year, they have to give you the extra money this year, and they can’t delay the payment indefinitely. So this extra money would compensate any unpaid sick days anyway.
I do know office workers who accumulate vacation days from year to year and don’t get the money or the days (until they get fired, when they take all the vacation days). So yes, this system may not be the best for office workers.
I am not saying this system is perfect; but it is not that bad.
In general, sick leave is paid for salaried employees, and it may or may not be paid for hourly employees depending on the employer’s policies. Policies vary quite a bit, since in most US jurisdictions, sick leave is only lightly regulated.
There are three brackets of what might be considered “sick leave”. What most Americans consider sick leave is just for brief absences (something like 3-10 days per year total, with more than a couple days in a row often requiring documentation). For longer absences, there’s short-term disability/FMLA leave (a few weeks to a few months), and then there’s long-term disability (semi-permanent).
For illnesses (either the employee’s or an immediate family member who requires the employee’s assistance) lasting longer than standard sick leave, there’s a federal law (the Family Medical Leave Act, or FMLA) requiring most employers to offer at least 12 weeks of unpaid leave for qualified employees (i.e. you don’t get paid but you have a job to come back to at the end). Some states also have a short-term disability program which provides some money if you’re on FMLA leave or if you’re unemployed due to illness, and some employers offer private short-term disability insurance as a fringe benefit.
There’s a federal program from long-term disability for those who can’t work due to illness for over six months, as part of the Social Security system. And as with short-term disability, there are also private long-term disability insurance that some employers offer as a fringe benefit.
My company gives everyone the same amount of PTO, and they make it relatively huge (8 weeks). This works really well as it turns out in our industry/ecosystem. You are required to cash out all unused vacation every year.
I know lots of companies do PTO, but mine is one of the few I know about that will give a junior person with no experience 8 weeks of ‘vacation’ and do the same for the President. And this applies to exempt and non-exempt staff – you could be an engineer or work the loading dock.
It works fantastically, especially because any stress related to time off is always vectored into discussions of the actual problem – getting the work done on a particular project.
The only downside is that our salaries during offer negotiations can look non-competitive if the candidate does not take into account that they will be cashing out 4-6 weeks of vacation during normal years.
I can’t imagine what I’d do with 8 weeks off. I have trouble spending the 4 I get now, no idea what I’m going to do with the extra one I get this year.
Do you like your job that much?
I can’t speak for Cassander, but I don’t think I could trust being away for 8 weeks. I took Friday off and showed up 2 hours late on Monday and the sky was practically falling.
I have the same situation as Beta. US employers typically staff tightly, meaning that there are few to no backups for individuals in unique or limited positions. Baseline production types (assembly workers, CSRs at a call center, etc.) are fairly interchangeable, so covering vacations is just a matter of spreading out their use and having enough extra people to handle it.
Covering for technical positions, management, specialized positions and such often means doing your work ahead of time, catching up when you return, and being available for e-mail and calls while off.
Employers do seem to staff lightly, and there’s the Iron Law that the workload tends to fill the time allowed (with pressure to always exceed the typical workweek).
For me, I just don’t trust my coworkers. For example, HBC above notes that PTO is liability. For about a solid year, our factory had it recorded as an asset. This was a colossal, multi-million dollar fuckup, and I figured it was 50-50 that our Controller would be fired over it (she was not). Also relevant to the “are you surrounded by idiots” question above.
Relatively few people use all 8 weeks. That is sort of the point and why it works so well. Since you are forced to cash it out at the end of every year, you are paying yourself whenever you take a day off, not using up some artificial pool of days.
Those 8 weeks are it for PTO, though. Even on federal holidays you are using it up if you don’t come in, so I guess it is more like 6 weeks.
Are they hiring? 😉
I’ve been digging around recently on the topic of caries vaccines. As best I can tell, there is 1 primary cause of dental cavities: lactic acid produced by S. mutans. Vaccinating against S. mutans seems like a plausible route to dramatically decrease or even eliminate the incidence of cavities. It seems like various lines of research have been pursued, but perhaps the most exciting is replacement therapy by strains of S. mutans that can’t produce lactic acid but outcompete the native strains. It seems that precisely this organism was created by Jeffrey Hillman at the University of Florida around the year 2000, who successfully conferred lifetime immunity to rats via one topical application of a genetically engineered S. mutans strain, BCS3-L1. This leads me to wonder: why do I and everyone I know still get cavities?
It seems that Jeffrey Hillman cofounded a company, Oragenics, to attempt to productize his discovery, and make a pretty penny in the process. Phase 1 and phase 2 trials were underway by the early 2010s, but in 2014 the phase 2 trials were shelved with vague justification about patent issues and regulatory difficulty in conducting trials. In 2016, however, Oragenics finally received a 17 year patent on their replacement technology, branded SMaRT (S. Mutans Replacement Therapy).
So it seems like for the past two years everything has been in place: a company exists and has a beautiful 17-year patent on a medical treatment that could revolutionize an entire field and save the medical system billions of dollars a year. Why haven’t they restarted trials?
That’s my question to you all. I am not a businessman, a lawyer, or a biologist. Perhaps somehow there is a perverse incentive structure for Oragenics. Perhaps Colgate or the Society of Self-Interested Dentists are paying them off not to do trials. Perhaps a patent isn’t enough to be confident in their IP position. Perhaps the Phase II trials showed no evidence of effectiveness, and Oragenics has just been covering it up. Perhaps Oragenics is just truly incompetent at fundraising, and has the opportunity of a lifetime but no investors to take them up on it. I’d be curious to know what you all think, and also if you know about any ways for me to get my hands on a purported caries vaccine. I do hate going to the dentist.
I haven’t had dental caries in 20 years. But I do have to keep going to the dentist.
As you get older, it’s not caries that gets you, its periodontitis. And the longer you let it go, the worse it will be to get it cleaned out, and the more painful it will become. Everyone gets it. Everyone. And even the strictest brush and floss regimen will not save you, you can’t get far enough under your gums with those tools, especially if you have developed any deeper pockets.
And even if you bought the right tools, you cannot use them on yourself, and you really don’t want someone who’s not trained, practiced, and certified to use them on you. I’m talking, of course, about the dental hygienists.
The actual dentist is there mostly because his guild demands it.
Go to the dental hygienist. And floss. Hard.
Yeah, very much all of this. Eventually you will get this if you stop going to the dentist, and it’s a pain in the butt to fix. I got this and it took something like 4 treatments to fix. It also wasn’t exactly a fake thing, I went to two different dentists that identified all the same trouble spots at basically the exact same intensity.
Now, I haven’t gone to a dentist in a year, but that’s because I’m a bum. I do brush twice a day, and floss 4-5 times a week (because I am honestly too lazy to get to the full 7).
If you get deep enough pockets, they will not fully heal, and you will need to come in more often than that “every 6 months” canard.
The only treatment I know for periodontitis is gum grafts. I’ve had some, and should get some more. Is that what you were talking about?
There’s scaling and planing — that is, removing the plaque beneath the gumline and smoothing the tooth to discourage further buildup. I’ve had that for the lesser gum disease (gingivitis). Then there’s antibiotics, there’s a laser treatment which destroys the infected tissue, there’s a surgical treatment without grafts. Probably others.
The four treatments were probably scaling and planing; it’s common to do four sessions doing one quadrant at a time, though when I had mine done my dentist did two sessions (upper and lower).
Then I got a cavity a few years later. Not a cavity in my life (except in an unerupted wisdom tooth, which IMO doesn’t count), including for 20 years not seeing a dentist… I start seeing a dentist again and I get a cavity. Harumph.
Not sure how you’d make money off it. Stealing the treatment through saliva transfer is kinda gross but I can see parents doing it between their own kids at least, and I bet a lot of them would do it with other people’s kids.
We could try to hit two birds with one stone: saliva transplant dating.
You french kiss on the first date to transfer saliva and perhaps something more…
Is it weird that, before this thread, I had literally never heard cavities / tooth decay referred to as “caries”? It sounds like a typo / fake.
I still hadn’t heard of them referred to as “caries” after reading this thread. I had to go back to see where it showed up. Apparently my brain just translated it to “cavities” for me while I was reading the first time.
Agreed that it sounds weird.
Dentists in my country commonly use the word.
I had never heard it outside of a dentist’s office, but I knew it was the technical term.
Basically, it seems to me that the core point of her book is that, if there was a Khmer Rouge-style revolution that completely destroyed all pre-existing institutions and values in modern America and rebuilt them to be more accommodating to introverts, society would be better.
It would be better for introverts, sure, but better for everyone? I doubt it. Society is designed more around the needs of extroverts because there are more of them. (Though also, I guess, because they’re louder and better at making their needs known.)
Even as an extreme introvert who’s always felt out of step with society, I find myself instinctively annoyed with the wave of books and articles which try to paint introverts as some kind of persecuted minority and call for a restructuring of society to accommodate them. But then, I tend to be cynical toward those kinds of calls-to-action in general.
Are there or do people present as more extroverted because society rewards it?
Humans are pretty good at adapting to demand.
I’m sure there are introverts who learn to pass as extroverts when necessary. Personality tests do tend to show extroverts being a majority though. How reliable personality tests are is a different question, I guess.
I am a bit skeptical of such tests because they often ask people to self-report their behavior, which is notorious for people giving socially desirable answers that don’t match their actual behavior.
Pursuant to the post above about history and its value:
What would be the SSC version of, let’s say, primary / elementary school history? Or everything a kid ought to know about history before puberty. Requirements:
– basic sketch of the sweep of conventional history (ca. 4000 BC to present)
– touches on all of the major events, countries, and people they’ll probably need to be aware of for cultural reasons (WWII, the Ancient Roman Empire, Herodotus, Africa, etc. – given that we live all over the world, we can expect to hit a lot of nifty stuff, including things I wouldn’t know unless I take history somewhere outside the US)
– categorizes periods and events in a way that minimizes confusion (e.g. state vs. nation vs. government; village vs. town vs. city; battle vs. war vs. cold war vs. long term economic conflict)
– preps them for more refinement on the events they learn
– preps them for the problem of resolving conflicting historical accounts
– any requirements you think vital that I missed above
?
You could probably do a lot worse than the cartoon history of the universe. Tack on a few great books for extra credit.
A lot of this should vary by culture; the Roman Empire, for example, is a lot more relevant for Westerners than other cultures. Also kids should learn their national history.
I’d start with human origins and the stone age.
Then, for Western or Middle Eastern students, I’d continue with the origins of agriculture, kingship, etc. in the Middle East.
Then, for Western students, Greece, with an emphasis on democracy. Then Rome; the transition from Republic to Empire, Jesus and Christianity, and the fall of the western Roman Empire.
Then the Middle Ages, with an emphasis on the students’ country if they’re in Europe, otherwise the country that matches their language. Then the Renaissance and the age of exploration.
Then the colonial era, with an emphasis on the students’ country if they had or were a colony. Also the origins of British democracy if that’s relevant to the students’ national history.
Then further national history, and further world history with an emphasis on things that were important to the students’ country.
I think “long term economic conflict” might be a bit abstract for children before puberty.
Instead of trying to cover everything superficially, which is what covering everything implies, might it be better to spend a lot of time on one or two or three historical societies? The students won’t end up with a grand pattern of history but I’m not sure we have one to give them. They might end up understanding that other societies were different and some of the ways in which they were.
The disorder of history itself seems only fair to inform the younglings of. Wouldn’t be it be terrible if you thought it was all David eddings until belatedly disabused? better to be warned early and later pleasantly surprised if you ask me.
One or two might suffice for that, and if the goal is to show possibilities of different worlds and lives, a survey of the welders inst the nursery schools might produce a more vivid sense of how different things can be than something so forgotten and far away. -modern economic subcultures might be less different, but they are far more accessible than the ancient past. But if the idea is to produce a sweeping overview of history, then I think a sweeping overview, however shallow, has no substitute.
Part of my concern is making sure whatever we teach is something the kids will remember for years, or at least will make future history lessons that much easier even if they forget the detail.
Everyone seems to learn well from stories, so that’s the obvious starting point. The next point seems to be stories about something they could tie into history they learn later. So several stories about one society makes sense. One trap I want to avoid is kids insisting that the first version they hear is the only correct one. Another trap is kids thinking none of the stories are true if they conflict, or that they can pick which one they want to be true. I want them to get used to the idea that there are tall tales, slight embellishments, factual accounts, and ways to tell between them, and cases where it’s hard to tell.
For American kids, I think it would be better to focus more narrowly on American history, with some Canadian and Mexican history maybe mixed in as well. You can get more thorough with it, at least to the extent that that’s possible with elementary students. Once they get into secondary school, they can then do more specialized history classes that they find interesting.
I am a big fan of the John Green series: crash course world history. The perspective I like is not fact memorization or historical “truth,” but the presentation of different methods for historical analysis/competing theories of history with vivid illustration to make them clear.
Disclaimer: I’m not a historian and have no expertise to judge the claims made in the videos, but they seem plausible to me and accord with my own general knowledge.
Personally, I plan on first showing age appropriate movies, tv shows, video games to my kids and being available to answer any of their questions. After awhile I plan on having them pick a certain location in time period and going in depth with it, as David Friedman suggested. I find the most useful thing about history is understanding humans in a large scale, so going in depth one subject they are already interested would be most effective.
However, I don’t expect this unschooling adjacent program would generalize well to students that do not have a stay at home parent with comprehensive knowledge up through high school. If I’m affecting education policy widely, I would recommend stressing the great men view of history. I personally think that it is not the most accurate way to understand history most of the time, but it is a very engaging way of learning history. Inspire kids wtih stories that make them imagine themselves in those situations, and they might remember them better. Honestly I think that’s close to what elementary schools do already, but maybe that’s for the best.
I would have a “Great Men of History” course of lessons starting with Scott Joplin, and continuing all the way to Charles Berry, with strong emphasis on Robert Johnson; W. C. Hardy, John Lee Hooker, and Elias Otha Bates/Elias McDaniel/Bo Diddley.
For extra credit only incude the Rolling Stones, Yardbirds, Led Zeppelin, Jeffrey Lee Pierce and The Gun Club.
Without just getting into political invectives, can anyone in/familiar with british politics explain something that’s been confusing me for a while – basically, why does it seem like Corbyn in particular and labor in general (and I suppose the lib dems) have just been a total non-issue in all the recent debates? Is this just a factor of me being an ignorant American, or has the minority party really been as feckless as it seems? Approximately 90% of what media I’ve at least consumed has acted as if the Tories are the only game in town, and the rest is vague on what, if anything, the left is trying to do.
I think this can be done in a culture war free way but I might be toeing the line. Let me know if so, and I’ll abandon this until the next fully open thread.
My impression, admittedly limited, is that Labour largely backed Remain. However, they (like the Tories) committed to going with the referendum results. Leave won the referendum. So Labour officially wants to Leave when most of its membership and politicians really want to Remain (or at least backed Remain previously). However, simply turning into Remainers would mean ignoring a democratic referendum, which would lead to them getting absolutely hammered. Meanwhile, they officially oppose the current Brexit deal but there’s not a lot of coherent reasoning behind why. Their main priority appears to be getting an election called and hoping they can win more seats again, maybe even a majority. But the Tories won’t give that to them because the last snap election went badly for the Tories.
Labour’s big effect on current politics is that they’re really the only party that can hammer the Tories. For example, they dealt a huge blow to the Tories by gaining seats in the previous snap elections. But they’re not anywhere near strong enough to actually form a new government. And while the Tories are in a coalition, it’s a coalition with a far right party. They are, if anything, more pro-Brexit than the Tories. So the only way Labour could get a majority is by uniting every other party against the Tories, including the parties to the right of the Tories. That is, needless to say, probably impossible. That’s why they’re so focused on getting elections.
If you’re referring to the recent vote of no confidence, Labour wasn’t involved because a vote of no confidence only includes the party members.
TL;DR: Labour is in the minority and Tories plus an ever farther right party have the majority. Meanwhile, Labour is twisting itself into knots over its promise to respect a referendum with an outcome they dislike.
Being pro-Europe but stuck having publicly committed to abiding the result of the referendum that went the wrong way is basically the state of the entire British political establishment.
The Tories themselves backed Remain. While there is a wing of the party that is anti-Europe, that is not the wing of the party was in charge leading up to the referendum. Hell they’re not even in charge now. As far as i can tell, Theresa May is Prime Minister not because she leads the pro-Brexit Tories, but because she is very good at internal party politics and keeps her backstabbing knives real sharp. Personally she is probably a Remainer. Thus the British government right now is a farce of pro-Europe politicians trying to implement an anti-Europe plan they don’t believe in because they like being in power more than they dislike Britain leaving the European Union.
And this is why smart politicians don’t hold referendums about anything.
At the very least, require the referendum to pass by supermajority. Then you can be sure you have a mandate.
This is basically it. Labour is complicated further in that Corbyn would be in favor of hard Brexit unlike most of his party so he is content to let the Tories stumble towards it.
Strong disagree on this. I think that’s mostly a lie told by people who dislike Corbyn for other reasons and want to slander him.
@fion all he is has to do is come out and say otherwise but he won’t for fear or actually having a stated position
This would be a more interesting question to pose in a hidden open thread, but a little bit of context to Corbyn’s leadership might help.
Labour party rules require a potential leader to be nominated by a certain percentage of MPs. Three centrist (“new Labour”) MPs were nominated, and Corbyn was floated as a more traditional candidate. As I remember it, it was a “it’s his turn to be the traditional candidate” sort of thing, and also a number of MPs were convinced to sign his nominating papers just for the sake of injecting a bit of variety into the leadership contest (though I may be misremembering, or have read that from a biased source).
After the candidates are nominated, it’s put to a vote by all members of the party. It just so happens that the previous leadership made it much easier (mainly cheaper, I think) for new members to join. Corbyn caught a wave of popular support from Labour’s traditional base. He ended up leader, despite probably never seriously expecting to win when he was nominated.
Corbyn has attempted to steer Labour more towards its roots, which has limited the number of MPs whose support he can rely on, as the more centrist MPs are not entirely happy with this direction. Corbyn actually lost a leadership challenge of the kind that May recently faced, however Labour rules didn’t require him to step down as the Conservative party would have required of May. Add this to the leave/remain divide and you end up with a very fractured party that has struggled to oppose the Conservative party effectively.
Not 100% accurate. Corbyn *won* the leadership election with the membership, he lost the vote of no confidence by MPs which triggered it. Labour requires only a few MPs to secure a slot in the election which then goes to the membership. Tories have jungle primary among MPs followed by a top two membership vote. Also in Labour the leader can stand automatically while in the Tories they are disqualified automatically.
Yeah, the key thing is that after Corbyn lost the PLP confidence vote he was automatically on the ensuing ballot that was sent to the party membership. Despite losing the confidence of the PLP he remained very popular with the membership so he won this second leadership election with flying colours.
What would you expect Corbyn to do? If you don’t have a majority in most democracies you are powerless. SNP and Lib Dems are even less relevant. Its all a question of whether the Tories will keep supporting May or not.
I don’t want to violate the CW-free rule, but I think there is an argument to be made that the vast majority of people who work in the media dislike what Corbyn stands for and try to portray him in a poor light. I’d be happy to elaborate if this was discussed in a hidden thread in the near future.
But the point somebody else made about how the opposition is always pretty irrelevant in UK politics is also valid.
Hmm… There are like four things in there that I disagree with, but I think I’ll sit here and simmer and wait for the hidden threads to tell you how Wrong And Therefore Evil you are. 😛
Dungeons & Dragons discourse:
In my best friend’s Mythic Greece campaign, she’s learned to take the kid gloves off. I’m playing a Barbarian-dipped Moon Druid. Her husband is playing a Barbarian-dipped Moon Druid. We’re rounded out by an unoptimized 9th level Wizard providing damage and high flexibility and a Rogue (DPS that never needs a rest). So last session, the deity who hates us (Eris) set us up to fight a Chaos demon: in this case a Goristro, Challenge Rating 17. When it got a face full of tanks, it just ate the opportunity attack caused by running away from the tanks to charge the Wizard for 14d10+7 damage.
Now, the way damage works in 5E is that, unlike AD&D and 3.Pathfinder, negative numbers don’t exist. You are dying at 0 HP, but you get three death saves and spending an attack on an unconscious or dying enemy doesn’t auto-kill them: it merely subtracts two of those death saves. So the DM played hardball for the first time and had it use two out of three attacks to on its next turn to kill the Wizard, then backfist me with its remaining fist attack. However, having to spend attacks like that meant it soon went down, never managing to drop a Druid’s human body to 0 HP.
It was a dramatic, exciting fight that taught us a valuable lesson: just because 5E PCs have death saves rather than dying at -10 HP doesn’t mean we’re immortal. Only raging Moon Druids are, and we need to play smarter if we care (Bears) about keeping friends alive against encounters the rules call “deadly.”
Yeah, while I can understand why a lot of people were skeptical about the increased survivability for 5e PCs I have to say that IME it works just fine if you play the monsters intelligently and leave the dice as they fall. My girlfriend lost her first character recently and the party has been in serious danger of a wipe twice in “Deadly” encounters, and I’ve been running CR by the book rather than trying to deliberately increase the difficulty.
That said, I’m personally not thrilled with how your DM ran that fight.
Nothing is really out of character for a demon and Goristos in particular are living battering rams not known for their deep tactical thinking. That said, a coup de grace is a huge waste during a fight to the point that I wouldn’t even have a mindless construct or undead do one. This Goristo was getting whaled on for 6-12 seconds by a bunch of guys who could each individually wipe out its weight in lesser demons… for what?
Maybe if it had a grudge against that specific PC, fine, make sure the Wizard is dead and take the long way back to the Abyss. But otherwise it seems like a bad move even for a demon.
She did have a story reason for it, which grew organically out of our characters’s behavior rather than being a railroad story element. We’d been spamming summon spells, so this was us getting to see what it feels like from the other side. We were summoned by a magic item Heracles popped on his quest to rescue Alcestis from the Underworld, and the being that he reacted to by summoning 4 CR 9 heroes counter-summoned something from the Courts of Chaos, which Eris arranged to be a goristo that had a personal grudge against our Wizard.
(Eris herself has a grudge against Atalanta and PCs who are her friends, because A. threw a monkey wrench into her Golden Apple dickery at the wedding of Thetis and Peleus, where we met in Session 1.)
In general, I too disagree with monsters engaging in the behavior of spending two attacks on killing PCs with 0 HP, as it puts them at a fatal disadvantage in the action economy when other enemies are still swinging. This is kind of a 5E Catch-22, since it’s not that hard for another PC to pop a dying one back into combat. It gets rather more like a Final Fantasy game than anything literary. =/
Alright, well that sounds better than I had expected. Not to malign your friend, it just sounded off.
On the one hand, it’s absolutely gamey and honestly hard to justify from a verisimilitude or narrative standpoint. Dying people can hang on for quite a while before they eventually die, and in fiction the important ones usually last at least long enough for a dying speech.
On the other hand, the same logic which makes coup de grace a waste also applies to healing a downed ally. Gaining that extra set of actions helps, but if he goes back down in the next round you’ve now spent an action to get zero-to-one more actions depending on initiative. It’s a dick move to let your comrades bleed out but ending the current threat ASAP is usually worth rolling a death save or two.
I have very little experience with 5E, but just from an armchair POV I think I’d be pretty happy with the “three death saves” approach. The biggest problem with 3.x’s death mechanics is how poorly they scale into higher levels; at low levels you’ll actually get some tense scenes where you try to stabilize a fallen party member, but past 7th or so you’re fighting stuff so powerful that the first hit that takes you into negatives will probably turn you into chunky salsa. That’s also about when raise dead starts being obtainable, so it’s not a game-breaker, but it does add some definite mechanical strain, especially if you’re minded to run a low-magic campaign.
You’d probably want to make raise dead and friends rarer with the three-death-saves approach, for the same reasons, but that’s IME a good idea for versimilitude reasons anyway.
3.0 mindlessly carried over AD&D’s death mechanic, making no changes to account for the Feat system (Power Attack being the main culprit for how different Damage-Per-Round was) and monsters now being built the same as PCs (in AD&D, humanoid monsters were locked into one attack per turn with flat damage while 2E Warrior-types got Dual Wielding, extra attacks at Levels 7/13/etc. & Weapon Specialization). “Dead at -10 HP” was the result of thousands of hours of play-testing by Gary Gygax and the second generation of Dungeon Masters, so it presumably never became “irrelevant, you’re chunky salsa” unless fighting one of the many bestial enemies with 3+ attacks or getting swatted at low HP by a type of Giant who did 3d6 or higher with his only attack.
Even in AD&D there was plenty of stuff that could put out 20+ damage in an attack. Dragon breath especially, but also golems, remorhaz, sword spiders, the bigger giants, traps, falling, high-level spells.
But yes, it was a lot more of a problem in 3.x.
Duh, right, dragon breath. That was bonkers in 1E & the Basic line: a flat 80 damage (save for half) from the average Ancient Red Dragon.
But arguably the literary inspirations for D&D are settings where the fire of an ancient dragon would kill even a legendary hero. IMO “the ancient red dragon breathes fire on you” should essentially be a “you die” unless you have some custom plot-related item to protect you.
@arbitraryvalue: Yeah.
That also gets into the question of what a saving throw represents in the narrative or simulation. Is it like a serial protagonist’s save against what looks like certain death, or…?
I have much experience with players just saying “I save” and shrugging off the fact that I have no idea what that means in the world, so as not to bring play to a screeching halt. =/
Some “saves” make sense. A Reflex save is another word for dodging (usually), a Willpower save is overcoming something mentally attacking. Reflex save for half means you can jump to the edge of the breath or behind a barrier and only get partially hit (on games with figures and a grid, why you don’t have to move is…a good question). Save vs. Wand or Death never made any sense to me though. It’s clearly NOT the inherent possibility of the attack missing or failing, since it’s specifically something the player is doing to affect the attack.
Reflex save: You kind of mostlly dodged the Red Dragon’s breath, and so are only a little bit on fire on account of your great DEX.
Fortitude save: The poison that would have killed an ordinary man, merely weakened you on account of your high CON.
Will Save: The vampire tried to dominate you with his gaze, but you said “No” and made it stick thanks to all that WIS
Modify according to circumstance. And a recommended house rule: Shields apply their normal AC bonus to most reflex saves (and against most touch attacks), but if this makes a difference they suffer whatever it is they protected the character from.
On the topic of playing monsters intelligently: The Monsters Know What They’re Doing.
I ran a campaign like that one time. One player was a wizard specialized in damage. Several combats went the same way: he would throw a giant area-effect damage attack, hitting all the monsters at once. Then they’d all come running at him and kill him, because he didn’t have any defense spells. Then his allies would mop them up, and they’d raise him from the dead.
I don’t think he enjoyed that campaign very much. After it ended, he never wanted to play a game with me again. :-/ I’ve sort of taken it as a life lesson: people don’t like it when you kill their characters.
What I’m saying is, either I hope your wizard player had fun, or else I hope that doesn’t happen to him frequently. : )
Let’s talk death. Specifically, ways of bringing characters back from it.
Raise dead is a 5th-level cleric spell in most versions of D&D, meaning that you can cast it at 9th level, halfway into your average character’s career and around the same time the party wizard learns how to teleport, paralyze monsters, and move objects with his brain. 9th-level characters are unusual but not truly exceptional by D&D standards — a few are expected to be in any large city, and by old-school rules it’s when you’d start to settle down, attract followers, and get involved in politics. So, rules as written, the equivalent of a Catholic bishop is expected to be able to do the Lazarus thing at least once a day, as long as the deceased has been dead for no more than a week or two, they didn’t die of old age or death magic, and the corpse is reasonably intact. This poses certain problems. There’s enough limits that it can’t truly transform society, but it’s enough to thoroughly break a lot of common storylines. City watches could keep a cleric on retainer to raise murder victims (there are plenty of gods of justice). Assassination plots would be a lot harder to pull off. Your destined love dramatically killed herself following a series of tragic misunderstandings? Pay off Friar Lawrence and he can make it all go away.
But wait, it gets better! You’re probably expecting me to talk about resurrection now, but no, that’s not much more broken. Reincarnate, however, is a fourth-level druid spell, meaning that you can cast it at character level 7, about the time your fighter buddy is learning to swing his sword twice in six seconds. (If you’d rather something less snarky, ACKS ballparks 7th level as “best in a county”.) It places the deceased’s mind in a newly created young adult humanoid body, with the same abilities (but the new body’s racial adjustments to them), and does not require the remains to be intact. It doesn’t bring back people who’ve died of old age, but there’s nothing stopping you from stabbing Grandpa on his deathbed and making a generous donation to the Sierra Club so that they’ll reincarnate him as a healthy 20-year-old bugbear. (Possibly a female one. Sex is unspecified by the spell description.)
The upshot is that there’s no reason for anyone who can afford 1000 GP in materials and a 7th-level druid’s services to ever die of anything besides death magic, level-draining undead attack, or freakish accident. Now we’re transforming society.
What can we do about this?
Yeah, Reincarnate bugs me because it breaks Earth-based fantasy and why is earlier than the Cleric’s equivalent? Grrr.
It would be less broken if instead of a young adult female bugbear growing out of the dirt with a departed loved one’s personality, all it did was let you find the reincarnation of dead character X and give them all the memories of their previous incarnation. That’s close to how a Dalai Lama is identified, and the biggest problem would be the player’s annoyance at the attribute mods for now being a newborn human. Better hope you roll “baby dragon” or at least “dryad” on the table.
I think the limits on Raise Dead and similar spells are supposed to be the cost of the materials used. The diamond required for the 5E Raise Dead spell costs about the yearly expenses of a merchant, skilled tradesman, or military officer. The city watch doesn’t Raise Dead murder victims because it would take a sizable chunk of their budget to do so (and they have Speak with Dead to find out whodunnit anyways).
It’s also interesting that the costs of materials for these spells is expressed in a fixed GP value. You can justify it in-universe by invoking it as a sacrifice to the gods and having the gods in a place where diamonds are plentiful require a much bigger diamond to be impressed enough to Raise your companion. On the other hand, this suggests that for a truly broken campaign, you can play games with the economy. I can see a Chaotic Stupid Good cleric launching a crusade to devalue the gold piece to make the costs of the diamond for a Raise Dead spell affordable to everyone.
Yeah, if you’re taking what passes for D&D economics seriously it’s probably not too broken for anyone short of merchant princes and high nobles. But most campaigns start playing fast and loose with those economics sooner or later, because player characters get rich fast and story incentives need to keep up. Any reward big enough to get the attention of even fairly low-level PCs is also big enough to pay a cleric or druid. So why didn’t the quest-giver just do that?
I prefer to keep magic, and particularly high-level magic, a bit restricted and mysterious, and part of doing so is making it something that cannot typically be bought with money. If you can’t cast Raise Dead yourself, don’t expect to go to a temple and just pay the high priest. Expect serious questions about why this person is worthy of resurrection, and you might have to do or owe the sect some great favor for what you are asking.
I’m aware this is very much a choice. Some people just go by the rates in the books, making resurrection a merely expensive mundane service.
I like this idea from a lore perspective, but that seriously limits the ability of a party to do difficult missions if they do not themselves have a priest class. Then you get into the problem of whether you can have difficult battles that might end in deaths or if the PCs chicken out and run away all the time.
It’s far more realistic, but it’s also taking away a lot of the features that make it a game.
I am totally fine with small adventuring parties that have a high impact on the setting needing a miracle-working holy (wo)man as a force multiplier.
Don’t feel like bringing your own thaumaturge? Bring lots of extra Fighters, like a realistic war band. The Cleric is a force multiplier, period. This works both on a game level and on the level of rational military strategy (“simulationism”).
I think you need to decide as a DM how close you want death to be, based on the mood of the campaign, and select appropriate house rules accordingly. My thinking is that death should be a big deal, but not completely absent. Having death at -HP, a healer as an inevitable feature of every serious party, and the DM pulling the occasional punch when the dice indicate an inopportune death, seems to work for creating a mood of simmering tension.
I think what it’s supposed to mean is that you need a diamond of a particular size and quality. But rather than actually tell the player the exact size and quality necessary, it cuts to the bottom line and says how much that diamond will cost. So if you pay a non-standard price for the same diamond it’s still the same diamond and will work just as well.
Logically the price of everything should change from one location to the next, but that’s a level of detail they don’t go into.
Because I set a long-running 3.5 campaign (level 1-22) in the Bronze Age, when diamonds were unknown, the PCs had to find a diamond mine and then set the value of a fairly small diamond at 5,000 GP (3.5 talents/208 pounds of silver) by persuading a king to pay that for it.
I’m annoyed with 3.x and its magic item economy for making that the only time players ever did anything economically interesting. I wish they’d let me run ACKS instead.
If they sold it to the king, how did they then use it for the spell?
@bullseye: They acquired a bunch of diamonds at that site. They then had to sell one for 5,000 GP to create the market for diamonds before the spell would work. Rules As Written, if a diamond has no market value in gold, the spell won’t work!
Set up a plot line where it turns out that the adventures were hired to kill a rampaging beast by an insurance company trying to cover up the fact they cheeped out and cast reincarnate instead of resurrection and the newly minted [level appropriate monster] is trying to get their money back.
Does 5E have a stance on what percentage of people go to the [Insert Alignment Here] Good Place when they die? The assumption could be that your afterlife is preferable to your life, and if you died while doing anything other than Going On An Epic Quest it’s not worth making the sacrifice. Sure your loved ones will miss you, but they know for verifiable fact that they’ll see you again one day.
Of course, over time this means that people who don’t share this value (and especially people who also don’t mind switching species every so often) are heavily selected for, so it’s not a stable solution to the worldbuilding problem. And a world where death is rarely seen as a tragedy still breaks a lot of stories.
I don’t know enough about 5E to answer that question as stated. Assumption in previous editions seemed to be that most people were some species of neutral, and the neutral afterlives are generally pretty boring. (The good afterlives are good on average, and the evil afterlives are bad on average, as you might expect.)
5E stepped away from answering those sorts of world-building questions for the most part. Partly that’s because 4E had built-in setting assumptions and 5E is a reaction to the backlash against 4E. Partly it’s because the system is supposed to be a return to pre-3E “Ask Your DM” philosophy instead of post-3E “Rules As Written.” And partly just because it’s easier for them not to have to think about it.
The closest to an official answer would probably come from the >300 Forgotten Realms novels, setting books and adventures, video games, etc. FR is for all intents and purposes the “official” setting of 5E, with most of the default options (minus Dragonborn) being designed to fit into the realms lore.
I find Forgotten Realms hard to make sense of. Are edition changes part of the setting’s history/metaphysics, like DC Comics destructions and creations of the multiverse? If I’m playing the 5E Storm King’ s Thunder module, how many years after the 2E Baldur’s Gate video games am I? Is that even a coherent question?
Yes, edition changes are actual in-universe events. The laws of magic explicitly changed at least three or four times and each of those have specific dates attached.
You can find dates for when the events in novels, video games, and organized play are supposed to have taken place, but that’s harder for published adventures because they don’t want to lock you into one time period. That said, Storm King’s Thunder was originally part of the Adventurers’ League, 5E’s organized play, so I think that if you dig into the AL materials for it they’ll probably have a year.
Death gets to be two tragedies! One is the normal tragedy that comes from pain and loss to a loved one. Death still hurts, and sometimes it’s permanent no matter what you do. The other tragedy is that Raise Dead (or equivalent) is expensive, so you’ve got some really really bad class warfare fodder in knowing that someone could be brought back, but will not due to economic concerns. Imagine being the high level good priest talking to a grieving person, and saying – “Sorry, you can’t afford the spell, so I’m not going to bring back your loved one.” They can’t even save up for the spell, because it’s a limited offer, contingent on the corpse being whole enough.
Maybe it actually is a solution. It would help explain why your party can find a constant stream of bad guys at the appropriate level as you grow, when there really shouldn’t be that many 10+ level people anywhere. It’s the self-selected group that’s willing to sacrifice for extended life and has no hope of a good afterlife.
Ref, the most recent OOTS, where a main character was resurrected, the cleric who raised him promptly killed him again with a flame strike, and then told everyone else in the party to chill out, because they have another diamond, and will just raise him again.
That struck me as terribly wasteful, even for a wealthy person. She could have spent that diamond to resurrect someone else.
I have to wonder if you’ve ever been hurt like she was. To me, her reaction was understated, on the level of slapping him.
Also, she’s a cleric of Loki. Wastefully flamestriking people is a holy duty for her.
I’m not saying it didn’t make sense for the character. People are wasteful sometimes, and she was clearly acting on impulse rather than thinking it through.
That “slap” means that somebody else who could have been brought back won’t be.
@Mark Atwood
You read The Order of the Stick as well?
I’m a fan and I also read it’s Forum.
From that Forum I found a post on D&D Alignment, a history very impressive (a definite “effort post”!).
From the same person a later post shows what seems to me to be very good taste.
Clearly someone who is very correct, good looking and has STUNNING HUMILITY!!!
Ohh, I have a campaign setting for you:
Raise dead (Evil Exclusive Or Justice) : At the cost of the life and soul of a sapient being, one person may be brought back from the dead. This is an Evil spell, unless used to bring back a murder victim using the duly convicted murderer.
Reincarnate: (Nature) At the cost of the life and soul of a rabbit, one person or animal may be reincarnated in an appropriate form, which grows in a very large ceramic jug filled with a special stew over the course of two weeks (this stew is not super
expensive. But it is 200 liters of stew, so.. not free, either). Warning: The vast majority of people find this extremely disconcerting, and will usually suicide from extreme bodily dysmorphia, so this mostly is used in place of speak-with-dead to let people put their affairs in order.
And to find new Adventurers.
Adventurer: Adjective: A person with an extremely fluid soul. A small minority are born with souls that are highly malleable. Despite much effort, no reliable method is known to identify an adventurer prior to First Death. The hall-mark of adventurers are their extreme tolerance to the process of Reincarnation – during times of extreme conflict, or even in cases like to Orc Trials (A particularly hard-line training regimen), Adventurers have been known to go through over a dozen bodies in a year with no sign of soul-sickness. Adventurers are much sought after for all kinds of extremely hazardous work for this reason. In particular, all known civilized nations employ them to hunt down Ressurectionist Cults.
Ressurectionist Cult: What it says on the tin. Basically an evil insurance company that will murder some random citizen to bring you back to life if you die. Banned by all non-evil societies, the actual government of Evil ones.
Raise Dead has a 500 gp material component. If you use it on every dead peasant your village will go bankrupt.
It also can’t restore missing body parts, and can’t bring back someone who died of old age, and I’d expect the latter to be a much more common cause of death in a peasant village than among adventurers.
I’m getting a little peeved at my D&D group, though trying to control my reaction. We have (now) 4 players, and 3 of us (including me) are in our first campaign. However, 2 of our members seem to forget how to attack on every single round, constantly forgetting to add in their proficiency and dexterity bonuses (both play rogues). I didn’t mind so much at first, but it’s now our 6th session, and we obviously have a combat encounter or more in every single session.
They also make no effort to role-play and are constantly worried about traps. Hold on, the one character wanted to “role-play” which became a 1.5 hour information dump between her and the DM, and that’s the extent of the role playing. Now as for the traps, this typically involves performing different