Open Thread 49.5

This is an experiment with more open threads. Post about anything you want, ask random questions, whatever.

This entry was posted in Uncategorized and tagged . Bookmark the permalink.

1,457 Responses to Open Thread 49.5

  1. E. Harding says:

    Still no race or gender?

    • Scott Alexander says:

      Use your judgment. And if your judgment is wrong, I’ll permaban you.

      • jaimeastorga2000 says:

        I actually like this policy. It’s honest and straightforward.

      • Cord Shirt says:

        It is more important that innocence be protected than it is that guilt be punished, for guilt and crimes are so frequent in this world that they cannot all be punished.

        But if innocence itself is brought to the bar and condemned, perhaps to die, then the citizen will say, “Whether I do good or whether I do evil is immaterial, for innocence itself is no protection,” and if such an idea as that were to take hold in the mind of the citizen that would be the end of security whatsoever.

        John Adams

    • Theo Jones says:

      It sounds like there was a big change in the comment policy I missed. Does anyone have a link to that one?

      • Anonymous says:

        From what I can piece together…

        I can’t remember when Scott said he was first trying out a reign of terror. I noticed that the Comments Page now has the simple policy “If any man shall exceed the bounds of moderation, we shall punish him severely.”, hyperlinked to a review of Albion’s Seed. Relevant quotes from interesting Puritan facts:

        16. Wasting time in Massachusetts was literally a criminal offense, listed in the law code, and several people were in fact prosecuted for it.
        17. This wasn’t even the nadir of weird hard-to-enforce Massachusetts laws. Another law just said “If any man shall exceed the bounds of moderation, we shall punish him severely”.

        I think that a lot of it came under Scott’s judgment anyway – there were lots of attempts by lots of people (including myself) to lawyer the old Sufi Buddhist policy and no-one seemed to be able to agree how to interpret it. The new Puritan policy is at least more-or-less explicit in its vagueness and unlawyerability.

        • sufism: what islamic atheism looks like says:

          Everyone agreed how to interpret it: their comments were all necessary, no one else’s comments were necessary.

    • Airgap says:

      Only if you’re also discussing NeoR****ion. This is called “Shooting the moon.”

    • Wunderwaffle says:

      What is this question about?

      • HeelBearCub says:

        It used to be standard for Scott to include a proscription against starting OT threads about race or gender, as they led to a nasty downward spiral in the following comments.

    • SJ says:

      [inserts tongue into cheek]
      Obviously starting a discussion of the athletics of racing, and the possibility of a person of ambiguous or mis-identified gender participating in such a sport, is a subject not to be discussed.

      [removes tongue from cheek…how the heck did I talk with my tongue in that position?]

  2. SSCES says:

    There is a Greasemonkey plugin for Firefox that was announced on the /r/slatestarcodex subreddit here. This is purely experimental at the moment, but an open thread seems like a good time to test it. Clicking my name should lead to installation instructions.

    • Noumenon72 says:

      This must be different from the one that was announced a week or two ago that just let you hide annoying users. That one didn’t have upvote arrows.

    • gbear605 says:

      It appears to work with Tampermonkey on Chrome

    • Douglas Knight says:

      I know what SSC stands for, but what do SSCES and SSCEC stand for?

    • MugaSofer says:

      I don’t know why we’re all upvoting this, nobody who hasn’t already gotten the script can see it 😉

      EDIT: bug report: this re-orders everything after the browser has already jumped to a submitted comment, meaning you’re dropped at a random location on the page whenever you submit comments.

    • Wunderwaffle says:

      I wish there was an extension to hide all except top-level comments, like Reddit Enhancement Suite does.

  3. Dude says:

    I want to know why vegans are better people than vegetarians http://www.ncbi.nlm.nih.gov/pubmed/27161448?dopt=Abstract

    • Sid says:

      Not sure if you’re taking the piss (nothing personal, just that people often are when they talk about vegans), but is this actually surprising? Veganism is harder than vegetarianism, so of the people who are vegetarian/vegan for ethical reasons, the vegans will tend to have stronger moral motives and/or better self control, so it’s not surprising that they are “more universalistic, empathic, and ethically oriented”. I’m sure there are some class-related explanations too, but the obvious explanation can go a long way.

    • Nels says:

      Just spitballing her and didn’t read the whole report, but wouldn’t a big contributing factor be that more conscientious people are “better”? I’m a vegetarian who’s tried to go vegan before and failed because I struggle to hold principle above base desires for pleasure and convenience. More conscientious people qho examine there morals and come to the conclusion that meat eating shouldn’t be supported as a culture skip the half-measure of being merely vegetarian an keep themselves on a commitment to avoid all animal products.

      • scaphandre says:

        That’s a tempting narrative. But the study looked at the ‘conscientiousness’ and found that this did not vary between their veggie and vegan populations (16.4 to 16.5, ns). Vegans seemed maybe a little more open, but these were not big differences.

    • Jason K. says:

      I wouldn’t count too much on that survey being accurate. Study on moral offsetting behavior. While the study is about the environment, the concept applies elsewhere. What people say they do and what they actually do tends to diverge.

    • Furslid says:

      Possibly because there are more vegetarian cultures than vegan cultures. Suppose half of the vegetarians are making an ethical choice and half are Hindus following their parent’s cue on food choice. I wouldn’t expect the second group to be much better than the average person. They aren’t making an ethical choice in diet, just garden variety cultural conformity. So any correlation between making an ethical choice in diet and empathy to be diluted.

    • Jill says:

      The Dalai Lama is an omnivore. Yasser Arafat, at least toward the end of his life, ate only boiled vegetables. Just sayin’.

      And then there is the issue that different people are, and feel, healthier on different diets. I wonder if people’s morality correlates to some extent with their health.

      E.g. if poor health makes you more empathic toward other sick people, or others in general, or the environment, then those people who are vegan but not getting enough protein or whatever to have good health, would be more empathic. But it could be, in this case, perhaps because of their poor health, not because of their being vegan.

      • Stan le Knave says:

        I believe it’s a principle of Tibetan buddhism to be an omnivore rather than a vegetarian. Something to do with the animals you are eating being the reincarnations of your former teachers.

        I cant find any reference to this in 5 minutes of perfunctory googling, but the Buddha himself did continue to eat meat all his life apparently.

        • Anonymous says:

          It’s forbidden for the majority and possibly all? Buddhist monks, at least, to be vegetarian, since they’re meant to beg for food scraps for their sustenance and therefore cannot refuse food, since that would create an imposition or guilt in the givers. They’re not allowed to eat anything prepared especially for them, however, especially not an animal slaughtered expressly. The underlying principle is that they shouldn’t create additional suffering in order to feed themselves, so any accomodation is abhorrent. Moreover, it was immediately recognized that letting regular devotees produce food expressly for the monks would lead to a wasteful arms race of ostentatious piety, with the rich competing to feed the monks the rarest and most expensive foods and so on, thus ruining everything for everyone and sending them to Buddhist Hell.

          Plus, many Buddhists don’t believe it’s worse to kill an animal than a plant as such, which makes vegetarianism philosophically nonsensical from their perspective.

          • Desertopa says:

            I don’t know about manifestations in other countries, but in China, Shaolin monks are the exception rather than the rule among Buddhist monks in being allowed to eat meat. They also do have food prepared for them at their monastery, and I suspect that they are not exceptional in this, since it’s probably not practical for larger monasteries to feed their entire population of monks through alms, at least if the monastery isn’t located in a large population center, which some are not.

  4. Chalid says:

    What outcomes would lead you to think that this experiment with more open threads was a success or failure?

    • Outis says:

      Success: Increased income, health, education level, and decreased rates of criminality and teen pregnancy amongst commenters.
      Failure: Increased rate of joke comments.

      • Except Tom Swifties. I think they’re considered successful comments.

      • Saint Fiasco says:

        That leads to perverse incentives. We could just be mean to poor, sick, uneducated criminals until they all leave.

      • Deiseach says:

        Success: Increased income, health, education level, and decreased rates of criminality and teen pregnancy amongst commenters.

        Plainly this policy works retrospectively as I did not get pregnant during my teenage years 🙂

        • Airgap says:

          You say that, but my time machine is almost finished.

          • Deiseach says:

            Dear Airgap, if you can successfully assail the fortress of my chastity retrospectively, you deserve not alone whatever awards due to you for inventing a working time machine but a medal of valour and a triumphal procession.

            The disincentive in this case would be you very well might end up having to marry me, and that, dear sir/madam/other, is a fate you would rather jump into a volcano to avoid 🙂

          • Airgap says:

            You’re telling me you wouldn’t go for a guy with a time machine? Ha.

          • Deiseach says:

            Oh, you won’t catch this person napping! I’ve read Wells! Sure, you come on all “Come with me to see the wonders of the 55th Century” and then I end up as the centrepiece of a Morlock banquet! 🙂

          • Anonymous says:

            Ah, but had you read Wells when you were 15?

      • Enquiring Mind says:

        I understand the first items in Success, but how does one measure how many commenters are pregnant teens or criminals?

        • HircumSaeculorum says:

          Convenience sampling.

          • Deiseach says:

            Yes, but what about the pregnant criminals, teen criminals, and pregnant teen criminals? Are they all to be counted separately from “pregnant teen (non-criminal)” and “criminal (non-teenage, non-pregnant)”?

        • Randy M says:

          Use a proxy. Compare rates in us states with high internet use versus those instates with low internet access.

        • pneumatik says:

          Look for there user name elsewhere on the Internet and link them back to their real name, then check their facebook account. Assume people you can do this on are a representative sample of the population. Publish.

  5. Wrong Species says:

    So the newest Captain America movie came out and one of the questions is about how the governments of the world should react to the existence of superheroes. Should they try to rein them in or would that psssibly lead to tyranny?

    • Nels says:

      I haven’t seen the new movie yet, but it’s an interesting problem. One issue is the relative power of the superheroes. Can governments rein them in using their own resources, or do they have to depend on the aid of other sumpathetic superheroes? The could also be beyond the rein of other superheroes, like Dr. Manhattan in Watchmen, and there is nothing anyone can do but try to influence his mind.

      If, on the other hand, a collection of superheroes working together is powerful enough to keep any other one at bay, they could form their own organization together that presumably would be stronger than any government on earth. Maybe it works for good, but maybe it becomes tyrannical too with no accountability besides the consensus of beings with no other accountability.

      Maybe the best idea would be to treat them as a valuable natural resource whenever they pop up and keeping them accountable to governments. They can be kept in line by a combination of military resources and the countries other superheroes. The governments of the countries with superheroes are kept in line by the presence of countries with other powerful superheroes, so now you have a weak concept of Mutually assured to destruction and hopefully everything works out.

      There is always the question of the government using superheroes for bad things anyway, but this is always an issue with governments and anything that’s powerful like nuclear weapons, torture, biological weapons, secret police, espionage. I bet superheroes would be in a better position to stand by their ethics and go against their government, since they could hopefully use their powers to disobey and escape the situation. Using their clout as a superhero, they could then go on to explain their objections to the public, who would hopefully in turn keep the government accountable.

      Those are my preliminary thoughts, and I’m interested in seeing the movie.

      • Alex Zavoluk says:

        In most comics, you’re fundamentally relying on the motives of the heroes involved. They’re too powerful to get around that fact. Brandon Sanderson’s Steelheart takes a look at what happens when all the people with powers are interested in taking over and oppressing the fuck out of everyone else. And many people here are probably familiar with Worm, which involves a constant struggle between governments with powers and criminals with powers, as well as between different governments with powers.

        In most comics, the most powerful characters are basically good, even if they face ethical dilemmas or disagree on the best way to do things, they mostly have good intentions. In Steelheart, they’re evil. Worm is complicated, though the most powerful handful of capes are nominally good and most have good-ish intentions (probably the most powerful evil cape is the Siberian).

        Another key difference is to what extent normals can keep up with powered individuals. It’s completely futile in comic books, but they can be of use in Steelheart (key plot point) and Worm (particularly with cape support, such as PRT units with Dragon’s gear).

        It can also be interesting to contrast similar universes. DC also addressed the issue of “what happens superheroes have too much power?” and the answer is similar (Batman, like Tony, is nominally a regular human with above-average smarts and technology) but also different (Tony involves the government, Batman doesn’t).

        • JDG1980 says:

          Another key difference is to what extent normals can keep up with powered individuals. It’s completely futile in comic books

          Really? If anything, the opposite seems to be the case. About half of superheroes and supervillians don’t have any specific super powers, and are ‘just’ insanely skilled inventors and/or martial artists.

          In comics, just about anyone who really wants power badly enough eventually gets it, one way or another.

          • Jiro says:

            No. In comics, everyone who wants power and has a story written about them can get it. With trained-normal or inventor type superheroes there’s a tension between “anyone can do this” and “this guy can do what nobody else can do”–they’re just normals, but on the other hand, they always do better than the police and catch the robbers, and the police and robbers are also normals. If everyone could do it, police forces would be made up of hundreds of Green Arrow types, which they’re not. They’re normals, but they’re better than other normals, somehow.

            (More modern comics sometimes introduce elite forces of police or government agents who do indeed have comic book levels of training or advanced technology. There still aren’t hundreds of them, but at least you stop wondering why the police can’t ever take advantage of this stuff that “normals” can do.)

        • MugaSofer says:

          >Worm is complicated, though the most powerful handful of capes are nominally good and most have good-ish intentions (probably the most powerful evil cape is the Siberian).

          Even ignoring Scion and the Endbringers as edge cases, Glastig Uaine is the most powerful human alive. Bonesaw and Wnpx Fynfu are also immensely powerful, and can basically wander around killing people left and right.

          And there was String Theory, I guess. And the Thanda, although I guess you could argue the Thanda weren’t totally evil.

          • CommonPlebeian says:

            Scion || Endbringers != Capes
            Including scion and endbringers would be like including galactus as supervillain.
            Glaistig Uaine heel face turns by the end. Bonesaw was basically a persona created in service of Wnpx Fynfu. Excluding Wnpx Fynfu, most “evil people” in worm are basically either self-interested individuals who have a lot to gain through breaking the law, people with power exacerbated mental issues, or cosmic horrors.

            Rot13 is cipher used for charcter names
            http://www.rot13.com/

          • Guy says:

            Wnpx Fynfu also isn’t particularly powerful. He happens to be extremely charismatic, and he uses that to control people substantially more powerful than him, but it isn’t part of his powerset. His powers that he gets sebz uvf funeq basically just consist of a long range cutting ability plus n yvzvgrq nagv-bgure-cbjre novyvgl that he doesn’t even notice.

          • Anonymous says:

            Gur punevfzn vf cneg bs uvf cbjre. VVEP uvf funeq jnf hfrq sbe pbzzhavpngvba orgjrra gur jbezf. Urapr jul ur’f va ab guerng va gur pbzcnal bs vafnar oheafpne naq pna vasyhrapr fpvba jvgu n fragrapr. Vg’f gur fnzr guvat nf Gnlybe trggvat nznmvat zhygvgnfxvat.

          • InferentialDistance says:

            Wnpx Fynfu’f cbjre vf oebnqpnfgvat. Ur oebnqpnfgf gur rqtr bs uvf jrncbaf va beqre gb phg guvatf ng n qvfgnapr, ohg nyfb uvf gubhtugf gb vasyhrapr gubfr nebhaq uvz. Uvf oynqrf pnhfr phgf orlbaq gurve rqtrf, uvf gubhtugf pnhfr npgvbaf orlbaq uvf frys.
            (source)

      • Jiro says:

        Civil War was loosely based on a comics storyline. In the comics storyline, unfortunately, the writers couldn’t agree on exactly what superhero registration meant, to the point where it made hash of the story (it actually matters if superheroes are being drafted by the government into combat).

        Unrelatedly, one aspect of many comic book superheroes is that they inevitably break the law in some ways–roughing up suspects, breaking in to buildings, etc. The whole concept of the superhero is that the superhero has to work outside the law, and because of plot reasons the superhero actually can be trusted in doing this. Having the government control superheroes would mean either that superheroes are officially violating laws (bad, and it’s a lot less plausible that a government would only violate the rights of bad guys than Batman would), or that the government keeps the superheroes from violating laws (eliminating the point of the superhero).

        • Julie K says:

          it’s a lot less plausible that a government would only violate the rights of bad guys than Batman would

          Why? Or is that part of the definition of Batman?

          • Jiro says:

            It’s easier to imagine one guy with supernaturally good judgment and motivation, than to imagine a couple of hundred guys with it, which is what it would require for the government.

          • John Schilling says:

            Also, one guy has a finite appetite for, well, anything, and if that appetite leans significantly in the direction of abuse we call him a (super)villain. A government is the sum of millions of appetites united in mutual back-scratching – do the math.

          • LHN says:

            Being superhumanly right (and nigh-incorruptible) is one of the things that the superhero fantasy rests on. That’s why they can be admirable despite being unaccountable wielders of extreme violence. The guy Batman holds over a ledge or punches repeatedly in the face is never an ordinary person in the wrong place at the wrong time. (Let alone someone Batman is unjustly suspicious of for socioeconomic reasons.)

            Occasional errors in judgment (followed by appropriately over-the-top guilt) may be allowable. But if they’re as brutal as a Batman-like figure (let alone a Punisher-like figure) and aren’t right nearly all the time, then you’re probably looking at a deconstruction of the genre.

            (Which while occasionally done well, is kind of shooting fish in a barrel. Yes, costumed freelancers punching people to further their perception of justice with no one to answer to would tend to go south in a hurry without the support of genre convention. Relatedly, if you try to express strong emotion by engaging in an elaborate song-and-dance number, people will stare at you. And it’s unlikely you can support yourself and a small staff in Manhattan brownstone by solving murders for walk-in clients on a regular basis.)

      • vV_Vv says:

        If the government doesn’t allow normal people to form vigilante/paramilitary/guerrilla groups that act outside the boundaries of the law, why should it allow people with “superpowers” to do so?

        The government may make an exception for god-like superheroes like Superman or Dr. Manhattan who are just to powerful to control, any attempt to do so could provoke their hostility with catastrophic consequences, so it may be better to just try to keep them happy and hope that they will always remain benevolent. The real world example would be the government of some small country accepting the US interference in its internal business, because if you are a small country you can’t really do anything to oppose it and it’s better not to provoke the US hostility.

        • Mary says:

          Because
          1. You can’t stop them if you want to, so what you “allow” is moot
          2. If you do stop them, you have nothing to stop the supervillains with.

          This is an interesting essay on the topic:
          http://fantasticworlds-jordan179.blogspot.com/2011/03/how-superpowered-people-would-change.html

          • Airgap says:

            In my case it’s severe drug addiction, but you may be right on average.

          • Nornagest says:

            Well, the important part is you’ve found a way to feel superior to the political analysts here.

          • TD says:

            It’s not like we’re particularly homogeneous anyway. Is that Jill?

          • Nornagest says:

            Jill has better grammar, and usually isn’t openly aggressive.

          • Jill says:

            “it’s the heavy diet of anime, comic books, video games, and science fiction combined with utter certainty that one is correct and a deep need to blame and castigate, fruitlessly bitching to no end, with no sense of shame are human sympathy”
            May 16, 2016 at 7:51 am
            I think I’m beginning to understand what’s gone wrong with the political analysis here.

            YES. To you. The person with the symbol but no name. Is this board so scary to you that you feel a need to be nameless? That would actually be understandable to me.

            Some folks want to be the arrogant superhero who knows all and condescends to everyone else, playing dominance/submission games constantly.

            That was the point of the vox.com article about Norm Ornstein that I posted on a previous thread. The guy is Right Wing– works at the American Enterprise Institute. He’s not talking about ideology. He’s talking about the level of political discourse. He sees this as having gone downhill big time since Newt Gingrich in the 1990s.

            And he sees it as having gone downhill big time in those particular ways mentioned in that quote.

            If you build it, they will come. If you bash it, with personal and destructive criticism and arrogance, then everyone will just feel badly and nothing constructive will be accomplished. People might become very disheartened and lose all hope of constructive problem solving.

          • Ƭ̵̬̊ says:

            Speaking on behalf of all people who use a symbol rather than a name, I think there’s nothing weird about it. Stop culturally appropriating me.

          • Dirdle says:

            Ah yes, people liking things wrong. That’s the cause of our woes! How dare they!

            I can believe that different political tribes have different tastes in fiction. It would be surprising if they didn’t. Further, saying that there’s probably some feedback loop going on that grows specific divergent memeplexes also seems plausible, though I’d bet on underlying biological factors first. But really? “Them $NERD_INTERESTS make you all robotically unsympathetic and convinced of your own righteousness”?

            If you want to see why this is wrong, consider the symmetric “Your $ELITE_INTERESTS make you out-of-touch and convinced of your own righteousness” position*. If there’s any reason for dismissing it that can’t also apply to the original position, I’d be very interested to hear it.

            On the other hand, if you keep posting under that name, we’d have our own Mistake Not…, which would be cool.

            * – it’s the heavy diet of theater, free verse, indie films and postmodernist novels about middle-aged people having affairs combined with utter certainty that one is correct and a deep need to blame and castigate, smugly condescending without end, with no sense of fun or moral integrity.

          • Airgap says:

            @Dirdle:

            As a exercise, try to figure out why my riposte was so much more effective than yours. The nasty man with the long username will tell you it was if he comes back.

          • Nornagest says:

            Is that a glyph for the symbol-latterly-known-as-Prince? I wonder what else is hiding out in the bowels of Unicode.

          • Anon says:

            re: the essay

            An important missing piece is that even a superman-tier fellow can’t PROTECT everything he desires. Sure, the military can’t harm him, but lets say he forms a nation where he runs things as he wants – the military can just nuke it, or have some kind of dead man’s switch system should certain politicians get a bit too eyebeamed, etc etc. It would likely be more cold war-ish than a strict “supes does what he wants”. Then again, this assumes the superperson desires a structure more nuanced than “I control everything”. I’m assuming they would, because running a planet seems like a lame job for someone with superpowers.

          • Mary says:

            “An important missing piece is that even a superman-tier fellow can’t PROTECT everything he desires.”

            He also can’t actually make you do anything.

            Witness that in both antebellum South and imperial Rome, a slave owner could punish a slave fatally without legal consequence — yet in both times, owners were known to pay slaves to work.

            Slacking off can’t be fixed by the knowledge that the person can be crushed for it. As witness the USSR.

          • Airgap says:

            Witness that in both antebellum South and imperial Rome, a slave owner could punish a slave fatally without legal consequence

            Not in the south. Look it up.

          • CatCube says:

            @Mary

            He also can’t actually make you do anything.

            An important distinction that new officers need to learn: You can’t make anybody do anything. You can only make them wish they had done it.

      • Wrong Species says:

        It’s interesting that you suggest mutually assured destruction. My thoughts were the exact opposite. Give all the superheroes their own nation and if they try to do anything terrible, team up and stop them. The problem with your suggestion is that the superheroes could secretly conspire with each other and manage to control the nations of the world without a fight. The problem with my idea is that humans may not be a match for all of them and without other superheroes as a check and balance, it may be impossible to stop them from imposing their will on the world.

        • Nels says:

          Yeah, I definitely think all the supers together is too tough for the rest of the world to deal with, so I can’t really see that system working well. Your last sentence was what I was trying to get at.

          When they are in their own countries, they are with their own countrymen in an national defense organization with a common goal of defending their countries from other threats from outside. I think it’s very reasonable that they start to identify more with their countrymen and their friends they work with and that connection than the superpower one. Hopefully this loyalty halts conspiracies, but even if they don’t and those conspiracies succeed doesn’t that just naturally lead to the second outcome anyway?

      • Nornagest says:

        I like how the comments in this thread more or less recapitulate one of the subplots in Worm.

    • Loquat says:

      a) Trying to mete out vigilante justice can already get you sent to prison in the US. You may be well within your rights to shoot someone who’s threatening your own person or property, but if you go out looking for criminals to beat up you’re going to wind up on the wrong end of the law. A whole lot of what comic book superheroes do is just completely independent vigilante justice on a larger scale, and I can’t see any self-respecting government tolerating that for long. If they want to fight bad guys, they’re going to get integrated into the police and/or military.

      b) Basically all international stuff seems likely to be either banned or subjected to the same rules international military intervention is subject to now. Pretty much no sovereign nation is going to agree to a legal system under which foreign citizens can come conduct paramilitary operations in their country, endangering their citizens, without needing to get official permission first.

    • John Schilling says:

      It helps if you can pin down exactly what a “superhero” is.

      For most of comic-book history, a defining characteristic of superheroes is that they did not kill people, even in self-defense or in the defense of innocent bystanders. Meaning anything they do can in principle be undone if society disapproves. Also part of the definition is that a superhero doesn’t seek profit or political power, at least not by way of his superness. That takes out most of the motives for wrongdoing. And they turn their defeated foes over to the secular authorities for punishment, which means that while they may be defying the government w/re tactics, they are at least implicitly letting the government call the shots at the strategic level.

      The category for anyone who doesn’t obey that code is “supervillain”. Even if, as has occasionally been the case for people like Lex Luthor or Victor von Doom, they are presented as zealously championing the welfare of humanity or some subset thereof. They kill people in the process, become rich and powerful, and place themselves above the law, so they are villains in the black-and-while morality of the comic books.

      And as others have noted, since the late 1980s actual superhero comics have started taking a more nuanced take on this, but movie superheroes still pretty much have the “no killing, no profit, submit to secular authority in the end” code. Which doesn’t really make for interesting stories, though the cinematic “Watchmen” captured at least some of the interesting bits of its dead-tree predecessor.

      If you’ve got a community of people powerful enough that the tools of mundane law enforcement are next to useless against them, well, I think Scott once noted that the defining characteristic of the Middle Ages was that one armored knight could defeat any number of peasants. Which is an exaggeration, but not enough of one to matter – for a while knights could and would, out of tribal self-interest, organize to preemptively break up any attempt at organizing peasants to overpower knights. So will superheroes, if it comes to that.

      Note that realistic human motives include, “I’d rather people see me as the Good Guy, so long as it doesn’t involve submitting to infuriating petty rules made by people who don’t understand what it’s like in the field…”. And also, “I’d rather find an excuse to see those really really powerful people as Good Guys than try to fight them with nothing but a pitchfork”.

      So, secular law is basically written by the knights. And there is a separate spiritual law based on shaming the knights (and everyone else) by categorically withholding their Certified Good Guy status if they go too far astray. Possibly something similar would emerge

      • grendelkhan says:

        Ooh, relevant! “Finally, I have seen the day…”. The kind-of-nichey Grendel comics from the 1980s address this! It started off as a sort of Batman pastiche with more stabbing, then turned into this sci-fi mashup, far-future-seen-from-the-1980s, zaibatsus, flying cars, shoulder pads and superweapons.

        So he’s Bruce Wayne, complete with Dark Knight Returns broom moustache. And, frustrated at the injustice he sees every day, by night he dons the costume of Grendel and sets out to battle it on the streets… except he doesn’t. It wouldn’t make any sense to. He’s a powerful man; why abandon that for something much less effective? He fights his implacable enemy using the mechanisms of the society he’s highly-placed within: a regulatory council, an inquiry to find illegally-held information in vast archives, visibility. He rejects outright murder, however he despises his enemy, until the end.

        This takes its toll on the genre conformity, of course; there’s a secondary protagonist who does dress up in costume and stab people, but he’s mad and doesn’t really drive the plot, apart from thematically.

        You can’t really have plausible superheroes, if only because history doesn’t generally work on the Great Man Punches Things model. They’re symbols, and if you mistake an analogy for an isomorphism, you’ll end up asking pretty silly questions.

        • pneumatik says:

          Thanks for that link. I’ve bookmarked it to read in the morning.

          I also may end up rereading my Grendel collection. And trying once again to track down a copy of issue #29 (I think that’s the one I’m missing).

      • Eggoeggo says:

        Aren’t most of the really interesting super-villain plots based around stripping the heroes of the public support that legitimizes them?
        IIRC (and this is from when I was like 7), the old live-action superman TV show had a plot where Luthor had everyone convinced that Superman’s powers were causing environmental damage: heat waves, earthquakes, etc.

        That seems more interesting than “punching each other in the sky for 20 minutes like Neo and Smith in that one movie we just pretend didn’t happen”.

        • Nornagest says:

          Depends how “really interesting” looks to you, I guess, but I can’t think of too many famous or influential plots that use that hook, except for one of the threads of Watchmen.

          A lot of superheroes have trouble securing reliable public support in the first place, or don’t seek it at all. It’s probably most obvious in the X-Men line and its imitators (where the powers are a transparent stand-in for membership in one marginalized group or another, depending on the writer), but even Spider-Man, a fairly traditional good guy in most respects, has to deal with his civilian boss trying to smear his costumed identity.

          • Mary says:

            Hmm. “Hero is framed” is a relative common plot in my experience. And others where he looks bad. Notable for the gullibility of the crowds.

            (A good version is The Cloak Society, Villains Rising, and Fall of Heroes by Jeramey Kraatz.)

        • JDG1980 says:

          IIRC (and this is from when I was like 7), the old live-action superman TV show had a plot where Luthor had everyone convinced that Superman’s powers were causing environmental damage: heat waves, earthquakes, etc.

          You’re probably thinking of “The Man of Steel Bars“, an episode of the 1990s Lois and Clark: The New Adventures of Superman TV series.

          • Eggoeggo says:

            November 21, 1993, so I was even younger than seven. That explains the really poor memory.
            Thanks for the ref!

          • LHN says:

            I understand the concept of the passage of time, and I recognize that Lois and Clark is now old enough that, for example, someone who was a small child when watching it for the first time can now be an adult.

            I nonetheless feel great pain at the idea that “the old live-action superman TV show” now references something that didn’t star George Reeves.

          • hlynkacg says:

            I know the feeling mate.

      • Jiro says:

        For most of comic-book history, a defining characteristic of superheroes is that …

        Something else I’ve found obvious but which a lot of people seem not to: Traditional superheroes only handle cases where a successful outcome would be morally unambiguous. A superhero will commit a crime to stop the Joker from killing. Pretty much everyone except the Joker thinks we’d be better off with the Joker not killing. A superhero will not commit a crime to prevent a particular politician from being elected, even if electing the politician would save 100 lives and catching the Joker only saves 10, because whether a politician is bad is not something everyone agrees on.

        Modern superhero writers have a tendency to use superheroes as author mouthpieces and end up violating this traditional superhero rule. This has the side effect of making superhero activities a lot more questionable.

        • grendelkhan says:

          Modern superhero writers have a tendency to use superheroes as author mouthpieces and end up violating this traditional superhero rule.

          This at least goes back to Grant Morrison’s run on Animal Man in the late 1980s. Wow, does he ever have non-subtle politics. But writers have slipped their politics into comic heroes for a long time, even before the Vertigo invasion.

          • LHN says:

            It goes back to Siegel and Shuster’s Superman, who spent a lot of his first year physically threatening politicians and captains of industry, or knocking down a slum (with just enough warning for the residents to grab what they could carry) to force the government to build public housing. At about the same time, Captain America was violating the Neutrality Act right and left in a way that looks fine to most of us, but presumably didn’t fly well with members of America First.

            But the genre was still coming into focus then. For most of the next few decades, superheroes were studiedly apolitical (within the Overton Window of the US, of course). When they started having political opinions again starting in the 70s, they were still mostly careful to only go after actual illegal activity. (Green Arrow might talk left-wing politics, but he was never going to engage in direct action against someone who wasn’t a clearcut criminal by any standard.)

          • Jiro says:

            Siegel and Shuster had a story where Superman beats up foreign athletes for expressing anti-American views. They turned out to be spies (their country Dukalia was apparently meant to be pseudo-Nazi-Germany), but he didn’t know that when he was beating them up. It shows exactly what is wrong with using superheroes politically–the creators of Superman, with perfectly good intentions and relatively liberal beliefs for the time, still managed to unintentionally have Superman be a fascist oppressor when he meddled in politics.

          • LHN says:

            Captain America Comics #2 likewise has Cap punching a random fellow airline passenger because Bucky says “he looks like a fifth columnist”. (He is, but not based on any evidence available to Bucky or Cap.)

          • suntzuanime says:

            As a rule of thumb, it’s best to wait until someone makes a mistake before you criticize the basis of their decisions. How do you know Bucky doesn’t have a solid visual model of a fifth-columnist?

          • TD says:

            @Jiro
            “It shows exactly what is wrong with using superheroes politically-”

            I don’t know. I find those early stories way more interesting than whatever Superman is doing now. There’s one story where he engages in social engineering by smashing up cars and assembly lines to send a message about road violence. How do you top that?

            The “Superman is a reluctant messiah” thing they keep milking is way less amusing. A fascist oppressor is more fun.

          • Eggoeggo says:

            Captain America Comics #2: “The Ageless Orientals That Wouldn’t Die!”
            Now that’s good comic writing. Team Fortress comics teased us with hippie punching for several issues, but I don’t think they ever delivered.

            What I wouldn’t give for a return to the Rainwater-&-Pure-Grain-Alcohol Age of anti-communist comics.

          • Jiro says:

            I find those early stories way more interesting than whatever Superman is doing now

            The problem is the combination of “superheroes never make mistakes, so it’s okay for them to break the law” and “superheroes meddle in politics”. If Superman is stopping Lex Luthor from making California fall into the ocean, I can easily believe that he has little chance of making a mistake. If Superman is beating up people for exercising free speech, it is much harder to believe that he’ll never make a mistake–especially when the justification he uses in the story is to our eyes a mistake.

      • MugaSofer says:

        movie superheroes still pretty much have the “no killing, no profit, submit to secular authority in the end” code.

        This isn’t really true.

        The current DC movie heroes kill (and are widely criticized for it), but all the Marvel heroes kill people as well and openly oppose the government (and Iron Man makes money off it, too), and the X-Men are on-and-off at war with the United States and stab people to death a bunch.

        Also, Batman has killed people in every one of his film incarnations except maybe the Adam West one, and they’ve made Watchmen and Kick-Ass films. Also, if we count films without comic counterparts, then there’s Chronicle and Hancock.

        In fact, the only movie superhero who doesn’t kill people is Superman, and he threw Zod down a chasm.

      • God Damn John Jay says:

        movie superheroes still pretty much have the “no killing, no profit, submit to secular authority in the end” code.

        Tell that to Zod’s snapped neck.

    • Outis says:

      My question is why was Captain America the one who refused to submit to government authority. Isn’t he a soldier? And Tony Stark is an entrepreneur. It seems completely backwards.

      • Wrong Species says:

        It makes more sense if you’ve seen the previous movies. In the last Avengers movie, Tony Stark created an AI that went rogue, killing many. So he wants to atone for what he has done. In the last Captain America, it was revealed that one of the top spy agencies in the US was infiltrated by a Nazi like organization. So of course he’s concerned about what could happen when the government(or the UN in this case) tells him what he can and can’t do.

        For a summer blockbuster, the movie is actually fairly nuanced. Each side has good reasons for what they do and there is a never a clear indication of who’s right. You have to decide for yourself. Considering that they manage to introduce two new characters and a good villain while giving everyone some screen time and it’s quite an impressive undertaking.

        • Siah Sargus says:

          This movie also did the “versus” story better than I’ve seen in a while. The formula is usually: two heroes are lured into fighting each other by a manipulative villian, only to discover that they are both on the same side, and then take on the villian in the climax together. Even in Spider-Man comics in the 70s this was understood to be largely played out.

          So… when the story introduced five new winter soldiers in the second act with a flashback, I rolled my eyes and expected a 3v5 to cap off the story. The way it wound up playing out, with Zemo killing all the soldiers with the rationale being he didn’t want the avengers united against a common foe, was instead the capstone to the last wind of an extended fight between Cap and Iron Man that had been going on the whole movie. Not only did it makes sense for Zemo to want to continue their fight with each other after they had gone to face him, it also worked much better by keeping the conflict congruent throughout the whole film, instead of having a last minute villian. And to complete this Black Panther prevented Zemo’s suicide, ending the cycle of revenge that was started by Captain America himself.

          Alright, I’ll stop now, I’m gushing.

        • Anon says:

          It’s a bit odd that the movie itself didn’t directly address these obvious motivations (to my memory). It didn’t even seem like it was supposed to be some unspoken fact, it’s like they just forgot to put those lines in. A nod to Scarlet Witch’s family being “collateral damage” but still not trusting Stark (for that same reason) seems like an obvious unused angle for her being on the fence, too. Weird writing.

          I do like that I’ve heard people say “they only made one side even plausibly believable!” about both sides, though.

      • John Schilling says:

        Isn’t he a soldier?

        Captain America is, as the title implies, an officer.

        “I, Steve Rogers, do solemnly swear that I will support and defend the Constitution of the United States against all enemies, foreign or domestic; that I will bear true faith and allegiance to the same; that I take this obligation freely, without any mental reservation or purpose of evasion; and that I will well and faithfully discharge the duties of the office on which I am about to enter. So help me God.”

        There’s nothing in there about obeying orders or doing what the government sys. Those can be pretty strongly implied in some contexts, but not universally so. There is something very explicit in there about defending the United States Constitution against its domestic enemies. And Rogers comes freeze-dried from an era when the Constitution wasn’t quite so liberally reinterpreted as today. The comics, at least, have pretty consistently portrayed Captain America as the sort of old-school patriot who doesn’t blindly trust or obey the government.

        Iron Man isn’t a terribly good fit for law and order, but the entire superhero genre basically creates characters that don’t fit that role and, when the story demanded a whole bunch of loyalists, he was the least-bad fit in the Marvel stable. Tony Stark, Crony Capitalist?

        • LHN says:

          Aside: Given that he leads the Howling Commandos, movie Cap is presumably an officer (and maybe even a captain). But I think comics Steve Rogers (who had a secret identity, unlike the movie version) spent the entire war as a PFC; I’m not sure if his Captain America identity had official military status or if people just went along with him because of course you would.

          • John Schilling says:

            Every reference I can easily find says that “PFC Rogers” was a cover story, but none of them specify a rank. Pretty sure he commanded soldiers more than once, so I’m going to go with the blatantly obvious.

          • LHN says:

            It may have been a cover story eventually. (Or that may be a post-revival retcon.) But I read the first couple of issues earlier this week (thanks, Marvel Unlimited!), and there’s no sign that he’s reporting to anyone. Meanwhile PFC Steve Rogers is peeling potatoes– and getting upbraided by his superiors for how long he’s taking, thanks to his unexplained absences.

            Which at one point include traveling across the Atlantic and crossing France into Germany. (Steve dresses as an old lady. Bucky simultaneously kicks both Hitler and Goering.[1]) That absence is long enough that Steve gets put into the guardhouse for a week. Which seems pretty lenient, given how long he must have been AWOL.

            [1] http://goodcomics.comicbookresources.com/2016/05/11/i-love-ya-but-youre-strange-that-time-captain-america-had-to-dress-like-an-old-lady-to-stop-the-nazis/

            So at least at the beginning, Captain America was much more of a freelance superhero guise than a military position, and one which inconvenienced PFC Rogers’ career rather more than Superman’s activities did Clark Kent’s.

      • Outis says:

        Wrong Species, John Schilling: Thanks, that explains it to my satisfaction.

      • Chalid says:

        Tony Stark is a *defense contractor,* he’s definitely got to stay on the good side of the government if he wants his business to continue operating. And having essentially worked to strengthen the US government his whole life, it makes sense that he would be convinced that that’s the right thing to do.

        Even if he was some other sort of businessman, it would be easy for the government to shut down his whole corporation.

        (Note: haven’t seen Civil War yet)

        • Guy says:

          Tony Stark *was* a defense contractor, but realigned his company after seeing the results firsthand. The guilt is strong in that one, which drives his actions both in Avengers 2 (which should have been an Iron Man movie, but I digress…) and Cap 3.

      • MugaSofer says:

        It doesn’t really make sense.

        Tony Stark has literally spent every film doing his own thing and flipping off the government, and has rejected suggestions like this onscreen twice before; and Steve Rogers literally spent the majority of his life dreaming about being a soldier. (Other characters don’t really make sense either – Black Widow is a Snowden-style whistleblower and implied war criminal, Hawkeye has a vulnerable family, etc.)

        In the film’s defence, however, Stark has been acting increasingly unstable and hostile to the idea of “superheroing” in his films – building robots to do it for him, suffering panic attacks, temporarily retiring twice – while Cap appears to operate entirely on the principle of “is this person a Nazi?” and has already opposed the US Government in his films (they turned out to be Nazis.)

        In the film’s offence, none of the legal details made any sense either and the superpowers were weird and inconsistent. Also, it has the same plot as Batman v Superman.

        • Outis says:

          I like how all the superpowers are pretty much exactly balanced. No matter the nature or source of their powers, any two superheroes end up pretty much evenly matched.

          • I don’t think we saw, e.g., Clint and Vision be matched. Heck, I think we explicitly disproved that thesis in Natasha v. Hulk in the first Avengers movie.

            What the teams had was a counter for each hero, and a bunch of other heroes who could act as force-multipliers. But it was clear that none of the heroes whose powers were based on Actual Physics could have stopped Vision for long.

          • InferentialDistance says:

            Don’t forget that both sides were pulling their punches to avoid lethal harm, so when a heavy hitter fights a softy, they use way less force.

          • Guy says:

            Not to mention – Hawkeye got the drop on Vision when he attacked the compound and set up very specific countermeasures and he still lost.

      • Jaskologist says:

        This wasn’t government authority, it was rule by the freakin’ UN. Of course Captain America didn’t go for that crap.

      • Fahundo says:

        Captain America has never been one to blindly follow orders. And in the movies they’ve shown him increasingly at odds with modern bureaucracy. Refusing to follow an order he disagrees with is consistent with his character. If you watched his previous movies, in the first one he only became Captain America after leading an unsanctioned rescue operation. And in the second one he found himself in the midst of a massive government conspiracy.

        Iron Man is pretty much on the government’s side due to personal guilt. He created an AI that almost destroyed the world and it continues to haunt him. He’s starting to believe that he has come too close to causing catastrophe and needs to be supervised, and he also projects that belief onto the other heroes, even though none of them have been as reckless as he has.

    • Mary says:

      You might find the Wearing the Cape series by Marion G. Harmon interesting.

    • Neanderthal From Mordor says:

      One of the oldest definitions of the state (going back at least to Weber) is that the state has a monopoly on violence on its territory.
      That means the the state may not be directly opposed to superheroes, but it would be to their heroics.
      In practice, most states takes preventive actions against any possible challenge on their monopoly by controlling guns and militias, so I presume they will go after superheroes even if they are law abiding.

      • States don’t have a monopoly on violence in their territory–if they did there would be no private murders, robbery, etc. I think the usual claim is something like “a monopoly on legitimate violence,” which rapidly becomes circular if “legitimate” means “legal” means “approved of by the state.”

        My definition avoids that by treating “legitimized” not as a moral or legal category but as a description of how people react to its acts.

        • Airgap says:

          I still think all the drug murders in the ghetto are secretly committed by agents of the Office of Management and Budget.

    • TheAltar says:

      I think there are a few ways things could go. Either the powered individuals become the heads of state (like Black Panther is in Wakanda) as people willing throw governing power at them and they decide they can do more good with more power OR they form independent groups that now supercede governments in power and become the new ruling bodies of the world.

      When I watched Civil War I mainly saw it as the governments of the world trying to do a massive power grab of convincing the Avengers that the population of the world should have control over powered individuals rather than the other way around. If the entire group decided to say “No” to the UN, then the UN would have no recourse and no real ability to do anything back.

      The Avengers being the new world rulers is prevented as a possible future due to Tony Stark still being very regretful of his own actions and no longer trusting himself after almost single-handedly killing all life on earth with Ultron. The Avengers directly accepting democracy and governments as the best method for directing safety policies for the world is prevented as a possible future because Captain America saw how easily SHIELD was infiltrated by Hydra (and could easily be infiltrated by someone like Loki as well) and knows the Avengers will almost instantly become helpless puppets to government leaders with selfish personal agendas. Adherence to a path of strong violence is also prevented by Vision being largely a pacifist and possibly being the strongest person there.

    • Aegeus says:

      They should be registered, for one simple reason: So that when Fireball Man robs a bank, the government can look in their database of supers who throw fireballs and see what comes up.

      I wouldn’t support anything beyond registration though, because trying to turn people into an underclass when those people can kill you with their brain is a stupid idea. Admittedly, a lot of people in the Marvel-verse are just racist against mutants, so you can’t expect rational responses. But I would still suggest that they not pursue policies that make mutants sign up with Magneto.

      I’d also probably support some incentives to keep supers out of trouble and stop them from turning to villainy to support themselves. Yes, this means you’re essentially paying people for being born, but I bet it pays for itself by not having supervillains tear up your cities.

      I would not be horribly afraid of government tyranny because the supers have a monopoly on force – if the Avengers don’t want to be evil, there’s no power on Earth than can force them to. Of course, by the same token, that means that there’s a risk of superhero-driven tyranny. But that’s sort of baked into the superhero setting – the only thing that can stop a supervillain is a superhero – which means that a government-backed super-team is probably the best option we can hope for. At least SHIELD has some sort of oversight.

      • MugaSofer says:

        The film doesn’t center around registration, for what it’s worth. According to the SHIELD series, supers are already kept track of (which makes sense.)

        Fireball Man could easily have only just triggered, though, or kept his powers a secret, especially in a world where supers are rapidly brought to justice.

    • Hummingbird says:

      I saw the movie yesterday. It started out on the theme of government control, vigilantism, and power. But then it embarked on a plot in which the anti-government control faction was trying to prevent a villain’s plot, while the pro-government control faction just mistakenly thought one of the other faction was the villain. Both sides end up recruiting other heroes seemingly without ideological interest, from Antman who seems like he’s just happy to be included, to Spiderman, a mercenary who takes a new suit as payment for his services to Tony Stark. The theme is government control is not addressed further, and not resolved.

      It would make sense for governments to generally be ok with certain acts of heroism, like runaway trains, preventing bomb threats, stopping bank robbers, or saving kittens from burning buildings. But if anything goes wrong in these scenarios, like the train derails into an affluent neighborhood, the bomb goes off anyway because the superhero doesn’t know anything about how bombs work, a robber’s bullet bounces off the hero’s armor and into a civilian, or the burning building collapses and kills a lot of people due to the hero punching through walls, then the government will jump on getting involved. They can’t touch heroism as long as people support the outcomes. See the beginning of the movie “The Incredibles”.
      Or heroes are tolerated because they protect against threats so large that the government has to rely upon them. But this only applies to characters like Superman and Dr. Manhattan.

      In the “real world” though, with sufficient numbers of people with superpowers and with a large enough power scale, lack of government control of supers would lead to a kind of superhero warlord game of risk, while government control of supers would lead to, probably, tyranny.

    • Paul Goodman says:

      One proposition I’ve seen proposed in Worm and possibly other things that seems plausible is that a world with superpowers will naturally tend towards feudalism. Feudalism arose from an environment (and in fact independently in more than one environment) where a small number of well trained, well equipped knights are a more significant military force than basically any number of peasant levies. Gunpowder weapons allowed any recruit to be trained and armed relatively cheaply into a valuable asset, tilting the balance in favor of democracy.

      But if you have people with superpowers who can’t be effectively fought except by other supers, that looks a lot like knights who can’t be effectively fought except by other knights. If the mass of the population has no appreciable military power, pretty soon Moloch will take away their political power too.

      • TD says:

        The gun is democracy, yes.

        Only there’s probably some lag involved. Citizen’s militias are no longer considered militarily effective (outside of militia fantasies), and haven’t for over a hundred years, so something like feudalism could potentially come back around in our real world. One thing that stops it is that even though the military is vastly superior to citizens in any combat situation, the sympathies of the military are with the citizens given that those are who their friends and family are.

        Automated armies will take that away, and back to feudalism we go! Actually no, because feudalism requires that peasants are economically useful, so something new will happen. If everything is automated except high level tech jobs, then no one else has any useful role in society. Looked at materialistically, this might imply that the great extermination is around the corner. In all previous eras, no true state supervillainy could exist on this scale. Even genocidal states could only go so far, as elites needed their people for the economy and they needed the military to back up their power. When human beings stop being useful, all bets are off.

        Artificial intelligence is a superpower wielded by a superbeing, so groups like MIRI want it to be “friendly”, but to achieve this goal a high degree of centralization is required, which is to say that as computers become more powerful, and algorithms more versatile, anything that could potentially blossom into AGI anywhere must be contained. However, the consequence of that is that no one can defend themselves from being exterminated by those who control automation. Essentially, we need to solve the friendly human problem first.

        This is why I think we are fucked either way. The human era will be over soon.

        • Stefan Drinic says:

          An armored knight requires more than a knight being stronger than a bunch of peasants. To arm and train a knight is very expensive and time-consuming, which meant that anyone trying to start a rebellion against such types would get their revolt stamped down before it could go anywhere. Guns, on the other hand, are much more cheap and easy to use; you can take a generally average person and train them to be effective in combat much more easily today than you could when knights were still militarily relevant.

          • “Guns, on the other hand, are much more cheap and easy to use; you can take a generally average person and train them to be effective in combat much more easily today than you could when knights were still militarily relevant.”

            True of crossbows too, however.

          • Cliff says:

            Crossbows have an extremely limited range

          • Early guns had a limited range too.

          • Airgap says:

            I guessed that guns had worse range than crossbows until rifling. I’m seeing figures of ~400 yards for medieval crossbows, 175 yards for the Brown Bess.

          • John Schilling says:

            I think you’re comparing the literal maximum range of crossbows with the effective range of muskets. I don’t have data on musket balls, but the Sporting Arms and Ammunition Manufacturers Institute indicates that the lethal range of a 12-gauge shotgun slug is about 1000 yards.

            You can’t expect to hit a specific target at that distance, but you weren’t hitting specific targets with a crossbow at 400 yards. And really, you weren’t shooting the crossbow at all until the target was within half that distance.

          • Ilya Shpitser says:

            Actual european warfare evolution was fairly interesting. Heavy cavalry charges (what plausibly is the domain of “the knight”) were decisive for a very long time, well into Napoleonic warfare, when cannon was also important, infantry had guns, and cavalry was not armored anymore.

            Siege of Vienna in 1683 was lifted partly by a celebrated Polish cavalry charge. Firearms were well into being a thing by then.

            Earlier, men at arms (some noble, some not) had a retinue of 3 to 10, called a “lance” (sort of like the old school “squad”) from which armies were made. This retinue had archers and such.

          • Airgap says:

            I think you’re comparing the literal maximum range of crossbows with the effective range of muskets.

            No. Effective range of the Brown Bess is even lower.

        • CatCube says:

          Citizen’s militias are no longer considered militarily effective

          Seems to be working out for the Taliban.

          • Airgap says:

            Citizen’s militias are definitely more effective when you coke-fueled whoremongers in the US Congress airmailing you RPGs. I know my local neighborhood watch misses the hell out of Charlie Wilson. The last time we caught a prowler we actually had to wait for him to bleed out from the gut shot. Goddamn waste of time if you ask me.

          • CatCube says:

            Have you talked to a neurologist lately? I was looking over old posts a couple of days ago, and I don’t remember you being insane. You should probably get that checked out.

          • Airgap says:

            Your memory of reading sane posts by me is almost certainly iatrogenic. Try suing your therapist, or if you don’t have one, Scott.

        • Nornagest says:

          Citizens’ militias can’t effectively project power, but you don’t need power projection to make life unpleasant for occupying troops that are already there. Those tactics don’t lend themselves to conventional victory — very few insurgencies have ever just straight-up kicked their opponents out, and the ones that have, have usually enjoyed conventional military support from foreign powers — but in most cases you don’t need victory, you just need to make the war unprofitable.

        • NN says:

          Citizen’s militias are no longer considered militarily effective (outside of militia fantasies), and haven’t for over a hundred years

          Tell that to the citizen’s militia with a grand total of 2 members, neither of whom had any combat experience or training whatsoever, who shut down the 6th largest metropolitan area in the most powerful country in the world for 2 days using fireworks and kitchen appliances.

          I am, of course, referring to the Tsarnaev brothers.

          Not too long ago, a somewhat larger and better equipped citizen’s militia wreaked so much havoc in a single night that they brought the 7th most militarily powerful country in the world into a state of martial law for 6 months and counting. You might have heard about it.

          Or just go to the airport, take a look at a security line, and reflect on how having to take off your shoes and undergo a virtual strip search before getting on an airplane would have seemed like something out of a cartoonish dystopia story 20 years ago. Then reflect on how even these cartoonishly dystopian security measures still aren’t effective at protecting air travelers.

          And that’s in the developed world; in the developing world… Just open up Google News and type in any of the following search terms: “Syria,” “Iraq,” “Libya,” “Afghanistan,” “Sinai insurgency,” “Boko Haram,” “Donbass,” and “Mexican Drug War.” You’ll quickly see why the statement that “citizen’s militias are no longer considered militarily effective” is laughable.

          • Nornagest says:

            Then reflect on how even these cartoonishly dystopian security measures still aren’t effective at protecting air travelers.

            I read that as “preventing air travelers”. I think I like my version better.

          • Psmith says:

            To be fair, I’ve seen it mooted that successful asymmetric warfare depends on the militarily stronger side being unwilling to go full Colonel Kurtz, which strikes me as more plausible than the standard “lol but drones tho” line of argument.

          • NN says:

            To be fair, I’ve seen it mooted that successful asymmetric warfare depends on the militarily stronger side being unwilling to go full Colonel Kurtz, which strikes me as more plausible than the standard “lol but drones tho” line of argument.

            That may be true, but even when going full Colonel Kurtz works, it tends to take a long time and leave the territory under dispute in a very bad state. For example, the Russians pretty much ignored the Geneva Conventions during the Second Chechen War, and after 10 years and more than 7,000 military deaths they “won” in the sense that in the last 6 years, the conflict in the North Caucasus has only killed about as many people as the Northern Irish Troubles killed in 30 years. Similarly, Bashir Al-Assad has paid little attention to humanitarian concerns during the Syrian Civil War, and he may eventually win, but even if he does there won’t be much left for him to rule over.

          • keranih says:

            successful asymmetric warfare depends on the militarily stronger side being unwilling to go full Colonel Kurtz

            More to the point, it depends on the militarily stronger side restricting legit violence to their military and preventing the sort of vicious ugliness of Bleeding Kansas, Bosnia, and the like that happens when civilian populations go to war on each other.

            So long as the military side retains the capability and will to engage in “a whiff of grape” at strategic points, insurrections can spend decades not getting anywhere, while political processes may go on elsewhere. When they do not, and both of the civilian populations become engaged, we get traditional (ie, non-American) civil war.

            The forbearance of the stronger military does not promise success to the insurgents, of course – the Shining Path and the Palestinians are long running insurgencies that still have not won. But it’s the combination of restraint and control of the violence that creates an environment in which the insurgency does not die quickly.

        • Corey says:

          When I expressed this position (elsewhere) someone pointed me to Jacobin’s Four Futures story, where they lay out some alternatives to this (which they label “exterminism”). Being Jacobins, they’re coming at it from an explicitly socialist perspective, but there are other possible endpoints once labor is no longer scarce: automated communism, ordinary socialism (say, robots + UBI), or rentism (everyone’s employed in intellectual property industries, so they can pay licensing fees on their replicators).

          Disclosure: I do still think exterminism is likely, because people are assholes, but it’s hardly inevitable.

      • JDG1980 says:

        One proposition I’ve seen proposed in Worm and possibly other things that seems plausible is that a world with superpowers will naturally tend towards feudalism. Feudalism arose from an environment (and in fact independently in more than one environment) where a small number of well trained, well equipped knights are a more significant military force than basically any number of peasant levies. Gunpowder weapons allowed any recruit to be trained and armed relatively cheaply into a valuable asset, tilting the balance in favor of democracy.

        In comic-book universes, technology can make up for the lack of superpowers. This technology usually ends up being restricted to the hands of a handful of superheroes, supervillians, and government agencies, but that seems like a result of the narrative requirement to keep the background setting at something approximating modern-day America, rather than a logical end point of the plot.

    • keranih says:

      I have seen the previous movies – and followed some of Cap and IM in the decades previous – and for me, the movie is going to have to do a metric $#!+ton of work in the first act to make me believe that Steve Rogers is on the anti-registration side against Tony Stark, the man who privatized world peace.

      I might be over-conflating Rogers with other similar heroes (Bigwig, f’instance) and with actual military men of that generation, but to me, Tony Stark is still the man who doesn’t trust any system he didn’t build himself – and in Age of Ultron ran smack into a brick wall of his own limitations.

      Having said that – in a system that had superheroes, those powerful individuals would have to be taken into account when governments attempted to order interactions between people. IMO, we’d much less likely see “government” trying to deal with supers than we would see supers concerning themselves with right government. The founding fathers, the Roosevelt and the Kennedys, Napoleon, the ambitious landed warlords of the Kyoto courts – these were all “superhuman” – possessing of skills above those of ordinary mortals. (We see this reflected in the existence of Batman – Bruce Wayne’s superskill might be sheer will, but he is what he is because of being gifted not only with grit, but also enormous wealth.)

      Superman is, imo, separate only in degree, not kind.

      • InferentialDistance says:

        but to me, Tony Stark is still the man who doesn’t trust any system he didn’t build himself

        That’s why he’s pro-registration, though: either he works with the system and has some input into the outcome, or he fights it and watches the government send death squads after all his friends. He has personal experience with how governments deal with things they perceive as threats (he used to help them with that), he very much wants to make it clear that he and his people are not threats.

        Rogers was always a loose cannon, he was just lucky enough to have a a good target to chase so he avoided pissing off the powers that were. That his judgement is at least 2 standard deviations better than any government oversight is irrelevant, because the assorted political forces of the world demand accountability, and “it was the right call” ain’t gonna cut it (even if it was!).

        • CatCube says:

          “It was the right call” will usually work, if it is in fact the right call. It’s not uncommon for somebody to get away with violating regulations if it works out.

          If you make a habit of violating regulations, you’ll usually end up getting thrown against a wall eventually, because in real life nobody is right all the time.

          • InferentialDistance says:

            What I mean by “the right call” is not “successfully predict the future” but “choose the course of action, given the information available to you at the time, that is most likely to achieve good outcomes”. Because the future is uncertain, even intelligent decisions will fail. Sometimes you do everything right and still lose. And the public gives zero shits that you did everything right: you lost. Regulations are the excuse for losing, so you follow them even if it means you lose more often.

      • Whatever Happened To Anonymous says:

        I have seen the previous movies – and followed some of Cap and IM in the decades previous – and for me, the movie is going to have to do a metric $#!+ton of work in the first act to make me believe that Steve Rogers is on the anti-registration side

        He was literally against registration in the previous movie.

      • LHN says:

        @keranih I came in wondering about that, but the movie did a good job of sketching out the sides based on each hero’s previous MCU experience.

        Spoilers for attitude and previous movies, but not for specific incidents in Civil War:

        Cap’s experience with legitimate authority running things includes the World Council and SHIELD. The former decided to nuke New York while he was there. (Arguably a defensible call, but not one he’d have agreed with.) Both turned out to have been infiltrated and taken over by HYDRA. Nick Fury himself wasn’t corrupt, but defended the merits of compartmentalization, need-to-know, and necessary ruthlessness to Cap only to be completely blindsided by his own organization. Steve isn’t a loose cannon, but he’s never going to willingly outsource his conscience to an organization like the UN after that.

        Ever since Afghanistan, Tony has been desperately pursuing safety, and failing to achieve it. He got PTSD courtesy of the Chitauri, and his attempts to act on his own to improve the world keep backfiring more spectacularly (from Ironmonger in the first movie to Ultron, which he conceived as “a suit of armor around the Earth”), hurting other people more than himself. He doesn’t trust himself (or, by extension, the other Avengers, because of course he’s the one he trusts the most) not to hurt people unnecessarily while thinking they’re doing the right thing. By subordinating his judgment to the world’s, he’s trying to take his proven fallible judgment out of the equation, once again trying to make things safe. (Or at least to not make it his fault that they’re not.)

        There are other, personal issues driving the conflict as well, which contribute to the escalation from talking to punching. But at least within the context of the MCU, they’re both shown to have a point, and events are pushing them further towards their respective poles than they’d probably really feel comfortable with in a calmer place.

        While there are fridge logic issues with the plot, I thought the themes were really well handled.

        • MugaSofer says:

          >the World Council and SHIELD … Both turned out to have been infiltrated and taken over by HYDRA.

          When was the World Council infiltrated? They were just shadowy faces on some screens.

          • LHN says:

            Spoilers for Captain America: The Winter Soldier (2014) and Marvel’s Agents of Shield (and sort of for the first Avengers movie):

            In the first Avengers, the main Council guy Nick Fury argues with over things like the effectiveness of the Avengers and nuking New York is played by Powers Boothe. In Agents of SHIELD, Boothe plays Gideon Malick, part of a multigenerational HYDRA family who holds high rank. (Arguably their leader, but there seem to be rival splinter groups.) The show eventually established that Malick was in fact the Council member in the movie.

            Since the show takes input from the movies, but not vice versa, that’s still potentially ambiguous taking a film-only view. But we also have:

            In the Winter Soldier, Nick Fury’s close friend Alexander Pierce (whose life Fury saved once) is an ex-SHIELD operative, now probably a US Cabinet official (he’s called “Secretary”) currently appears to be taking a leading role on the Council. (He also at some point declined a Nobel Peace Prize.)

            We learn he’s been a HYDRA operative since his SHIELD days. From his seat on the Council, he’s in charge of the entire mole operation that’s hollowed SHIELD out from within, tried to kill both Fury and Cap, and put HYDRA within hours of destroying twenty million enemies using the helicarriers SHIELD (ostensibly) built for the Council.

            Winter Soldier also establishes that HYDRA established itself in SHIELD right under the nose of Peggy Carter and Howard Stark, and that Nick Fury had zero clue up to the point they almost took him out. That’s pretty much everyone Steve might have trusted to have a clue. Now he’s going to place his activities at the disposal of the UN, which he presumably knows much less well?

          • MugaSofer says:

            Thanks! That’s very convincing.

  6. MawBTS says:

    Why did Pangloss say that noses exist to hold up spectacles? He should have said that gravity exists to keep spectacles on noses.

  7. noone says:

    There’s been an article going around about how Acetaminophen may reduce empathy (From Painkiller to Empathy Killer: Acetaminophen (Paracetamol) Reduces Empathy for Pain).
    While not an extremely surprising result, has anyone with background in the field gone through the article and determined whether this was a well done study?

    • Berna says:

      these drug-induced reductions in empathy raise concerns about the broader social side effects of acetaminophen

      This is one drug I have experience with, and I’d be mightily surprised if the empathy-reducing effect of paracetamol weren’t more than offset by the empathy-increasing effect of being in less pain oneself.

  8. Anonymous says:

    Anybody else reading the Interfaces thing?

    I love it, reminds me of Unsong some times, but way darker. Touches on some things that might be of interest to this community.

    Here are some favorites: 1 2 3 4

    (They are all short)

    • Noumenon72 says:

      It seems that they are parts of a continuing story. Maybe it would be best to click the user’s name and read them from the beginning? I’m not that interested. More background

      • Anonymous says:

        The first link in my comment goes to a collection of all parts, in order of chronological posting. The provided examples are non-spoilers.

        The author posts daily in relevant reddit threads, pretty good strategy to make commentary and generally disturb people in a good way.

    • HircumSaeculorum says:

      I second this recommendation – this is fantastic.

      > We must find and enter the narrow gate, but it will not be easy. It order to find it, we must sort through the many possible pasts to find the few possible futures which result in a humanity free to live and die as humans, and not as an unholy agglomeration of mindless flesh. Unfortunately, as we fight against the forces of slavery and death, it will be precisely our instincts towards the preservation of freedom and life that will lead us to destruction. In short, we live in precarious times.

      It does sound like he’s talking about something like Moloch.

      • Dirdle says:

        It does sound like he’s talking about something like Moloch.

        Exactly what I thought. “What if Moloch were a real outer god, rather than just a metaphorical one?” Which also drags in the Unsong parallel, I guess. “What if the sky cracked and the metaphors that returned to being real were the weird and horrifying ones, rather than the merely very scary good-and-evil ones.” Well, I suppose we’ll have to see where Unsong goes.

        Also strong elements of SCP, of course. Off the top of my head 093 and 3125 are broadly similar, but there’s no lack of random body-horror stuff on there. Maybe worth noting Frictional Games as another purveyor of high-quality SAN damage in the same general vein.

        Yeah this is a great recommendation, thanks Anon.

  9. Jutland says:

    Any other history students read SSC?

    • 1Step says:

      I got an undergrad history degree if that counts. I’m a software developer now though. ¯\_(ツ)_/¯

      I still enjoy reading and learning about history as a hobby. In my experience it isn’t the most popular topic, unless you include the WWII buffs.

    • Eggoeggo says:

      History minor. Fell totally out of touch with the field once I lost access to journals. You take it for granted at university 🙁

    • HircumSaeculorum says:

      Well, I’ll be starting a history degree next year. Why?

      • Jutland says:

        Well, sometimes Scott writes about history or history related stuff. And I think he has interesting things to say. Certainly he has very different priors than the people who get published in American Historical Review. I asked mostly because I wanted to see what other history people think about his history related writing.

        • Urstoff says:

          I follow a few early American history blogs, and I wonder how race and gender studies came to dominate that are of history (does it now dominate every area of history?). Is it simply that the political / economic / etc. areas of history have been completely mined? Is it shared political biases or interests? Or is it just a historical / institutional accident that those areas are now dominating?

          I ask because I like to read books on the founding fathers, and those books tend to be derived as “founders chic” by lots of historians (whether the books themselves are written by historians or not). It just seems odd to be bothered by what captures the public imagination. As a philosopher, I’m not really bothered by what philosophy books are publicly popular; I don’t expect hardcore analytic philosophy to every be popular with the public, and I’m ok with that.

    • Good Tea Nice House says:

      I’m reading a pretty good biography of Robert E. Lee. Does that count?

    • E. Harding says:

      I got a 5 on the APUSH and AP World exams, and am currently writing a long blogpost on how the parties switched geographical positions. Does that count?

    • birdboy2000 says:

      No longer a student (well, at least not a formal one, reading never ends) but do have an undergrad degree in the subject.

    • Chris H says:

      Took graduate classes before I decided academia wasn’t worth all the bs to get to a good job (plus the decreasing likelihood of ever making it even if I did stick with it). Got an area of specialization/a preferred historical methodology? I tend to prefer structural histories myself (I’m still not reconciled with the idea that post-structuralism was a real advance in the field).

      • Jutland says:

        Area: modern Europe (mostly France, but I’m trying to branch out)
        Methodology: I just discovered microhistory and Carlo Ginzburg which I’m very excited about. In general I mostly study social history with occasional forays into intellectual history and (very rarely) military history.
        Post-structuralism is hard to like. Derrida was worthless for sure. Foucault is I think worth reading but not taking too seriously. I’ve never really heard of structuralist history, except for one book about Nazi Germany by Ian Kershaw that mentioned something called that. But it sounds like you mean French structuralism and not whatever Kershaw was talking about. Did you just apply like Lévi-Strauss and Althusser to history?

    • I don’t have any degrees in history, although a few of my publications deal with historical issues such as the doctrine of the just price. But I am interested in history and do historical recreation, including cooking from cookbooks back to the tenth century.

      • Stefan Drinic says:

        The tenth century? I am intrigued, and am going to want to see that now.

        • The tenth century cookbook.

          My and my wife’s self-published cookbook, which includes some recipes from that one.

          • Stefan Drinic says:

            Ahh, that does make sense. A tenth-century European cookbook would have surprised me a lot more. It does look neat.

          • There’s a late Roman cookbook (Anthimus) but I haven’t worked from it. And there is a sixth century letter from a Byzantine physician to Theoderic, king of the Franks, that you can squeeze a few more or less recipes from that I have used. But al-Warraq is the earliest real cookbook I’ve worked from.

            The earliest post-Roman western European cookbooks we have are 14th century or perhaps late 13th c., but there is one 13th c. cookbook that hasn’t survived but has left daughter manuscripts, one of which is the source of one of our favorite recipes.

            I don’t think there is any reason that there couldn’t be a 10th c. western European cookbook, but so far as I know there isn’t.

          • Stefan Drinic says:

            There’s a number of Roman cookbooks and recipes in general, yes. Back in high school I’d turned in a number of assignments past their deadline, and my teacher found it funny to give me extra homework consisting of a text from Apuleius to translate, which was a recipe of his for the preparation of pig vulva.

            As for tenth century Western Europe.. It’s probably a case of all the wrong people being literate enough to write such a thing. The thirteenth and fourteenth century are after the commercial revolution in northern Europe had started off, as well as the cities in Italy hitting their stride; merchants with clerks are very well able to produce such literature. (Pre) feudal kings generally aren’t.

      • Jutland says:

        What field do you work in?

    • Chevalier Mal Fet says:

      I have a Master’s degree, and a minor in Classics.

      I teach history at the high school level.

      I generally enjoy Scott’s writing on history, since he does have a fresh perspective, even if it is sometimes adorably naive, like his credulousness towards Albion’s Seed.

      • Could you expand on that? His review left me a bit suspicious of the book, but I haven’t read it.

        • Chevalier Mal Fet says:

          Overall, Fisher’s work is really strong – there’s a reason his book is still a text in graduate-level American history courses, which is where I first encountered it. His research is top-notch and his history of folkways (he breaks them down into all manner of things like wealth ways, marriage ways, learning ways, child-rearing ways, etc.) is lots of fun to read, as Scott showed with his “interesting facts” sections. And as a history of the cultural development of the early American colonial regions, it can’t be beat.

          Fisher’s main weakness, though (and this was brought up in the comments of Scott’s review, which is why I felt no need to comment) is that he wants to stretch his thesis too far. You can’t explain modern America using his 4 British folkways, as SSC commenters quickly discovered when they started trying to sort each other into the various cultures and found that it was more complicated than it seemed. Fisher almost totally ignores, for example, the effect of race on American culture, of immigrant traditions beyond the English (many commenters recommended City at the Center of the World as a useful antidote to this particular failing), of subsequent waves of immigration to the United States, and of the environment upon culture.

          Oh, he makes token acknowledgements of these things – saying of course material conditions can affect culture, of course cultures grow and change – but usually forgets about them as soon as he moves on to the next interesting fact. As a result, the book has a mess of contradictions. Culture is ever changing, but the 4 founder cultures are immutable. Culture is a “human instrument,” often a tool of the elite to control the masses, but it’s also apparently genetic (Patton). Culture is at the mercy of its environment, but even though the influence of the South’s wide, slow rivers and pleasant climate on Southern agriculture and its peculiar institution has been noticed for decades, Fischer never develops the idea at all, preferring instead to concentrate solely on the Cavalier culture.

          As a result, the way that cultures can change over time are totally ignored, since Fischer prefers to play around exploring the colonial founder cultures (with good reason, as they are indeed fascinating!). Hence you get his cursory treatment of topics like the interplay between race, climate, and culture in the South – he even says “race slavery did not create the culture of the southern colonies, that culture created slavery.”

          Finally, Fischer’s exhaustive research lets him play around a bit with the evidence. At times he can cherry-pick examples to support his point, while ignoring counter-examples. Thus, he offers isolated examples like Wethersfeld, Connecticut to prove a New England culture of nuclear villages but ignores the several hundred plats presented by Joseph S. Wood (author of a cultural history of the New England village) to prove the opposite point. Fisher is not being deliberately dishonest – his evidence is copious throughout the book, and not solely based on anecdotes – but he’s not giving the full story, even in 900+ pages.

          Basically, American culture is a complicated story and Fisher provides a good tool for starting to grapple with it, but he can be (and is) often oversimplified.

    • Zaxlebaxes says:

      Probably replying too late, but I did my B.A. in history and then finished a master’s in social science with a history concentration about a year ago. I wrote a comment at one point about how refreshing Scott’s take is because he actually reads and evaluates historical arguments in the way scientists read and evaluate findings. He seems to recognize that history is about actual facts, and that some facts may actually be interesting and have discernible implications that follow logically from them. This is as opposed to treating history as just a perpetually fraught hermeneutical exercise where you can never make a statement that’s more right than any other, and your objective is never to determine facts anymore, since we already know what happened.

  10. Nadja says:

    I need help thinking through a problem. There is an elderly 80+ lady whom I have casually known for decades, and whom I sometimes help with small tasks, such as letters or translations (she doesn’t speak English.) She lives in a different state. She has worked physically all her life, the last 30 years as a cleaning lady in a small clinic. The job is all the social interaction she has. All of her old neighbors moved out, and new immigrants, speaking a different language, moved in. So, recently, her boss of 30+ years retired, and shortly thereafter she was laid off. I suspect it was that boss who kept her around for so long, because he knew how important the job was to her. She is now asking me to call her HR department and plead with them to take her back part time. The problem is that I’m a very introverted person who is terrible at talking to people or at any sort of persuasion. And I want to give it my best shot. I really want to help this lady. Any suggestions as to what would be the most effective way of talking to her HR department for me to even stand a chance of convincing them to help? Additionally, my understanding is that the lady and the HR head dislike each other. Thanks for reading =)

    • Eggoeggo says:

      That… almost sounds like effort would be better spent finding her new avenues for social interaction, especially if that was the most important part of the job for her.
      How is her income, do you think? Is she religious? Volunteering for church functions is an excellent way to get in touch with people, especially if you may need community assistance down the line.

      • Nadja says:

        Thank you for responding. I think she’s fine financially. She’s very frugal and will be receiving social security payments. Great idea about volunteering at church. I will ask her if she’s considered it.

        Still, I’d feel guilty if I at least didn’t give her HR department a call. I wish there were negotiators for hire that I could use… Are there? Anyone’s ever used one? (I’m going to be googling this.)

        • brad says:

          Lawyers are a kind of negotiator for hire, but only for certain types of negotiations. Same for sports and entertainment agents (who are sometimes, but not always lawyers). I don’t think there’s any such thing as at large negotiator for all circumstances, walk-ins welcome.

    • Jason K. says:

      Calling HR is going way above and beyond, and probably won’t be effective (especially with the dislike between the HR head and this person).

      As she no longer has any connections to the area she is in now, my first suggestion to her would be to move to a location where she can make connections. Perhaps find some nearby ethnic enclaves for her?

      In fairness, she has been here 30 years and still doesn’t speak the language? That sounds like almost willful helplessness. Be careful. People that are willfully helpless can become massive energy drains. Sure, you will feel good for a while helping them, but it can easily create an unhealthy dependency based relationship and the demands for your help can easily spiral out of control. My second piece of advice for her would be directions to a local ESL class.

      • Nadja says:

        Thank you, Jason. It’s a good idea to look around for some ethnic enclaves or at least organizations (maybe a church) that she could use for social support.

    • vV_Vv says:

      Isn’t she a bit too old to work as a cleaning lady? Even if she can do it now, would she be able to do it in a couple years? If she slips, trips and breaks her hip on the job, wouldn’t her employer get into trouble with the insurance lawyers and/or government regulators?

      It looks like she’d better consider herself retired and look for some other forms of social interaction.

      • Nadja says:

        It wouldn’t surprise me if her employer were concerned about the very same thing. Thanks for your comment!

    • Nonnamous says:

      I think I agree with everything other people already said about calling HR being likely to fail and maybe even not such a good idea.

      But, assuming we still want to give it a try, here is what I would do: Pick a friend who you think has good people skills, explain the situation to them, and ask them to do this as a favor for you.

    • Airgap says:

      Find a more extroverted friend and convince them to plead the old lady’s case for you. To convince the friend, you can use a combination of emotional blackmail, bribery (e.g. offer sexual favors), and outright threats (it’s pretty easy to obtain a rifle in most of the US, and they’re much easier to aim than pistols). Good luck!

    • gmu frosh says:

      Does she understand she has the right work below the minimum wage?
      The employer may be receptive to such acts of liberty.

    • SolipsisticUtilitarian says:

      If you do end up making the call, try the Door-In-The-Face technique: After explaining how important the job is to the old lady, ask them to take her back full time, and if they say no, ask for part-time.

      The anchoring effect combined with people’s aversion to saying ‘no’ two times in a row will work in your favor.

  11. Izaak Weiss says:

    My favorite part of the sequences has always been 37 ways words can be wrong. What are people’s thoughts about this?

    I ask because I’m taking a class next semester on semantics from the linguistics department at my school, and I’ve been wondering if this area of the sequences has been less criticized because it’s right, or whether it’s just been overlooked. (Or perhaps I’ve never seen the criticism but it is out there.)

    • suntzuanime says:

      It’s right. It’s also basically just Wittgenstein, but there are worse things to be. Note that Scott has written on a similar topic at https://slatestarcodex.com/2014/11/21/the-categories-were-made-for-man-not-man-for-the-categories/

      • Dan Peverley says:

        I’ve heard the “Less Wrong is Analytic Philosophy lite” thing a few times in different ways. Suggested entry point for Analytic Philosophy, Wittgenstein in particular?

        • Airgap says:

          I recommend reading Wittgenstein in the original German, as this will help prevent you from falsely assuming you understand what he means. If you can read German, try to find his works translated into, say, French.

        • John Buridan says:

          I would suggest Ray Monk’s biography of Wittgenstein if you want to follow Ludwig into Analytic Philosophy. The chiding comments about Wittgenstein being incomprehensible are cleared up very well by Ray Monk. That book was my gateway into AP.

          http://www.amazon.com/Ludwig-Wittgenstein-Genius-Ray-Monk/dp/0140159959

          Less Wrong is kind of like AN AP lite, but Less Wrong has a cultural outlook which makes for some very good public discussion. A professional philosopher could often look over at Less Wrong and find quaint people puzzling over questions that have been discussed by ‘professionals’ for 50 years, but the same philosopher has no community of AP philosophy equivalent to Less Wrong. I think Less Wrong peeps are an awesome experiment in philosophy without input from the Tower.

          For example, I think “37 ways” is an extremely misleading article and needs amendment. My Tower education allowed me to read a lot more on the topic than most people ever will, so I’m predisposed to find most explanations about words to be woefully incomplete, and thus misleading in some ‘crucial’ way.

        • Philosophisticat says:

          Wittgenstein is a kind of odd figure in Analytic philosophy – everyone agrees he was brilliant but nobody is really sure what he was saying. This is common in the continental tradition but rare in analytic philosophy, where tremendous value is placed on clarity and people don’t like to bother much with interpretation. So I think looking at Wittgenstein as your introduction to analytic philosophy would give you a misleading picture of the discipline. That said, I think his Philosophical Investigations is a work of great beauty as long as you recognize that it’s not representative.

          I think a better entry point into analytic philosophy would be to pick an area that interests you (ethics, epistemology, etc.) and read a textbook or compilation, or some plato.stanford.edu articles. Or, if you need some names, Bertrand Russell, David Lewis, Robert Nozick are all much clearer and more representative writers in the analytic tradition.

          • Earthly Knight says:

            Offhand, Peter Godfrey-Smith’s Theory and Reality and Mark Colyvan’s Introduction to the Philosophy of Mathematics are good textbooks on philosophy of science and math, respectively. Here is a decent introduction to formal epistemology and confirmation theory.

        • jaimeastorga2000 says:

          Less Wrong is not Analytic Philosophy lite; Analytic Philosophy is Less Wrong lite.

      • Peter says:

        Well, the later Wittgenstein, Rosch, Lakoff and successors, and picking certain bits from the later Wittgenstein rather than trying to swallow the whole thing whole (a difficult task – as Airgap says, no-one can agree on what it all means).

        An uncharitable person would say, “it’s the sort of thing you get if you think that ‘cognitive’ is a magical search term that means ‘the good stuff’ and then trying some searches to try to find some good linguistics”.

        • Airgap says:

          Probably the best thing to do with Wittgenstein is to decide in advance what you want to argue, and then skim the Investigations for quotes that appear to support it. Whether your position has anything to do with Wittgenstein’s is immaterial. I understand Saul Kripke’s doing pretty well for himself.

          • Earthly Knight says:

            You have a distinctively Positivist view of citation practices. You should instead think of them in terms of citations games, or forms of citation life. I recommend that, whenever you find yourself saying something which you think is smart enough for Wittgenstein to have agreed with, you add a footnote as follows:

            1. Cf. Wittgenstein CR 107.3.

            It makes absolutely no difference if the acronym or the numbers represent anything, as it is hopelessly passe to understand language in terms of representation.

          • Airgap says:

            Not sure why the “smart enough for Wittgenstein to have agreed with” condition is necessary, but otherwise solid [1].

            [1] n.b. Feferman 1979, pp -7 to pi.

      • Pal says:

        Doesn’t this article basically resolve the sand-heap Sorites Paradox? How is this not the resolution? IE, that categories are constructed rather than inherent, and therefore whatever we define as a heap or pile is what a heap or pile is.

        • Vox Imperatoris says:

          That, in itself, does not solve the problem.

          We don’t just arbitrarily choose to call some things a heap and other things not a heap. We call them heaps because of what they’re objectively like, independently of what we call them. Otherwise, calling them heaps would be meaningless because you could call anything a heap. Yet the term is not meaningless, since in order for there to be a heap of something, there has to be a good-sized amount.

          On the other hand, I don’t think it’s a terrible, insoluble problem, either. Ayn Rand’s answer (in Introduction to Objectivist Epistemology) to this kind of boundary-drawing problem was what she called the “objective theory of concepts” (in contradistinction to the “subjective” and “intrinsic” theories). Concepts are created by human beings, yes, but there are are some instances where the classification according to a particular concept is mandatory (for the goal of achieving both “unit economy” and an accurate picture of reality), some where it is impermissible, and some where it is optional.

          There’s some point, e.g. one or two grains of sand, where the concept of a “heap” is completely inapplicable. There some other point, e.g. two tons of sand, where it would be absurd to call it anything other than a heap. And there’s some point where, depending on your purposes, you may want to call it a heap and you may not.

          It’s the same question with red and orange. There’s some point at which a shade is unquestionably red and not orange at all, some point where it’s orange and not red. But there’s some point in the middle where you can call it either orange or red, make a new concept to delineate that particular shade (e.g. “coquelicot“), or just describe it by a circumlocution like “reddish orange”.

          • Theo Jones says:

            We don’t just arbitrarily choose to call some things a heap and other things not a heap. We call them heaps because of what they’re objectively like, independently of what we call them. Otherwise, calling them heaps would be meaningless because you could call anything a heap. Yet the term is not meaningless, since in order for there to be a heap of something, there has to be a good-sized amount.

            I’m not sure of this. Concepts like “heap” exist to simplify expression, at the expense of precision. Ie. instead of saying “there is greater than 1kg of sand in the sandbox” you can say “there is a heap of sand in the sandbox”. There is no reason you would expect such a concept to have an objectively delineated and exactly precise definition. The definition of such a concept is therefore going to be imprecise and context-dependent.

          • Vox Imperatoris says:

            @ Theo Jones:

            As I tried to express in my post, I’m not saying that a “heap” has to have an “exactly precise definition”. Any theory of concepts upon which it would has a serious problem.

            But neither is the term “heap” completely imprecise, so as to be meaningless! If there’s one grain of sand lying on the ground, it’s not a heap. No matter whether you say it is or not. You can redefine the term “heap” of course, but the point is that it does have a generally accepted meaning that applies to some things and not others. There are some things that are unquestionably heaps, some things that are unquestionably not heaps, and many things in the middle.

            That’s different from saying a heap is just whatever any particular individual chooses to call a heap. The term is able to communicate information because the things people call heaps have certain features in common.

          • FeepingCreature says:

            I think the answer is that “heap” brings an image or a shape to mind, a roughly pyramid-shaped … it’s hard to describe this without calling it a “heap”. Like a mound? A bunch of instances of the thing lying on top of each other.

            I don’t think it’s related to number at all.

          • Agronomous says:

            I don’t think it’s related to number at all.

            I agree:

            Fifty sandbags is enough to make a heap.

            Fifty sand grains is not.

      • Zippy says:

        I don’t think I like ’37 Ways That Words can Be Wrong” very much because I’ve been introduced to the ideas therein slowly and thus been inoculated.

        “The Categories Were Made For Man, Not Man For The Categories” is interesting to me, because I think it accidentally (seems to everyone as though it) argues against the beliefs that Scott actually holds, which were detailed on his tumblr recently (I link you to this, reader, under the implicit agreement that you will not make it go viral, because that may make Scott sad). This would not be overly surprising, people change opinions and whatnot, but Scott cites his “The Categories Were Made For Man, Not Man For The Categories” in that same tumblr post as supposedly the same position. Is this just me?

        And now, a slightly related quote from 37 Ways That Words can Be Wrong:

        You can claim, if you like, that you are defining the word “fish” to refer to salmon, guppies, sharks, dolphins, and trout, but not jellyfish or algae. You can claim, if you like, that this is merely a list, and there is no way a list can be “wrong”. Or you can stop playing nitwit games and admit that you made a mistake and that dolphins don’t belong on the fish list.

    • Outis says:

      I don’t think “humans are mortal” is supposed to be so by definition.

    • Airgap says:

      I prefer the little sci-fi stories.

    • Earthly Knight says:

      2. Your argument, if it worked, could coerce reality to go a different way by choosing a different word definition. Socrates is a human, and humans, by definition, are mortal. So if you defined humans to not be mortal, would Socrates live forever? (The Parable of Hemlock.)

      This is amusing. It doesn’t follow from the fact that Tom is a bachelor that Tom is a man, because, were we to define a bachelor as a one-eyed raven, we could then infer that Tom is a one-eyed raven, which Tom is not? Uh, okay.

      4. You know perfectly well that Bob is “human”, even though, on your definition, you can never call Bob “human” without first observing him to be mortal.

      Let’s suppose (falsely) that, necessarily, Bob is a human only if Bob is mortal. Why on earth would that generate an obligation to observe Bob being mortal (…how?) before calling Bob a human? Surely I can call Neptune a planet without violating any norms of assertion even if I’ve never observed Neptune doing anything planetary.

      5. The act of labeling something with a word, disguises a challengable inductive inference you are making. If the last 11 egg-shaped objects drawn have been blue, and the last 8 cubes drawn have been red, it is a matter of induction to say this rule will hold in the future. But if you call the blue eggs “bleggs” and the red cubes “rubes”, you may reach into the barrel, feel an egg shape, and think “Oh, a blegg.”

      Suppose that I see an animal at the zoo with gray skin, large, floppy ears, tusks, and a trunk. I use the conventional name for this creature, “elephant,” and infer from past experience with elephants that, if it is female, it in all likelihood gives birth to live young. Isn’t this how inductive reasoning is supposed to work? How is this different from the blegg story?

      5. You try to define a word using words, in turn defined with ever-more-abstract words, without being able to point to an example. “What is red?” “Red is a color.” “What’s a color?” “It’s a property of a thing?” “What’s a thing? What’s a property?” It never occurs to you to point to a stop sign and an apple.

      We had better not use any terms whose referents we can’t point to, then! Wait, scratch that, that’s a terrible idea.

      17. You argue over the meanings of a word, even after all sides understand perfectly well what the other sides are trying to say. The human ability to associate labels to concepts is a tool for communication. When people want to communicate, we’re hard to stop; if we have no common language, we’ll draw pictures in sand. When you each understand what is in the other’s mind, you are done. (The Argument From Common Usage.)

      There are, of course, substantive issues which hinge on the meanings of words. Suppose that we wish to know if a certain politician perjured himself– knowingly told a falsehood under oath– when he claimed never to have had sexual intercourse with a certain intern. To show perjury, we must first show falsehood, and whether his statement was a falsehood or not depends on what the words “sexual intercourse” actually mean.

      27. You wouldn’t feel the need to say, “Hinduism, by definition, is a religion!” because, well, of course Hinduism is a religion. It’s not just a religion “by definition”, it’s, like, an actual religion.

      Hinduism is just so obviously a religion, is it? Take two Hindus, say, a village Brahmin c. 2000 BC sacrificing a goat to Indra to ensure a good crop, and an illiterate, vegetarian Vaishnava from Bangalore whose faith consists in a brief prayer over a lotus flower every morning. Would they recognize one another as fellow-practitioners of the same religion?

      Between the repetition, the truisms, the errors, and the inconsistencies, that's about all I can stomach for tonight.

      • Anon. says:

        The first one is perfectly sensible, it’s making the same point as section 2 of Two Dogmas.

      • tern says:

        It’s true there’s a lot of repetition on the list, and many of the points seem rather trivial. However, I would like to offer some responses, as your criticisms seem to miss the mark:

        2: The argument “Tom is a bachelor and therefore, by definition, a man” is in fact invalid – if the arguer’s actual information about Tom is only “Tom is a ‘bachelor'”, the fact that one definition of ‘bachelor’ is an unmarried man is not helpful. The information the arguer needs is what definition of ‘bachelor’ was used to classify Tom. If the arguer themselves classified Tom, what “Tom is a bachelor, and therefore a man” really means is “I am led to believe Tom is an unmarried man, the combination of which features I call ‘bachelor’,” but obviously that’s a bit of a mouthful.

        4: The objection is that the hypothetical arguer is equivocating between definitions. Clearly mortality isn’t that important for the definition of human if the arguer already knows Bob’s human without determining his mortality. If one knows that Neptune is a planet, the definition of planet being used does not require or necessarily imply qualities that one doesn’t know about Neptune.

        5: Of course the arguer can make that inference. But pretending that it’s not an inference is a way to be wrong. In other words, “It must be heavy, it’s an elephant!” is an easy way to make the speaker and listeners forget that the creature may not be an elephant, but instead an animatronic elephant-like robot made from balsa wood and helium.

        Granted, in most real-life cases, people recognize “X, as an elephant, …”and “X is an elephant, so…” as challengeable claims, but it’s more relevant when the classification made is using one of several substantially different definitions, or one unnatural to the speaker or listeners.

        (other 5, I think you mean 6): You agree that here point is not meant exclusively literally, right? I find it hard to imagine having a useful word that can’t possibly be given an ostensive definition, but perhaps one exists (maybe particles or articles?). I’d be intrigued to hear a counterexample.

        17: In that case you’re not arguing over what sexual intercourse means, you’re arguing about what the respondent understood by it – an empirical question, albeit extremely difficult to prove. In fact, once you understand what the respondent was trying to say, you’ve solved the issue (as a matter of law, perjury in the US requires the alleged perjurer not to believe the statement in question, and to affirm it’s true, among other things). In the more general class of things that depend on the meaning of a word, I believe most are actually about what the speaker/writer meant by the word. There are also arguments about what most people would understand by a word (e.g.marketing, PR, etc), but those aren’t about what the word means, rather what the audience thinks it means.
        Arguments about what a term means to Bob, or to the IEEE standards committee are explicitly deemed ok, Eliezer just thinks it’s wrong to argue that there is one true definition of a word everyone has to line up behind in all contexts. Not very profound, I grant you.

        27. We understand Christianity to be a religion as well, even if many practices among different sects are quite distinct, and commonly include sects which don’t regard each other as true Christians. Further, taking a practitioner from 4000 years ago as an example is, I think, rather in bad faith; the example is that Hinduism is “obviously a religion” not that it has always been so, nor that is has always been the same religion unchanged. Finally, and most importantly, I don’t think your objection really strikes to the meat of the point – whether or not Hinduism is a single religion seems rather irrelevant to the question of whether the non-central fallacy is actually wrong.

        I acknowledge there are commenters here who don’t agree that the non-central fallacy is a fallacy, and consider it a valid argument, but you’re not making that argument, you’re just picking a fight over a non-critical example.

        • Earthly Knight says:

          The argument “Tom is a bachelor and therefore, by definition, a man” is in fact invalid – if the arguer’s actual information about Tom is only “Tom is a ‘bachelor’”, the fact that one definition of ‘bachelor’ is an unmarried man is not helpful. The information the arguer needs is what definition of ‘bachelor’ was used to classify Tom. If the arguer themselves classified Tom, what “Tom is a bachelor, and therefore a man” really means is “I am led to believe Tom is an unmarried man, the combination of which features I call ‘bachelor’,” but obviously that’s a bit of a mouthful.

          I have no idea what you’re trying to say here. If you’re saying the inference from Tom being a bachelor to Tom being a man is not logically valid, i.e. true in virtue of logical form, of course it isn’t. If you’re saying that there are other, marginal uses of the word “bachelor,” of course there are, but we’re not talking about those (if you like, substitute some word which has a unique meaning for “bachelor”). Yudkowsky seems to be claiming that, even if A’s being an F entails that A is a G, we are not allowed to infer from A’s being an F that it’s a G because, were we to define “F” differently, this would allow us to infer a falsehood. This is ridiculous.

          Clearly mortality isn’t that important for the definition of human if the arguer already knows Bob’s human without determining his mortality.

          This doesn’t follow at all (ignoring the poorly chosen example). Suppose that having property X is a sufficient condition for being an F, and that having property Y is a necessary condition for being an F. You might determine that A is an F by way of its having property X, and glean from this the important fact that A also has property Y. For instance, if someone tells you that you have an enclosed figure whose edge is at all points equidistant from its center, you can draw the useful and novel inference that its area is pi times the square of its radius. Or do you think that the relationship between the radius and area of a circle isn’t an important feature of circles?

          None of this has anything to do with observation, of course, which is the problem in the original claim.

          But pretending that it’s not an inference is a way to be wrong.

          I agree that one should not believe that one is not drawing an inference when one is, in fact, drawing an inference. Note, though, that this is a truism and has nothing do with misuses of language.

          I find it hard to imagine having a useful word that can’t possibly be given an ostensive definition, but perhaps one exists (maybe particles or articles?). I’d be intrigued to hear a counterexample.

          Good lord. “Hadron.” “Twelve.” “The Big Bang.” “Natural Selection.” This is restricting ourselves to nouns, once other parts of speech are brought in there’s not even a glimmer of hope that we might systematically define them by jabbing our fingers around. Put a little thought into this, please.

          In fact, once you understand what the respondent was trying to say, you’ve solved the issue

          This is wrong– perjury requires that the statement be actually false, which depends on the actual meanings of the terms involved, as noted. A man who believes he is testifying falsely but, in fact, is not does not commit perjury.

          whether or not Hinduism is a single religion seems rather irrelevant to the question of whether the non-central fallacy is actually wrong.

          Oh, sure, whatever. I was just pointing out that we can add Hinduism to the list of things the author doesn’t know anything about.

          • Anon. says:

            Yudkowsky seems to be claiming that, even if A’s being an F entails that A is a G, we are not allowed to infer from A’s being an F that it’s a G because, were we to define “F” differently, this would allow us to infer a falsehood. This is ridiculous.

            The point is that just because you defined all Fs as being Gs doesn’t mean that’s what is actually happening in reality.

          • Earthly Knight says:

            Here is what you seem to be saying:

            –If A is an F yet lacks property X, our definition of F-hood should not include property X.

            This is true, but not at all illuminating. It amounts to the requirement that definitions be accurate.

            As best I can reconstruct, Yudkowsky was trying to make a slightly different claim:

            –If A is an F and has property X, but possibly A could be an F while lacking property X, our definition of F-hood should not include property X.

            This is a more sophisticated truism– it amounts to requiring that definitions be true in all possible worlds. In any case, Yudkowsky apparently didn’t have the conceptual vocabulary to express himself properly, and wound up saying something ludicrous instead.

          • tern says:

            If you’re saying the inference from Tom being a bachelor to Tom being a man is not logically valid, i.e. true in virtue of logical form, of course it isn’t

            On a list of “ways words can be wrong” it seems appropriate to include “using a word this way is logically invalid.” I believe your second interpretation (from your most recent post) is close to my understanding of it, though emphasized differently.

            Suppose that having property X is a sufficient condition for being an F, and that having property Y is a necessary condition for being an F. You might determine that A is an F by way of its having property X, and glean from this the important fact that A also has property Y

            If X is a sufficient condition for being an F, observing X logically entails observing Y and, what’s more, knowing that X is sufficient and Y necessary requires us to already know X entails Y (so we also know we have observed Y).

            The original claim, quoted here: “You unconsciously slap the conventional label on something, without actually using the verbal definition you just gave,” is actually just “equivocating on definitions is bad.” If we introduce an atypical definition of human, clearly we ought to actually use that definition; if we don’t know if Bob fulfills the necessary conditions, we can’t (in this context) honestly claim Bob is an example of a human. This is sort of like point 5.

            “Hadron.” “Twelve.” “The Big Bang.” “Natural Selection.” This is restricting ourselves to nouns, once other parts of speech are brought in there’s not even a glimmer of hope that we might systematically define them by jabbing our fingers around.

            Is “jabbing our fingers around” intended for rhetorical effect, or meant to be taken literally? I agree many meaningful concepts exist that cannot be physically pointed at.

            I think we must understand different things by the phrase “ostensive definition.” With enough examples of twelve varied objects labeled “twelve” and examples of groups not numbering twelve labeled “not twelve,” we could arrive at an ostensive definition of twelve. We can, similarly, give actual examples of natural selection, both real and fictional. Adjectives, verbs, adverbs, and prepositions can be demonstrated.

            As a single event, we can do better than ostensive for “The Big Bang” and move up to a proper extensive definition. Hadron we can also define extensively, barring major discoveries in physics.

            This is wrong– perjury requires that the statement be actually false, which depends on the actual meanings of the terms involved, as noted

            Behold:

            (1) Having taken an oath before a competent tribunal, officer, or person, in any case in which a law of the United States authorizes an oath to be administered, that he will testify, declare, depose, or certify truly, or that any written testimony, declaration, deposition, or certificate by him subscribed, is true, willfully and contrary to such oath states or subscribes any material matter which he does not believe to be true; or

            (2) in any declaration, certificate, verification, or statement under penalty of perjury as permitted under section 1746 of title 28, United States Code, willfully subscribes as true any material matter which he does not believe to be true

            perhaps it is different in other countries. In any case, establishing the actual falsehood of the statement would involve proving that what the respondent meant was false, not recourse to an “actual” definition. I can see where someone might want an “intentionally mislead” clause, too, to include cases where the respondent uses bizarre meanings for their words, but then the argument would be not about an “actual” meaning, but what the questioner might reasonably be expected to infer from the answer given.

          • Earthly Knight says:

            On a list of “ways words can be wrong” it seems appropriate to include “using a word this way is logically invalid.”

            I suspect you may be in too far over your head for this to be a productive conversation. I have no idea what you think it would mean to use a word in a way that is logically invalid. Logical (in)validity is a feature of arguments or inferences, not words. I get that you feel compelled to defend Yudkowsky’s honor or whatever, but you’re not doing him any favors by piling a new set of confusions on top of his.

            If X is a sufficient condition for being an F [and Y a necessary condition], observing X logically entails observing Y

            Let me stop you there. This is obviously false. Suppose I observe of a polyhedron:

            X: It has exactly six faces, all of them square.

            I infer from this that the polyhedron is a cube, and, therefore:

            Y: Its volume is the cube of the length of any edge.

            My observing X certainly does not entail my observing Y.

            As a single event, we can do better than ostensive for “The Big Bang” and move up to a proper extensive definition. Hadron we can also define extensively, barring major discoveries in physics.

            Earlier you said:

            “I find it hard to imagine having a useful word that can’t possibly be given an ostensive definition”

            So can you point to the big bang or a hadron, or can’t you? I’m not interested in listening to you hem and haw and introduce new terms you don’t understand. Just answer the question.

            perhaps it is different in other countries.

            Actually, it is as I said it was in the US. Here is US v Gorman:

            “To support a conviction of perjury beyond a reasonable doubt, the government had the burden of proving that (1) the defendant, while under oath, testified falsely before the grand jury; (2) his testimony related to some material matter; and (3) he knew that testimony was false. 18 U.S.C. § 1623.”

            And here is the statute:

            “Whoever under oath […] in any proceeding before or ancillary to any court or grand jury of the United States knowingly makes any false material declaration or makes or uses any other information, including any book, paper, document, record, recording, or other material, knowing the same to contain any false material declaration, shall be fined under this title or imprisoned not more than five years, or both.”

          • MugaSofer says:

            @Earthy Knight:

            You appear to have misread your own quote:

            “Whoever under oath […] in any proceeding before or ancillary to any court or grand jury of the United States knowingly makes any false material declaration or makes or uses any other information, including any book, paper, document, record, recording, or other material, knowing the same to contain any false material declaration, shall be fined under this title or imprisoned not more than five years, or both.”

            This expressly states that you must know your claim is false for it to be perjury.

          • Earthly Knight says:

            This expressly states that you must know your claim is false for it to be perjury.

            Yes– I glossed it in my initial comment as “knowingly t[elling] a falsehood under oath.” tern disputes that falsity is required: he claims, apparently in error, that making a statement you believe to be false while under oath suffices for perjury.

          • tern says:

            Is that your invitation to commence the traditional sniping and hostile reading back-and-forth now? I can’t say I’m very interested in that, sorry.

            I have no idea what you think it would mean to use a word in a way that is logically invalid. Logical validity is a feature of arguments or inferences, not words

            I thought the meaning was fairly clear – the arguer is using the word to make a logically invalid inference, and not in the trivial sense of “there are words in that inference.” The invalid inference escapes notice because the word is implying facts not observed or justified.

            Perhaps you prefer the phrasing “using a word this way encourages logically invalid arguments,”

            My observing X certainly does not entail my observing Y.

            Yes, you could say that you haven’t directly observed Y in such a way, I suppose. But you will have observed a fact that necessarily implies Y, and when you observe X you know it necessarily implies Y. There’s no mystery as to whether Y is true. One could say you’ve observed the truth of Y, instead, if you like, or incontrovertible evidence of Y. I don’t know what significance you attach to this – are you taking issue with the use of the word “observe” rather than “verify with certainty”? If so, then this is again a criticism which doesn’t seem to engage with the concept that the item is trying to communicate (“don’t equivocate”). I mean, it’s fine to criticize the given examples, I just thought you were disagreeing with the substance of the point.

            can you point at the big bang or a hadron, or can’t you?

            One cannot physically point in a meaningful way at a hadron or the big bang, if that’s what you mean. Considering that I’ve repeatedly questioned the relevance of literal pointing and received no answer, I assume you either a) don’t mean that but don’t wish to explain what you mean or b) use ostensive in a different way from me that implies this question is relevant.

            I was taught to use the word “ostensive” to refer to definitions provided by example and counterexample only, whether the examples were accessible to direct observation or not. Looking up definitions available online, that may not be what you understood by the word.

            If we take “ostensive” to exclusively apply to definitions in terms of the kinds of things one can directly observe and/or physically point at (apparently fairly common usage), then I freely admit there are many things one cannot define ostensively.

            I further admit, after having gone back to re-read the linked article and try to figure out what exactly Yudkowsky meant, I find this particular item (6) in the list to be distressingly unclear in intended application, and only tangentially related to the article it links to. I’m not sure I agree with your interpretation, but I’m not confident in my initial reading of it, either.

            Here is US v Gorman… [a]nd here is the statute…

            I see. So depending on whether a grand jury or court is involved, we must sometimes prove (knowing) falsehood and sometimes not (though I’m not a lawyer, so there might be precedent establishing that requirement for 1621, too, I guess). If you look within the court case you cite, you’ll see that the determination of falsehood was not dependent on an “actual” definition, but rather based on what the defendant understood:

            When a word has more than one potential meaning, it must be examined in context to determine the meaning the defendant ascribed to it. United States v. Williams, 536 F.2d 1202, 1205 (7th Cir. 1976). As Jamarkus correctly points out, the fault for unclear, ambiguous, or vague answers rests with the questioner. Bronston, 409U.S. at 360. But… precedent dictates that even when a question or answer is ambiguous, a conviction may still be upheld if a jury has been called upon “to determine that the question as the defendant understood it was falsely answered ….” United States v. Scop, 940 F.2d 1004, 1012 (7th Cir. 1991)

            In other words, once the jury understands what Jamarkus meant when he denied “having” the car, there’s no further argument about the definition of “have”, just whether he “had” it or not in the way he denied.

          • Airgap says:

            Is that your invitation to commence the traditional sniping and hostile reading back-and-forth now? I can’t say I’m very interested in that, sorry.

            What? He’s callin’ you out, bro. Don’t be a pussy.

          • HeelBearCub says:

            @Airgap:
            I think this trolling persona you are using is fairly annoying. It’s also an exploitation of the commons, in that if everyone did this, the comment section would be completely unreadable.

          • birdboy2000 says:

            For what it’s worth, I find Airgap entertaining.

          • Earthly Knight says:

            I thought the meaning was fairly clear – the arguer is using the word to make a logically invalid inference, and not in the trivial sense of “there are words in that inference.” The invalid inference escapes notice because the word is implying facts not observed or justified.

            Perhaps you prefer the phrasing “using a word this way encourages logically invalid arguments,”

            Stop using words you don’t understand. Yudkowsky’s claim, I take it, entails that you could not infer from Tom’s being a bachelor that Tom is a man, because, were you to define a bachelor as a one-eyed raven, you would then be able to infer that Tom is a one-eyed raven, which he is not. Do you endorse this principle, and agree that you cannot infer from Tom’s being a bachelor that Tom is a man?

            Yes, you could say that you haven’t directly observed Y in such a way, I suppose.

            So you’re on board that, even if property Y is a necessary condition for F-hood, you need not, in general, observe A exhibiting Y before dubbing it an F?

            One cannot physically point in a meaningful way at a hadron or the big bang, if that’s what you mean.

            So you now agree, contra Yudkowsky, that it is sometimes acceptable to “try to define a word […] without being able to point to an example”?

            In other words, once the jury understands what Jamarkus meant when he denied “having” the car, there’s no further argument about the definition of “have”, just whether he “had” it or not in the way he denied.

            No, the statement still has to be false on at least one of its actual meanings to be a candidate for perjury. Suppose Tom robbed a gas station but never a bank, mistakenly believes that a gas station is a type of bank, and testifies, under the misapprehension that he is lying, “I never robbed a bank”. Tom does not perjure himself under the statute I cited, because his statement is true regardless of whether he takes it to be.

          • Whatever Happened To Anonymous says:

            in that if everyone did this, the comment section would be completely unreadable.

            This sort of Kantian thinking would greatly impoverish the comments section.

          • Fullmeta_Rationalist says:

            Yudkowsky’s claim, I take it, entails that you could not infer from Tom’s being a bachelor that Tom is a man, because, were you to define a bachelor as a one-eyed raven, you would then be able to infer that Tom is a one-eyed raven, which he is not.

            This is not Yudkowsky’s claim. He’d certainly agree that we could infer that “Tom is a raven”, had someone defined “bachelor” to mean “raven”.

            The problem is when people debate definitions because they think the conclusion of their debate-over-definitions will causally exert an empirically-measurable impact on reality, rather than redraw their opponent’s model of reality.

            A better example EY uses is when atheists argue that religion is the root of all evil, and theists counter “but atheism is a religion too”. If the atheists take the bait, they’ll get sucked into a futile dispute over whether “atheism is as much a religion as traditional religions”.

            This is merely a definitional dispute. Whether we define Atheism as a religion distracts from the original empirical question of whether religions like Christianity or Hinduism beget {net disutility, excessive warfare, insert your own metric}. The original question is potentially useful insofar as it will help us decide whether to encourage or discourage theism in posterity.

            It’s kinda like the Is-Ought Divide. But between inductive and deductive.

          • tern says:

            Do you endorse this principle, and agree that you cannot infer from Tom’s being a bachelor that Tom is a man?

            I don’t think that’s the claim, but probably not. I’m not sure, in this instance, if by “Tom is a bachelor” you mean “Let Tom be a bachelor, whatever that is,” “Tom is an unmarried man,” or something else. If it’s the second, then definitely not, that’s a reasonable inference.

            The claim, I believe, is that the label one uses for a concept (e.g. the literal string “bachelor”) should not be confused with the actual properties of the concept. Tom is not a man because he’s labeled a bachelor, but because of his relevant characteristics. When they do get confused, you get trouble as in FMR’s example.

            So you’re on board that, even if property Y is a necessary condition for F-hood, you need not, in general, observe A exhibiting Y before dubbing it an F?

            If Y is a necessary condition for F-hood, your certainty of the truth of F-hood can only ever be as good as your certainty of Y, and not better. So whatever degree of certainty you need before you officially dub A an F, you need it for Y – if that requires direct observation for some reason, then you’d need to directly observe it, but I don’t see why that would be the case in general.

            So you now agree, contra Yudkowsky, that it is sometimes acceptable to “try to define a word […] without being able to point to an example”?

            Without being able to literally point a finger at it, yes. As I said, I have no earthly idea whether Yudkowsky meant the statement to specify literal finger-pointing, though his example does use it.

            No, the statement still has to be false on at least one of its actual meanings to be a candidate for perjury.

            Of course it does, but from my quote of the case you cited, it seems that one of the “actual” meanings the jury considers is what they judge the defendant to have meant. Quoting more concisely,

            a conviction may still be upheld if a jury has been called upon “to determine that the question as the defendant understood it was falsely answered ….” United States v. Scop, 940 F.2d 1004, 1012 (7th Cir. 1991)

            Gorman earlier cited Bronston, which held that “literally true” but misleading statements are not grounds for perjury. But here we see in the Gorman decision the court says that it’s not enough for a “literally true” interpretation to exist; if the jury decides the defendant was not operating under that interpretation, it’s still perjury.

            In your example, Tom understands the question to mean “did you rob a bank or gas station” – if the jury believes to the required standard of evidence that that’s what he understood, and that he robbed a gas station, it’s sufficient to make his testimony knowingly false, according to that passage.

          • Earthly Knight says:

            If it’s the second, then definitely not, that’s a reasonable inference.

            It’s clear from what Yudkowsky says that he is objecting to the following argument schema:

            –A is an F, Fs are, by definition, Gs, therefore, A is a G.

            He objects to this schema because, were F to be defined as ~G, it would yield a falsehood. But “Tom is a bachelor, bachelors are, by definition, men, therefore, Tom is a man” is an instance of this schema. If you accept that we can infer from Tom’s being a bachelor that Tom is a man, you reject Yudkowsky’s claim.

            if that requires direct observation for some reason, then you’d need to directly observe it, but I don’t see why that would be the case in general.

            Good. So if you believed that mortality were a necessary condition for manhood, that would not given rise to any obligation to observe someone being mortal before judging him to be a man?

            Me: No, the statement still has to be false on at least one of its actual meanings to be a candidate for perjury.

            tern: Of course it does,

            So we are agreed that a witness perjures himself only if his statement is actually false, and whether it is actually false depends in part on the meaning of the words involved, and not exclusively on the witness’s mental states?

            In your example, Tom understands the question to mean “did you rob a bank or gas station” – if the jury believes to the required standard of evidence that that’s what he understood, and that he robbed a gas station, it’s sufficient to make his testimony knowingly false, according to that passage.

            I do not see where you are getting this from. The opinion is saying that if a word has (say) six candidate meanings, and on some of those meanings the alleged perjurer’s statement is true and on others it is false, the jury may attempt to reconstruct from context which one of the senses the alleged perjurer intended. But there is no sense of the word “bank” on which it includes gas stations, consequently, a man who has only ever robbed a gas station and testifies that he has never robbed a bank does not perjure himself, no matter what he thinks the word “bank” means.

          • tern says:

            It’s clear from what Yudkowsky says that he is objecting to the following argument schema:

            –A is an F, Fs are, by definition, Gs, therefore, A is a G.

            He objects to this schema because, were F to be defined as ~G, it would yield a falsehood.

            Again, I don’t agree that’s precisely the claim. It doesn’t follow from the fact that the label “F” is applied to A that A is a G, it follows from the relevant properties of A (here, G-ness). The detour through “is an F, and Fs are by definition G” serves only as a place where someone could equivocate. Instead, they ought to just say “these qualities of F’s indicate G-ness.” In other words, If Tom is sitting next to you and a stranger asks “Are you sure that thing beside you is a man and not a raven?”, It would be unhelpful to say “he’s a bachelor, and therefore by definition a man,” instead of just “I asked him and he said he’s a man,” “I have sworn testimony from four zookeepers he’s no raven,” or simply “yes”.

            So if you believed that mortality were a necessary condition for manhood, that would not given rise to any obligation to observe someone being mortal before judging him to be a man?

            Such an obligation would exist under the colloquial definition of observe, which many people use to mean “to come to realize or know,” “to notice,” “to be or become aware of” etc.

            In a stricter sense like “perceive by an unaided physical sense,” no. It’s obvious you took it in this sense, and what Yudkowsky meant is, again, beyond me, but even if he did mean it in the informal sense above, there are other, better words that would not risk such confusion.

            So we are agreed that a witness perjures himself only if his statement is actually false, and whether it is actually false depends in part on the meaning of the words involved, and not exclusively on the witness’s mental states?

            No, because even under section 1623 the defendant’s mental state at the time of the alleged offense counts as a relevant meaning of the word. Really, the relevant meaning because a statement made that the speaker believes true can’t be perjury, and (as I contend below) a statement that the speaker believes to be false, provided he’s not mistaken about the facts of the situation, is perjury. In my interpretation, once they’ve proven what the speaker meant, there’s no more question about what the word means.

            I do not see where you are getting this from. The opinion is saying that if a word has (say) six candidate meanings, and on some of those meanings the alleged perjurer’s statement is true and on others it is false, the jury may attempt to reconstruct from context which one of the senses the alleged perjurer intended.

            My understanding is this: if there is inherent ambiguity, or someone claims there to be ambiguity (and gives enough evidence to support it), the defendant can be found guilty if all meanings provided are false – crucially, one of those meanings can be Tom’s, if the court believes it. If some of those meanings are true, the jury can attempt to conclude what Tom meant, and if that’s false they can convict. Obviously the second step is superfluous if they’ve already been convinced of Tom’s strange definition.

            Consider the case where Tom’s quirk is well known – literally everyone in the court has always known that Tom thinks gas stations are banks, and thinks that the phrase “gas station” refers to fuel depots for spacecraft, and doesn’t understand that other people don’t use his definitions. But still, no one else uses the words that way, and the definition doesn’t appear in dictionaries, urban or otherwise. Do you really think Tom gets to answer “no” without it being considered a false statement under 1623?

          • Earthly Knight says:

            Again, I don’t agree that’s precisely the claim. It doesn’t follow from the fact that the label “F” is applied to A that A is a G, it follows from the relevant properties of A (here, G-ness).

            Suppose you are in a bar, you overhear a stranger mentioning that he is going to Jaime’s bachelor party tomorrow, and infer from the stranger’s utterance in conjunction with the definition of bachelorhood that Jaime is a man. Don’t you thereby learn a new and important piece of information about Jaime, solely based on facts about meaning?

            crucially, one of those meanings can be Tom’s, if the court believes it.

            You seem to be reading your wacky semantic views into the opinion. Here’s the complete passage from US v. SCOP:

            “An answer relating to a term susceptible of various meanings can be found perjurious so long as the jury can adequately assess from the testimony the defendant’s understanding of the term and thus the falsity of his or her statements with respect to that term.

            The possibility that a question or an answer may have a number of interpretations does not invalidate either an indictment or a conviction which requires the jury to determine that the question as the defendant understood it was falsely answered in order to convict.”

            When the court talks about “meanings” they’re talking about actual meanings of the sort you might find in a dictionary, not idiolectic meanings that only exist inside the defendant’s head. This passage wouldn’t make any sense, otherwise: all terms have various meanings and all questions and answers have a number of interpretations if “meaning” and “interpretation” are understood psychologistically.

        • Airgap says:

          @HeelBearCub

          If everyone else did it, I wouldn’t have to.

    • FrogOfWar says:

      As a heads up, unless your linguistics department works differently from what I’m used to in U.S. universities, a semantics class is unlikely to address any of the topics you link to.

      What semantics does is construct models that are meant to explain certain facts about meaning, such as entailment relations and the productivity of language (the fact that language users are capable of understanding indefinitely many sentences despite having finite minds). In the model theoretic tradition, this requires doing three things:

      1) Specifying tree structures for sentences illustrating syntactic relations.
      2) Assigning each word a meaning, which is generally a function–written in lambda calculus notation–that takes the meanings of other expressions as arguments.
      3) Specifying a set of rules for combining the meanings of various expressions given their context within the tree structure.

      Together these things allow you to calculate the “meaning” of the whole sentence, which will be a truth value in extensional cases or a set of possible worlds in intensional ones (rather, the characteristic function of that set). And by “calculate”, I mean calculate; a computer can do it. It has the feel of a logic or math class more than anything else.

      The worries on the LW page are either stipulated not to be present for simplicity, shoved off toward the discipline of pragmatics, or unaddressed.

      • Peter says:

        I have some background in computational linguistics, so my knowledge of “real” linguistics classes is limited, but:

        The model-theoretic thing is certainly a thing, and I’ve worked with projects involving large collections of computer code to do exactly that. There’s a distinction between lexical semantics (step 2) and compositional semantics (step 3), and different ambiguities involved in each step. One example sentence with a compositional ambiguity: “every smoker is the nephew of a dragon” – do you mean that every smoker is a nephew of the same dragon, or that each smoker has his own draconic uncle? In half-remembered symbolic notation, the first reading is ∃(x)dragon(x)&∀(y) (smoker(y) => nephew(x,y))) and the second is ∀(y)(smoker(y)=>(∃(x)dragon(x) & nephew(x,y))). I’m sure I’ve mangled this – the point is that you can do all of this and all sorts of exciting logic without, for example, knowing that “nephew” means “sibling (or sibling-in-law’s) son”.

        The lexical semantics in step 2 often seem to boil down to selecting the correct sense from a smallish inventory of word senses, turning “bank” into “bank(sub)financial(/sub)” and “bank(sub)river(/sub)” – there’s no consideration of the niceties of which sorts of financial institutions count as banks (or which institutions are financial enough to count).

        The thing that mucks things up is that I and various others have become skeptical of neat little word sense inventories – they tend to be either so small and coarse-grained as to be missing important senses or so large and fine-grained as to make it too hard to tell which sense a word is in. A lot of word sense ambiguity is due to regular polysemy (example: animal-food polysemy, i.e. “fish” and “chicken” and “lamb” and many other words can refer to live animals or meat derived from those animals), there’s a lot of confusion over where things like metaphor stop and word senses start, and so on.

        My overall impression was of people building a complicated brittle thing which was kind of amazing in its own way, but which seemed to be missing some key insight.

    • Peter says:

      Being the sort of nerd who buys textbooks and other dry academic books for fun: I’d recommend Cognitive Linguistics (Croft and Cruse, 2004) for a not-too-out-of-date summary of some of the relevant bits of linguistics, and Women, Fire, and Dangerous Things (George Lakoff, 1987) is a big seminal work with lots of important ideas. With the latter… Lakoff’s linguistics IMO are better than his philosophy (or for that matter his political commentary, but that’s not relevant), and apparently there have been some discoveries about the Australian aboriginal language which he uses as his big example that kind of undermine some of his case, but still it’s an important work and gets cited loads.

      So if you want something reputable (and more clearly written than Wittgenstein), then have a look at those. I read the LW lingustics stuff a bit after I’d read lots from those sources, and lots of the LW stuff checks out, at least as far as an amateur like me can say. That said… linguists are a notoriously fractious bunch and I’m sure you can find people in different camps who would complain as much about the texts I’ve recommended as about LW.

  12. Eggoeggo says:

    Posted this (very late) last time, but sadly didn’t get a response. Hopefully second time’s a charm:

    I have a quick research request inspired by a conversation on the subreddit and browsing some trans-rationalist tumblr blogs.
    Can anyone link a male to female transsexual who posts/blogs/talks about masculinity and masculine men in a positive way? Specifically indicating attraction to them. One would do, but more would of course be better; you can never have too many hot-guy photoblogs in your bookmarks.

    I have a hypothesis, and this seems like an easy falsification test. The only problem is that I’ve got none of the contacts or cultural knowledge I’d need to find the info myself.
    Someone else on the reddit said they were also interested in this for other reasons, so it sounds like it’s at least a vaguely useful line of inquiry.

    • Anonymous says:

      What hypothesis? You should have said the first time.

      • Eggoeggo says:

        Well, you have to test your sample against the population before you can do any responsible theorizing.

        Right now I’m sitting here with a big pile of red and green M&Ms. Before I go analysing them, I should at least check if there’s actually such a thing as blue M&Ms, and that I didn’t accidentally pick up a special edition “christmas colours” bag with a misrepresentative sample.

    • vV_Vv says:

      There is Blaire White who is is a male-to-female transgender MRA/anti-SJW. I don’t know if she speaks about being attracted to masculinity (I just watched a couple of her videos) but at least she seems sympathetic towards it.

      I suppose she is not a redpiller though, if that’s what you were looking for.

      • I’m just a simple country chicken, but isn’t there more than a little daylight between “posts/blogs/talks about masculinity and masculine men in a positive way, including considering them worthy of attraction” and “redpiller”?

        Seriously? Isn’t there? I honestly don’t know who’s defining terms for whom any more.

        • Eggoeggo says:

          Everyone’s talking about “men’s issues” and “redpills”, and I was just hoping for a link like “my-sexy-firemen.tumblr.com”, or maybe info on a transsexual bara artist.

          Considering men attractive can’t be considered some radical fringe position… can it?

          • I’ve seen online discussion of the extent to which heterosexual men *never* get told they’re attractive– I think it’s part of the system which requires them to make the first move.

          • Eggoeggo says:

            Right, but… there’s porn of men out there. Gay dudes can’t be the only ones reblogging and talking about it, can we?
            I know a bunch of mtf transsexuals who make and reblog porn of women, and one who writes (really hard) erotic fiction about women.
            Surely there must be some “straight” ones who make porn of men?

            I just figured people were going to post like twenty ero-tumblrs run by mtf transsexuals, and I’d have to reevaluate my biased sample and have a heartwarming revelation, etc. etc.

          • Airgap says:

            There are much easier ways to find porn on the internet than asking for help on SSC.

          • Anonymous says:

            Wouldn’t you expect transgender women to exhibit female sexuality? So read & write sexually explicit fanfic rather than drool over crotch shots?

          • Deiseach says:

            Considering men attractive can’t be considered some radical fringe position… can it?

            I shouldn’t have thought so but then again, seeing the amount of grimacing about “cis hets”, who knows?

            And then yet again, attraction is a very individual and subjective thing. The guys I find attractive might do nothing for you, and vice versa. JUST REALISED NECESSARY DISCLAIMER: Sorry, I’m not trans or (apparently) sufficiently queer (being aro/ace doesn’t count, it would seem) to be able to help you in your quest, I’m one of the horrid cis hets with all that unexamined and unearned privilege 🙂

          • Leit says:

            You’d be surprised.

            I hate to use chans as an example, but there’s a persistent meme in places like /fit/ that gainz (+fit) are an escalator toward non-heterosexuality, and it’s largely because while the other guys in there might appreciate your body, the ladies seem to only appreciate from the distance of a tv screen or magazine.

          • Airgap says:

            Candidate theories:

            1. It’s possible to spend so much of your life on the internet that you avoid discovering that women like physically fit men.

            2. *chan boards produce a signal through your monitor which causes brain damage, like in Videodrome.

            3. /fit/ posters are mostly closet homosexuals to begin with.

            Obviously the next step is to seek an NSF grant to determine which (if any) are false. I welcome help from anyone familiar with the application process.

          • Leit says:

            /fit/ posters are largely non-closeted homosexuals to start with, or at least pretending very convincingly for the sake of humour.

          • Airgap says:

            @Leit

            4. Gay men have no idea what women find attractive because they don’t care.

            @Deiseach

            Remember that the unexamined privilege isn’t worth having.

      • Eggoeggo says:

        OMG, Blaire sounds awesome. Mostly seems to discuss serious issues/memes, but I’ll see if there’s any relationship talk in the longer ones. Thanks!

    • Siah Sargus says:

      Sorry for not finding anyone… I got distracted.

    • Zorgon says:

      Wow. I can actually hear the hackles rising from across the Interwebs.

      • Eggoeggo says:

        It’s too late to edit the original post, but could you explain a bit? It sounds like I poked a sore spot? Someone told me to look up Alice Dreger last night, and hoooo boy…

      • Eggoeggo says:

        Aaaaand I found out why
        http://physicshead.blogspot.com/2016/02/alice-dreger-bailey-case.html

        James Mead/Andrea James/Jokestress evidently has a penchant for attacking young children, calling a nine year old girl a “cock-starved exhibitionist” and a five year old boy a “precious womb turd.”

        I’m not touching this subject with a ten foot pole any longer. My curiosity and tolerance are both fresh out.

    • xzd7rys9ao says:

      Myra Breckinridge is a novel by Gore Vidal which is essentially all this, that is, you can get through all the satire and megalomania. It quickly became a go-to book for queer theorists back when it was published.

      • Airgap says:

        Except that Gore Vidal is gay man, not an MTF transsexual, and everyone already agrees that they’re into masculinity.

        As an aside: I didn’t anticipate the plot twist in Myra Breckinridge when I read it because I was so distracted by how much Myra reminded me of an ex-girlfriend of mine. The ex claimed to have been pregnant before, but on reflection, I can’t be sure this is true.

      • Peter says:

        The complaints I’d heard from trans circles is that the character of Myra Breckinridge is… a cis gay man’s idea of something, essentially a device for exploring ideas of gender and sexuality, but not in any way an attempt to explore what trans-as-it-occurs-in-the-real-world is like.

        It does not surprise me at all to hear that queer theorists were all over it; literary theory has a tenuous relationship to the real world at the best of times; at it’s best, it’s not so much about what the world is like as what writers and readers think it’s like when writing and reading. (Also I have a low opinion of literary theory in general and queer theory in particular, so it doesn’t surprise me to hear of a whole bunch of them having been dead keen on a problematic text. There’s a certain… not quite Schadenfreude, but similar, to it.)

    • multiheaded says:

      I don’t talk about *attraction* to men, but I do sometimes attempt to talk about men’s issues from a sympathetic and personal perspective on my tumblr blog. (I have a tag for it, a lot in it is just reblogs from others tho.)

    • rttf says:

      You should try asking on 4chan’s /lgbt/ board. It seems like the perfect place for your question.

    • Vox Imperatoris says:

      I don’t know any specific blogs or anything, but trans women are at least as often straight (i.e. attracted to men) as gay (i.e. attracted to women). The belief, in fact, for a very long time was that the only “real” trans women were attracted to men, and that those attracted to women were suffering from “autogynephilia”, or an attraction to a sexual fantasy of themselves as a woman, and not truly transgender at all (and therefore, shouldn’t be given hormones or sex reassignment surgery). As a result, this is a sensitive subject.

      You could read the memoirs and interviews of famous transgender women. For instance, material about Christine Jorgensen, the first American to have sex reassignment surgery.

    • Adam says:

      I just attempted to post something and got filtered, but if you look for porn instead of activist bloggers, I think you’ll find what you’re looking for.

    • arbitrary_greay says:

      I can’t give any specific suggestions, but I think you’d have better chances by searching fandoms in which men are the objects of desire (Supernatural, MCU, One Direction, Free!, etc.), and then looking for the MtFs in those fandoms. Browse through the social media of the more prolific fanfiction authors, since they tend to spill a little more about their personal lives, or have a good number of fans in theirs Tumblr asks/replies saying “I’m [of this demographic] and was inspired by your fic!” and such.

      Alternatively, find trans geek spaces, and then narrow down within those spaces to fandoms in which men are the objects of desire.

  13. ton says:

    When I was a kid, I read a lot more Serious Books for Curiosity and Fun than I do now (as in educational, books that Scott has reviewed is a good proxy for what I mean). (I’m 20 now).

    I notice that Serious People that I respect (Scott, Eliezer, people I know personally) tend to read lots of such Serious Books, which makes me feel bad for not reading as many (also the reviews make them sound fun, I guess). It’s not like I don’t have the time, it just gets spent on internet or other reading. I’ll go hunt down a rabbit hole of legal sources or whatnot, and it’s just as intellectually satisfying, but looking back it doesn’t feel the same like having read long educational books.

    Is there some simple technique I can use that will help me read more, or optimize better for what I really want (which I’m not actually sure)?

    Apologies if this isn’t terribly clear, I’m hoping someone recognizes my description in themselves or an earlier version and can help without further explanation. I don’t particularly understand what’s behind my dilemma beyond my description, except that there’s definitely a there there.

    • Bassicallyboss says:

      I’ve had problems with this, and it seems like standard akrasia/time-discounting/intellectual-preference-revealed-preference-difference. The easiest way is just to do it. Get the book, read the book. It’s not hard to do, but it’s easy not to, especially if you spend a lot of time, e.g., reading online, or generally in flow states that don’t have room for reflection. Anyway, here’s my suggestion for how to do it more easily:

      Go to the library often. Take a reading list of Serious Books you might like to read. While you’re there, you might as well get something–why not something from your list? And since you have to return the book when it’s due, you’ll feel pressure to finish it by then. Don’t be afraid to quit without finishing if you don’t like a particular book; sure, it feels like you’re giving up on your goal, but the whole reason you have that goal is because you enjoy reading Serious Books.

    • Charley says:

      This is probably stating the obvious, but getting physical distance between me and my distractions is very helpful for me. The best version of this is going to a cafe without your phone or laptop, but even just putting your internet-capable devices in another room partially removes them as an alternative that my brain might decide is preferable to reading.

      I also become better at doing the things I’m supposed to do for the remainder of the day after I exercise. (I’m aware I’m doubling down on trite advice.)

    • Airgap says:

      You think Eliezer is a serious person? Have you met him? I’m betting not.

      • ton says:

        Serious as in intellectually serious, which I notice by the number of references contained in their writing.

    • Anon. says:

      Put aside a specific amount of time every day for reading. Not the time right before you sleep. Get away from electronic devices. Drop boring books instead of trying to finish them. Always have the next book ready.

    • Depending on your budget (and/or technical know-how and disregard for intellectual property laws), I recommend getting a Kindle. When you are carrying around a library of Serious Books in your pocket at all times, it’s really easy to sneak in 15 minutes of reading here and there when you’re in line. In addition, the activation energy to move onto the next Serious Book is very low; you can pre-load dozens, or easily purchase a new book just from the Kindle interface, and be reading it in minutes.

      I’d personally recommend a Kindle Paperwhite; the first-gen Kindles are flimsy, and the Paperwhites are fairly cheap now.

      (This reviewer was provided with nothing for his fair and unbiased review.)

      • Ton says:

        This was part of the problem. I got a kindle and now have more books on it than I can ever finish, so each book feels smaller.

        • Simon says:

          One thing I found that helped to concentrate the mind, rather like Johnson’s prospect of being hanged, was to roughly calculate time per book against estimated days of life remaining. Once I had an approximate upper limit number, time spent on frivolous reads felt more wasteful.

          Thus if you manage one a week for 60 remaining years your upper boundary is only 3120. Of course due to reading speed and free time your mileage may vary.

          Hopeful transhumanists can disregardthis method.

  14. HeelBearCub says:

    I get the sense that SSC has attracted a fairly substantial cohort of people who think that college is useless, merely a “subsidized tulip”.

    My question is, how deep does this go? How many people wouldn’t advocate teaching children anything at all? (I assume no one?) How many would teach their children to read and nothing more? (I assume very few, if any?)

    So, at what point does education cross-over from useful to useless?

    • CatCube says:

      It’s not that education is bad on its own. It’s that you have to mortgage your life for it. There’s nothing wrong with teaching kids, but the current system of making them get a credential that has little to do with the job they’re likely to have and going massively into debt for it is a problem.

      So, law, medicine, engineering, etc., aren’t a problem. Those programs are valuable for teaching you to do the job you intend. Getting a degree in liberal arts so you can work in HR is actively destroying value.

      • HeelBearCub says:

        Perhaps I need to also loop in the many recitations of the idea that IQ is largely determinative of outcomes and not affected by education to give a fuller picture of what I mean.

        Can I get you to expand on what you mean by “destroying value”?

        • CatCube says:

          Spending a lot of money for something that doesn’t help you do the actual job, and is only a piece of paper to make the person screening your resume put you into the “good” pile.

          I loved going to school, and could see myself doing it for the rest of my life. I got my degree in engineering, joined the Army as an officer, eventually got into a position to use my degree, found that I love it as much as when I was taking classes and decided to get out to keep doing it. College was *great* for me, especially because I have no debt. However, there are a lot of people, probably most people, that find being stuck in a classroom while someone yammers at them to be soul-sucking. Requiring those people to get a degree just to get a degree is what I mean by “destroying value”. They’re taking on a huge pile of debt to learn something they don’t much care for to get a job that doesn’t really use the information they suffered through classwork to get.

          • HeelBearCub says:

            This seems related to Fundamental Attribution Error. Most people who go to college seem to really like college. You like college because you found it stimulating intellectually and it led you to a satisfying career. But other people must have only liked college because parties or something, they must have found the classes were soul-sucking and useless.

            This seems very flawed to me.

          • Wrong Species says:

            I don’t understand why there isn’t a way to have the “college experience” without college. Shouldn’t it be simple to get a bunch of 18-24 year olds together in one place and have them be involved in clubs while having parties on the weekends?

          • HeelBearCub says:

            @Wrong Species:
            Flexible schedule, small geographic footprint, created tribal affiliation, young. Hard to get all of that outside of college. Even the military only has two of them, or maybe three (but the small geographic footprint tends not to last).

          • John Schilling says:

            Shouldn’t it be simple to get a bunch of 18-24 year olds together in one place and have them be involved in clubs while having parties on the weekends?

            One critical ingredient of the “college experience” is that one’s living expenses, and a whole lot of party-related infrastructure and organization, is paid for by parents, the government, or by subsidized and guaranteed loans. In exchange, one has to not actually flunk classes, but as the product that colleges and universities are selling is “students who didn’t flunk”, that’s not a terribly high bar.

            If you have to work a job where your actual economic productivity covers the costs of living in a college-equivalent environment, with the experience and discipline of a self-taught 20-year-old, you may not have much time left over for partying.

          • CatCube says:

            *shrug* Could be. I guess my question is do they like college for the classwork/learning opportunities, or the social life? As David Friedman has pointed out before, his daughter went to school with a bunch of people who were happy when class got cancelled. Depending on the class, of course, I felt the same way getting my bachelor’s. After being in the Army for 7 years, when I was working on my masters I felt differently. When you think about it, it is pretty messed up to be happy a class you’re paying thousands of dollars to attend has been cancelled.

            I’ve had more than a few friends who were at college with no major, or general studies, because they didn’t know what they wanted to do. At the time, I was like “Hey, at least you’re in college.” Now, I realize that’s pretty screwed up. The default should be you don’t go to college if you don’t know why you’re there, but we’re in this prisoner’s dilemma where middle-class people feel like they have to attend even with no clue what they want to do, while they ring up debt that gets skimmed off by academia (whether faculty or administration) to build their own empires.

          • Error says:

            @HeelBearCub:

            Datapoint: I was in the students-aiming-for-job-ticket-with-vaccum-attached-to-soul category. It’s a real thing, though college wasn’t nearly as bad as grade school in that respect.

            (also I never actually got the job ticket, though I’ve done okay anyway)

          • Guy says:

            @CatCube: Just on the “paying for classes you don’t attend” thing –

            Sometimes you’re paying for lecture. More often, in my experience (engineering/hard science), you’re paying for a reading list, a set of practice problems, a designated place to ask questions, and finally some proof that you did the reading/practice. Some people don’t notice the first two and use the third to just ask directly for the fourth. These people are idiots in every sense of the word. The people who do take the first two seriously, and for whom the last is relatively trivial, are the people who benefit from college.

            (I’d generally include lectures and lecture notes mostly with reading, and somewhat with the “question zone”, but I’ve generally found that the associated text is more important/useful. That might just be me, though.)

          • Tibor says:

            @HeelBearCub:

            American campus universities are quite a strange place from a (continental) European perspective. Of course, students also party and do all the other “college experience” stuff but the university does not do much to provide them any special environment to do that. It is usually not expected that all students find a student dormitory, I don’t know the exact numbers but I would say that probably the majority lives in regular apartments, usually with other students. What the Unis in Europe do provide and what is arguably something unnecessary, are sports facilities. Other than that though, it is just the stuff actually related to studying (and subsidized mensa food but do not expect any culinary marvels, the purpose of the mensa is to have a place where people can fill their stomach between morning and afternoon lectures without having to go to far from their lecture rooms).

            I’ve never been to the US, so maybe I have a distorted picture, but from what I read and hear, the US campuses are much more like a strange hybrid between a kibbutz and a hotel. It also seems much more expensive to run it. European universities are usually run by the state and either fully or mostly paid by taxes but they still seem to be able to provide their services cheaper – possibly because they do not provide as many luxuries to the students.

            Still, I would go much further and rebuild higher education in the language certificate fashion. You can have a research centre which also provides certification. So when you’re ready to take an exam in whatever you want to learn, you come there, pay for the exam and only for the exam and wait for the results. Each centre structure its exams into programmes so that if you finish the list (and possibly do a more intensive project which would correspond to the current bachelor/master theses and for which you would find someone to guide you in that centre – and obviously pay them for the consultation hours), they give you a Bachelor or a Master Diploma. You can decide who gets to have the right to issue these diplomas in two ways – either the state certifies the research centres, or you could have it in a decentralized way by rating agencies, similar to how it is done with many things today. You trust the agency because it has a good reputation, if it starts cheating (accepting bribes) it will soon lose the reputation which is unlikely to be worth it, reputation is really hard to earn and even harder to regain.

            In either case, how you actually obtain the knowledge needed to pass the exams is up to you. And you can then simultaneously have people who go to traditional universities (and pay the traditional price) which include all of the “college experience” in their package. Or you can learn online/with books and perhaps pay for a consultation every now and then when you don’t understand something (there are many online courses which are actually done very well and they’re generally for free). Or you can do something in between. The point is decoupling the “college experience” and the actual learning, so that you can have the same amount of learning (hopefully, we agree that that is the primary thing) for a much lower price.

            I think that when people criticize college, they generally criticize its inefficiency and wastefulness rather than the concept itself. In the same way, people criticize lower education – I am not a fan of standard schooling at all and in this case I think that even the concept is wrong (since everyone is supposed to learn the same thing and at the same time in a very unnatural environment).

          • NN says:

            American campus universities are quite a strange place from a (continental) European perspective. Of course, students also party and do all the other “college experience” stuff but the university does not do much to provide them any special environment to do that. It is usually not expected that all students find a student dormitory, I don’t know the exact numbers but I would say that probably the majority lives in regular apartments, usually with other students.

            The difference with regards to dormitories between US and European colleges probably has a lot to do with the fact that the US is a lot larger than any European country except Russia, meaning that a lot more students will be going to colleges far away from their hometown. From what I gather, living in dormitories is less common at US colleges that mostly only attract local students. For example, at the University of New Orleans, which mostly educates New Orleans residents who are willing and able to go to college but are either unwilling or unable to leave the city, 90% of students live off campus.

          • Anon256 says:

            The US “college experience” is essentially being sheltered from Moloch for a few years so you can do fun and interesting things (and for many people it’s the only time they get to do anything interesting in their whole lives). Sheltering people from Moloch isn’t cheap, but by latching on to certain signalling arms races (credentialism and the idea that learning certain things is inherently virtuous and high-status) the universities have gotten other parts of society to mostly pick up the bill. Attempting to decouple credentials and learning from the rest of the college experience would expose students to Moloch; governments and parents would become less willing to pay for the full college experience, and employers less willing to tolerate as long a period of goofing off on resumes. Is the brief period of freedom between childhood and employment not worth defending from Moloch by whatever rationalisations and fences are available?

          • Dr Dealgood says:

            No, it isn’t.

            Maybe this is sour grapes, because I spent my time in college actually preparing for my career, but having a four year bacchanalia with other people’s money is pretty much the precise opposite of what young adults should be doing.

            Even assuming an 100% consequence-free environment, how exactly is drinking until you puke three nights a week supposed to be a worthy enough endeavor to justify the tens to hundreds of thousands of dollars it costs? There’s no reason you couldn’t take one hundredth of what you would have spent getting a degree in critical flower arranging theory and backpack across Europe with it. If anything you’d come out way ahead, since you’d have in all likelihood a more interesting experience while incurring less debt and wouldn’t be over-credentialed for the work you’ll actually end up doing.

          • TheAncientGeek says:

            The stripped down,non residential approach has been implemted successfully in the form of the Open University.

          • Tatu Ahponen says:

            “The difference with regards to dormitories between US and European colleges probably has a lot to do with the fact that the US is a lot larger than any European country except Russia, meaning that a lot more students will be going to colleges far away from their hometown.”

            In the Nordic countries it’s a matter of cultural pride for young people to move away from their parents and often to a wholly different town as early as possible, so while I don’t have the statistics, I’m not sure the rates of people living away from their hometown (their parents, really – that’s what’s being discussed here, no?) are any different.

          • Tibor says:

            @Anon256: I see two problems – this is more or less poor people subsidizing rich people’s kids, so that they can have a few years off, while the kids of the same age who do not go to college actually have to work. This is especially the case in continental Europe where higher education is either fully or mostly paid by taxes, less so in the US where it is mostly covered by the parents (but the system still makes it more difficult for poor people to get the higher education because of unnecessary costs which have nothing to do with actual education).

            The second problem is that it is simply inefficient. If we want people to have a year or two during their late teens or early twenties to “find themselves” or whatever, then let them do that without the distraction of courses etc. It might even come off cheaper.

            And of course – as long as there is a way to get the same kind of education (or rather a certificate which is equally valued) without the luxuries then by all means, keep this kind of a system as well, have the rich kids enjoy the comfort of the campus if their parents are willing to pay for it.

            @Tatu: I was talking not so much about leaving town, but about not living in a campus. Campus-style universities are quite uncommon in continental Europe as far as I know (also, I think that in most European countries, kids do not want to go too far from their hometown, although they usually do not want to live with their parents – although I read an article about Italians the other day according to which it is quite normal in Italy for 30year old men to live with their parents…mamma cooks and looks after the boy, so why should he move out? :)) ).

          • Anon256 says:

            @Tibor: It’s unfortunate that poor people are currently less likely to have the opportunity to spend a few years sheltered from Moloch, but this doesn’t seem like a reason to tear the system down, any more than the poverty of third world countries is a reason to tear down Western welfare states. Work to expand access to more people, not fewer.

            “have the rich kids enjoy the comfort of the campus if their parents are willing to pay for it” The trouble is their parents won’t be willing to pay for it if there are socially-acceptable alternatives. Indeed parents would soon find they had little choice but to take the cheaper alternatives in order to afford to compete in positional-goods bidding wars with other parents who had done the same.

            While it’s possible to imagine a better system than the status quo, we can’t just decree new social norms and have them stick. The current system co-opts a status arms race to subsidise youth freedom and basic research (without much impact on the Laffer curve). Collapsing this system is unlikely to lead to youth freedom and basic research being subsidised in more efficient ways; the money/resources will likely go to other positional goods instead.

            @Mark Atwood: What would you consider “social need”? Society is a tool which exists to increase freedom and fun, and college is where it does this most successfully. Mocking people who spend time on fun and self-reflection reminds me of the rats from the Moloch post who would mock those who spent time on art and leisure rather than devoting themselves single-mindedly to maximising reproduction.

            “I must study Politics and War that my sons may have liberty to study Mathematics and Philosophy. My sons ought to study Mathematics and Philosophy, Geography, Natural History, Naval Architecture, Navigation, Commerce and Agriculture, in order to give their children a right to study Painting, Poetry, Music, Architecture, Statuary, Tapestry and Porcelain. My grandchildren ought to study these things in order to give their children the liberty to find themselves while getting drunk three nights a week.”

          • Tibor says:

            @Anon256: I generally don’t see subsidizing someone else’s luxuries as a desirable thing. I generally do not like the welfare state because I think its flaws outweigh the benefits but I can at least imagine a scenario where I would support some kind of welfare (something along the lines of a negative income tax). But I don’t see why I should subsidize someone’s leisure, especially if he’s rich, but even if he’s not. Sure, leisure and hobbies (even if those sometimes at least partly overlap with work) are essentially the ultimate goals of everyone so we should not frown upon someone who likes to enjoy them, we all do. But even if you like subsidizing things, you cannot subsidize everything and there are hundreds of things that are worth subsidizing more than hotels for moderately to highly intelligent people in their twenties. Also, I don’t think they really need much of a subsidy. The university can consist of the lecture rooms, everything that is necessary for the research (of course that does not concern bachelor students much, but if you want researchers there, you need it) and that’s it. That would be a lot cheaper. Of course, then you cannot have the Uni in the middle of nowhere, you have to have it in at least a medium size town (100k people let’s say). But all the leisure stuff the students like, they can still do, they just pay for each thing separately instead of a package deal like in an all-inclusive hotel on the French Riviera. They can even live together with other students, it is very common at least in German or Czech cities for students to share a big flat between 5-6 people. There are student dormitories too, but it would work without them as well (the prices in the dormitories are slightly subsidized, but the price difference between a place in the dormitory and a corresponding flat/room in a regular house is usually not more than 20-30%…sometimes you can actually get a better deal outside of the dormitory, although then it is usually a really small room in a shared flat).

          • Anon256 says:

            @Tibor: I don’t mean to argue that it should be subsidised by taxes/government; my claim is more that the US social norm where middle-class adults buy their children (or past selves via debt) these expensive “packages” is better than the likely alternative. (Dorms vs grouphouses is an implementation detail that I don’t think matters much for this.)

      • grendelkhan says:

        I hope you’re not including CS in “engineering”. It’s notoriously hard to teach people to code (let alone to engineer software or to administer systems!), which means that frequently self-taught people do as well as the formally-educated.

        Honestly, it looks from here like everything short of traditional apprenticeships is just a kind of pale shadow thereof. Like, we know how to do education one-on-one, and apart from that it’s kind of a crapshoot. (It seems like too much of a coincidence that the field that has yet to be professionalized is the one where you can do a good job learning on your own.)

        • Outis says:

          Studying CS at university, especially in postgrad, teaches you things you would be unlikely to pick up on your own, but it does not teach you how to code. In terms of utility as a software engineer, it goes like this:

          Self-taught coder + formal CS education
          Self-taught coder
          Formal CS education
          “Hacker school” grad
          Vagrant
          Some of the PhDs I end up having to interview

          • Airgap says:

            “Vagrant” should be above “Formal CS education,” but otherwise reasonable.

          • Teal says:

            I agree with your list, BUT the problem with self taught coder + formal CS education is that they don’t know that they still suck.

            I can’t tell you how many times I’ve heard “I’ve been coding professionally for 8 years!” from a 22 year old. Daddy’s friend paid you $20 to make a website for him when you were 14, you want a cookie? Combine that arrogance with the “did I mention I went to STANFORD” and the resulting insufferably density risks forming a black hole. No we aren’t letting you rewrite the core product in Scala or Haskall or Clojure. Now shut the fuck up and fix some bugs.

            The only way to become an excellent software engineer is to be a software engineer for a good period of time.

          • grendelkhan says:

            Teal, I’ve interviewed candidates who’ve been employed in the industry for ten years who can’t do FizzBuzz. (The struggle is real!)

            For me, it helped to occasionally touch code that I couldn’t understand, e.g., the Linux kernel, and remember that I might be the most competent engineer in the room, but it was a very small room. (I eventually moved to a bigger firm, where I’m a much smaller fish, and I’m quite happy here.)

          • HeelBearCub says:

            @Teal:
            Time in most jobs makes you far better at the job (with diminishing returns plus a “burn out” factor taking away from that). Even for already experienced workers.

            However, the question is why employers/other employees should spend their time making you more productive. Usually its because they have been satisfied you are worth the effort. You only really, really know once someone has been in the job a while, but, given that starting employees are colossally expensive to bring up to speed, one wants to maximize the likelihood that the will actually do so.

    • Eggoeggo says:

      >College
      >Children

      Boy, the infantilisation of college students got internalized fast.

      • HeelBearCub says:

        College is just the tail end of the mindset I am referencing. I’ve seen plenty of extension of this to HS, and there is also plenty of references to IQ determining outcome irrespective of education.

        • vjl110 says:

          I feel pretty comfortable labeling college (at least in many cases) an inefficient potlach.

          As the discussion starts moving back towards elementary education, I am not as confident being a skeptic. That said, it does seem like formal education might be a complete waste of time for natural autodidacts… which might explain why the ‘anti-education’ idea finds so many believers here.

          • The Nybbler says:

            I think it just looks that way because US primary and secondary education is so bad. Only the most autodidactical of autodidacts won’t benefit from some direction on what to learn and where to go looking for it, at the very least.

          • “Only the most autodidactical of autodidacts won’t benefit from some direction on what to learn and where to go looking for it, at the very least.”

            Yes. But that direction doesn’t require sitting in a classroom for many hours a day. It can be provided by conversations with people who know stuff, most obviously one’s parents.

          • Emily says:

            Well. If your parent is Milton Friedman, definitely.

          • Airgap says:

            I saw “Free to Choose.” David: Do you think it’s too late to change your dad to Tom Sowell? He seems much cooler.

          • @Air:

            I don’t think I look the role. And he’s only fourteen years older than I am, which makes it unlikely although not impossible.

          • Airgap says:

            You’ve clearly never seen “Trick Baby.”

          • ChetC3 says:

            Intelligent autodidacts are the people most in need of something like college. At least if they aspire to produce something other than outsider art.

    • Good Tea Nice House says:

      I don’t think sitting in a room for an hour or two with other people and learning things from a wiser person at the front of the room, perhaps based on material by an even wiser person, is an inherently bad idea. I do that every time I attend a conference, for example.

      But if that’s all college was, then it’d be harder to say college is (mostly) useless. The problem is, college is mostly NOT that.

      • HeelBearCub says:

        Wiser seems to be doing a great deal of lifting there. Or maybe not.

        What do you see college as then, exactly?

        • Good Tea Nice House says:

          Disclaimer: I am probably not a typical SSC reader. I’m not into the EA thing, I believe in God, I’m socially conservative, etc.

          When we say “college” I’m assuming we’re mostly talking about undergrad, and some amalgam of the educational experience of both STEM and non-STEM majors. Dorm life and campus activities are included too. The typical interaction with college sports (attending a few games, purchasing branded merchandise, sharing class with student athletes) might be considered on top of that. And we’re also talking about college as it’s situated in our society: as a thing that costs a lot of money and usually puts you into a mountain of debt, takes 4 or more years of your life to complete, and spits you out at the end with a piece of paper that a lot of places won’t hire you without but which doesn’t mean a whole lot in terms of your knowledge or preparedness for the practical challenges of everyday work.

          Sitting in a room with other people and learning things from a wiser person at the front of the room is a very small part of all that, and sometimes that wiser person is teaching things that are useless or worse than useless.

          The biggest problems with college are really part of the problem of how we treat education more generally.

          BTW, this isn’t to say that I think people only need to learn by going to the library and reading books (as at least one other commenter seems to be saying). I’m a big fan of the apprenticeship model, though it has its shortcomings too. I’m also not so sure there is a less worse model when I consider what mass immigration and automation has done to a lot of the jobs that would otherwise be done by the huge proportion of people who simply will never be smart enough to earn meaningful college degrees (or in many cases, high school diplomas).

      • Glen Raphael says:

        @Good Tea Nice House

        I don’t think sitting in a room for an hour or two with other people and learning things from a wiser person at the front of the room, perhaps based on material by an even wiser person, is an inherently bad idea

        In high school I think it is an inherently bad idea.

        For the typical SSC reader, lectures are a waste of time because one could learn the same material faster and better by just reading a good book on the subject. If you look for a good book you can find one written by one of the best book-writers in the country, not even limited to currently-living ones and that book has been edited so it’s not full of ums and ahs and mistakes and unplanned digressions.

        The fact that in finding a teacher we’re stuck looking among the pool of people who live in your town in easy commute distance of your school and are otherwise unoccupied and we get the best lecture they’re capable of that day, whereas in finding a book we can cast a national or even global net and get a product that took years to refine means the average book is orders of magnitude better (compared to other books) at accomplishing its instructional goals than the average lecture is (compared to other lectures). (And heck, if the lecture WERE better you could transcribe it, edit to tighten it up a bit and…put it in a book!)

        So SSC readers would rather just read the book and ignore the lecture.

        Another problem is that lectures can only efficiently convey information when people are really interested and motivated to learn it. If a third of the class is behind and lost, a third is ahead and bored, and the remainder that might conceivably benefit isn’t terribly interested in the material and thus isn’t paying much attention, that teacher is lecturing mostly for their own benefit. Schools then become a jobs program for teachers and a daycare program for kids with conveying information nearly an accidental byproduct.

        When you go to a conference, your group is highly selected for being interested in that particular material taught that particular way and the conference company has market feedback that gives them an incentive to get rid of boring bad instructors and instruction methods. Schools don’t have such feedback or such highly qualified and pre-selected students, so what works in conferences might not work as well in schools.

        Lectures were invented in a time when books were so expensive it made sense to have one guy stand up and read from a single book to the class. Now that books are cheap, I’m not sure the value proposition is still there.

        • Salem says:

          Books have been cheap for almost 600 years. That is plenty of time for the market to adjust to the new “value proposition,” yet lectures in particular, and in-person tuition in general, are very far from dying out.

          My local library is full of fascinating textbooks on every academic subject under the sun. Why bother going to university, when you can just go to the library? If you spent 40 hours a week reading those books in the library and working through the examples, I’m sure you’d learn much more than if you spent the equivalent time going to lectures and tutorials, but I’m also sure that that section of the library is deserted. Lots of people are checking out the James Patterson and Danielle Steel novels, though.

          If you think a book is a near substitute for a lecture, could it be that you are missing an important part of what a lecture provides? I know that I don’t consider them near substitutes, and I am far less certain than you on the views of the mythical “typical SSC reader.”

          Perhaps a closer substitute for a lecture is a youtube video of a lecture. That scales well, can be rehearsed and edited, and consumed at a time convenient for the customer – all the things you consider so important. This kind of thinking is behind MOOCs, etc. We will see how it goes, but I predict mostly failure. A youtube video of a lecture seems to me too much like a book, and too little like a lecture, to be successful.

          • Tibor says:

            I ditched a lot of lectures during my bachelor and mostly learned the stuff from books. I studied “general mathematics” (which included the basics of mathematical analysis, algebra, probability theory, statistics and numerics as well as a bit of computer science) in my Bachelor, probability theory during my Master (there the lectures were more specialized and a bit less basic, so also more local…but even then I probably only came to like 75% of the lectures). The main reason to come to the lecture for me was to make myself actually go through the material before the exam period. But if I had more willpower, I could probably ditch two thirds or even more of the lectures and read everything from books plus an occasional consultation with the lecturer (I usually read and tried to understand all of the lecture material then I wrote a list of questions and arranged for a consultation with the lecturer where I asked about those comments).

            I think that the main reason we have the same kind of lectures people have had for hundreds of years is that somehow it is perceived as “less serious study” to just read books, even though usually the opposite is true.

            There are some obvious exceptions – if you have to work in a lab in your field, then you are back in the scenario where study material is expensive (or impossible to get legally as a private person – certain chemicals or dead bodies are obvious examples) and so it makes sense for people to meet in a room with a lecturer. Also, one-on-one studying is undoubtedly superior to just reading books, but this only happens during the PhD, undegrad lectures cannot really be tailored to specific individuals.

          • grendelkhan says:

            Why bother going to university, when you can just go to the library?

            Aren’t we pretty sure the answer is ‘signalling’? Reading books and getting an education doesn’t get you a credential, and that’s what the education system is.

            Teachers pretend to teach; students pretend to learn; employers pretend that it really matters that you have a well-rounded liberal arts education. A closed-loop Human Centipede of lies and fakery. It’s common for educators to complain about it.

          • Salem says:

            I heartily agree that credentialism is part of the answer. One of the things that the lecture series provides is a credential at the end.

            It’s not the whole thing, though. One of the big things they’re selling is motivation. Even Tibor writes:

            The main reason to come to the lecture for me was to make myself actually go through the material before the exam period. But if I had more willpower, I could probably ditch two thirds or even more of the lectures and read everything from books…

            Unfortunately he can’t take that thought any further, and falls back on “oh, we go to lectures because that’s how it’s always done.”

          • CatCube says:

            I personally much prefer lectures. I *can* learn from a book, but it goes much more easily when I have an expert walk me through examples, and can fill in gaps in response to my questions.

            I don’t even like webcast lectures; I much prefer in-person.

          • HeelBearCub says:

            I heartily agree with @Salem here.

            One of the things that bug me about the SSC zeitgeist around education is that it assumes that we can ramp up “autodidact” to a whole population, with, as near as I can tell, zero evidence.

            Scott links me to a post where the best study on unschooling he can find is N=12, and never mentions selection effect at all. If this were a post citing a study on the latest research on teaching “grit”, I think we know what Scott would say.

          • Stan le Knave says:

            I find that for learning concepts rather than facts, a one-to-one chat is many, many times more useful than reading a book, to the extent that a five minute chat with a lecturer has brought me to understanding when hours of reading the material was insufficient.

          • “If you think a book is a near substitute for a lecture, could it be that you are missing an important part of what a lecture provides?”

            Yes. But after thinking about the question for several decades I have not yet figured out what it is.

            My view of the subject amounts to Glenn’s argument plus the puzzle you raise–why the mass lecture survived the invention of the printing press. Small classes make some sense because of interaction, but a lecture with three hundred students in the room seems in all important ways inferior to a book. Yet they are still given.

          • Surely the point of lectures, as opposed to books, is to help people overcome akrasia. If you have a book in your possession, you can start reading it any time you like, and once you’ve started you can stop reading any time you like. The scheduling of your study sessions is entirely up to you. But if you’re going to lectures, you have to keep to the lecturer’s schedule. Even if there are no penalties for you if you miss a lecture, the lecturer’s schedule is a Schelling point.

            I’m a university student. If I had been learning from books alone instead of lectures, I think it’s possible I might still be about as knowledgeable as I am now about the subject I’m learning, but I think my knowledge would be more focused on the particular areas of the subject I’m interested in, and more patchy in generality. That might be a worthwhile trade-off; I don’t know. But I’m probably an unusually conscientious student. I think most people probably end up learning more via lectures than they would if they just read books.

          • Vox Imperatoris says:

            I just wanted to chime in that I always got a lot more out of college lectures than the books. Even when I actually read the books, which I often didn’t because the assigned sections are often too long to reasonably do in combination with everything else, kind of boring, and not very helpful for getting an A on the exam.

            For instance, I think I got a much better understanding of Aristotle’s or Immanuel Kant’s philosophies from listening to a professor talk about them than from trying to read that stuff. This has some relation to the fact that what’s usually assigned for the readings are primary sources, which I tend to think are of…dubious value, especially as introductory material.

            But I’ve always considered myself more of an “auditory learner”, as much as that distinction is valid. I tend to remember what I’ve heard significantly better than what I’ve read, and I definitely find it much easier to focus on auditory material. That said, I can’t stand watching videos on the internet explaining how to do something because the people are too long-winded and talk too slow.

            In any case, I always hated every kind of “participatory” or “hands-on” project in school. My favorite type of education has always been the “sit quietly while you’re being lectured to” model. With questions allowed, but not when it turns into the students talking more than the teacher.

          • Airgap says:

            I don’t see why you kids are trying to explain mass lectures and other educational artifacts rationally when the real explanation is path dependence. Big lecture halls are what we’ve always done. If there was a better way, and if the system were subject to pressure (e.g. market forces) to find the best way to impart knowledge, we would change. We haven’t. Determining which of the above conjuncts is false is left as exercise for the reader. Anyhow, that’s all for today. See you next class. Don’t forget to study for the midterm next tuesday!

          • Glen Raphael says:

            By “typical SSC reader” I mainly meant “person with >130 IQ who enjoys reading”.

            So…consider a typical math class in front of 30 kids. The teacher is not as smart as the brightest kids in the class to begin with and that teacher is presenting a lecture which has to cover the material slowly enough that the class doesn’t get lost and confused, which means repetition and overlap. You start each year with a week or more of review that assumes kids don’t remember anything they learned the prior year (which is not true for some) and move on from there at a pace such that the kids in the bottom third of the class can keep up. This means kids in the top third are bored out of their skulls. I survived most classes by ignoring the lectures as I sat in the back and read books hidden under the desk, occasionally glancing up at the board to see if I’m missing anything. At the time I thought I was getting away with something but looking back I’m sure my teachers knew and didn’t mind as long as I wasn’t disruptive and at least occasionally pretended to be paying attention.

            In math class, the teacher discussed the same material that was in the book only less efficiently; the best use of my time was to either read the book *or* pay attention to lectures, not both.

            I feel like I learned much more despite than due to my classes – I could have learned more and faster without them.

            But this argument applies much more strongly to grade school and high school than college. And it doesn’t apply at all to classes teaching things you can’t learn directly from a book, like pottery or drawing, or to classes where the teacher is for some reason unusually gifted. But it applies to the typical class at the typical high school.

            It applies the most strongly to situation where the teacher sees their job as spelling things out – elaborating on how to apply what’s in the book – before students who don’t need that service because they read the book and understood it.

            Regarding ‘motivation”: one reason it’s hard to motivate kids to learn is that you’re teaching them something they don’t yet need to learn. If they wait and learn it later when it’s relevant to something they need to do, it’ll be easier then.

            The good news is indeed that online universities are likely to disrupt this. It’ll be interesting to see how it all plays out.

          • “But this argument applies much more strongly to grade school and high school than college. ”

            Your example was math. For the most advanced math classes I took (nominally grad students only), there was no book. I took notes from the blackboard, made no attempt to keep up or follow in any detail what the professor was teaching, learned the course in the week or so before the final from my notes. It would have been a little easier from a book.

          • Glen Raphael says:

            @David:
            Clearly you take better notes than I do! 🙂

            One reasons I thought my argument applies somewhat less to college is that fewer college students are legally required to be present in the room. If everyone who really doesn’t want to be there either doesn’t sign up for the class or doesn’t bother to attend, the teacher can teach faster. Another is that college professors might somewhat more plausibly have unique insight into the material or be teaching something for which the definitive book hasn’t yet been written.

          • “Another is that college professors might somewhat more plausibly have unique insight into the material or be teaching something for which the definitive book hasn’t yet been written.”

            It doesn’t seem likely. Most fields don’t change that rapidly, and only a small fraction of college professors are actually original thinkers with unique insight.

            The one clear advantage of a class is interaction, but that doesn’t apply to large lecture classes, which are the ones whose survival I find puzzling.

          • HeelBearCub says:

            I asked questions in all, or at least most, of the large lecture classes I had. I have to think I wasn’t the only one who got benefit out of that, if for no other reason than they made lectures less predictable.

            My father used interactive feedback devices to make his large lecture Econ classes interactive. He regularly used that method to do things like market simulation.

            Large lectures can most certainly be interactive, to some extent.

          • Tibor says:

            @Salem: I think you could set up better mechanism than going to the lecture though and so the motivation really is not the main reason lectures are held. A typical lecture lasts 90 minutes. In those 90 minutes I would usually mostly copy the blackboard into my notes (and I would not even do that when there were good lecture notes provided by the lecturer, which was sometimes the case). Maybe some people are better at concentrating but to me this means paying less attention to what the lecturer is actually saying then if I just listened. This hardly seems like the optimal way to motivate people to study. If nothing else, especially in the big bachelor level lectures, the presence of the lecturer is almost unnecessary. For example you could arrange for scheduled meetings of the students where they read the provided lecture notes together (they could then also meet in much smaller groups). Then some questions could be answered by other students and for the remaining ones you could have a consultation lecture every few weeks where the lecturer would answer these. When there were good lecture notes provided, this was often more or less what I did (except that I usually read it alone or maybe with one other student) and I would come to the lecture from time to time mostly to show up because I thought the lecturer would not like a student who never comes to the lecture and then only shows up for the exam. And even though I did go to some lectures in order to study at a more steady pace (as opposed to learning everything during the exam period), I still had to do most of the work at home, being at the lecture gave me a rough idea about what the lecture was about but I still had to go through the notes afterwards. 90 minutes at the lecture saved me on average something like 45 minutes of reading on my own.

            Exercise classes were very different, not just because attendance was often required but also because it was actually useful to be there. I could also usually get a more clear understanding of the concepts from the lecture there. 90 minutes at the exercise class could save me as much as twice the time studying at home.

            In my master and in the few lectures I took during my PhD there were usually not more than 5 people attending the class, on two occasions there were only two of us attending the class. In these quantities, one can really do things a book cannot, you can also stop and explain something every time someone does not understand it.

            I also have another problem with lectures – I am usually able to follow the line of argument but to put everything together into a bigger picture, I have to go back and reread the whole thing. It happened to me quite often that we were proving a really complicated and technical theorem and at the end of the proof I forgot what we were actually trying to prove and would have to backtrack a bit. Other times, something was explained in more detail than I though was necessary, because the argument was (to me) obvious. This happens even at small lectures, but there you can usually actually stop the lecturer and ask. If everyone did it at a lecture with 200 students, it would take twice as much time and half of the people would get bored quickly.

            At the end of the day, like my PhD advisor says – you don’t really understand anything before you start doing it. Unless you actually start applying what you learn yourself (at least in the form of a bachelor/master thesis), your understanding will be very superficial at best (more likely, you’ll forget a lot). Of course, in maths you get to apply a lot of the basic stuff while learning the more advanced things, so this is less an issue with calculus or linear algebra but more with more advanced (and specialized) material. I think this might be even more true in other fields where things are less built on top of each other than in maths.

          • HeelBearCub says:

            @Tibor:
            In regards to note talking, I believe there is good research that, for many/most people, hearing something and then writing it down leads to far better memory retention than just listening. The same is also true for reading, but I seem to recall it not being as effective.

            Take that with a grain of salt, because I certainly didn’t read any primary papers on the effect.

          • Tibor says:

            @HeelBearCub: I think that’s is true, it definitely works with me, although I suspect that the writing bit is the most important. I can still recall one time in the grammar school where I was either ill or on a vacation with my parents for a week or two and then I borrowed my classmate’s physics notes to copy them (I think we did not have a scanner at home back then). I wanted to be examined afterwards so I planned to copy the notes and then to learn it. But to my surprise I had already learned it while rewriting it from my classmate’s notes.

            I also used to write theorems and proofs during my bachelor and master while studying for the exam instead of just reading them, or rather I would first read it and then try to recreate the theorem and the proof on paper (eventually, I would write the main ideas of the proofs only since I was confident I could work that out if needed, but especially in the bachelor when I was a complete beginner, I would usually write the whole thing while studying).

            Even today, when I read a paper for example, I like to make “paper notes” if the paper is technical and complicated.

            I also find colloquium talks or conference talks useful. Well, some of them…especially other PhD students often have quite horrible talks full of formulas and technical details which one simply cannot follow in the 30 or so minutes a typical conference talks has. But there are amazing talks sometimes when after 30 minutes I feel like I understand all the ideas perfectly and all that is left are boring technicalities (these are usually talks of much more experienced people). But lectures are different. There you actually do all the boring technicalities and I have a horrible attention spam (I can sort of pay attention for one hour, after that I keep zooming out without noticing, sometimes it happens even sooner), so often I would realize I missed the last 5 minutes while thinking about something completely unrelated (without even realizing it) and could not keep up with the last argument because of that.

            If lecture talks were structured more like good conference talks – omitting the details, giving the main ideas and the big picture and assuming that the rest would be filled in by the students themselves, perhaps using the provided lecture notes (where it would be written in detail), then they would be a lot more useful. But this is very rarely how things are done in practice.

        • Skivverus says:

          Lectures are, I think, still useful for the same reason newbies in online games ask their questions in chat rather than consulting the manual – substitute “textbooks” for “manual” and “academic subjects” for “online games”.

          A textbook will (probably) contain the information you’ll need for your specific inquiry, but it also contains a lot of other, less-immediately-relevant information, and however trivial a skill “checking the table of contents and index” is, very few people master it to a degree where it’s faster than just asking the expert in residence. Plus, some people just learn better with their ears than their eyes.

        • dimestoreinjun says:

          I’m wary of generalising these observations for three reasons: (i) I’m from Ireland, which features heavily subsidised university education and isn’t the poison debt pit of the US, (ii) I studied undergraduate law, which is as much a craft as it is an academic subject and (iii) I think a lot of these arguments are being made in the context of the US, specifically undergraduate liberal arts courses of dubious utility. That said, here are some generalised observations.

          In the first place, I think it’s a bad position to start from the assumption that many or most people who receive a university education are similar to the aloof latterday philosopher kings that populate the SSC comments section. I would’ve been heavily in favour of the approach whereby people simply read books to learn as opposed to having those books effectively read out to them, but having subsequently met a large number of people who would describe that process of self-learning as a literal(figurative) hell, I’m sympathetic to the idea that lectures work and work well for a lot of people.

          Going to lectures and following them through demands a degree of rigour and thoroughness that I don’t think I would’ve supplied myself had I been allowed to study at my fancy. When I didn’t attend lectures I produced some very good stuff on esoteric and interesting topics but would find myself struggling to remember precisely the basic ingredients for negligence in tort. I felt very unmoored from the boring but important workaday elements of the law.

          Being lectured by somebody who has good practical experience in a subject was invaluable. It provided a context and savvy professional awareness of the law that simply couldn’t have been replicated by the experience of sitting in a room reading the law reports cover to cover.

          Lectures don’t work particularly well for people who are going to work at the top of their field or in an academic position producing new scholarship in that field. But I don’t think that’s their ambition. They’re designed to force large numbers of students to remember and engage with large volumes of boring but necessary information while providing some minimal level of personal interaction and reassurance as well as the ability to ask for in-person clarification of particular problems that the student might be experiencing. Combined with tutorials, they’re quite good at doing that.

          • “Being lectured by somebody who has good practical experience in a subject was invaluable. It provided a context and savvy professional awareness of the law that simply couldn’t have been replicated by the experience of sitting in a room reading the law reports cover to cover.”

            How about sitting in a room reading a book written by someone with good practical experience–whatever person among those with such experience who does the best job of writing such a book? What is it the lecturer can give you in a lecture that he can’t give you in a book, and how does it make up for the fact that you get to be much more selective in books than in lecturers?

          • Anonymous says:

            Have you really never heard of the typical mind fallacy?

            This is like Rafael Nadal going around and tennis forum and incessantly asking people why they can’t just hit the ball 120 MPH. It’s easy!

          • Hlynkacg says:

            You can’t ask a book questions. and books generally don’t tailor their output to the audience.

          • dimestoreinjun says:

            “How about sitting in a room reading a book written by someone with good practical experience–whatever person among those with such experience who does the best job of writing such a book? What is it the lecturer can give you in a lecture that he can’t give you in a book, and how does it make up for the fact that you get to be much more selective in books than in lecturers?”

            As the guy below said, I think you might be generalising from your own experience as a very smart, autodidactic person.

            Keep in mind that you’re dealing with undergraduates, who for the most part have little to no purchase on the academic intricacies of the subject that they’re studying. So while you might be able to sort good scholarship from bad, relevant from irrelevant, contemporary from outdated, most people entering into a subject haven’t the faintest idea where to start. Without the guidance that some form of knowledge curation (like lectures) provides, it can be a very disheartening experience to plunge into a new field of knowledge.

            One might say that these goods could be provided by a Youtube series, and I’m very sympathetic to that argument; I’m sure there’s many other factors at play, but considering that the core good sold by a university is brainy people talking in a room full of people and brainy people thinking silently in a room alone it seems ridiculous that it should cost so much.

            At the same time, I think it underestimates the importance of in-person interaction with a lecturer and sense of shared competition/suffering/enlightenment of going through lectures with a peer group.

            I suppose the classic example in law would be a professor saying: “The law is X, though in my years of practice I have found that it applies with corollary Y, or is only relevant where Z is pleaded.” Perhaps they give you a broader context within which a regulatory scheme applies. Or they give you an anecdote from practice which provides insight into the life of a corporate lawyer. You can imagine. Surely as a lawyer you’ve come across legislation or a legal anomaly which seems ridiculous on its face but made sense once someone contextualised it for you?

          • “Surely as a lawyer”

            I’m not a lawyer. I’m an academic economist who teaches in a law school.

            So far as your general point, a good book, like a good lecture, points the reader at the things he needs to learn, so I don’t see the advantage of the latter. I’m not suggesting that people replace a lecture with a research program, which is what you seem to be describing. And I still don’t see why the sorts of information you describe in a hypothetical law school lecture couldn’t be provided just as well in a book.

          • smocc says:

            @David Friedman

            Do you have some examples of the good books you’re talking about? My impression is that you’ve gained basic competency in several fields through self-study. Each time you did that did you use a single book, or multiple? Did you have to try bad books before finding a good book? How did you know which books were the good ones?

            I ask partly because I’m not sure I have ever learned an entirely new skill from a single book. I learned more from Taylor’s Classical Mechanics than from my lectures, but that came after several years of studying physics and math, and I learned group representation theory from a book, but only after two very good lecture courses on math foundations and abstract algebra. Honestly I’m having a hard time imagining the sort of good book you’re talking about.

          • HeelBearCub says:

            @David Friedman:
            Perhaps you have put me on mute, but I would still like you address the idea of whether you can confidently assert that lectures and books are perfect substitutions for each other.

            Paring knives and butter knives are far more similar to each other than books and lectures, yet each is very much unsuited to the task done by the other.

          • Glen Raphael says:

            @smocc:

            Do you have some examples of the good books you’re talking about?

            I ask partly because I’m not sure I have ever learned an entirely new skill from a single book.

            I learned the new skill of counting cards at blackjack from a single book, Million Dollar Blackjack by Ken Uston. It’s not a great book but it was the first one on the subject I randomly came across. I had learned that skill once I read that one book, spent a bunch of hours practicing, then spent some hours playing in a casino.

            (Since then I’ve read a dozen other blackjack books – my favorite of the later set was Blackbelt In Blackjack by Arnold Snyder – and subscribed to newsletters and forums and revised my strategies many times, but that first book got me started.)

            Heck, even when I joined a blackjack team which required learning a new system and mastering new rules about money management and checkouts and such, that stuff all got written down. The info I needed was provided in a text file and emails, not a classroom lecture.

            Or as a second example: for both libertarian-ish political theory and economics, my gateway book was Free To Choose by Milton Friedman. (sometime after that I went through a phase of reading economics textbooks for fun, but FtC instilled the economic way of thinking and gave me the idea there was something worth knowing in there.)

            I don’t think I’ve ever read just one book on a subject I’m interested in, but I’ve often had the experience that the first book changes the way I think about that subject, primes me to want to read more, and provides pointers to other relevant books or related topics.

            I could list lots of other specific books that worked like that for me but honestly if you just go to any bookstore or library, wander through the section related to the topic you’re interested in, pick up and skim texts at random until you find one that looks kind of interesting and seems to be written well…that’s the one you want. Read it all the way through (it helps to be a completionist) and you’re on your way to understanding that subject better than you did before. Now find and read ten more and you’ll start to have a sense of the field.

            (this method has served me especially well in bookstores near a college campus or in a subject-specific university library, but it probably generalizes to wherever you are.)

          • Glen Raphael says:

            @HeelBearCub

            Perhaps you have put me on mute, but I would still like you address the idea of whether you can confidently assert that lectures and books are perfect substitutions for each other.

            Didn’t David directly answer that question here? He said there might be some advantage to lectures but (after much thought) he doesn’t know what it is.

            As for me, I can think of a few conceivable advantages to lectures. For instance:

            (1) There is a theory that some people are auditory learners – they learn better from hearing than from reading. Maybe they get something out of vocal tone or body language that helps retain info better, or maybe they’re poor readers or have especially good hearing or especially poor eyesight or need glasses. Those people should attend lectures or perhaps read books-on-tape.

            (2) There is a theory that teachers aren’t really there to convey knowledge at all. You teach who you are is the phrase. The teacher is there to be a positive role model, to provide the students with an example of a functioning adult person who they can emulate as they themselves try to become functioning adult persons. The material is only an excuse to bring students and teacher together. From this perspective when the teacher digresses from the topic to discuss say, where they went on vacation or what they had for lunch, you might be learning more of what really matters than if they stick to the material!

            (3) there are (very very rarely) some subjects in which scholarship is changing so quickly that a good book on the subject doesn’t exist yet.

            (4) There are some subjects in which the lecture is inherently hard to write down, perhaps because there are physical demonstrations involved (eg, physics experiments).

            (5) Some lecturers are really entertaining speakers or are really famous, such that students have fun or like to be able to say they met or learned from that person in person, regardless of whether they learned much. Or students might hope to glean additional nuance on a topic they already know really well. (It’s kind of like the difference between a live concert and listening to the album.)

            I am sure #1 is true for some people, though it didn’t apply to me when I was in school. I have been motivated by #5 and #2 a few times.

            Upshot: it’s easier to explain the persistence of lectures if one drops the premise that the sole point of a lecture is to impart knowledge. Because in the main, books do that better.

          • HeelBearCub says:

            @Glen Raphael:
            Thanks for pointing me at that comment. I had read it, and it predates my question to him. It’s a little unclear to me, but it seems to me he is only engaging in epistemic humility, allowing that he might be wrong about lectures and books being substitutes. He does not seem to be saying he thinks there might be advantages to lectures.

            And my question is, again, why would we think that two things that are so very different from each other would not have comparative strengths and weaknesses? Most college classes use both.

            As to your points, that seems like a good list, but I think you are really underselling #1. Universal US literacy (let alone world literacy) is, what, less than 50 years old? Whereas homo sapiens verbal communication is at least 100,000 years old? And primate language of some sort is probably much older?

            I know which one my money is on for being more prevalent in terms of learning styles.

            I also think that missing from your list is the concept of reinforcement. Most learning requires repetition. Ideally, you read an assignment before the lecture, you hear the lecture, you write down the lecture in the form of notes and you discuss the lecture in study groups or sections. Not only are repeating the information, you are doing it in different ways, and I believe research supports that doing it in different ways is better. If you could smell the information, it would be even better. 😀

          • Good books:

            My standard example is _The Selfish Gene_, which teaches evolutionary biology and makes it fun. I thought of that as in part a model for my _Hidden Order_, which tries to do the same thing for price theory.

            I learned quite a lot from Marshall’s _Principles_, the book that really put modern economics together a little over a century ago.

            If I hadn’t studied physics in class, Feynman would probably have been a good way of learning it.

            Poetry, one of my interests, I learned almost entirely by reading poems, not either by taking classes or reading books about poetry.

            _The Elements of Style_ is a pretty good book for learning to write better. But I think most of my training in writing was doing it. For some years I was the token libertarian columnist for a conservative student magazine. I had a word limit, I think 800 words, and fitting what I had to say into it was very good training.

            Law I have learned in part reading articles, in part teaching courses, which sometimes involved reading the textbook.

            But a lot of what I learned was from talking with people and teaching courses. When I came to VPI as an assistant professor of economics, I got assigned, I suspect deliberately by Jim Buchanan, to teach a large part of the syllabus. Teaching things is a good way of learning them.

            More generally, I think my learning was more something that happened in my head, although books helped. When I started writing my price theory text, if you had asked me how many pages it would take to explain the subject I would probably have guessed fifty or so. Actually doing it forced me to think through the ideas underneath the ideas, and the book ended up at something like ten times that length–teaching me quite a lot.

            So I’m not that good an example for my argument. I sometimes say that you never really understand an idea until you have invented it yourself.

          • @HeelBearCub:

            I don’t think I’ve put anyone on mute–I’m not even sure how to do it. If I find comments uninteresting I just skip over them.

            Books and lectures are not perfect substitutes. My argument is that the book is a superior substitute for the mass lecture, not always for the class small enough so that most students have substantial interaction with the teacher.

            And I recognize that the continued use of mass lectures is evidence against my argument. That’s a puzzle to which I have not found an adequate solution.

          • HeelBearCub says:

            @David Friedman:
            Thank you for clarifying. My brief thought on why the mass lecture continues in higher education is simply that it is a gate preventing those who have not been exposed to the introduction concepts from taking higher level courses. Gates of course don’t only prevent entrance, they also mark the entrance, so one can think of mass lectures as marketing for the field of study.

            It’s not clear to me, however, what this has to do with college writ large, as mass lectures are significantly in the minority of courses taken.

          • Glen Raphael says:

            @HeelBearCub:

            It’s not clear to me, however, what this has to do with college writ large, as mass lectures are significantly in the minority of courses taken.

            Wait, they are? How are you measuring that?

            My undergraduate degree was at UC Berkeley (which had 30,000 students at the time); nearly every class I took was a mass lecture. (As an undergrad at Berkeley if you want fewer than 50 people in the room, that’s what discussion sections or office hours are for.)

          • suntzuanime says:

            That may be a feature of Berkeley specifically; I went to a large state school, but the huge lecture halls were mostly just for introductory classes. After the first couple years all my classes were in reasonably-sized classrooms, maybe thirty students at most. I’m really struggling to see how you would hope to teach upper-level undergrad stuff in a mass lecture.

          • HeelBearCub says:

            @Glen Raphael:
            73% of Berkley’s undergraduate classes are fewer than 30 students in size.

            Perhaps this was a function of your particular major? When did you graduate and with what degree, if you don’t mind me asking?

            If you experience of a college was entirely 400 student lectures, well, you didn’t have anything like the typical experience.

          • Glen Raphael says:

            HeelBearCub:

            When did you graduate and with what degree, if you don’t mind me asking?

            Graduated UCB in 1990. I competed a major in Computer Science and most of a minor in Philosophy. So the big classes were in CS, Philosophy, music, various electives and core requirements.

            At the time, CS was “an impacted major” (meaning more students than they could reasonably handle wanted to get in). It still is today. Intro to CS is the largest class at Berkeley, serving ~1100 students in a lecture hall with 750 seats.

            There is a fantastic video illustration using pills to show class sizes at the bottom of this article.

            The claim “73% of classes are <30 people" has an obvious implication which is false due to selection effects – it doesn’t mean an average student experiences mostly small classes.

            UPDATE: hang on, let’s do some math on that. Suppose we divide Berkeley classes into two types we call “popular” and “unpopular”.

            An “unpopular” class is one taught by an unpopular professor on an unpopular subject, connected to an unpopular major. To make the math easy, let’s say the average class size is 10.

            A “popular” class is one that serves an especially popular subject or major or is a basic intro to something important, or it is a class taught by a superstar professor (say, a philosophy class taught by John “Chinese Room” Searle). These classes overflow the room, have students standing in aisles and turned away due to overcrowding. Let’s say the average class size is 500.

            73 “unpopular” classes of 10 = 730 students.
            27 “popular” classes of 500 = 13,500 students

            Given that math, at any given time 5.4% of students are in the small classes and 94.6% of the students are in the big classes.

            Students who spend extra time, effort, and creativity studying the ratings and strategizing out how to get into especially good classes would be expected to spend even more time in large classes. Students who pick really popular majors would also be in a large fraction of big classes – I did both those things. On the other hand, students who pick an especially obscure major might get to spend nearly all their time in small classes, seeing the same few people all the time in them.

            I’ve exaggerated the case a bit for dramatic effect, but not all that much. Possibly the best thing I learned from Berkeley was the value of persistence. When you really want to crash a 600-student class which you didn’t get because it’s “full”, if you just keep showing up eventually the other dozen or two people trying to add will give up and leave. A week or two later you’ll have no trouble adding the class, the Prof will just sigh and say “are you still here? Sure, I’ll sign the add form.”

          • HeelBearCub says:

            I completed a Math Degree (which was really a Comp. Sci. degree) from UNC in ’91.

            My experience was absolutely nothing like that. Large lecture classes were reserved for intro to Psych, Econ, Geography and maybe a few others. Comp. Sci course topped out at about 50 (maybe 100 in Intro to Comp. Sci?) and got smaller the more advanced they were. Math classes were around 30. Physics was around 50. I took a variety of general college courses in a variety of other departments, all (or almost all) of them 30 students or less. I recall that all my large lecture classes also had sections which met weekly as well, which were maybe 20 or 30 people.

            Our required course of study didn’t let us take mostly large, 100+ lecture classes. The required course work routed you through mostly smaller classes.

            My daughter is currently at a school of 800 where most of the classes are 15 or 20 people. I don’t think they even have a large lecture hall.

            Edit:
            And that article seems particularly linked to programming being more and more linked into every other field of study, leading to lower level courses being swamped with demand. That’s not an example of what a typical bio, history, poli. sci., psychology, etc. major would experience.

          • suntzuanime says:

            Yeah, that sounds like a problem relatively unique to Berkeley. I remember when I was shopping around for colleges I visited Berkeley and thought the place seemed dingy and poorly maintained. Maybe they just don’t have the funding to do a proper job teaching their students? I wouldn’t generalize to the average undergrad experience based on that.

        • Anon256 says:

          Note that adult second-language classes (particular for immigrants) are subject to few of the distortions present in other forms of education (the people paying are generally the ones benefiting, they can mostly tell whether they’re learning what they should, government involvement is minimal, the focus is on useful skills not credentials or abstract virtue). But these also often take the form of small lectures.

    • E. Harding says:

      Reading, writing, and arithmetic is useful. Algebra is sometimes useful in daily life. Geometry is rarely useful in daily life (at least, for me). Everybody forgets their science classes, but everyone should know some basic science stuff (you leave more sweat on your shirt in human air, man evolved from apes, enzymes are proteins, stuff like that).

      I don’t know if I can decide where education crosses over from useful to useless. Theology is definitely useless, electrical engineering is definitely useful.

      • Wrong Species says:

        Is electrical engineering useful to the guy who doesn’t want to be an electrical engineer? Sure, they might be better positioned to do something in the event of an outage but we don’t give mandatory plumbing classes.

      • Friday says:

        Presumably someone thinks theology is worthwhile.

    • Jason K. says:

      The problem isn’t education per se, but that the education we teach tends to be low value. Most of formal schooling is little more than an information cram, which isn’t very helpful for most people. It is like this because it is easy to write and test your memorization of data. The important lessons are (almost) never formally taught. Instead we are expected to acquire them along the way.

      Perhaps I am wrong, but how many people that went through the school system learned (in class):

      1: How to manage your finances.

      2: How to complete basic repairs/maintenance.

      3: How to negotiate.

      4: How to identify manipulation and lies.

      5: How to interact with the government (police, social services, etc).

      6: How to choose a life partner.

      7: How to become employed.

      8: How to choose a profession.

      These seem to be absolutely essential things for any person to learn to make the best out of their life, but I wager not 1 in 20 people covered more than 2 of these things in ‘school’.

      • Outis says:

        1. Do not spend more than you make. From a Dickens book where a character says that, for a man with an income of 20 pounds, spending 19 pounds and 19 schillings is happiness, but spending 20 pounds and one schilling is ruin. Or something to that effect.
        Also basic economics and accounting, in high school.

        2. No, and I haven’t felt the need for it.

        3. No, although you could argue that there are enough examples in history.

        4. Not explicitly, but yes.

        5. Basic civics, yes.

        6. Not as such, but I read enough writers’ opinions.

        7. Yes.

        8. Sort of.

        But I don’t think these things necessarily need to be taught in school, especially at their the most basic level, which seems to be what you have in mind. School is not meant to take in feral children and spit out functioning members of society. There is a (fully reasonable) expectation that parenting and other out-of-school learning is going to happen.

        • Jason K. says:

          “School is not meant to take in feral children and spit out functioning members of society.”

          Excepting ‘feral’, this seems to be largely what it is sold as and what a lot people expect from it.

          I agree that it isn’t what it actually does, nor was it the original design purpose.

        • Agronomous says:

          Micawber, in David Copperfield, Chapter 12: “Annual income twenty pounds, annual expenditure nineteen pounds nineteen and six, result happiness. Annual income twenty pounds, annual expenditure twenty pounds nought and six, result misery.”

          (I could have sworn it was a Chesterton quote, to the point where I was surprised David Friedman hadn’t corrected you yet 🙂 ).

          @Jason K.:

          I agree that it isn’t what it actually does, nor was it the original design purpose.

          Paul Graham talks about how you can tell the real purpose of an institution by looking at what it makes easy and what it makes hard. He notes that it was easy to go all day without learning anything at his high school, but next to impossible to leave campus without being caught.

          • suntzuanime says:

            Paul Graham is too much in love with a pithy quote and too little in love with the truth. It’s easy to avoid getting a driver’s license at the DMV and hard to convince the pretty clerk to sleep with me, but one should not conclude that the DMV exists to keep me sexually frustrated.

      • HeelBearCub says:

        Where is the evidence that, at a population level, unschooling can actually work? N=12 has to be bound by selection effect to a ludicrously high degree.

        • You might want to look at the literature on Sudbury Valley School. N is considerably greater than twelve.

          Whether it works for most people is hard to tell, since unschooling is pretty rare at present.

          • HeelBearCub says:

            So, I see Sudbury costs $8700 per year to attend, which is really close to the $10,700 average cost per pupil of public school in the U.S.

            Pretty much blows most of Scott’s post out of the water.

            Now, I was a Montessori kid early on, as were my kids, and my mom is a Montessori pre-school teacher, so I am actually a big fan of self-directed learning. But it does require resources and guidance. And it doesn’t necessarily work for every kid.

            I watched my oldest kid just stop working when she hit fourth grade, because akrasia is a mother.

          • Ptoliporthos says:

            @HeelBearCub
            I know I’d like a 20% cut in my property taxes, perhaps it’s worth a look?

          • Glen Raphael says:

            So, I see Sudbury costs $8700 per year to attend

            The original Sudbury School has a fabulous physical location and some prestige value; tuition is $8700 for the first child per family, $7400 for the second, $6000 for each additional child. (You can find much cheaper Sudbury-type schools in other locations.)

            which is really close to the $10,700 average cost per pupil of public school in the U.S

            Er…the per-pupil cost of public school in that district (Sudbury, MA, typical for that state) is 14,246 (source).

            So even the first kid per family saves 40%. Presumably if that schooling model became more popular there’s room for economies of scale beyond that, but it seems like a good start, no?

          • HeelBearCub says:

            Scott’s whole post is premised on the idea that the entire cost of public schooling and college can be recouped as GBI, which it doesn’t seem unschooling actually allows you to do.

            The cost of Sudbury also doesn’t deal with selection effect, so it’s not clear to me whether one can actual even do it at the price if you try and role it out to the whole country. Perhaps economies of scale would take effect, perhaps things would be more expensive when you start trying to unschool everyone. And that assumes that unschooling is actually effective for families who don’t “opt in” to it.

            I also am unsure if Sudbury has any sort of non-tuition based funding. For instance, do they have an endowment of any kind?

            In any case, unschooling may be less “expensive” than current public schools, but what is blown out of the water is the idea that unschooling is free.

          • Glen Raphael says:

            Scott’s whole post is premised on the idea that the entire cost of public schooling and college can be recouped as GBI, which it doesn’t seem unschooling actually allows you to do.

            True unschooling is illegal or nearly-so in many (most?) metropolitan areas due to truancy laws. The Sudbury Valley School is first and foremost a school so they have to pay the cost of dedicated staffing and real estate. If you want to send your kids to “a school” that practices unschooling and meets all the state regulatory criteria for fitting the description of a school, there aren’t a lot of options for that and the few institutions doing it face a fair number of inherent costs.

            If unschooling were more socially acceptable then people could do the exact same thing at home (or in a random neighbor’s backyard or garage or at the park…) without all the parents involved being at risk of their kids being taken away from them. That’s where we could get the other 50% savings from.

            The cost of Sudbury also doesn’t deal with selection effect

            “selection effect” could go either way – one reason parents pick private schools is that their particular kids do badly in public schools. Also, a lot of “problem kids” in the public schools aren’t a problem under Sudbury-type methods. Public school social pathologies often have to do with the weird learning environment – segregating kids by age, giving them no control over what they learn or when, expecting them to sit still and be quiet and stay in a particular place on cue, expecting them to learn at a particular pace and order. The Sudbury method plausibly scales across a wider variety of learning styles.

          • HeelBearCub says:

            @Glen Raphael:
            So now you want everyone from pre-K through college to be completely without any dedicated adult or teacher support for 8 to 10 hours per day? Because that is what your suggestion seems to actually boil down to.

          • Glen Raphael says:

            So now you want everyone from pre-K through college to be completely without any dedicated adult or teacher support for 8 to 10 hours per day?

            Heck no! I just want something vaguely resembling that option to be legally allowed. I wouldn’t inflict it on everyone. Right now it’s almost completely illegal, and I think there are probably stages in between “X is illegal” and “let’s force everyone to do X”.

            “I seek the staid, moderate middle ground between prohibition and compulsion.”

            Remember, a lot of people here – or perhaps their parents – grew up in a time when it was socially allowable to hire a local teenager to babysit your kids and it was socially allowable for kids to play in the backyard or the front yard or even the street without a grownup present and it was socially allowable for relatively little kids to ride their bikes long distances. Or get on a plane by themselves, to be met at the other end. Or ride a bus on their own. Or arrive home before their parents do and look after themselves for a couple hours (aka “latchkey kids”).

            If you relax the constraints that enforce the current model of schooling, that doesn’t mean we end up that everyone at every age is completely without any dedicated adult or teacher support. Rather, it seems likely to me the median result is that some kids get some unstructured and unmonitored or less-monitored time to play or read a book or learn on their own.

            Maybe you have a dedicated adult or teacher part-time, like you see them only on certain days or at certain hours of the day instead of them being there all the time. Or maybe you have older kids responsible for younger kids. Or maybe kids work with their parents or take a side job or volunteer or apprentice somewhere to learn stuff that way.

            (I still get the feeling you’re not grokking how self-directed learning works, but I’m not sure where the disconnect is.)

          • HeelBearCub says:

            Self-directed learning? I’m fairly familiar with it, as my mother is a Montessori pre-school teacher and I and my kids both attended Montessori through at least some of grade school. I have zero problem with self-directed learning, and I think it would be good for it to be more widely available.

            And I completely agree that we need our kids to have more freedom in their day generally, especially when we consider the WHOLE day, not just the school day.

            But Montessori isn’t any cheaper than bog standard US public schooling. And I’m not sure unschooling is either. For instance, do you think home unschooling would work if one parent wasn’t stay-at-home?

            In any case merely cheaper is quite a bit different than what Scott is asserting in his post. He claims that all of the cost of public schooling can be captured to be used for GBI with no resulting loss of society wide function. I find this claim extremely dubious, to put it mildly. His address told a putative entire graduating class that they gained nothing from schooling, and in fact laughed mercilessly at anyone who thought that education had helped them learn “how to think”.

            You can say he is merely asking people to “look at the tradeoffs” but his thumb is solidly on the scales. We know what Scott thinks about the proposition.

          • Glen Raphael says:

            @HeelBearCub:

            I think that there any number of (perhaps most) people who learn better through a combination of lectures, reading, writing and testing, than through self-motivated individual study with no provided curriculum.

            It seems like you enjoy strenuously asserting things that nobody has disagreed with.

            For instance, do you think home unschooling would work if one parent wasn’t stay-at-home?

            I do think that, yes. I’m not at all convinced you need a parent nearby in order to learn – I learned just fine reading books at the library – but even if you did (for what?) why would it need it to be your parent. Couldn’t any other parent in the neighborhood work just as well?

            It seems like your model of unschooling might be “parent intensively homeschools their own kid, but allows some amount of autonomy in choosing subjects.” No, figuring out what you want to learn and how you want to learn it is the whole deal. If a parent has to be home all the time to do it, that’s probably just homeschooling, not unschooling. (Not that you really need parents home all the time for that either – lots of homeschooling uses self-paced workbooks – but that’s another topic.)

            In any case merely cheaper is quite a bit different than what Scott is asserting in his post. He claims that all of the cost of public schooling can be captured to be used for GBI with no resulting loss of society wide function. I find this claim extremely dubious, to put it mildly.

            Scott does not claim this. You might want to work on your steelmanning skills; there’s some straw peeking through. Scott didn’t say with no resulting loss so if you want to assume he did, why not assume he threw a qualifier in there somewhere at the time?

            If you only think there would be SOME “resulting loss of society-wide function”, you haven’t yet disagreed with Scott. If you think there would be some and that loss would be substantial enough so as to endanger his thesis then you should (a) state that point clearly, (b) defend it.

            His address told a putative entire graduating class that they gained nothing from schooling,

            He really didn’t. He said they gained a whole year compared to a Sudbury-type approach. That’s not nothing. It’s just not enough of a something to be worth what we’re giving up.

            and in fact laughed mercilessly at anyone who thought that education had helped them learn “how to think”.

            Okay, yes, he did do that. And if you have an argument to the contrary, maybe you want to…present it? Or at least give us a hint?

          • HeelBearCub says:

            @Glen Raphael:
            David Friedman is asserting that self directed reading is as good as lectures. That’s who I am disagreeing with in that quote.

            The reason I brought up a parent being home is that it represents, in opportunity cost, the implicit expense of home unschooling, regardless of how much or little instruction they do. Assuming one parent stays home out of 10 per 15 average families, and you are at standard teacher to pupil ratios.

            Going back to the original post I made in this OT, I’m fighting the contention that college (and perhaps all schooling) is useless. As I noted early on, this is related to both frequent comments I see about school or college here, and also the repeated assertions made about IQ being unmalleable and determinative.

            If you do not think instructor aided education is useless for most (or all) people, then I don’t think we are in disagreement?

            If you are saying that for some children instructor led, or even more strongly even just instructor aided, is unnecessary, I don’t disagree with that! If the contention was “Well, school was useless for me, but maybe not for you” my stance would be quite different.

            But that isn’t the claim I see being made, nor is it the hypothetical I originally posited. As to your point about me giving a hint as to how college (or schooling in general) helps people learn how to think, you actually pulled a quote from a post where I did exactly that.

            I contend that the practicing of collecting, condensing, assessing, analyzing and summarizing information is learning how to think. It might not be enough to optimize your thinking, but that would be a different contention.

          • Glen Raphael says:

            @HeelBearCub
            I don’t think anybody knows precisely how much official formal schooling we could profitably get rid of, only that it’s quite a lot. The optimum level of that thing is so much less than what we have now that movement in the direction of less seems like a good idea, no?

            I’m fighting the contention that college (and perhaps all schooling) is useless

            When Scott claims the education system we’ve got now isn’t collectively worth what it costs that doesn’t mean it has no worth at all. Doesn’t mean it doesn’t benefit some people in some specific ways. All it means is that the costs outweigh those benefits. You can’t defeat a cost-benefit argument by saying “but there are some benefits; it’s not useless“! Given the mindbogglingly large amounts of time and money we spend on schooling it’d be astounding if you couldn’t point to anything that in isolation looked like a benefit, but what matters is the net benefit, the benefit in excess of the costs. I didn’t see costs even mentioned in what you’ve been saying so far.

            Revisiting this quote:

            I think that there any number of (perhaps most) people who learn better through a combination of lectures, reading, writing and testing, than through self-motivated individual study with no provided curriculum.

            It’s true as far as it goes, but you’re stacking the deck if you think it’s relevant to the current argument. One problem is the false dichotomy. There’s nothing that prevents individual study from employing reading, writing, testing and, heck, even lectures. My own self-directed study efforts have included all those things! There’s also nothing stopping individual study from including other-motivated elements, or finding and making use of an existing curriculum.

            The difference is that the self-learner *chose* what lectures and curricula they wanted to follow rather than having this inflicted upon them by outside agents. And in doing so – in figuring out what resources to use to meet their goals, the self-learner is likely to learn how to learn at a deeper level than someone who has to wait and be told by others what their learning plan is.

            So if you tell the self-studier they can’t use an existing curriculum or tests or writing or…whatever seems useful to achieve their goals, you’re asking them to learn with a hand tied behind their back.

            Another small problem with your quote is that phrase “learn better” which prompts the question “learn what better, and what do we mean by better?” There’s no argument that if you already know exactly what specific thing you want to learn it could be useful to have somebody come along and tell you how to learn that and hold your hand while you do. You might learn that specific thing faster. But as you did, you wouldn’t be practicing the skill of learning how to learn as much as if you figured out how to learn it yourself. And you might not be learning the right thing!

            Figuring out what’s worth learning and figuring out how to learn things on your own can be frustrating and slow. It might be easier to just have a bunch of assorted info spoon-fed to you. But frustrating and slow and difficult things are worth doing too. Like exercising a muscle.

          • In California, home schooling is effectively unregulated, so you can unschool your own kids pretty easily.

            So far as selection effects, my impression in the very small Sudbury model school that our kids went to for a while was that a fair number of the kids were ones who had problems in, and were probably problems for, the public school.

          • John Schilling says:

            For instance, do you think home unschooling would work if one parent wasn’t stay-at-home?

            I do think that, yes. I’m not at all convinced you need a parent nearby in order to learn […] but even if you did (for what?) why would it need it to be your parent. Couldn’t any other parent in the neighborhood work just as well?

            Most children are highly motivated to not disappoint their own parents, and if possible to gain their express or at least tacit approval. Random adults from the neighborhood, not so much.

            I’d wager this would substantially affect the way an unschooled child choses to spend their time, and it would be an interesting experiment to conduct. Preferably for someone with more free time and accessible test subjects than me.

          • HeelBearCub says:

            @Glen Raphael:
            If you took anything I said to mean that I was disparaging your education, or the education of any individual who went to Sudbury or did some other form of unschooling , I apologize. This was not my intent. I am only attempting to speak to population level effects. I’m in favor of a variety of schooling methods, matched to the children, “Follow the child.”

            I will note that this argument works the other way, as I found my traditional schooling to be quite valuable, and observed the same in my children.

            I don’t think anybody knows precisely how much official formal schooling we could profitably get rid of, only that it’s quite a lot.

            I am asking for evidence of this at a population level. You seem to be asserting this without evidence. I have, via the cost of Sudbury and the opportunity costs of parents, shown why it is naive to assume that unschooling is free or near free or even a great deal less expensive than traditional schooling. Perhaps one can prove that there is a great deal of savings, but you are the one asserting this, so it is on you to prove.

            The question is, if you take people in the middle of the bell curve of “traditional schooling is a good match for them” and force them into unschooling, how effective will that be? Right now, selection effect should predispose us to think that people in unschooling are dominated by those on the tail end of that bell curve. Would we get good educational outcomes if we populated an entire cohort of people from the middle of the bell curve into unschooling? Not one or two among a cohort of those not suited to traditional school, but a cohort dominated by those who are of average suitability for traditional school.

            As to comments about “straw peeking through” and the like, I don’t think that improves the discourse.

    • eh says:

      I tentatively believe that education is better regarded as something that happens naturally while doing useful or fun things, than as a big chunk of life during which you can’t do anything else. I’d be happy to teach children to read, but as an instrumental value for getting information and enjoyment from books. I’d happily subject 8 year olds to lessons on trigonometry and programming for the purposes of making a game, or drive them to sports if they wanted to play them, or anything else that’s both productive and something they actually want to do.

      Most of my childhood was spent sitting at a desk surreptitiously trying to read interesting books while the teacher pretended I was doing actual work, interspersed with a series of increasingly violent and increasingly fun games of football at lunch and recess, bracketed on both ends by daycare. I was one of the smartest kids in my class, so I had the luxury of stolen time, and I had parents with interesting hobbies who encouraged me in mine, so school didn’t totally ruin my education, but for most others the joy of learning was completely killed by the process of formal education. I would imagine many SSC readers would have had a very similar experience, and might have drawn similar conclusions to me. Studies previously posted here and elsewhere bolster this view.

      Kids aren’t much more stupid than adults. Why, when you learn something as an adult, is the process so different from learning as a child or a university student?

    • Theo Jones says:

      Its not that education is bad. Its that there are good and bad reasons to get a college degree.
      Reason 1, the education provides a skill set that is useful to you in the future
      Reason 2, the education provides a way to signal to the world that you are the type of person who has a college degree

      Reason 1 improves human capital and is generally good (as long as the value provided by the knowledge exceeds the cost of the education). Reason 2 gets society a $50,000 white elephant.

    • Glen Raphael says:

      I really find the idea of unschooling appealing. Ideally, we’d let children learn stuff as and when it becomes interesting to them rather than inflicting a particular planned curriculum upon them. And sure, if a few kids happen to do especially well with traditional guided sit-at-a-desk-and-be-talked-at style instruction we can keep some of that around, but I wouldn’t be inclined to make that the default.

      If we could somehow separate school and state completely – get a real free market in education – I’d expect to see lots of different approaches to teaching. Good ideas could become standardized, turn into franchises. Like McDonalds, you might see a branded local branch of a national chain providing a cheap standardized education product. That’d be another way to go.

      • Guy says:

        What does the private school sell? To whom are they selling? It seems unlikely that businesses will pay schools for their graduates, which means that the schools will almost certainly be paid by the parents of the students. This, then, means that schools will sell “these employers will hire our graduates”. This will be the same as “our graduates are the best possible employees” only in so far as it is cheaper to actually train people than to provide something that looks like training, eg, a degree in “general studies” or some such thing.

        Education that works is a hard problem because consequences are largely decoupled from payment. Payment happens first and outcomes are uncertain to a degree that is so hard to measure that it isn’t worthwhile for any particular potential buyer, and in any case the outcome is very far removed in time from the actual payment. Education is therefore paid for by people other than the eventual direct beneficiary, regardless of the surrounding structures.

        Apprenticeships are the exception, but apprenticeships are largely incompatible with unschooling because of the necessary commitment on the part of the apprentice.

    • keranih says:

      So, at what point does education cross-over from useful to useless?

      At the point where the information and patterns established in the student by the process become counter-productive to the student and/or their society.

      To judge that point, we have to establish what patterns are actually being learnt, and which of these are nonproductive. On top of that, we need to decide if bad-for-society-but-good-for-the-student are okay, perhaps +/- public funding. We are also going to run into the problem where student a and student b go through the same education process and one turns out a decent citizen and one becomes a crook.

      And then we need to take action to remove the student from the “bad” education process. All of which takes a lot of work, deep knowledge of the specific student, and imperial power over the student.

      If only there were one or two independent adults we could trust to monitor the development of every student, really getting to know that individual, and watching how they respond to different stimuli, and making changes in the education process when the student starts to go astray…

    • Dr Dealgood says:

      I’m part of the cohort I guess, so here goes:

      Higher education is useful pretty much to the extent that it is an apprenticeship. So the only useful part of undergraduate school is in setting up internships, as well as meeting the requirements for graduate and professional schools which are much more useful. Presumably trade schools and the like would also be useful.

      Foundational knowledge is important but, frankly, if you’re the kind of person who can learn from a college or highschool lecture you’re also the kind of person who could have picked it up on your own time. Teach everyone the three R’s by the time they hit puberty, then let the kids work as unpaid labor in their chosen field for a decade before you ask for a thesis or masterpiece.

      • John Schilling says:

        you’re the kind of person who can learn from a college or highschool lecture you’re also the kind of person who could have picked it up on your own time

        I strongly disagree with this, and see it as a sort of autodidactic snobbery.

        Some people will, given access to a library, acquire a college-level education on their own time. Other people won’t. I would guess that about two-thirds of the people who are capable of acquiring this level of education, will require more than just time and books (or the internet) to do so.

        Some people are wired to learn by hearing or doing, far better than they can by seeing or reading.

        Some people need the formal structure of lectures TTh at 10:30 and homework every week, to avoid endless distraction and procrastination.

        Some people need the social support of a peer group that is going through the same thing, even if only passively but even better if they study (or just gripe) together after.

        Some people will get hung up on, frustrated, and eventually give up over some small stumbling block that could be resolved in a minute or two with a bit of Q&A with an expert.

        And some other people can dismiss all of these people as “not the kind of person who can learn from a college lecture”. Except, if you put them in college lectures, they learn.

        • So far as the “make sure you do it” element of lectures, one could have a course taught from a book with a quiz every week or so which would accomplish the same thing.

          When Betty (now my wife) went to Oberlin, she took a calculus class that met at eight in the morning but had unit tests which you could take at any time. She stayed two weeks ahead of the class, verified by the tests, with the incentive of not having to get up early to go to class.

          • HeelBearCub says:

            This seems very much like “one size fits all” which is supposedly what you are fighting against.

          • I’m not arguing that unschooling is best for all kids, only for some.

          • HeelBearCub says:

            Certainly some number n of students benefit from unschooling.

            But the argument I see being made either explicitly or implicitly is that n is close to the size of the student population, which is a quite different proposition.

          • Skivverus says:

            @HeelBearCub

            An alternative, more charitable take on that: “n is certainly smaller than the size of the population, but it’s also significantly larger than [American] society currently gives it credit/money for.”

          • Anonymous says:

            I don’t think we can afford bespoke, artisanal, hand crafted educations for all 70 some odd million children in the US. At least not until the moshiach comes Gutterdämmerung The Singularity automation makes work obsolete. Unless the N is big enough and/or clustered enough the current homeschooling exceptions will have to be a sufficient safety valve.

            Frankly I don’t see the problem. David Friedman doesn’t see the need for lectures or classes and claims he can learn everything he needs to learn out of books. Looking at his CV it appears he was able to secure professorial appointments without ever formally studying the subjects he teaches and writes about. So the system worked, no?

          • HeelBearCub says:

            @Skivverus:

            “College and HS are worthless warehousing of young people” is very different than “perhaps more self direction in all education would be a good thing and 20% of people might be able to school almost completely self-directed”.

            If people were making the second argument, I would find no fault in it, except that I would want to know why they thought 20% was the number.

          • “I don’t think we can afford bespoke, artisanal, hand crafted educations for all 70 some odd million children in the US.”

            Unschooling takes less teacher time than the conventional method, not more. The hand crafting is being done by the student. All the teacher (parent) is doing is pointing the student at resources for exploring what he is interested in–and a lot of that is done by the student or other students.

            “So the system worked, no?”

            The system wasted about a thousand hours a year of my time for ten years, sitting in a classroom being bored. Not my definition of working. It didn’t prevent me from getting an education, but it only occasionally helped the process.

            Somewhat less true of college and graduate school.

          • “College and HS are worthless warehousing of young people”

            The argument for that claim is the evidence that a large fraction of those who go to college don’t know significantly more coming out than going in. That doesn’t tell us whether an alternative approach would work better.

            The arguments for unschooling, at least mine, are:

            1. Arguments about what is wrong with the present K-12 system–that it selects from the universe of knowledge a largely arbitrary subset about large enough to fill K-12 and insists on everyone at least pretending to learn that subset, and that it teaches in a way which has serious problems.

            2. The claim that for at least some kids, there is a much better alternative.

          • Anonymous says:

            Unschooling takes less than 65 typical teacher/parent hours a year? Color me skeptical.

            65 ~= 180 * 6 / 16.6

    • I guess I’ll speak up and say that I, at least, found my undergraduate to be tremendously useful (I did a degree in physics, and I’m doing a PhD in physics right now). I learned a lot from my degree, and not just in physics – I came out of it a much better problem solver and a much better thinker. I think there’s a “trial by fire” component to a rigorous STEM degree that’s hard to replicate through self-teaching. It’s one thing to try to figure something out because you want to; it’s something else to have to figure it out, because you have an assignment due in five hours that needs to be done (and then three more assignments due the next day after that).

      I’m guessing people who are negative on universities would say that I’m an exception, and that most people don’t have that kind of experience with their undergrad degrees – that most degrees don’t teach much of anything, that they only exist for signalling purposes, that they’re completely useless. And fair enough, maybe they are. But even if I’m an outlier, I do exist, and so do a bunch of people who actually got something from their degree.

      (You could also say that I got something from my degree, but that it wasn’t anything useful – that I learned only esoteric skills and knowledge that are great for an ivory tower but of little use in the real world. Why should the government subsidize me learning about quantum fields and lorentz transformations and all that just to satisfy my weird curiosity? But I don’t think this is true, though – I really did come out of my degree a better thinker, and post-degree me would be immensely more useful to an employer than pre-degree me)

      • BRST says:

        I second this comment. I also did physics and math, and I think a lot of the value came from putting me together with other hard-working, intelligent people, and forcing us to do more work than was comfortable. (I slept so little my second year…)

        Ok, but that could be done without lectures. And when my lectures were worthless, I didn’t go! I made notes from my textbook, handed in homework and showed up for tests. But other lecturers really were excellent, and added a lot of their thought processes and personal insight.

        In addition, I definitely think it made me a better thinker. When I think back to the type of problems I had difficulty with my first and second year in undergrad, it appalls me that I could have ever found them difficult. I don’t think this is just domain knowledge; the methods of reasoning became far more familiar.

        I suspect many people who study theoretical math have a similar experience where it ‘clicks’, and from that point onwards, proofs just make sense (this happened to me early in my third year). The mathematical style of reasoning has been incorporated into your own reasoning. As I grade undergrads, as a TA, I have to remind myself that the mathematical style of reasoning does not really come naturally, and has to be built.

        • “But other lecturers really were excellent, and added a lot of their thought processes and personal insight. ”

          Why couldn’t a book do the same thing? You have the option of using the best book on the subject ever written. You don’t have the option of a lecture by the best lecturer in the world.

          • Stan le Knave says:

            But face-to-face communication is a much more information-dense form of communication. And a relationship with the person you are communicating with allows them to tailor the delivery of information to you.

            In trivial terms, you can’t ask a book a question.

            I may be biased as I studied a very niche subject with a lot of small class size lectures.

          • HeelBearCub says:

            @David Friedman:

            Why couldn’t a book do the same thing?

            Honestly, why would you expect them to do the same thing?

            Have you ever tried to butter bread with a paring knife? Have you tried to peel an appe with a butter knife?

            When it is something very simple, people seem to intrinsically understand that small differences in characteristics can make for really big differences in outcomes. But when we start talking about education, all of a sudden things as widely divergent as books and lectures are considered perfect substitutions.

          • BRST says:

            A few small points: because people think about things differently/learn differently, there is no ‘best’ book in the world on a subject. There will be a collection of highly regarded books with different perspectives and levels of sophistication. Analogously to that, there is no best lecturer in the world; so I could be reading a high quality textbook with one perspective, and hearing a high quality lecture with a different perspective. In addition, writing a book takes a lot of time, so there may very well be plenty of lecturers, who are very good, and have a unique perspective, who haven’t bothered to write a book yet.

            But more generally… I’m not sure why, but its just not the same. A good lecture generally helps you ‘get it’ faster. The two types of learning really aren’t equivalent.

            A quick note: all of this is relating to subjects where the concepts are difficult to grasp. This may just be my arrogance showing through, but there very well may be no point to some Biology lectures. If a subject, at least at a certain level, is purely memorization of facts and structures, then lectures are probably equivalent or worse than a good textbook.

          • “But face-to-face communication is a much more information-dense form of communication.”

            As I think I have already said, my puzzle isn’t the survival of small classes with lots of interaction but the survival of large lecture classes with almost no interaction–a few students out of hundreds asking questions that get responded to. From the standpoint of all the students who don’t ask a question, a book that included responses to the questions the author had found students ask would provide the same level of interaction.

          • “so I could be reading a high quality textbook with one perspective, and hearing a high quality lecture with a different perspective.”

            Or you could read two books with different perspectives.

            What you are ignoring is that a high quality lecturer means, at best, the best lecturer in your university–say one of the thousand best in the world if you are lucky. A high quality book can mean the best book on the subject that has ever been written.

            In addition, the lecture has to go at a rate that almost everyone in the audience can follow. With a book, you can skim rapidly over things you already understand, read things you find difficult three times over. In effect, the book reading is tailored to the individual student, the lecture is not.

          • smocc says:

            I wonder if we are having a slight problem of talking about different things.

            Does “traditional lecture” mean a professor with a huge audience and a totally prepared presentation? Or does it mean a professor with <30 students who stops and takes questions.

            My college experience (physics major / math minor, early 2010s) makes me think that the former format has not survived. The only classes I took with very large lectures were general ed overview or introductory courses. That means that large lecture courses were significantly less than a quarter of my college education. Admittedly, I remember almost nothing from most of them.

            The other non-lab classes I took (the great majority!) were either discussion or lecture based with smaller classes where you could stop and ask questions or argue with the teacher / your colleagues. I found this very helpful, and not something that could be replaced by a book.

            My physics department was especially zealous in trying different formats and techniques, e.g. electronic clicker participation stuff and “flipped” classrooms.

            I did go out of my way to avoid large lecture courses when fulfilling my non-major requirements and someone who didn’t take as much care as me might have ended up in more lectures. But I it’s important to make a distinction between the large lecture format and classroom education in general. They can be very different.

          • keranih says:

            My college experience makes me think that the former format [a professor with a huge audience and a totally prepared presentation] has not survived.

            From a first professional degree in the 2000’s, and then a masters in the last five years – the format is still alive and thriving.

            It is true that most of the courses taught with this method (and I was one of those who would interrupt with stupid questions and ask the instructor to back up and repeat when I didn’t get it, for which I was thanked repeatedly by students who were more shy/less resigned to being thought an idiot) were basic intro courses, and yes, I cut a lot of them.

            But I wish I hadn’t. There was information to be learned there and I just refused to accept the pain that was required in order to get the trade off. In later, smaller, more-hands on courses (and exams, and practical work) the skimming of earlier courses has come back to bite me.

            In a more ideal world, the book/lecture learning would have been melded with practical work to the extent that I (I-me, not I-any student) would have willingly dug into the lecture prep so as to be able to master the cool practical stuff faster. This would, however, have required instructors who were able to handle multiple students at varying levels of basic knowledge, each progressing at different rates, and all with the patience of Job.

            The only classes I took with very large lectures were general ed overview or introductory courses. That means that large lecture courses were significantly less than a quarter of my college education. Admittedly, I remember almost nothing from most of them.

            I think that for the time/money trade offs mass lectures serve enough students well enough that we are stuck with them. (Like, say, having people on assembly lines in order to have lots of cheap autos.) But lordy do they ever suck.

          • HeelBearCub says:

            At least some of the large lecture classes I took also included much smaller TA lead sessions of about 20 to 30 students. Does that not happen anymore?

            @David Friedman:
            I wish you would respond to my point above about the fact that lectures and books cannot be considered substitutions of each other.

          • “Does “traditional lecture” mean a professor with a huge audience and a totally prepared presentation?”

            My puzzle is why that sort of lecture continues to be given. But I include the case where the presentation is prepared but the lecturer takes an occasional question from the audience.

          • smocc says:

            @Keranih

            I think that for the time/money trade offs mass lectures serve enough students well enough that we are stuck with them. (Like, say, having people on assembly lines in order to have lots of cheap autos.) But lordy do they ever suck.

            Agreed. Unless or until MOOCs take over, though I personally hope they don’t.

          • Paul Brinkley says:

            I increasingly suspect that the answer to David Friedman’s implied question – what explains the continued presence of mass lectures today – is that the obvious replacements, books written by the foremost experts of the respective fields, do not in fact exist yet, for a great many of the mass lecturing topics.

            If true, this implies that book writing is harder than first assumed, or that the need for them and ease of providing them has not yet reached critical public awareness. Pursuant to the latter, I notice that the WWW is only about 20 years old, and the first purely educational websites aimed at a general audience are necessarily younger. Meanwhile, high speed internet access in the US is only about 75%.

          • “books written by the foremost experts of the respective fields”

            I don’t think the author has to be one of the foremost experts, any more than a lecturer has to be. It has to be someone who understands the subject and is a good writer.

    • dndnrsn says:

      There seem to be two problems with university (when I say university, I mean 3/4 year undergraduate stuff and up) and two with high schools.

      Universities, first, face the problem that most programs (honestly, most stuff that isn’t applied sciences) prepare you either for further studies (I know a history bachelor’s isn’t going to prepare you to go be a historian, the same is true for the social sciences, and I imagine it’s true for many sciences besides comp sci and engineering – and weren’t the original universities basically vocational training for priests?) or are a way for the affluent to become more interesting and make connections. But now universities have become, in large part, a way to establish that a person is basically literate and competent enough to handle white-collar work of some variety or another. There’s also this idea that they’re a means of social mobility: “the well-off have university degrees, therefore, someone who gets a university degree will become well off” is fallacious – but might have been true for a while, before a quarter or whatever it is of the population went to university. But now you’ve got people taking on heavy debts for something that isn’t very useful.

      Second, as a result of the above, a lot of people who for one reason or another aren’t suited to university. Maybe they’re just not bright enough, and the result is a growth of less-demanding programs and a lowering of standards in those programs. Maybe they just don’t have the right temperament, but have been thrust into university because it’s expected that everyone of a certain level of intelligence, certain social class, whatever, has to go to university, otherwise they bring shame upon their family. Both kinds of people benefit less from university, would probably be better off doing something else, and are probably paying a decent sum for the experience.

      With regards to high schools, the first problem is that having the same standards across the board for everyone doesn’t really seem to work. Some people are better at some things than others, and some people are smarter than others. Trying to ignore this leads to wackiness like trying to legislate that all children be above average. If a kid is just not that smart overall, or is just not good at a certain subject, it’s inhumane and unfair to blame and punish them/their parents/their teachers/their school administrators/their neighbourhood for their failure to pass grade 11 math or whatever. And I say this as someone who took the easier of two options for grade 11 math.

      Second, as people have noted, there are a lot of important life skills that high schools don’t teach. Everyone should know how to cook a meal and deal with their finances.

      • HeelBearCub says:

        “But now you’ve got people taking on heavy debts for something that isn’t very useful.”

        Find me some non-fringe examples of companies exploiting this to make a profit. By which I mean, if university is not useful or only a very little bit useful, companies should be able to hire HS graduates at a significant discount over their college graduate peers.

        We should see companies competing with colleges for the HS graduates who have the least to gain from going to college. Outside of the military, is their anyone who does this in any appreciable numbers?

        • dndnrsn says:

          This is a good point. Thoughts:

          A. Perhaps the employers haven’t caught on yet. For the longest time, a university degree really did prove someone was probably smarter and more competent than someone without.

          B. Perhaps there’s an advantage in perception: it looks better to clients if all your desk jockeys have BAs. Again, the market hasn’t caught on yet, in one way or another.

          C. People with heavy student loans are desperate enough to work for not much more than a HS grad would get for the same job.

          • HeelBearCub says:

            A. Does not comport at all with university being worthless for very many people at all.

            B. The vast bulk of people employed, especially college grads, have no direct contact with the public. Even when they do, can you quantify how many contacts with external company employees you have encountered where you knew whether they even attended college, let along graduated? This seems to require a great deal of work to prove, and doesn’t seem to meet the available evidence.

            C. This also does not comport with college providing no value. You would have to think that the market price for a college grad was equal to or below that of a HS grad, due to their desperation. But then the HS grad should happily accept the job at the college grad wage and not take on the debt. Again, the market competition should exist.

            The only way you get around this is if we say the minimum wage is interfering, but then still the HS grad should happily take the job at the same minimum wage, but fewer benefits.

            All of this is confounded by the fact that their are many professions where 4 year college is not the route to employment. Certified Medical Assistant requires only one year of post-secondary education, as just one example.

          • dndnrsn says:

            Honestly, you might be right, and I’ll have to update my opinions on the value of a university education accordingly.

            If a university education is still worth it, though, why all the fuss about student debt, and so forth? Anecdotally, I know a lot of people with degrees from good universities who have a hard time finding a decent job.

            Is it just a lack of decent jobs, in general, then?

          • Anonymous says:

            It’s a form of B: ass-covering. If a guy hires a high school graduate and then something goes to hell with that guy, that currently gives management a hook to blame him for the hire. If instead he hires a BA in gender studies and this person shockingly turns out to be a mental invalid, the hirer “did everything right” and isn’t to blame — even though in practical fact the outcome was much more predictable in the second case. Therefore people in charge of hiring default to sorting out people without a degree because it’s a “risk”.

            (There’s probably also more than a little class bias in this, where middle class people cement their status position by preventing those who couldn’t afford the entry ticket from obtaining well-paying and/or prestigious jobs, but the ass-covering explanation is sufficient, and HBC seems to be arguing from a perspective of not believing in economic irrationality for the purposes of this discussion)

          • Garrett says:

            My understanding is that Once Upon A Time prospective employers would have applicants go through their own testing process (frequently an IQ test). This was fast and cheap. It also had racial disparate results. This may-or-may-not have been the goal. The result is that using that type of testing was struck down by the Courts, unless it can be shown to be directly related to the job at-hand. Eg. someone applying for a computer programming job may be given a programming skills test because the test is evaluating skills or requirements specific for the job. It’s nearly impossible to show that you can only be effective at a job if you IQ is 110, but not 109.

            However, it is still legal to reject applicants who don’t have a degree. So having a degree is being used as a proxy for intelligence, which is being used as a proxy for likely-to-not-be-a-complete-waste-of-my-time.

          • HeelBearCub says:

            @dndnrsn:

            Honestly, you might be right, and I’ll have to update my opinions on the value of a university education accordingly.

            This is the highest honor possible to bestow in the comments of SSC. I am humbled.

            If a university education is still worth it, though, why all the fuss about student debt, and so forth?

            Two hypotheses spring immediately to mind:
            1. Expected return on investment vs. actual return on investment. Especially in regards to the phenomena wherein losing something you “have” is more painful than not getting something you don’t have.
            2. Trend lines. College debt, if it continues on current trend lines, will stop being worth it. Wage stagnation, even real wage loss, if it continues on present trend lines will reduce us to [insert some perceived bad state]. Trend lines of course rarely do continue, but it feels like they will.

            And your point about the job market is certainly true as well. This is mostly occurring post-2008, which was, we should not forget, was really, really bad, economically speaking. The worst in 80 years. So, the amount of general economic angst should be expected to be pretty high, especially among all those current college grads who are now competing with all of the PhD grads who wouldn’t be PhDs except for 2008.

          • Vox Imperatoris says:

            @ HeelBearCub:

            For what it’s worth, I just wanted to chime in that I agree with you as well. It would contradict everything in economics for everyone to be scrambling toward college education without its being a good deal for those who get it.

          • HeelBearCub says:

            @Garrett:
            I can still list my HS, my GPA, even my SAT scores on my resume, all of which I am assured or also good indicators of IQ.

          • HeelBearCub says:

            @Vox:
            Well, that isn’t really my point here. It is the from the opposite side. It would contradict what we know about economics for employers to fail to compete for HS grads if they were as skilled as college grads.

            I mean, I think it could be shown they are intrinsically related too each other, but different sides of the same coin. But, for example, in the the event of any pointless, credentialist regulatory capture, one could easily see the training to be “worth it” but not imparting much, if any, true skill.

          • dndnrsn says:

            A thought: even if the value of a university degree is based on irrational decisions (in status-seeking, ass-covering, or due to a prohibition on better ways of choosing employees), that’s still value.

            Additionally, I know a lot of people (yeah, more anecdotal evidence) who instead of seeking a job out of undergrad pursue further degrees in the hope of being more employable and/or in the hope that the job market will get better.

          • Edward Scizorhands says:

            However, it is still legal to reject applicants who don’t have a degree

            Only because no one has brought it to court. Read the text of Griggs:

            The facts of this case demonstrate the inadequacy of broad and general testing devices, as well as the infirmity of using diplomas or degrees as fixed measures of capability. History is filled with examples of men and women who rendered highly effective performance without the conventional badges of accomplishment in terms of certificates, diplomas, or degrees. Diplomas and tests are useful servants, but Congress has mandated the common sense proposition that they are not to become masters of reality.

            We need someone to push the issue in court, at which point SCOTUS precedent is clear.

          • Watercressed says:

            The universities might have enough affirmative action to make requiring a degree okay.

        • Emily says:

          Looking at the example of the military may give some insight into why this isn’t a common model. First, the military has done a lot of research into who is successful in the military. (It’s definitely not HS graduates who have the least to gain from going to college. They are recruiting mostly mid-tier high school graduates.) They’re good at it. For other organizations which may be less good at it, it may make more sense to just use education to sort candidates. Second, the military trains the heck out of their recruits both to acclimate them to the service and to their particular career field. I’d imagine that beginning extended, intensive training programs for new employees would be daunting. Third, the military is willing to invest in training candidates in part because those candidates sign contracts that make it difficult for them to leave for some number of years. That’s not a common employment model. I don’t think candidates for other types of jobs would be as willing to sign those kinds of contracts, and I’m not sure you would be able to make them as legally binding.

          • HeelBearCub says:

            All of which matches the idea that attending college does indeed improve the skillset of those who attend.

            The military spends massive amounts on education. What does that suggest?

          • Nornagest says:

            The military spends enormous amounts of money on training. Four-year college degrees consist largely of job training for a few majors (engineering, pre-med, journalism for the tiny fraction that manage to land an job in the field), but it is not the norm. And even within those majors, there are usually breadth requirements or elective classes that take up a lot of the student’s time.

            Associate degrees are almost entirely job training, but they’re not what most people are thinking of when they say college education.

          • HeelBearCub says:

            You say “training” I say “education”, either, either, neither, neither, let’s call the whole thing off.

            Employers clearly value whatever it is that colleges and universities impart and aren’t willing to spend the time to impart that themselves. The military is a good example of what it looks like to employ people directly out of HS. Learning various trades might also qualify.

            So, given that employers aren’t employing people straight out of HS in lieu of college grads, what is college imparting?

          • keranih says:

            A college degree certifies that the applicant was rigorously screened by the college application process and successfully sat through 4+ years of being directed towards information and producing reports/assignments on that information to the specifications given by various instructors.

            All this for *free* to the company doing the hiring, in a labor law environment with a hard floor on how much a non-college student’s value can be discounted.

            (You need a pickup truck. Your options are either a Peterbilt semi or a bicycle. In many situations, the bike could be free and you still couldn’t make the math work to take it.)

          • For what it’s worth, I just had a medical test done and chatted with the technician. She had not gone to a regular college, had taken a four year course in her specialty. Obviously she was employed.

            So that’s one alternative that apparently does work in some fields–replace the liberal arts degree with some sort of professional qualification reflecting the fact that you have been trained for a particular profession.

          • Nornagest says:

            Employers clearly value whatever it is that colleges and universities impart and aren’t willing to spend the time to impart that themselves. […] So, given that employers aren’t employing people straight out of HS in lieu of college grads, what is college imparting?

            The whole point of the signaling model is that it doesn’t need to be imparting anything as long as it correlates well with traits the employer finds valuable: probably some combination of intelligence, conscientiousness, ability to follow directions, and middle or high SES and attendant cultural markers.

            You could probably design a hiring system that extracted the same information from a high school graduate’s public data (transcript, SAT scores, etc). I bet you could even do it without running afoul of the disparate-impact laws that right-wingers like to complain about. But that would take more work, and employers have no reason not to just hire college grads.

          • “producing reports/assignments on that information to the specifications given by various instructors.”

            There’s a good bit of cheating in college, though I don’t have a feeling for what proportion of college students cheated so much that their degree is meaningless.

          • HeelBearCub says:

            @Nornagest: (whoops misidentification)

            But that would take more work, and employers have no reason not to just hire college grads.

            You appear to be ignoring cost of good as a factor in micro-economic decisions.

            The reason to go through the additional work of screening HS grads is that their cost of good was much, much less. They will happily be employed for much less.

            Either we are only talking about the lowest rung employees, whose wage and benefits are set by things like minimum wage laws, which does not describe most college graduating hires, or the idea that it is not “worth it” for employers to hire HS grads in lieu of them going to college needs far more support than you are giving it.

          • Nornagest says:

            Is that actually true, though? Can you get full-time, permanent work from high school graduates as smart and conscientious as you can get college grads, in this labor market, by paying them less? And can you get enough of it to make it worth the trouble of setting up a new, totally untested hiring process? There’s internships, but they’re rarely full-time or permanent, and I don’t know how much the kinds of work that interns are assigned actually contribute to the bottom line.

            (Well, I know how much they do in tech, which is “not very much”: it’s more a form of charity that sometimes makes coffee and might be hired to do useful work someday. I don’t know how it works in other industries.)

            To make working right out of high school attractive to college-bound high school graduates, here and now, you need to make it competitive with the college path — and probably not just to the students themselves, but to their parents, who at this point in life usually have a great deal of influence over their kids’ decisions even if they’re formally adults. It seems to me that this implies salaries within spitting distance of conventional entry level, even if not quite as high; at the margins, you might be able to wedge off a few with less, but at the margins you get marginal people.

            Remember also that salary is only a fraction of total cost of employment.

          • HeelBearCub says:

            @Nornagest:

            can you get full-time, permanent work from high school graduates as smart and conscientious as you can get college grads

            That is my whole point, restated.

            The fact that we don’t see these hires mean that college actually has a substantial effect, not merely in status, but on the skill set of the graduates. College bound HS graduates are not being recruited in lieu of college graduates, meaning that college has had a positive effect on these students.

            As to your point about career path, 4 years of experience plus four years of salary plus no college debt should look pretty attractive to everyone if, in fact, college is not doing anything for these students.

          • It seems to me there is also some defect-defect PD going on here too. College-bound students could shirk the college degree program to get the salary + training instead of debt to roughly balance things out, but would lose out to other students that did go the college route because that is what employers will currently hire. Employers could go hire college-bound students with that attractive offer on net, but they’ll defect to hire college grads instead because they are better.
            (Note, I think this still holds even if college degrees turn out to be worthless or even negative in real knowledge/skills, so long as both sides think they have value. This just changes what the offer for would-be-college-bound hires should be).

          • Nornagest says:

            The fact that we don’t see these hires mean that college actually has a substantial effect, not merely in status, but on the skill set of the graduates.

            No, it doesn’t. College doesn’t need to be adding any relevant skills whatsoever for everything in my last post to be true. It does need to be a good proxy for certain desirable attributes (smart and conscientious are not skills), but that isn’t the same thing.

            I don’t think you can get equally good workers at scale by hiring high school grads cheaply. But that isn’t because there are no competent high school grads, it’s because the social norm for competent high school students (that don’t want to enter the trades or a family business) is to go to college. If you’re asking for long-term, professional work from those students in lieu of college, you’re asking them to buck the norm, which means you need to get their attention — probably by offering near what their starting salary would be afterwards, if not more.

          • Adam says:

            There has to be some middle ground here. I didn’t find college useless, at least not all of it. Spending a great deal of time putting together and presenting detailed arguments and research reports, formulating problems in abstract, mathematical terms and then finding ways to solve them using familiar algorithms and design patterns, even learning how to use specific toolsets to implement assignment requirements, are all things I’ve had to do professionally as well. Maybe the modal student really does just spend three years drinking and going to frat parties, but I didn’t and neither did any of my friends.

            On the other hand, it was a pretty wasteful and cost-ineffective way to do this, I definitely could have taught myself much of what I learned, and I could have done without things like the mandatory by law poli sci 101 class that the professor bragged only existed because he and his friends knew how to lobby the state. I also can’t see a good reason we needed to be providing a free farm league to professional sports associations.

            For whatever reason, people badly overplay the ‘haha queer studies card,’ though. The DOE tracks this stuff. Overwhelmingly the most popular degree paths are business and healthcare services. The majority of students are a lot more practical than they get credit for.

          • The Nybbler says:

            Adam, “business” is first but second is “social sciences and history”, although to some extent this depends on how you break it down and different sources (from the same agency, even) break it down differently.

            https://nces.ed.gov/fastfacts/display.asp?id=37

            Which degrees are “practical” is hard to figure out. I don’t think either a bachelor’s level psychology degree (the “default” degree for women, for some reason) or bachelor’s level business degree (most popular for men and popular for women) is actually all that useful other than signalling “ready for white collar work”.

          • Adam says:

            That’s actually the source I came from, but somehow I overlooked that social sciences and history were 10,000 ahead of health, though that changes drastically if they include associates and certificate programs instead of just four-year degrees. I honestly have no idea what an undergrad business curriculum even teaches, but it at least seems that the intent of most students is obviously something along the lines of “I want to be employable, but I’m not good enough at math to do engineering.” That isn’t everybody, but it’s hundreds of times more people than ever take an ethnic studies or art history class, to pick out what seem like the two most popular bugaboos of people who think college is completely pointless.

          • HeelBearCub says:

            @Nornagest:
            Hold on, somewhere between $50K and $100K less in debt or remaining savings, plus 4 years of full time salary (even at a reduced rate) and benefits, plus four years work experience is NOT extremely attractive?

            Someone, many someones, should be hiring people into jobs with that kind of package. They should have a “Junior Associate” position which automatically promotes to Associate (the level at which they hire their college grads) after four years.

          • Nornagest says:

            Hold on, somewhere between $50K and $100K less in debt or remaining savings, plus 4 years of full time salary (even at a reduced rate) and benefits, plus four years work experience is NOT extremely attractive?

            Not when you’ve had it preached at you for the last N years that going to college is cruise control for money and status, and when you lack the context to interpret most of those numbers. It might be attractive to your parents, but your parents probably suspect shenanigans (because, again, this is an antinormative move). You might, too: since you’re a teenager in this scenario you probably have poor temporal discounting, but you’re probably also inclined to distrust strong, unusual adult claims.

            Now, as college becomes more ubiquitous, its signaling value erodes and that sort of preaching becomes less credible. I don’t expect that to continue indefinitely. But I’m not talking about some hypothetical equilibrium state. I’m talking about a lone company trying to do this now, since that’s what matters on the margin.

          • HeelBearCub says:

            @Nornagest:
            But that is a really attractive economic package right? Are you saying if companies like Hertz, Ernst and Young, Walmart, or Bank of America, all of whom are among the top employers of college grads today offered that kind of program that they wouldn’t be able to get 1000 or 2000 HS students to accept offers?

          • Nornagest says:

            At least some of those would probably work, but it couldn’t be just any company: it has to be one with the reach to get the word out, the money to ensure it isn’t going to evaporate and leave its employees high and dry, and the prestige for prospective hires to take them at their word. And companies that well established also tend to be highly risk-averse.

            You couldn’t get the same effects on a smaller scale.

          • The Nybbler says:

            Hertz and Ernst and Young already list their minimum requirement as an Associates rather than a Bachelors for some positions. Perhaps they could hire high school students for slightly cheaper, but would the cost of the additional screening or training they’d have to do exceed the difference in pay?

            Bank of America actually doesn’t require a degree for some of their positions, so they’re already doing it.

          • HeelBearCub says:

            but would the cost of the additional screening or training

            They shouldn’t require any extra training at all, unless my statements about college actually imparting value (past mere signalling) are actually correct. Again, I am fighting the contention that college is “useless”, as I have seen asserted frequently on these boards. Not useless to some particular individuals, but useless full stop.

          • John Schilling says:

            If almost all of the white-collar employers in your community insist on offering jobs only to people with Genuine Masonic Signet Rings, which the Masons will sell to anyone on their eighteenth birthday for $100,000 but which thereafter can only be purchased on the black market for $200,000, almost all intelligent people who plausibly aspire to white-collar careers will buy such a ring on their 18th birthday. Probably with their parent’s money, freely given, but if not bankers will be lining up to offer them loans on reasonable terms.

            People who aren’t qualified for white-collar jobs will be much less motivated to purchase rings because, even if they want such jobs and the ring lets them get one, they’ll probably be fired before they can recover the cost. Bankers and parents will have an opinion here, as well.

            This economy is in almost every way inferior to the one without the silly obsession about Masonry. But in that economy, the set of people without signet rings is going to be heavily dominated by people who shouldn’t be trusted with white-collar jobs. There will be exceptions, but you’ll probably go broke trying to find them even if you can hire them on the cheap.

            If there were nobody but qualified white-collar workers with rings and qualified white-collar workers without rings, the market equilibrium might be for the wage premium of ringbearers to exactly match the NPV of the ring. When you add in the cost and risk posed by the large pool of unqualified applicants masquerading as the ringless qualified – the search costs time and money and and some of the false positives will have decidedly negative value – that no longer holds.

          • HeelBearCub says:

            @John Schilling:
            “If almost all of the white-collar employers in your community insist on offering jobs only to people with Genuine Masonic Signet Rings”

            The question is why does not one of the while collar employers hire people who can demonstrate the ability to get a loan for a masonic ring at a lower salary.

            People (an-cap libertarians, even!) seem to me to be ignoring everything they think about markets in other situations because they have decided, absent good evidence, that schooling is useless.

          • Nornagest says:

            I don’t think schooling is useless — well, okay, I think high school is useless, but college has value. I just think that, for many majors, most of that value consists of being an honest signal of intelligence, conscientiousness, and cultural competence, and has relatively little to do with what’s actually taught.

            At this point I’m starting to wonder if you’re even interested in engaging with this line of thinking. I haven’t seen anything that looks like a challenge to it, only a lot of assertions that of course college must be imparting value, because markets or something. Only markets can accommodate this kind of equilibrium just fine in the short term, especially when they have culture and policy leaning on them.

          • HeelBearCub says:

            @Nornagest:
            I have engaged with it elsewhere, but this sub-thread was about the economic argument, which is where it has stuck. For instance, I maintain that (in-person) lectures and individual reading are not perfect substitutions for each other, and that they are, in fact, markedly different. Also, no one asked.

            I think that there any number of (perhaps most) people who learn better through a combination of lectures, reading, writing and testing, than through self-motivated individual study with no provided curriculum. I think many workplace training curriculums, backed by research on how to get people to effectively retain information, will show this to be true.

            Much of the work of white-collar employees is the collection, condensing, assessing and analyzing of a variety of information. The end product of this process is frequently some written document, which is then used to make decisions. Those decisions then need to be executed and the results tracked.

            Many of those same skills are also used to complete a college curriculum. In addition, the white-collar workplace is not static but dynamic, therefore being able to do this in brand new areas of information is paramount. This is why a well rounded education is quite helpful. It forces you to perform the same types of tasks, but in many areas which with one is not already familiar. The typical liberal arts education seems very well suited to build skills that are of use in the white collar world.

          • Psmith says:

            they have decided, absent good evidence, that schooling is useless.

            The evidence is that many of us went to college and didn’t learn very much that we actually use.

          • John Schilling says:

            @HeelBearCub: The question is why does not one of the while collar employers hire people who can demonstrate the ability to get a loan for a masonic ring at a lower salary.

            Because the person who has the ability to get a loan for a masonic ring doesn’t want a lower salary, and he doesn’t have to accept it. He can just take out the loan and buy the ring instead. The best he can realistically expect is to break even or slightly better, with the NPV of the decreased career salary approximately equal to the NPV of the loan payments on the ring. If there are too many negative-value employees hiding with him in the pool of “Look, I don’t have a ring but trust me I could totally get one if I had to” applicants, he can’t even break even once the employer discounts for his expected costs.

            The equilibrium where everybody who can get a ring does get a ring, and the pool of ringless job-seekers is dominated by the useless and worse-than-useless, is stable. In that stable equilibrium, it is not rational to hire from the pool of mostly-worthless-and-worse-than-worthless applicants just because they work cheap.

          • HeelBearCub says:

            @Psmith:

            The evidence is that many of us went to college and didn’t learn very much that we actually use.

            The fact that there exist some autodidacts that do not gain useful skills or information from attending college isn’t the question though, is it? I’m talking about population size (or industry work force size) effects. You can always find examples of things on the tails of distributions.

            Second, do you engage in collection, condensing, assessing and analyzing of a variety of information? Do you write documents or otherwise communicate summaries of this information? Do you encounter a variety of new problem spaces in your work? Do you dispute that these are skills which are exercised by a traditional college education?

          • HeelBearCub says:

            The best he can realistically expect is to break even or slightly better, with the NPV of the decreased career salary approximately equal to the NPV of the loan payments on the ring.

            That doesn’t follow, or I am not following your argument.

            Obviously, the employer needs to offer more than the expected salary of someone who does not have a ring for this to be enticing to the prospective employee, but less than the expected salary of a ring holder. The expected salary difference should be roughly equal to the NPV of the ring loan. If the difference was much higher, than loans would be increasingly easy to get (much as home loans are cheaper when housing prices are rising).

            You also seemed to have smuggled in an assumption that Masonic ring purchasing imparts skills, rather than being simply a measure of the net worth of families, as you have somehow concluded that those without Masonic rings are bad at doing jobs. I don’t know where you are getting that from. Perhaps you mean that if I already know I can’t do the job, I will not invest in a ring? That seems to imply a discontinuity in the skill curve, one that is easily detected by job seekers themselves.

          • Psmith says:

            The point is that this is a common experience. Do you not know lots of people who use very little of what they learned in college?

            Second, do you engage in collection, condensing, assessing and analyzing of a variety of information?

            I’m an academic economist. Modus ponens/modus tollens, I suppose.

            I won’t say that I use absolutely nothing I learned in undergrad. I probably use about two classes’ worth, out of the 52-54 classes I took. Everything else I learned on the job or taught myself.

            I enjoyed many of the other 50-52. But I don’t use what I learned in them, they didn’t make me appreciably better at anything except the specific topics they covered, and they weren’t worth the price I paid in time and money. On the other hand, being able to credibly signal to employers that I’m worth hiring is worth the price. So I signalled. In my experience, this is quite common, except that most people I know who aren’t in academia use less of what they learned in school and more of what they learned on the job or taught themselves than I do.

            [merging threads]

            Stating flatly that college is useless isn’t a very good way to figure out what it’s actually doing, how to do it better or how to reduce the cost.

            Take whatever road leads you to satori, I guess. But I’m not trying to do anything by posting here but while away an idle hour–certainly not to fix all the social ills associated with the US educational-governmental complex.

            you have somehow concluded that those without Masonic rings are bad at doing jobs.

            People who aren’t qualified for white-collar jobs will be much less motivated to purchase rings because, even if they want such jobs and the ring lets them get one, they’ll probably be fired before they can recover the cost.

            (And the IRL equivalent of the discontinuity shows up in less-prestigious schools, two-year programs, etc.).

          • HeelBearCub says:

            @Psmith:
            So, assuming that people in the market to purchase Masonic rings can easily detect whether they should or not, what makes this signal hard to detect for the prospective employers?

            In other words, if the difference is easy to detect, why do we need the employee to spend an absurd amount of money to assert their own intrinsic competence? Why can’t we negotiate the middle man out of the picture and capture the wasted value?

          • John Schilling says:

            You also seemed to have smuggled in an assumption that Masonic ring purchasing imparts skills

            No, only that it signals skills. And only in the same way that a peacock’s tail signifies fitness – if you haven’t got the skills to back it up, you’ll go broke trying to afford the signal. There’s no requirement for the Masons to test for skill, though it strengthens the argument if they do.

            The $100k for the ring is a pure deadweight loss, leaving a $100k surplus to be divided between any competent-but-ringless employee and any employer willing to hire them. But the premise is that such employers are very rare. So:

            1. If competent-but-ringless employees also very rare, the occasional surplus an employer gets when hiring them will be outweighed by more frequent losses when they hire an incompetent faker. Being the employer who hires the ringless is a sucker bet.

            2. If competent-but-ringless employees are common, essentially all of the negotiating power goes to the employer willing to hire them, meaning they will claim essentially all of the surplus.

            3. Second-order effects, like the sunk cost of starting a ringless job-hunt, the employer’s continued risk of hiring an incompetent faker, and the cost penalty for trying to rejoin the ringbearers after trying the competent-but-ringless route, further increase the employer’s bargaining power and will drive the wages below breakeven.

            4. This makes pursuing a white-collar career while ringless a sucker bet, so almost nobody will do it, pushing us back to 1. Enter the cycle wherever you like.

            The stable equilibrium is for everyone competent white-collar worker to borrow money to buy a ring, and for everyone who needs a competent white-collar worker to hire a ringbearer. Also, peacocks still have massively impressive plumage.

          • HeelBearCub says:

            @John Schilling:
            We are talking about an inherently unstable job market though, because old people are leaving the market and young people are entering it. Ringless entry level employees are very common, because they are ALL ringless, but they are not without leverage, because we have already stipulated that they are able to buy a ring.

          • John Schilling says:

            So, assuming that people in the market to purchase Masonic rings can easily detect whether they should or not, what makes this signal hard to detect for the prospective employers

            That would be the part where people lie.

            And sometimes they lie to themselves, so it’s never going to be a perfect signal. But if there’s $100K on the line when it comes to whether I can e.g. code C++ or do organic synthesis or whatnot, I can do a pretty good self-assessment. And if there’s $100K on the line when it comes to whether I can convince you that I can do those things, I can tell a pretty good lie.

            It will always be easier for me to know what I am capable of, then for you to know what I am capable of.

          • HeelBearCub says:

            @John Schilling:
            Now you are claiming that it is very easy for me to know whether I can do “masonic ring work” (which I have never done), but very hard for my employer, and, well, how exactly is that going to work?

          • HeelBearCub says:

            @Psmith:
            Do you teach freshmen econ 101 courses? How about senior level degree seekers? Do you advise PhD candidates?

            Think of the PhD candidates you advise and compare them to the freshmen. Do you think the freshmen you will go on to seek a PhD would be capable of working on their thesis on day 1 of their freshmen class? If not, why not? If so, … well, I actually don’t see you answering this in the affirmative, but … if so, why don’t they?

          • Psmith says:

            I’m just a reg monkey, not a professor. No Ph.D., and I sure as hell don’t advise anybody. (I may have exaggerated for effect by not mentioning this above. My bad. On the other hand, I don’t see the actual tenured and tenure-track research professors using much of what they learned in undergrad, either.). Still, just going by what I see around me…

            Do you think the freshmen you will go on to seek a PhD would be capable of working on their thesis on day 1 of their freshmen class?

            Yeah, pretty much. To the extent that they aren’t, as far as I can tell, it comes down to some combination of
            1) reading a bunch of papers and books on their own,
            2) one-on-one conversations with economists,
            3) a stats/econometrics class, preferably one that teaches enough R or Stata that they can teach themselves from there (this was one of the two or so classes that taught me something useful, and it might well be superfluous for someone smarter or more driven than I am), and
            4) deciding that they want to be economists and not business majors or doctors or whatever.

            Maybe the math-heavy guys need more formal math education, but it sure looks to me like the ones I know can pretty much teach themselves what they need to know, and that the ones who aren’t sufficiently hot shit to teach themselves sort into other fields.

            I worked minimum wage in a big residential kitchen for several years while I was in school. I got more out of that than I did out of most of my classes in terms of everything–pay, health, socialization, character-building and work ethic, organization, practical skills, even enjoyment compared to all but the most fun quartile or so of classes–except the metaphorical Masonic ring.

            (Of course, “is someone capable of performing the tasks of the job without going to college?” is a very different question than “will we, the professional gatekeepers, allow them into the profession if they haven’t gone to college?” Which is exactly my point. There are reasons that people continue to go to college and employers continue to hire college graduates, but it’s not because going to college causes people to be more able.).

          • HeelBearCub says:

            @Psmith:
            Well that is at least a consistent answer with your previous stance.

            I find the contention that the average freshman who will eventually seek a PhD is capable of starting work on their thesis on day 1 of freshmen year bizarre, though. I realize I am asserting that without offering evidence.

            Anecdata, I am a programmer employed in industry. In college I missed the switch to object oriented languages by a few years. I still, 20 years later, wish that I had experience with OO back then, as I tend to approach things procedurally. The piecemeal way that you pick up new techniques once you start working is just different than 4 years of of dedicated focus on developing skillset and knowledge.

            I’m a better programmer than most, but I can tell the difference.

          • Psmith says:

            I find the contention that the average freshman who will eventually seek a PhD is capable of starting work on their thesis on day 1

            It’s not that they don’t get anything at all out of college–it’s that they don’t get anything that they wouldn’t get from steps 1-4 and working a minimum-wage kitchen job. And partly I think it’s just a matter of getting older, too.

            Anecdata, I am a programmer employed in industry.

            Fair enough. I’m a little surprised–it seems like the most vocal autodidacts (or informal apprentices, etc.) are programmers.

          • HeelBearCub says:

            @Psmith:
            Both of my parents, and several of my aunts, uncles and cousins are teachers. My father, a university economist, spent a good chunk of his professional life working on more effective means of teaching economics. I’ve also done a fair amount of development and delivery of training as part of whole-life cycle development of large custom applications (single company user bases in the 1000s). I have done a fair amount of sales engineering/support as well as first line and second line tech support for products I have developed, especially when working in a small startup with single-digit employees.

            So I have bias. I am exposed to teachers who I have respect for and my sense is that it is a true and necessary profession. But I also have experience with trying to get the average computer user to understand how to use what they have been given. Trying to convey that in writing without them being able to ask questions is much less efficient and effective than simply delivering a presentation and/or a demonstration.

          • I think part of the problem with this discussion is not distinguishing among different sorts of students.

            Most of the students in an introductory freshman economics course are not actually interested in learning what you are teaching and most of them don’t–they memorize enough to get past the final. My guess is that for many of them the same is true of their other classes. They are in college for some combination of getting a credential and having an enjoyable four years socializing with their age peers. I wouldn’t expect them to graduate knowing much more about what they have had classes in than when they arrived.

            A minority of the students in the class are actually interested and likely to learn something. The PhD candidates are drawn from a subset of that minority.

            And it isn’t clear to me that the members of that minority learn more from the lecture class than they would from being assigned to read a good book on the subject, with some sort of feedback–perhaps quizzes and discussion sections.

            The employment puzzle is why the degree significantly increases the income of the pupils who were not interested and didn’t learn.

          • HeelBearCub says:

            @David Friedman:
            1) You are posing a binary measure of interest with a discontinuity of distribution. I’d posit that the more likely distribution is a bell curve over a less/more interested axis. You can call the center of that distribution “not interested” but I think you are almost certainly wrong.

            2) You further posit that the distribution of preferred/optimal learning styles is concurrent with with the distribution on the less/more interested axis. I believe that research into optimal learning styles fairly well debunks that notion. There are substantial number of people who are audio/visual learners. Given that universal literacy is very young, and audio/visual learning has existed as long as humans (or longer) this should not be surprising at all.

            Anecdotally, as someone who would have been on the far right side of the “more interested” axis, I needed the structure of class sessions and lectures. I rarely was able to make myself crack the textbook before a lecture, and mostly learned in college through lecture and note taking. The textbook was a reference for me, but rarely was it the source of my knowledge. And I learned a heck of a lot in college.

            The available evidence is not, I think, in favor of your theory. Some people simply read the book and teach themselves, but not most.

          • “There are substantial number of people who are audio/visual learners.”

            If that is the explanation of my puzzle, videos, in class or online, ought to be a cheap substitute for lecturers. But the mass lecture survived that technology too.

          • HeelBearCub says:

            @David Friedman:
            Production costs on high quality audio/video are high compared to the cost of the individual lecture. How many large lectures is a university going to replace before needing to redo the media? You could posit some cross university program, but that runs into substantial other issues.

            And given that you won’t be eliminating very many of the classes offered at the college, what have you gained? And at the loss of having Jr. faculty practicing lecture skills, the loss of the structure that class schedules bring, etc. anecdotally, I wouldn’t have watched the required video before my section, either. The loss in value provided by the structure would have been very high for me.

            Not to mention that people actually do ask questions and interact with the professor, even in large lectures.

        • brad says:

          We should see companies competing with colleges for the HS graduates who have the least to gain from going to college. Outside of the military, is their anyone who does this in any appreciable numbers?

          The students themselves have a strong incentive not to go with such a company. Because if they do, they might get paid well for a year or two, but then what? They are now “off the path” and somewhat damaged goods if they want to get back on it. Aside from anything else that means they aren’t in a good bargaining position with their employer.

          The military is somewhat of an exception because it is big and prominent enough that people all across society recognize it as a special case.

          • HeelBearCub says:

            If I work for two years at a job that normally requires a college degree, that will count as “relevant experience” in lieu of a degree. However, if I find this to be a stumbling block, I can still go to college afterwards.

            I’m not even asking why everyone isn’t doing it. I’m asking why no one is doing it.

            Given the amount of money lying there, we should be seeing someone take advantage of it.

          • Ken Arromdee says:

            Going to college after working a few years is a bad signal compared to going to college right after high school.

          • HeelBearCub says:

            @Ken Arromdee:
            Gap years are currently gaining in popularity, which would seem to argue against this. Graduate from college is a graduate from college, and it’s not at all clear to me how “Graduate from State Uni in ’16” looks worse than “Graduate from same State Uni in ’18, but with two years of intern experience under their belt”

          • The Nybbler says:

            People ARE doing it. In tech, for instance, there are people with no degree at all in software engineering jobs. But it’s generally a bad bargain because not having a degree remains a stumbling block even after the first job, which means depressed lifetime earnings. This means that to tempt a rational college-bound student to skip college and go right to work, you can’t pay as much less as you’d think. Usually the incentive is a shot at the “startup lottery”.

          • null says:

            Among what segments of the population are gap years gaining in popularity?

          • Edward Scizorhands says:

            I wish the gap year were a good signal. I wish that everyone took a year after high school to work in the real world and realize if they want to go to school for 4 more years.

            But right now I suspect the gap year is a negative signal, unless you are an Obama.

          • Anonymous says:

            I don’t think a gap year is much of a signal either way. It’s just a minor variant of the usual thing.

            It’s when you are 30, have eight years work experience, and a fresh bachelors degree that employers have to figure out how to think about you.

          • HeelBearCub says:

            It’s when you are 30, have eight years work experience, and a fresh bachelors degree that employers have to figure out how to think about you.

            And that is only if the work experience you have is markedly different than work experience I would expect you to have if you are applying for a position. So if I have 8 years of accounts receivable experience, and the job is in accounts receivable, the fact that my BA is newly minted won’t matter. Actually it may even look like a plus.

          • Anonymous says:

            Think about investment banking. The standard paths are :
            bachelors degree -> analyst -> associate -> VP -> MD
            or
            bachelors degree -> ??? -> MBA -> associate -> VP -> MD

            Where ??? can be analyst but doesn’t particularly need to be.

            If you had 2 years as an analyst and then got a bachelors degree should you be applying for an associate positions or an analyst positions when you graduate?

        • Ken Arromdee says:

          It works as a type of signalling. Signalling is useful to an individual in the sense that an individual who fails to signal or who fails to look at signals is comparatively worse off. However, it is useless in the sense that the population as a whole is not better off if they all signal/use signals compared to if none of them do so.

          Pointing out that employers don’t make money by hiring non-graduates shows that it is useful in the first sense, but not in the second sense.

          • Psmith says:

            Right. And there are two important things to remember in connection with that:
            1. Education is a very reliable signal, not just of intelligence, but also of things like conscientiousness and conformity. Any employer who wants to make bank by hiring people who didn’t go to college still needs some way to avoid hiring schmucks. Wonderlic tests will get you pretty far for intelligence, but the other two are pretty much by definition difficult to credibly signal in a short, cheap, easy-to-administer test.
            2. Education signals (at least in large part) a positional quality. The point of hiring college graduates is not to hire people who meet some Platonic threshold of “good enough”; it’s to hire people who are better than the competition. So anybody who wants to hire people who didn’t go to college but still wants to make sure they’re on the right sides of the relevant bell curves is going to have to come up with some kind of criterion that, to a plausible first approximation, weeds out about as many people as getting a college degree does.

            To put it another way, I think the problem is not that there are lots of people who would have great jobs if they only went to college; the problem is more that the people who are going to get good jobs have to burn several years and a pile of money first. So any employer trying to pull off this proposed arbitrage is going to have to find a way to distinguish the people who would have gone to college and successfully completed a degree, thereby signalling their underlying ability, from the people who wouldn’t have made it in college and are attracted to the job for this reason.

          • “Any employer who wants to make bank by hiring people who didn’t go to college still needs some way to avoid hiring schmucks. ”

            Less of a problem if it’s easy to fire people who turn out to be schmucks.

          • HeelBearCub says:

            There are 1 and two year study programs already in existence if all that is required is to signal that you can be conscientious. $100k or $200k per student is a lot of money sitting there waiting to be claimed.

            If all that is going on is nothing in those four years, someone is really going to come along and claim that reward if you believe in markets that operate with a modicum of efficiency.

            Walmart, GE, some really big company should be able to completely revamp corporate HQ. Small startups should be able to run circles around big companies that haven’t gotten with the program. Someone should be able to take advantage of the massive inefficiency that is being posited.

          • Psmith says:

            There are 1 and two year study programs already in existence if all that is required is to signal that you can be conscientious.

            These do not signal employability as strongly as a four year degree for much the same reason that a 6:00 mile does not signal running ability as strongly as a 4:30 mile. Unless they’re very difficult–and in those cases I think you see exactly the dynamic you claim. (The guy who completes his bachelor’s in two years is in fact going to get hired before the guy who completes the same degree plus a little bit of extra coursework in four.).

          • HeelBearCub says:

            “6:00 mile does not signal running ability as strongly as a 4:30 mile.”

            You have to actually do the work to be able to go from 6:00 to 4:30. This example largely proves my point, rather than fights it.

            But, let’s do something different. Let’s say it is all signalling. Jobs need people who can run marathons in 3:00 hours, so they recruit from people who actually run marathons in 3:00 hours. That means it will take at least 3 hours to conduct their job screen.

            What you are saying is that the businesses need people who prove that they can do 4 years of undergraduate work, which takes 4 years of undergraduate work. Even then (which I am only granting for sake of argument) college isn’t useless. It shows that the job screen is super-expensive to run.

          • Psmith says:

            What you are saying is that the businesses need people who prove that they can do 4 years of undergraduate work, which takes 4 years of undergraduate work.

            Basically, yeah.

            Even then (which I am only granting for sake of argument) college isn’t useless.

            Multiple senses of “useless” here. I think college is useless in that it imparts mighty few useful skills for the time and money. This has been my own experience and seems to be somewhat widely shared. That doesn’t mean you shouldn’t go to college. It is absolutely individually rational to go to college (if you want a high-paying white-collar job), which is why people keep going.

            It shows that the job screen is super-expensive to run.

            Exactly, and that’s the problem. Figuring out a way to make it cheaper would save a lot of time and money.

          • HeelBearCub says:

            If we were forming a team of people to investigate whether ancient peoples really did run large herbivores to exhaustion, and you wanted to be on the team, and they were looking at people who had a documented time in one of hundreds, nay thousands of marathons organized around the country, would we say “running a marathon to prove your a candidate for the team is useless. It imparts nothing and has no relationship to what you will do once you are on the team. It is useless status seeking.”

          • Psmith says:

            Mm, no, but we’d probably say
            1) If you want to prepare to be on the team, rolling out of bed and trying to run a marathon as fast as you can is probably not very useful. Instead, you should do some shorter work at various paces and perhaps a long run at an easy pace once a week. Going out and running the marathon will do very little to develop the abilities you’ll need to be part of the team. (I haven’t followed the state of the art in running for several years, but this was more or less the orthodoxy back when I was paying attention.).
            2) Usefulness has to be balanced against cost. If running a marathon cost a few hundred thousand dollars and four years, we’d probably want to use some combination of VO2 max testing, a Wingate test, blood draws to determine lactate metabolism, etc., instead. (I claim that there is essentially no analogous battery of testing that would substitute for college, that anybody trying to implement such a battery faces adverse selection issues, and that the existing policy environment militates against the development of any such substitute.).

            I am not saying that the problem of higher education (namely, that we spend too much time and money on it) is an easy problem to solve. If it were, we would have solved it already. In the status quo, it is individually rational for businesses to limit their search to college graduates and it is individually rational for prospective employees to attend college. That’s why it’s a difficult problem to solve. But I contend that it is, nevertheless, a problem.

          • HeelBearCub says:

            @Psmith:

            But I contend that it is, nevertheless, a problem.

            Given the framework I set up, here is how I would describe your contention: After Spending four years working at “running marathons”, you are no more prepared to run down a large herbivore than you were before you started.

            Given that we have many companies who engage in the business practice of running down large herbivores, and they hire almost exclusively from those who have done the 4 years of training and competing, I think it’s reasonable to ask you to come up with evidence that these companies can succeed in running down large herbivores by hiring people who have only been training for and running 5Ks?

          • Psmith says:

            I think it’s reasonable to ask you to come up with evidence that these companies can succeed in running down large herbivores by hiring people who have only been training for and running 5Ks

            Sure. Like I say, this is a problem for a reason. If it were easy to solve, we’d have solved it already. So a big part of our agenda (insofar as there is an “us” and insofar as we even have an agenda, ~passivism will win~) is coming up with institutions that will do what college does, but better and cheaper. In the absence of some suitable alternative mechanism, like I said, it makes sense for most people (who want high-paying white-collar jobs, anyhow) to go to college, and it makes sense for most employers (who want high-paid white-collar employees) to hire college graduates. I don’t intend to march up and down outside Stanford with a sign saying “Dissolve the Monasteries Now.” But I think, for example, that “people who go to college are richer, therefore everyone should go to college” is false, and the falsehood of that statement ought to inform policy.

          • HeelBearCub says:

            coming up with institutions that will do what college does, but better and cheaper.

            Something that does “it” better and cheaper is always nice, if you can get it.

            I guess the big issue I have is that you usually don’t get there by asserting that “it” doesn’t exist and you can get “it” for free (whatever the “it” is).

            Stating flatly that college is useless isn’t a very good way to figure out what it’s actually doing, how to do it better or how to reduce the cost.

        • eccdogg says:

          Bryan Caplan is finishing a book on this subject soon to be released. His answer:

          At this point, you may be thinking: If professors don’t teach a lot of job skills, don’t teach their students how to think, and don’t instill constructive work habits, why do employers so heavily reward educational success? The best answer comes straight out of the ivory tower itself. It’s called the signaling model of education – the subject of my book in progress, The Case Against Education.

          According to the signaling model, employers reward educational success because of what it shows (“signals”) about the student. Good students tend to be smart, hard-working, and conformist – three crucial traits for almost any job. When a student excels in school, then, employers correctly infer that he’s likely to be a good worker. What precisely did he study? What did he learn how to do? Mere details. As long as you were a good student, employers surmise that you’ll quickly learn what you need to know on the job.

          I see that others have already hit this point.

          • Airgap says:

            I’ve had a vision of the future in which after publishing the book, Caplan is falsely accused of sexually assaulting one of his students who’s unhappy with the ideas he’s promulgating, has his tenure revoked, and loses the down payment on the new house he was going to buy for his family. He’s also begun to resemble William H. Macy for some reason.

    • Corey says:

      An under-appreciated (as far as I know) aspect of the “college experience” I’ve been thinking of (as a mid-forties college grad with an 18-year-old nephew facing the choice of whether and how to do college): the soft start on adulthood.

      You get some subsidized time to get used to living on your own (usually with roommates), managing parts of your own finances, etc. without the immediate pressure of “overspend or screw up and you’re homeless.”

      • HeelBearCub says:

        I think it would be fair to say that the journey from preschool to college is replete with exactly these sorts of increasing rights/responsibilities.

      • NN says:

        In my experience, I think that this aspect of college may have done more harm than good for me. In particular, having a meal plan and living in a dorm room about 2 blocks from a school cafeteria was definitely not a good preparation for living on my own and having to prepare my own food most days.

    • LPSP says:

      As someone who dropped out of college and would probably conform to most’s idea of a “tulip subsidizer”, and also as a primarily child educator, I gotta say those are two unconnected things.

    • dndnrsn says:

      A thought: is this discussion leaving out the social opportunities in university?

      There are a lot of professional career paths where, beyond the “guild membership” aspect of it, your time in university is a great time to build connections. Even if a state or province or whatever let people sit the exam to become a lawyer, those who went to law school would still benefit from connections – perhaps enough to make the price tag worth it. The same is true of many other professions and career paths.

      Beyond that, there’s the personal aspect: university is where a lot of people make their closest friends, meet partners, etc. These things improve quality of life.

  15. John says:

    Assuming there is no deity responsible for the universe, what was the probability this blog would exist? If there were a deity and rational design is valid, then would the probability increase or decline?

    • ton says:

      1. Roughly 0, unless a multiverse exists

      2. Increase depends on whether a multiverse exists

      Your favored theory should be whichever of “multiverse, god” you consider simpler given our world.

    • E. Harding says:

      1. 100% (or close)

      2. No.

    • Airgap says:

      The existence of SSC is proof of God’s existence. Checkmate, atheists.

    • Aegeus says:

      1. Almost by definition, we have no way to tell whether the universe could have been created in a different way than it was (the universe is everything we can observe, so there’s no way to get outside it and look at multiple universes). And we certainly have no way of knowing if that different way would lead to this blog existing or not. So we can’t really say. But we’ve observed one universe that exists, and it contains SSC, so… 100%?

      2. Even if the universe was intelligently designed, we have no idea what that design is. We have no idea whether SSC’s existence furthers God’s plan or not, so again, we can’t say. But again, we’ve observed that our universe contains SSC, so if we assume God created the universe, this blog must be part of the divine plan.

    • JuanPeron says:

      1. Miniscule, though non-zero (for any given universe).

      2. Increase to a decent probability – well under 1%, but not in the “treat as zero” realm of probabilities.

      But this persuades me of nothing because we’re reinventing the anthropic principle. We don’t get to claim cosmological constants as evidence for anything, since we’re actively limited to observing samples in which the constants are similar to what we do in fact observe. That pushes our observation point into the last 10,000 years to start making observer-dependent claims.

      The version of 1) that’s comparable to 2) is something more like “given that no deity is responsible for the universe, and that universe supports life and human-like sapience, what was the probability that this blog would exist?”

      And the answer to that is about the same as the answer to 2): namely, “Low, but not astronomical. It’s one of many plausible blogs which could exist.”

  16. CrashSite says:

    After reading the “Skin in the Game” post it got me thinking about a topic that has been annoying me for years. How do you balance between some people a lot who are far away with you with helping someone close by a smaller amount.

    When I was volunteering at a charity book store there were two events which highlighted this issue. The first was I had a manager, who was a proper geezer (this is England by the way), he was a very nice guy and I enjoyed working with him. However he was not the best at his job, he wasn’t bad, just not the literal best. Eventually the organisation made him reapply for his own job and he didn’t get it. I got the feeling that this job was a large part of his life and he seemed devastated at having lost it. The next manager managed to increase profits in the shop and was also nice, but I couldn’t help but feel like the organisation had failed to help this person near. But obviously by increasing profits they were helping some of most vulnerable people on the planet, so it seems weird to complain about this one person they didn’t help.

    The second example was we had a old guy who used to come into the shop. He was a very intelligent man (spoke fluent french and had a great knowledge on a surprising number of topics). However he seemed to have some sort of neurological problem and he was prone to out bursts and in general acting strangely. He wandered the streets all day, no matter the weather and so when he came into the store we gave him a cup of tea and allowed him to sit in a chair for a couple of hours. However after the change in management he was more or less told that he could no longer come and in sit in the store, since his outbursts were scaring off customers. Once again I felt uneasy, even though it was only a single guy compared with helping some of the worst off in the world.

    There were even little things like how we could use expenses to buy snacks and tea, surely we should have forgone those since the money we spent could easily save someone’s life. It seems so hard to balance between these two issues and I can’t really say I have an intuitive response one way or another, I would love to hear other people’s input.

    • John Buridan says:

      These are great examples. The first one seems very grey. It would be a tough decision to make either way. How detrimental to the organization is he? What agreements do members of the charity tacitly have with each other and management? It’s possible that he had to go… have outcomes been that much better since he left?

      I feel a similar disease with the guy getting laid-off as you do. An organization has responsibility to treat its employees well, yet it does not seem he was actually mistreated. However, maximizing profits and the organization’s output should not come at the expense of those who sacrifice for the organization. If you are working in a charitable organization, as it seems we both do, there are often people “out there” who could fulfill the same role better, and some current peers could achieve financially better outcomes for themselves by going elsewhere. Yet, there is an ethos of solidarity; a certain amount of self-sacrifice gets made and opportunity gets lost on behalf of the charity, either for the nobility of the cause or the fulfillment of the individual. Those continued acts of solidarity should play a large part in the calculus of who is told to go. Too often we overlook the bonds of trust and cooperation which make our battle for the Good Side a battle worth fighting.

      In your second example, of course, it would depend upon how disruptive the guy was, how regularly. Nonetheless, those with mental disorders need to be treated compassionately insofar as is possible by the organization. Some people are not perfectly developed and peak physical/mental condition. So what? That’s a fact of life, we should learn to deal with patiently so long as we can’t cure it. It sounds to me like the guy wasn’t a terrible menace. If he truly was, then perhaps asking him not to come around makes sense.

      We have a special needs guy who is not intelligent at all. He comes around every two weeks or so, asks really loud questions, sometimes soils our bathroom, and takes a worker away from his/her task for a few minutes. It’s not the end of the world. We treat him kindly and respectfully and help him on his way. He is not a blight upon humanity, nor such a detriment to our operations that clients have a right or reason to be afraid.

      Not everyone gets to be a rationalist, but everyone (who is not working day and night for Dark Side) should receive compassion and be extended patience, even if they are in the wrong job position or a store they shouldn’t be in.

      Maybe that’s a non-answer. But that’s how I see it.

    • houseboatonstyx says:

      @ CrashSite

      On Munchkin fora, often there’s a hypothetical about a Paladin captured by an Evil Whosis who threatens evil to some hostage, unless the Paladin breaks his vows. Imo the real* Paladin thing is, don’t waste time or energy wrestling with conscience; concentrate on finding a clever, practical way to rescue the hostage.

      * Silent movie Lawful Good cowboys, first generation Superman, etc.

      That’s the focus I’d recommend. Trying to persuade the cold hearted Managers in your comment (who are right on logical grounds anyway) very likely won’t work (or shouldn’t). Instead try to help the geezers in some direct practical way, like connecting them with some Senior Center etc.

    • eh says:

      Maybe we’re missing externalities. Biscuits and tea might help retain volunteers, helping a mentally ill man might generate social trust or in-group cohesion, etc.

      I’ve always thought of raffle tickets and door to door chocolate sales as ritualised tests of loyalty combined with bonding exercises, where junior members of an organisation are tasked with going out and earning money by representing the group well.

    • onyomi says:

      I’ve been think a lot about this lately, too. I feel recently in America we’ve moved toward a culture of economic efficiency (what I’d probably call “neoliberalism” if I were a different kind of academic) which says you always fire the not-super-efficient-but-this-job-really-means-a-lot-to-him guy. The idea that this will hurt morale or whatever doesn’t usually enter into the picture, much less that you should just give preference to people who are already invested and whom you already know over theoretically better strangers.

      I encountered something that made me think of this in academia recently. An older colleague, hearing I was looking for a book publisher, offered to recommend my manuscript to his prestigious book publisher. I was like “sure, that sounds great!” Not much later he apologetically told me that book publishers no longer welcome recommendations. They want everyone to do it sort of on their own merits. This is no doubt a result of intense competition and a desire to avoid appearance of partiality (plus, if you can get a recommendation, before long everyone will have one and it will be useless).

      This relates also to the memes making fun of older generation people saying things like “why don’t you ask your friend so-and-so if they can get you a job at their company.” This kind of personal connection thing just doesn’t work, at least in the US, like it used to (it is still very much a thing in China, and probably a lot of other places, based on my experience, and it can definitely sometimes be infuriating and inefficient to have to cultivate personal relationships for every business deal and for people with connections to get ahead of the more talented).

      Yet I also think there’s something psychically extremely beneficial about not trying to totally divorce business from personal life. It might even make up for itself in some cases through increase morale and devotion to the company (loyalty to companies and organizations today seems almost nil, and deservedly so, in my experience). But I’m not sure where the balance lies. I feel right now we’ve gone too far in the “fire the nice, slightly inefficient guy to whom this job means a lot” direction.

      • brad says:

        I think it’s turtles all the way down. I read a post or op-ed or something once about why flying is such an unpleasant experience. The basic answer was: look in the mirror. When vacationers book a flight, the first thing they do is go to expedia or kayak or whatever put in their pair of cities and dates and sort by price. They don’t have a favorite airline, they don’t run down the configuration of the plane, they just sort by price. If you are going to do that then companies are going to fight each other to get that cheapest spot. Scott’s moloch article discusses something similar.

        It’s interesting to see it in the non-profit sector and think about how that aligns with effective altruism. It seems like the EA answer is for the non-profits to act like the most aggressive, rapacious, profit hungry capitalists so that they can maximize the money going to the most effective use per dollar. Something like compassionate leave could only ever be justified by hard data. There seems something off about that to me, but then I’m not an EA.

        About that hard data–morale and devotion point seems like something we might wish were true, but that only means we should be more skeptical about data tending to show its true.

        • Garrett says:

          My problem with this is that from the consumer’s perspective, there is no noticeable relationship between the amount paid and the service obtained.

          The TSA makes the non-flying part horrible (which is why I no longer fly).

          Next, I did a quick search on Priceline for flights between Pittsburgh (closest airport to me) and Las Vegas (a popular destination). For the dates selected, more than a month out, including Saturday-night stay:
          * There were 40 different options.
          * The cheapest, fastest, shortest fare was a direct flight.
          * All other options cost anywhere from ~1.5x as much to ~4x as much. There was no way to evaluate seat pitch or width, or other costs other than baggage fees and taxes.
          * The cost differences didn’t seem to relate to any obvious pattern of flight times, etc.
          * There isn’t any way to pay extra for just what you want. Eg. I’ll pay extra for more seat width, and up to 2″ of extra legroom. I’m willing to make extra stops or vary my flight times if I save money as long as I get in by 10pm.

          • John Schilling says:

            I did a quick search on Priceline for flights between Pittsburgh and Las Vegas…

            Well there’s your problem right there. You do a quick search on a service named Priceline, and you were expecting to get anything but the cheapest possible price for a service that can be legally advertised as an airline flight from Pittsburgh to Las Vegas?

            There absolutely are ways to evaluate seat pitch and the like. Priceline isn’t one of them. There are ways to purchase a few extra inches of width and legroom, often even on the same flight. Priceline won’t tell you about them. Because 95% of the flying public doesn’t care, or more precisely doesn’t care enough to spend even $30 extra for it.

            If you’re looking for a simple one-stop aggregator that ranks seating and amenities as well as price, TripAdvisor.com might be a place to start.

          • bean says:

            Garrett:
            * The cheapest, fastest, shortest fare was a direct flight.
            Well, yes. The last two are tautologically obvious, and the first is pretty much to be expected. These days, airlines are under intense pressure to fly their planes as full as possible. Indirect routing takes two flights instead of one, so there’s not much incentive to do it unless they have less capacity on the direct route than on both of the legs of the indirect one. That’s pretty rare to a major hub like Vegas.
            * All other options cost anywhere from ~1.5x as much to ~4x as much. There was no way to evaluate seat pitch or width, or other costs other than baggage fees and taxes.
            Pitch and width are to be found online, as are things like airline food cost. The fact that they aren’t all bundled together just raises the research bar slightly.
            * The cost differences didn’t seem to relate to any obvious pattern of flight times, etc.
            I find the opposite. When I’m shopping for tickets, it’s usually cheaper to take the first flight of the day or get in late at night.
            * There isn’t any way to pay extra for just what you want. Eg. I’ll pay extra for more seat width, and up to 2″ of extra legroom.
            As John says, they call it ‘Premium Economy’ and usually offer it as an upgrade when purchasing the ticket, instead of putting it on priceline. I’m sure the marketing people have a good reason for that.
            I’m willing to make extra stops or vary my flight times if I save money as long as I get in by 10pm.
            Extra stops often don’t make economic sense for the airline. Varying schedule is called ‘standby’, and doesn’t guarantee you a seat. The airline can only pass so many people a day over a given route, and varying schedule is a logistical nightmare. Often, the plane is going to be full anyway.

            John:
            There are ways to purchase a few extra inches of width and legroom, often even on the same flight. Priceline won’t tell you about them. Because 95% of the flying public doesn’t care, or more precisely doesn’t care enough to spend even $30 extra for it.
            I don’t think that’s the reason. I think that the marketing people have discovered that a significant number of people, when given a choice between $150 and 29-inch pitch and $180 and 32-inch pitch, go for the 29-inch pitch, but when told during seat selection ‘and for only $30 more, you can get an extra 3 inches’ go for the extra legroom. The last few flight-comparison websites I’ve used have generally listed three categories for each flight: economy, refundable, and first. So they’re clearly catering at least somewhat to niche customers.

        • bean says:

          That’s actually why Southwest doesn’t list on aggregator websites. They want people flying with them to be doing so because they like them, not because it was the cheapest. (Although if you’re flying with lots of luggage, they are almost always the cheapest.) But to a large extent, you’re getting what you pay for. A lot of airlines will sell you all the perks you want. You must not fly much if you don’t know about the conventional carrier’s ‘premium economy’ seats. Airplane configuration is (on most domestic routes) not really a variable. You get either a Bombardier/Embraer or a 737/A320, and put people in it in one to three classes. Which you get is determined by route and capacity.

          • brad says:

            An A320 can be set up like spirit airlines does or like Jet Blue does. Same plane very different experience.

            Premium economy makes it even more complicated to comparison shop. How much lower does the base fare need to be in order to make spirit airline’s “Big Front” seat a better deal than a standard jet blue seat? If instead of doing this kind of comparison shopping the vacationer picks the cheapest fare and then springs for the economy premium during checkout that only increases the incentive to have the absolute cheapest seats available. Got to keep that funnel filled.

          • bean says:

            An A320 can be set up like spirit airlines does or like Jet Blue does. Same plane very different experience.
            I misread you there. I was thinking of ‘favorite airplane’ as opposed to looking at the seating layout. I agree that the seating layout the airline selects has a lot more to do with the experience than the airplane itself.

            Premium economy makes it even more complicated to comparison shop.
            I believe that’s called ‘marketing’. Sure, JetBlue may be cheaper than United’s Economy Plus (and the seats are almost exactly the same), but the customer doesn’t realize that when they upgrade on United.

            How much lower does the base fare need to be in order to make spirit airline’s “Big Front” seat a better deal than a standard jet blue seat?
            People aren’t that good at doing explicit utility calculations. And those that understand the air travel game aren’t really priceline’s targets.

        • onyomi says:

          I do think there’s a definite “tyranny of the consumer” at work, as well (which is why it surprised me in the last thread that Heelbearcub assumed I wouldn’t blame the consumer for making shitty choices). I think it’s kind of a defect-defect vicious cycle which can get created when companies and customers start treating each other antagonistically. This seems especially severe with intellectual property lately, though in that case I side almost wholly with the consumer.

          I am probably somewhat guilty of seeing another nail for my libertarian hammer here, but I personally think having a ton of regulations for “consumer protection” and the like ironically encourages a situation where both buyer and seller take the maximum advantage within their legal rights. Because so much is illegal, what isn’t illegal must be okay.

          Re. airlines, point taken, though I don’t think it’s a great example. For one thing, there isn’t much difference among carriers in the US, and the difference between the least expensive flight and the second-least expensive flight is often hundreds of dollars for no apparent reason. If some of the companies were actually consistently a lot better than others I might feel some loyalty, but mostly I find they all have their good days and bad days and average out to a consistent mediocre.

          Like, if there were two companies both flying where I wanted to go and one was $50 more expensive but I knew they had nicer planes, better foods, and arrived on time more often, I’d definitely pay the extra. Maybe even more. But the choice you are usually faced with is “you can buy the cheap, direct flight on crummy airline A, or, for an additional $300 you can buy a flight that does a bunch of weird layovers on slightly less crummy airline B.”

          • I think there are large differences in international flights. I flew to Asia on Singapore airlines a while back, and it was impressively good.

            And, on the whole, I find Southwestern at least a little better than other airlines, although that’s more features such as free checked baggage and no penalty for changing your flight than in flight amenities.

          • Dan T. says:

            Some airline pricing seems downright illogical, like when a flight from A to B, booked separately, is actually more expensive than a flight from A to B and then B to C booked together, where you wind up on the exact same flight as the person who booked the first option. People actually find it desirable to book flights with extra legs that they don’t end up using because of this, leading to increased problems of seats not being full as the airlines would want.

          • onyomi says:

            “large differences in international flights”

            Oh yeah, definitely. I usually choose a non-American-based carrier whenever I can. They are usually (though not always) much nicer in terms of service.

            The fact that non-US-based carriers are not allowed to operate within the US, btw, seems like a huge sop to our crummy airline industry, and is arguably an example where trade barriers are making the great majority of Americans worse off to protect one non-competitive local industry.

          • bean says:

            I think there are large differences in international flights. I flew to Asia on Singapore airlines a while back, and it was impressively good.
            Singapore is commonly rated among the world’s top three airlines.

            Oh yeah, definitely. I usually choose a non-American-based carrier whenever I can. They are usually (though not always) much nicer in terms of service.
            International flights are a rather different thing from domestic flights. Emirates, for example, has a good reputation for international work, but a rather poor one for its (small) domestic operations.

            The fact that non-US-based carriers are not allowed to operate within the US, btw, seems like a huge sop to our crummy airline industry, and is arguably an example where trade barriers are making the great majority of Americans worse off to protect one non-competitive local industry.
            It’s actually an international law thing. Air traffic rights are regulated by international treaty, and called freedoms of the air. That said, there’s already a lot of competition in the domestic market, and if you’re comparing international services by non-US carriers to domestic services by US-flag carriers, you’re not really looking at the same thing. To the best of my knowledge, flights inside the European air travel block are not that dissimilar to those in the US. The big difference is the presence of RyanAir and EasyJet, which take packing customers aboard to extremes. Spirit is doing the same thing here, but hasn’t achieved the same penetration of the market.

          • onyomi says:

            “already a lot of competition in the domestic market”

            I disagree. The strong tendency has been for mergers and for each airline to focus on a couple of hubs. So while there are a fair number of airlines out there, most of the smaller ones are subsidiaries of the few big ones, and on any given trip, there are usually only 1 or 2 reasonable choices.

          • bean says:

            I disagree. The strong tendency has been for mergers and for each airline to focus on a couple of hubs. So while there are a fair number of airlines out there, most of the smaller ones are subsidiaries of the few big ones, and on any given trip, there are usually only 1 or 2 reasonable choices.
            This depends a lot on where you live. For reasons I don’t understand, some markets are poorly-served relative to their size, and thus are a lot more expensive and have fewer options than others. But I’m not sure that the US market could support that many more carriers. Big carriers can run lots of routes, and trying to route through little carriers would be a logistical nightmare. And there are significant economies of scale in operations. Airplanes take a lot of work to keep flying.

          • arbitrary_greay says:

            So it’s less that we need more airline-to-airline competition, and that we need more competition from high-speed rail?

          • ReluctantEngineer says:

            The number of routes on which high-speed rail could effectively compete with air travel is very limited (particularly in the US, where our two biggest metropolitan areas are 2,800 miles apart).

      • houseboatonstyx says:

        @ onyomi
        Yet I also think there’s something psychically extremely beneficial about not trying to totally divorce business from personal life. It might even make up for itself in some cases through increase morale and devotion to the company (loyalty to companies and organizations today seems almost nil, and deservedly so, in my experience). But I’m not sure where the balance lies. I feel right now we’ve gone too far in the “fire the nice, slightly inefficient guy to whom this job means a lot” direction.

        Obvious distinction: The nice, slightly inefficient guy doesn’t freak out the customers like the wierdos off the street do. The nice inefficient guy may please some customers, as being easier to talk to, as well as just plain nice.

        So I’d get rid of the wierdos (as nicely as possible) but spend as much time and thought as necessary with the inefficient guy to make the job a better fit for him. Or just appreciate him as he is!

    • ediguls says:

      To transform Newtonian ethics to utilitarian ethics, simply multiply your intuitive cimpulsion to help with a factor r, which is the physical distance between yourself and the person you want to help. Then renormalize.

      Maybe you have moral anisotropy though, in which case you need to use vectors and multiply with an appropriate anisotropy tensor as well.

  17. Good Tea Nice House says:

    What is the hypothesis you’re testing with your experiment, Scott?

    (I only ask because I would like to be a data point that helps prove true a hypothesis like “more frequent open threads would be a good thing for SSC and SSC readers, especially those who refuse to go back on heroin get a Reddit account.”)

  18. Dan Peverley says:

    Stellaris is bad. It’s got potential, but at the moment it’s buggy, unbalanced, illegible, and doesn’t deliver on its promise.

    When I say it’s illegible, I mostly mean the combat system. Other Paradox Studio games were much better in this regard, Stellaris has fancy 3d models on the map and ties what’s happening in the combat to actual stuff you can see, but most of the particulars of how these onscreen movements and choices are made are mysterious and unexplained. How does the game choose what ship to put at the front of the inexplicable triangle formation? I’d like to be able to choose that, but in the absence of that knowing the rules would be nice. How exactly do the AI modules attached to the ship affect behavior in the locked in battle mode? Why do my corvettes occasionally just sit around doing space donuts as they get fired upon by enemies outside their range? Is it because of the longer range destroyers I mixed in with them, are they sticking in the formation with them out of some sense of misguided loyalty? In the hours I’ve played, I found the behavior of the battle AI so frustrating that I coordinate my fleets to warp right in on top of the enemy at the edge of the gravity well, creating an immediate mosh-pit of flashing lasers and exploding ships with no opportunity for the computer to dick around. In EU4, the system was very simple (rows of squares with symbols on them, lined up to mash against each other in two recurring phases of combat!), but it was completely legible what was happening. You could see what was fighting what, squares changed in shade as the armies they represented depleted, you could see the dice rolls for each round of combat. Infantry out front, cavalry towards the sides, artillery at the back when there’s enough infantry for the front line, and so on. In Stellaris, who even knows! I’ve built a fleet and had my missile blocker corvettes out front, where I want them to be, at the very back, being useless as torpedoes hammer into my front ranks, and in the middle. I’ve seen gifs people upload of swarms of ships seemingly stuck like Burridan’s ass in the middle of two different fixed enemy stations,vibrating into a huddle as they are hit by missiles from outside of their own range. The different weapons and ship types are unbalanced at the moment, but those are just minor scripting changes, a change to a value here and there, the battle AI is a serious problem which infringes directly on fun.

    The user interface is a nightmare of mixed messages, missing functionality and clutter. They can do better, and have with their other games in the past, the only explanation is a complete rush job and lack of development time. The mission system “Situation log” is filled with tens of identical “collect debris” research projects after every war, you have to flit through menus in un-intuitive processes to access necessary game functions, ship designs are unsegregated and unsorted in a long linear list!

    Weirdly enough I’m still planning on playing more of it though. I’m having fun with the buggy and broken aspects of the current system, trying to figure out how to abuse its inadequacies before it’s fixed, and while a lot of material has yet to be implemented the pop system they put in place is a good backbone for a game of this sort, though more ways to influence Ethos values and whatnot would be appreciated. Thoughts from other people would be nice.

    • reytes says:

      I haven’t played it yet (my computer is very bad), but how is it compared to, like, release EU4 or release HOI3 or pre-HOD Vicky 2? Because the complaints that I hear from people sound valid but they also sound like recurrent problems for Paradox games generally, especially before they’ve had time to refine their models and experiences. I’m not saying people shouldn’t complain but like, at a certain point, it’s really not novel for Paradox to need time to refine things like AI and automation, and the fact that they’ve apparently built really solid basic systems into the game seems positive to me.

      I mean, like, Vicky 2 is probably my favorite Paradox game, and I would say it STILL has problems with message spam, opaque subsystems, and it not always being clear what the player’s motivation should be, even after 2 DLCs. Of course that’s kind of a generation old at this point but still.

      • suntzuanime says:

        The problem with Stellaris is that it doesn’t have the real-world history flavor to fall back on to paper over the issues.

      • Samuel Skinner says:

        The issue is that while some things seem math balancing (and resolvable) like the combat system, others don’t.

        For example, the closest I’ve seen to an inherent flaw would have to be planet management/sectors.

        On planets, you assign pops to tasks. Sectors let you/(are required after you control a certain number of planets) automate things with the AI. If the AI can do a good job (aka it is a solvable task) why is there even a planet screen? If the AI can’t, playing will be a constant struggle of microing to make up for the feature designed to reduce micro.

        That’s… not good.

        Ethos are weird; while individualist/collectivist makes sense and just needs a bit more bug fixing/balance, pacifist/militarist is really policy and shouldn’t be an ethos, xenophobe/xenophilie is straight up good/evil and materialist/spiritualist is… odd. It doesn’t make sense in universe (since this is one where mental powers exist so the test for the new pope presumably includes making the Papal Throne hover) and it has the issue of balancing research versus any other input. That is a really hard to balance especially since the research rate is set by bonuses due to the increasing tech cost from population.

        I’d personally throw out all but individualist/collectivist and replace them with
        xeno-nationalist – xeno-assimilationist
        continuous – discrete
        concrete – virtual

        xeno-nationalist versus xeno-assimilationist would be about how to integrate aliens. Do you go the multicultural route, going as far as to set up a millet system so that each alien species has its own law system and leadership underneath your government? Or do you attempt to make everyone an equal citizen of your state, insuring all have a common background and interact and mix with others no matter their species.

        The former would make it easier for pops to migrate in and out of your worlds as well as insulate your pops from the ideologies of alien populations; the latter would have the opposite effect. Xenoassimilationist would also be in favor of spreading their ideologies to other empires and spreading their genetic traits to others while xenonationalists would prefer having aliens live on planets they are adapted to.

        continuous versus discrete deals with change; do they like radical measures or go slow and steady? I’d have it deal with the probability of getting certain techs as well the way happiness works (getting a bonus for each percent or for passing certain levels)

        Concrete versus Virtual deals with transhumanist specification, wireheading and other major modifications.

        Of course to make these work (in the sense the flavor matches their effects), you’d have to add in new mechanics. I doubt they change the game in any way to be like these.

        • Eggoeggo says:

          Watching sector AI crap itself in the release streams was the dealbreaker for me, yeah. The best were the robot-populated tomb-worlds whose governors insisted on putting them all to work farming the lifeless hellscape.

          Couldn’t stop imagining some poor robot standing with an apple like Eve Tempted, going “what the hell am I supposed to do with this thing?!”

          Either let us optimize planetary economies ourselves, or abstract it to some “labour, land, capital” level where it’s not essential to carefully place individual buildings and staff them with exactly the right species of slave in order to have a productive planet.

          • Samuel Skinner says:

            The best part is that we had Victoria 2 moders laying out ideas of what they would do if the ever made a future space version of the economic engine. I’m puzzled by why they didn’t go with that and instead went with a totally generic economy.

            https://forum.paradoxplaza.com/forum/index.php?threads/thinking-about-programming-my-own-game-a-victoria-2-like-space-empire-game.653367/

          • reytes says:

            @ Samuel Skinner

            I mean… I love Vicky 2, but the Vicky 2 economic system was insanely complex, very obtuse, and also often had weird issues (see for instance Capitalist AI decision-making). Also, I’m fairly sure that no one currently working at Paradox actually understands how the underlying code works. So I’d imagine the reason they didn’t go with a Vicky 2-style economic system is because it would have been really, really hard to make, and take a huge amount of work, and probably not worked that well, and made the game much more complicated.

            It does sound like the sector AI is a problem in terms of striking the balance between automation and player control/optimization. It reminds me of some of the similar issues people had with automation in HOI3, but at least less central to the game than in that case, and hopefully Paradox will get it sorted out. It does seem like the kind of thing that would be naturally ripe for DLC.

    • anon says:

      Yep, I’ve played it a lot now and Stellaris very much seems to follow the Paradox tradition of “Is it done yet? – No but we’ll fix it after release.”

      At least it doesn’t crash to desktop every half an hour like HoI3 did, but so so many systems everywhere in the game are clearly unrefined, unbalanced or just plain broken. The midgame lacks the flavor of the early and late game, and they have outright admitted to shipping without the entire system that was supposed to fix that (colony events) just because it wasn’t done yet. Right now, if you are not among the biggest hardcore fans of Paradox (in which case you probably already have it…), avoid.

      That being said, there is a lot in Stellaris that shows real promise. The flavor and writing of a lot of the game is brilliant, like Alpha Centauri level brilliant. At it’s core, the gameplay is fun. It really feels like you inhabit a living universe. Consider this game again after a few patches and expansions.

      • Jordan D. says:

        I would agree with this comment. If you’re not so into the genre that you’ve already determined to pick it up, wait half a year for a sale. By that time, it’ll have at least two patches and possibly a mini-expansion and it should be a pretty great game.

        Things the game does well-
        – Exploring is satisfying
        – Ship design is pretty easy and meaningful
        – The choice between hyperdrive and wormhole gates is interesting and meaningful
        – Game events and event chains are intuitive, surprising and well-written
        – Species/ethos design is natural and complex
        – Uplifting primitive species is well-done, challenging to do but extremely useful once mastered

        Things the game sucks at-
        – All diplomacy
        – Borders are, by default, magical force-fields
        – Fallen Empires really terrible at their jobs
        – Keeping fleet updated is a huge PITA, and the auto-update seems to prioritize making your ships really expensive over making them any good
        – Corvette spam is the alpha and the omega
        – Sectors are a cool idea, too bad they suck
        – No matter how advanced you get, the coolest Fallen Empire stuff (ringworlds, sealed planets, etc) lieth forever beyond your power to build
        – You can scrupulously avoid using dangerous technology if you like, but there’s no way to tell if some idiot empire on the other side of the galaxy is gaily creating superhuman AI and then enslaving it while going ‘Whatcha gonna do, wipe out all organic life? You won’t.’
        – Warp drives are terrible
        – No matter how advanced you get, your options for actually interacting with all the systems you had such fun exploring remains limited.
        – The non-empire alien factions are boring.

        But my biggest complaint is a constructive one- Stellaris had the chance to do something which no empire-building or 4X game has done since Alpha Centauri. In AC, being science-focused was fun, and not just because of the benefits that it brought you. In AC, discovering the true nature of the Planet and choosing how much to adapt to it vs. adapting it to you was a storyline- when you did a lot of research, it told you a complex tale of how your technology changed you. It was like a subgame, an RPG contained within one of the strategy elements, and playing through it rewarded the player with cool options and a unique end-game goal.

        For a while, I really thought Stellaris was going to do that. It tricked me several times; discovering the precursor’s trails, researching the enigmatic nature of the Void Clouds, finding the Tree of Life, finding a shattered ringworld… each time I thought that this was going to be the first part of an epic journey of discovery, researching phenomena scattered across the galaxy to learn a hidden story. But instead each of them petered out after a few events, dumping a handful of resources into my lap and then vanishing.

        And that’s what disappoints me most. That disappointment is probably why I’ve only been able to bring myself to play for about six hours each day.

  19. Carinthium says:

    Hi. I have a problem, and the mental health lines aren’t working.

    I’ve been trying to get my parents to stop interfering in my life. I thought that getting a job in which I pay my own way plus the fact I’m renting my own flat would be enough, but it isn’t. Last time I lost my credit card (which had only $20 on it), I wanted to go to the bank and fix the problem myself. But Mum and Dad ganged up on me on the matter and said I couldn’t do that, insisted on searching my flat for it on their own, and wouldn’t even trust me with my own passport (Mum made a Freudian Slip about me ‘borrowing’ my passport) for fear I’d lose. It’s mine, and I want to take that risk, but Mum and Dad seem to insist on never giving me the chance to fail no matter how much I need it to grow.

    I’m 24 now, and I said I want to be a full adult by 25 which isn’t unreasonable no matter how autistic I am. But Mum and Dad keep insisting it’s not possible.

    People I phone on mental health lines keep telling me to negotiate with my parents, but I just can’t do it. It’s a struggle just to achieve one thing- converting from atheism etc. Then another one comes up, my parents seem to have ignored the lesson that I can handle it myself from last time, and the same thing happens all over again.

    This is particularly frustrating because my lack of self-confidence keeps me from getting a girlfriend. My parents chide me over it, but I ask how I’m supposed to be self-confident when I know I can’t stand up to them over dropping of university, they made me do it half time, forbade me from dating for years etc. This makes it very, very hard to keep my cool around them.

    • Jason K. says:

      Addressing something like this isn’t going to be easy. I am going to start with the assumption that this is simply a case of possible over protectiveness.

      Do you have a cell phone? If not, can you get one?

      If so, the tack I would try would be to tell them (essentially):

      ‘Mom, Dad, I love you and know that you want to protect me, but you aren’t going to be around forever. I need to learn how to function on my own and your constantly trying to solve every problem for me robs me of that. I need to learn how to manage on my own, both for your sake and mine. I promise to ask for help if I think I need it and you are welcome to offer advice, but I need you to not interfere unless I request it.’

      Now, a word of warning: Some people are of the sort that they may flip hard when told something like this. I don’t know enough about your parents to make a judgement call as to their behavior.

      If you have tried this, or they continue to interfere, it may be that you just have to not tell them about any problems unless you want them to interfere.

      • Carinthium says:

        I’ve tried this, and it does sound like good general advice. The problem is that Mum and Dad’s counter-strategy is that Yes I very well should be an adult but I’m just not ready yet but I will be before I’m thirty they’re sure of it.

    • Outis says:

      Some more information would be helpful. How did you end up in this situation? When you say “autistic”, do you mean you have actual autism or Internet autism? Apart from your passport, what direct control do your parents have over your practical life?

      • Carinthium says:

        Back when the diagnosis was Aspergers Syndrome I was formally diagnosed with that. And I think when I was young my parents got used to the idea I needed to be protected and looked after. At 18 I assumed I was going to be an adult, but they claimed I misunderstood how life worked and that as an Aspie they were going to look after me.

        My parents don’t have direct control, but I have reason to be afraid of them. Mum and Dad think they have the right to stop me from using prostitutes and last time that was brought up Mum threatened to hit me. I don’t have much in the way of technical skills to fix my own things should they break so I need Dad for it. They also have an irrational drive to make me go to university (and I don’t understand why people at uni who know my parents are basically forcing me to go don’t have a problem with it).

        Yes I know it’s a bit pathetic, but for years not only did my parents and everyone around me told me this was normal (even the professional psychologist I went to said I had to negotiate with my parents for freedom, and didn’t see a problem with them ordering me as if I were a child to go to university part-time). Thus, I lack a lot of the life skills I need and I never learned to stand up to them well enough.
        (And yes I considered suing him, but Richard Eisenmeier is very prestigious in matters of Aspergers so I fear it won’t work)

        (EDIT: Finally, it really didn’t help that for years every time I tried to achieve something on my own for independence my parents would at the bare minimum strongly discourage it. It’s hard enough to learn new skills on my own without that kind of pressure)

        • shrubshrubshrub says:

          Can you go no-contact, or move to a different city? Maybe see if you can go on exchange somewhere for a semester?

          I get that this would be difficult to do, but given the pattern of consistently ignoring your boundaries I have a hard time believing mere words will be enough to change their behaviour.

          • Carinthium says:

            Right now I’d love to go no-contact, so I thought about that. The problem is that right now I don’t even have the skills to arrange an airport trip or accommodation.

            I’ll try to get past that hurdle on my own, then maybe I’ll try it.

        • Outis says:

          1) I can tell from your accent that you are from Australia, and I understand that prostitution is not illegal there. However, it is not something you discuss with your own mother. You should not go with prostitutes until you understand and internalize why that is.
          By the way, if you do have sex with a prostitute, it will not give you self-confidence or get you over “that obstacle”. The obstacle is you. And sex does not actually feel all that good (although I guess it’s possible that drugs fucked me up). The main thing is to feel like you are desirable as a man, and obviously prostitution won’t help with that.

          2) If you break your things, there are professionals that can fix them for you. Or maybe a friend or coworker.

          3) What are you going to university for? What are you working as? Why don’t you want to go to university?

          4) Try and make steady progress in improving your life skills. Prioritize them by how fucked you would be if your parents stopped helping you with it. Protip: getting a girlfriend is not even close to the head of that queue. Yes, I know Maslow put sex at the base of his pyramid, together with food and drink – that’s because he was a quack.

          • Carinthium says:

            1: I need to discuss it with her. I know this isn’t that normal, but I’ve internalize a fear response regarding doing things my parents don’t like. If I don’t confront them and say openly what I’m doing, I’ll never get over my fear.

            2: I don’t have any friends with good enough technical skills- I don’t get out much socially. I could try to look, I suppose.

            3: Originally, I did want to go university. My father is a high ranking politician, and most of the people I know are very high flyers (famous lawyer, VCAT judge, prestigious doctor etc). For a while I was in denial about my own capacities, thought I could equal Dad in life, then eventually discovered I couldn’t. I was doing an Arts degree, with intent to get into law.

            When I realized I couldn’t, I wanted to drop out of university and do something else. This is where Mum, Dad, and my psychologist came in and started telling me that them forcing me to go to university was normal.

            4: The question is how to speed up the process. My parents have fucked me up a lot (not letting me use the trams for a long time, not letting me date etc). I want to get back on track so by the time I’m 25 I can start living a normal life.

          • James says:

            1: I need to discuss it with her. I know this isn’t that normal, but I’ve internalize a fear response regarding doing things my parents don’t like. If I don’t confront them and say openly what I’m doing, I’ll never get over my fear.

            But consider making an exception for this activity. “Getting over your fear” is only a means to an end, right? If you can find other ways of doing those things then that’s just as good, right? So consider doing this one in secret and, if necessary, using some other of the things your parents don’t want you to do to get over that fear.

            I think most parents would probably have a (pretty strong) aversive reaction to the idea of their child seeing a prostitute, for reasons that are complicated. A reaction strong enough (and for reasons so complicated) that it’s probably best just to avoid the whole hornet’s nest by keeping it from them. (Outis’ other point about why it still might not be as good as you hope also stands, though.)

            I want to get back on track so by the time I’m 25 I can start living a normal life.

            This is a worthwhile project and you should work on it as much and as fast as you can, but having a fixed deadline – “by the time I’m 25” might not be the most helpful thing here for you, primarily because “living a normal life” is something you’re likely to asymptotically approach, rather than clearly and definitely arrive at.

          • CatCube says:

            @Carinthium

            I can’t help on any of the boundary issues, but I want to echo James on not setting a fixed timeline.

            IIRC, you’ve said you’re 24. Expecting to solve all these issues in less than a year is extremely unrealistic. Not only are you likely to disappoint yourself, trying to force it that hard is probably going to make the situation worse.

            It took the rest of us a lot of time to grow up. It sucks that your parents took that time from you, but you won’t do yourself any good by going too fast in the other direction.

          • keranih says:

            What Catcube and James said. Take your time.

            I don’t know about you, but I distinctly remember one summer when I was 8-10, and was watching adults deal with a crisis in a pretty competent manner, and thinking to myself I can not WAIT until I am grown up and know all the answers and what to do when things go wrong.

            It took me most of another decade to realize that no, actually, adults don’t know all the answers either. Decades later, and I’m still faking the hell out of it with everyone else.

            A slightly different take – I used to get into screaming fights with my mother because she would deliberately not tell me about things (usually family health things, like my grandmother going into open heart surgery) “because I thought you’d get too upset and I didn’t want to tell you until it was over.”

            It took some pretty severe head butting for us to get to a compromise – when she told me something *even a little bit early* I would take a deep breath and say “thank you for telling me about this. Please keep me updated” and then go off and have my freakout elsewhere. Eventually we’ve gotten to the point where now I have to remind her she’s already told me about Cousin Bob’s car accident four times now.

            So my suggestion is to try for a compromise with your parents – you will tell them about issues (like the lost card) but on the strict condition that you get at least one free try to fix it without their interference. Even if they think it will cost you money or not work. On your side, you have to tell them, and you will have to own up if your attempt to fix doesn’t work, and go back to them for help.

            I hope this helps. Good luck.

          • Outis says:

            Carinthium: your father is a high ranking politician?! Then why are you even thinking of going with a prostitute? Do you want a scandal?

          • Airgap says:

            That’d show him for interfering, right?

    • Stefan Drinic says:

      I, too, was diagnosed with Asperger’s. The difference is that this happened as late as last year, when I was twenty-one, so for better or for worse my parents never quite managed to develop much of a controlling streak.

      That said..

      Your posts reads as if you’re saying ‘doing this thing I want to do is hard, can it be easy.’

      It just can’t, and I feel like somebody should be telling you that.

      If I’m not reading into the situation in a terribly wrong way, your parents simply view you as incapable of functioning independently and aren’t going to be swayed from that view very easily. What do you expect to do about that? To have a dramatic scene as if this were a movie, three people crying and resolving their differences? If there were an easy path to sway their minds, I’m sure you’d be smart enough to have taken it by now. The only option beyond asking/manipulating things into going well is making things go your way.

      If your parents can’t respect your boundaries about your apartment, stop allowing them in/take away their keys. If they can’t let you take decisions on your own, stop consulting them, and break off any conversation where they might want to try to do so anyway. It isn’t easy, but it is necessary. If you want to prove somehow you can be independent, you need to act like it.

      My own mother has on occasion tried the same sorts of manipulative tactics on me, if thankfully to less extreme degrees, and there has to be a point where you need to stand your ground and refuse to play ball. For me, it was when I was twenty and she’d decided that after a number of years where I’d on occasion cook our family dinners I had become incapable of cooking rice(not hyperbole). You seem to have gone past that point. It is in some way the reverse situation of a parent taking away their kid’s phone and games and sending them up to their room until they stop misbehaving, in a twisted, upside down sorts of way, but it’s the one thing I’ve personally seen work.

      Take this with all the grains of salt you otherwise would from an internet commenter, of course.

      • What Stefan said. If they can’t behave, don’t let them in.

        They will pitch a fit and try to force you. You may feel very distressed in the process–most people want to be on good terms with their parents, after all. But there’s a good chance your parents can learn, eventually.

        Do you have any close friends (preferably IRL) you can bounce ideas off of/go to for help/advice? Such a friend can be invaluable for those moments when you’re not sure, “hey, is this thing they’re saying totally unreasonable, or is it just me?”

        Good luck.

      • Carinthium says:

        I’ve tried to do this before. The problem is that I just don’t have the willpower. People say you can just do these things, but willpower is finite. I have nobody whatsoever to support me in person, whilst my parents have large numbers of friends and neighbors to back them up (and tell me I have to go to university). Even many of the help lines I call try to tell me to negotiate, even when I discuss the time my mother threatened to hit me. I can’t do this alone.

        At first, all my sources of information were telling me that it was normal to live at home and not be independent and go to university because I had to when I desperately wanted to move out. After years of being told that, it’s hard to get past it.

        Mum and Dad consistently say they will respect my boundaries, then next time a conflict comes up proceed to ignore them and forget what they said. They state reasons why what I’m doing is hard to intimidate me, and I can’t stand up to them and try to press through genuinely hard difficulties simultaneously.

        • Vitor says:

          Well, have you considered going to the police? If I understand correctly, you are legally an adult. Your mother can’t withhold your passport, it belongs to you. That’s theft, maybe even coercion and blackmail. A serious crime is being committed against you. Internalize those words.

          Once you manage to get police involved, all the emotional manipulation in the world won’t make them go away. The cold hard facts are on your side. The tricky bit is getting them to take you seriously. Don’t tell them that your mother won’t give you your passport. Tell them that it was stolen, and when they ask for details, fill them in, vaguely but truthfully. Provide details only when asked. That way, the police are already involved before any kind of prejudice about your autism or about it “just” being a family squabble has a chance to make them not take you seriously. Maybe wait for an opportunity where the situation is very blatant and clear cut before you act.

          If it comes down to a he said, she said scenario, it’s hard for the police to prove anything and take action, but at least you can get new documents issued, and lock them away. If all of this seems too daunting, if possible hire a lawyer to help you take the right steps and say the right things. A lawyer is literally paid to be on your side. And again remember: there is a crime being committed here, no matter what pretty words are used to justify it, so getting a lawyer involved is in no way overblown or dramatic. It’s the thing reasonable people do when others infringe on their rights.

          Same basic idea with your apartment keys. Ask nicely for your keys back. If your parents won’t comply, change your locks, report it as theft, do whatever it takes to bar them from entering your place against your will.

          Once you have those two things taken care of, it should be much easier to gradually distance yourself from them, to make your own choices without asking for permission or approval.