Non-Dual Awareness

Seen on Lauren’s Facebook: How Does Academia Resemble A Drug Gang?

Their answer is that both academia and drug gangs are marked by an endless supply of foot soldiers willing to work in terrible conditions for a small chance at living the good life. In drug gangs, the average street-corner dealer makes $3-something an hour; given that he’s got a high chance of being arrested or shot, why doesn’t he switch to McDonalds instead where the pay’s twice as good and the environment’s a lot safer? The article suggests one reason is because drug gangs offer the chance of eventually becoming a drug kingpin who is drowning in money.

(I’d worry they’re exaggerating the importance of this factor compared to wanting to maintain street cred and McDonalds jobs being much more regimented both in the application process and performance, but they’re the ones who have talked to anthropologists embedded in drug gangs, not me.)

Academia has the same structure. TAs and grad students work in unpleasant conditions for much less than they could make in industry, because there’s always the chance they could become a tenured professor who gets to live the life of the mind and travel to conferences in far-off countries and get summer vacations off.

The article describes this structure as “dualization” – a field that separates neatly into a binary classification of winners and losers.

This concept applies much more broadly than just drugs and colleges. I sometimes compare my own career path, medicine, to that of my friends in computer programming. Medicine is very clearly dual – of the millions of pre-med students, some become doctors and at that moment have an almost-guaranteed good career, others can’t make it to that MD and have no relevant whatsoever in the industry. Computer science is very clearly non-dual; if you’re a crappy programmer, you’ll get a crappy job at a crappy company; if you’re a slightly better programmer, you’ll get a slightly better job at a slightly better company; if you’re a great programmer, you’ll get a great job at a great company (ideally). There’s no single bottleneck in computer programming where if you pass you’re set for life but if you fail you might as well find some other career path.

My first instinct is to think of non-dualized fields as healthy and dualized fields as messed up, for a couple of reasons.

First, in the dualized fields, you’re putting in a lot more risk. Sometimes this risk is handled well. For example, in medicine, most pre-med students don’t make it to doctor, but the bottleneck is early – acceptance to medical school. That means they fail fast and can start making alternate career plans. All they’ve lost is whatever time they put into taking pre-med classes in college. In Britain and Ireland, the system’s even better – you apply to med school right out of high school, so if you don’t get in you’ve got your whole college career to pivot to a focus on English or Engineering or whatever. But other fields handle this risk less well. For example, as I understand Law, you go to law school, and if all goes well a big firm offers to hire you around the time you graduate. If no big firm offers to hire you, your options are more limited. Problem is, you’ve sunk three years of your life and a lot of debt into learning that you’re not wanted. So the cost of dualization is littering the streets with the corpses of people who invested a lot of their resources into trying for the higher tier but never made it.

Second, dualized fields offer an inherent opportunity for oppression. We all know the stories of the adjunct professors shuttling between two or three colleges and barely making it on food stamps despite being very intelligent people who ought to be making it into high-paying industries. Likewise, medical residents can be worked 80 hour weeks, and I’ve heard that beginning lawyers have it little better. Because your entire career is concentrated on the hope of making it into the higher-tier, and the idea of not making it into the higher tier is too horrible to contemplate, and your superiors control whether you will make it into the higher tier or not, you will do whatever the heck your superiors say. A computer programmer who was asked to work 80 hour weeks could just say “thanks but no thanks” and find another company with saner policies.

(except in startups, but those bear a lot of the hallmarks of a dualized field with binary outcomes, including the promise of massive wealth for success)

Third, dualized fields are a lot more likely to become politicized. The limited high-tier positions are seen as spoils to be distributed, in contrast to the non-dual fields where good jobs are seen as opportunities to attract the most useful and skilled people. This reminds me of the other article I read today comparing academia to drug gangs, which was where Paul Krugman theorized that the reason so many criminals have horrible tattoos in inappropriate places is as a conspicuous symbol of criminality! He says that since these people’s tattoos mean they can never get a job in legitimate industry, other gang members and black market contacts can trust them to keep their bargains, since they’ve got no option under than continuing to work in the criminal underworld. Krugman writes (h/t Nathaniel Bechhofer):

The author, Diego Gambetta, adds a wonderful parallel: according to his account, Italian academics, who do a lot of horse-trading in appointments etc., cultivate a reputation for incompetence at actual research, again designed to reassure those with whom one deals.

“Being incompetent and displaying it,” he writes, “conveys the message I will not run away, for I have no strong legs to run anywhere else. In a corrupt academic market, being good at and interested in one’s own research, by contrast, signal a potential for a career independent of corrupt reciprocity. In the Italian academic world, the kakistrocrats are those who best assure others by displaying, through lack of competence and lack of interest in research, that they will comply with the pacts.”

(wait, this argument sounds kind of familiar. KRUGMAN, HAVE THEY GOTTEN TO YOU TOO?!)


What dualizes some fields but not others?

Originally I was going to make a simplistic comment about licensing and regulation, but this doesn’t exactly capture it. Certainly the fact that medicine requires an MD has some effect on the dualization of medicine (alternatively, insofar as medicine isn’t entirely dualized, it’s because you can get less lucrative positions, like naturopath or therapist or nurse practitioner, without an MD). But we can imagine a system in which there were more than enough medical schools for everyone, anyone who applied to one got in, there was a glut of doctors, and the good doctors got good jobs and the less good doctors got less good jobs. So we might more soberly blame it on scarce licenses – for complicated reasons I won’t get into here, the number of residency spots is much lower than it should be, leading to a bottleneck where only a few people can obtain the MDs.

What about tenure? We can imagine an alternate universe where academia is populated with various PhDs on equal footing. Since there would be a glut, their salaries would be very low to start, but low salaries would mean easy employment, and colleges would find a lot of room for them to do one-on-one tutoring, or low-level research, or something like that. Eventually some of them would become a bit more prestigious in their fields and could demand higher salaries from hiring institutions, and a few superstars like Nobel Prize winners and the like could demand millions. At no point would there ever be anything called a “tenure track”. It seems like the main difference between this universe and our own is that tradition plus the reasonable desire of professors to be free from political interference has created this dichotomous variable called “tenure” and caused it to replace the continuous variable of salary as the prize for success. In favor of that theory, top professors seem weirdly underpaid compared to eg top athletes or top artists, even though I would expect having one of the world’s top scientists or historians to be a big draw for a school. According to the List Of Highest Paid Professors, only five professors in the US make more than a million dollars a year, and all of those are professors of lucrative medical subspecialties or of finance, who presumably are being paid that much to compensate them for teaching instead of participating in the high-paying professions they are otherwise qualified for.

What about drug dealers? I think there might be “licensing” at work here too. There’s no such thing as a mid-level independent drug dealer, because – if the three seasons I’ve watched of Breaking Bad are accurate – if you try this, the other drug dealers will shoot you. So you need a scarce “license” from the drug lords – basically El Chapo giving you the rights to a big piece of “turf” – instead of a license from the government. Whatever; I’m liberal Monday Wednesday Friday and libertarian Tuesdays and Thursdays; today is a Tuesday so all organizations that rely upon the use of force look the same to me.

But what about lawyers? Sure, there are regulations on who can practice, but the dualization in the legal fields comes after graduation of law school. Here, have some statistics:

The first step of what’s going on isn’t a mystery – the people on the very sharp mountain on the right are hired by big law firms on the “partner track” (note the similarity to “tenure-track”) and the people in the more gradual plateau on the left are everyone else. But there’s still a lot to be explained. Why isn’t there a law firm that hires people almost as good as the people on the mountain, for $100,000? This article suggests that “Not paying the standard top-tier salary [of $160,000] is a tacit admission that you’re no longer top-tier”, but you could say that about any industry where quality isn’t 100% obvious. How come chefs don’t have a salary graph that looks like that? How come engineers don’t?

It seems possible that maybe top law firms act as a de facto licensing system – picking out a couple of excellent young law school grads as Officially Excellent, and then if you’re a sufficiently big corporation you refuse to use any except those? But once again, I don’t know why law would develop this structure and other professions wouldn’t.

So if I had to figure out what all of these have in common, it would be an idea of privilege. Some people get guaranteed an unexpected privilege over and above the continuous measure of salary. The people who have to subsidize this privilege resent it and try to limit access to it. People start competing for scarce access to the privilege instead of having normal competitions for salaries, benefits, working conditions, et cetera, and all of those other things go out the window.

This is interesting because of how well it maps on to some other issues. For example, minimum wage creates a dualized system between workers and the unemployed. If there were no minimum wage, we would expect a sort-of-continuous wage distribution from 0.01$ an hour all the way up to whatever Taylor Swift makes for an hour’s performance. Instead, we guarantee everyone the privilege of $15 per hour. Employers resent this and (in theory) try to limit access to the privilege by lowering workforce, automating, etc, as much as possible. This creates a dualized system with an upper tier (employees with high wages) and a lower tier (unemployed with nothing at all).

Or how about benefits? If there were no benefits, we’d expect a more continuous spectrum of people working 40 hour weeks, 30 hour weeks, 20 hour weeks, and so on. Instead, we guarantee everyone who works X hours the privilege of good health care. Employers resent this and try to limit access to the privilege by hiring people to work X – 1 hours per week, or hiring independent contractors, or so on. This creates a dualized system with an upper tier (real employees) and a lower tier (people working 29.999 hours a week or whatever who don’t quite qualify for the benefits).

If you really want to stretch it, think about urban growth. If there was no zoning or regulation, desirable cities would have a continuous distribution from rich people living in nice mansions with lots of surrounding green land to poor people in apartment projects. Instead, we guarantee people living there certain privileges like “never having their view blocked” and “never having to worry about congestion”. This creates a two-tier system of current residents with the privileges, and non-residents who can’t live in desirable cities at all.

This raises a question of – assuming we want to give people privileges – or assuming we’re political realists who understand it’s going to happen anyway – are there ways to do it with a minimum of dualizing? It seems possible to imagine some solutions along those lines – for example, instead of mandating full health care for people who work more than 30 hours per week, we could seek systems where companies give health benefits that scale up with the number of hours worked. Instead of giving tenure, we seek systems where it becomes progressively harder to fire academics the longer they’ve worked for you.

Other cases seem harder – you can’t give half of a medical license to a doctor who finishes two years of med school, and the idea of a half a minimum wage defeats the whole point.

This entry was posted in Uncategorized and tagged . Bookmark the permalink.

617 Responses to Non-Dual Awareness

  1. Professor Frink says:

    I think the long term goal of Obamacare is decoupling health care from employment, which is similar to your non-dualizing proposal.

    • Partisan says:

      I’ve heard other people make this claim, and I don’t quite understand it. The employer mandate directly works against this goal, and the individual mandate indirectly does.

      Indexing the cadillac plans to inflation rather than the growth of health care costs would certainly work toward the goal in the long run – is that what you mean? But wasn’t another goal of PPACA to “bend the cost curve” to make those two growth rates similar? And isn’t there a non-trivial chance that the tax on generous plans gets postponed indefinitely, a la the “Medicare Sustainable Growth Rate?”

      (I don’t want to get into a “is PPACA good or bad” thing, really – I just am curious how something that seems to really reinforce the employment-health insurance link is somehow working against it)

      • Professor Frink says:

        There are incentives to move employees from employer based group plans to defined-cost subsidies on the individual exchanges. That’s why companies with large pension obligations (IBM,Time Warner,etc) are moving their retiree health care to the individual exchanges, and a lot of companies are phasing out group plans in favor of subsidies (Walgreens,etc).

    • onyomi says:

      Or they could just eliminate the tax subsidy for employer-provided healthcare plans. But that would make too much sense to be politically viable, I guess.

      • orangecat says:

        That was basically McCain’s plan, which Obama and the Democrats immediately demagogued as “taking away your health insurance”.

        • onyomi says:

          It’s definitely one of those “takes 3 minutes to explain why it’s a good idea but only 10 seconds to demonize” things which make politics so miserable.

          • Pku says:

            I can see the benefits, but on the other hand, it still seems like you’d end up leaving a lot of people without health insurance (which people exactly depends on what restrictions insurance providers have on setting costs). Now if you did that while also offering a government plan and making health insurance mandatory, it seems like it could work (combine the best part of both obamacare and mcCain’s plan).
            (Are there any obvious reasons this would be a terrible idea that I’m missing?)

          • Richard Gadsden says:

            Can’t reply directly to Pku, but I think they just described the most-common European healthcare system, as used in e.g. France, Germany, Netherlands.

          • Edward Scizorhands says:

            McCain’s plan was to replace the tax subsidy with a flat tax credit if you had insurance. And, for obvious implementation reasons like reducing fraud, the tax subsidy would be paid directly to the insurance company and knocked off your bill.

            Obama’s ad campaign described this as “he wants give your tax money directly to insurance companies.” (It was one of those ‘big words on white background’ ads, and I couldn’t quickly find it on YouTube, but someone more patient may be able to, thus fact-checking me.)

          • Edward Scizorhands says:

            G** f**** d****t. It’s even worse than I remember.

          • onyomi says:

            It’s generally forgotten amidst all the feel-goodiness, but Obama ran pretty nasty, disingenuous campaigns, especially the second time. I really wanted him to lose the second time not so much because I liked Romney, but because I thought he deserved to lose. Don’t believe me? Watch the debate between Biden and Ryan. It even made some of my progressive friends uncomfortable. And no, Romney and Ryan were not equally bad. Just boring and milquetoast.

        • shemtealeaf says:

          I’ve gained new respect for McCain. Anyone proposing something like these days?

      • Creutzer says:

        Someone please explain to an economically illiterate non-American what the consequences of this would be.

        • Airgap says:

          When you subsidize stuff, you usually get more of it. So when you stop you get less. You also have more tax money to blow on some other retarded idea. For example, you can bomb foreign countries until their citizens are no longer able to ask us economics questions over the internet because they’re too busy fighting over scraps of food and dying of radiation poisoning. Coincidentally, this is the main plank of the Airgap for President foreign policy platform. Securing our Comment Sections…For America!

          • Publius Varinius says:

            > When you subsidize stuff, you usually get more of it. So when you stop you get less.

            I guess by the word “someone”, Creutzer meant “someone who is economically literate”.

          • Airgap says:

            Is it too late for you to edit this post and replace snark with counterexamples?

          • Rick Hull says:

            Airgap > If we arbitrarily decide to give non-Capricorns a health insurance discount, how does the original problem get worse? There was some mention of a ‘cycle.’ Can anyone describe it? It’s not obvious to me.

            This is a flawed analogy because people cannot choose or change their Zodiac sign while they can change their employment status. But let’s assume people can change their Zodiac sign.

            Already, it’s much more preferable to be a non-Capricorn. If you make it even nicer to be a non-Capricorn, then Capricorns will have more incentive to switch to the favored group. Then the Capricorn group is populated with idiots and ne’er-do-wells, who couldn’t be arsed to change their sign.

            We can make it slightly more analogous by guarding the zodiac sign change process with a formal application, health evaluation, and limiting the number of permits. In that scenario, the Capricorn pool gets noticeably less healthy and insurance correspondingly more expensive.

            Clearly dualization is exacerbated.

          • Airgap says:

            > This is a flawed analogy because people cannot choose or change their Zodiac sign while they can change their employment status.

            Then don’t continue to use it because it’s just confusing. I used Capricorn as an example because I was asserting that for the purpose of the precise question at issue, the ability to change status was not relevant.

            It’s worth noting that just because “Capricorn” is a predicate identifying a relevantly-different group doesn’t mean it’s an insurance pool. Maybe the Capricorns’ years of oppression by the Geminis was sufficient to establish them as a protected category, and charging them higher premiums is a crime.

        • bluto says:

          By having taxes subsidize health insurance only when purchased via employment, you create a situation where essentially many of those buying insurance outside of employment is highly likely to be a costly insured, which means the prices for health insurance outside of employment are very, very high which creates a cycle and broken market.

          • Airgap says:

            Why is the price difference for health insurance outside of employment likely to be higher than the amount of the tax subsidy?

          • Edward Scizorhands says:

            Because buying in employment is so preferential, so anyone buying on the individual market is Weird.

          • Airgap says:

            Would you care to amplify this?

          • Jaskologist says:

            The population of people who are unemployed is specially selected for people with a whole lot of problems, many of them health-impacting.

          • Airgap says:

            Okay; I was thrown by the lack of a ‘because’ between ‘where’ and ‘essentially’ in bluto’s comment.

            Even so, it’s not clear that it works the way some posters make it sound. If Capricorns mysteriously have more health problems than other signs, their health insurance is going to cost more. If we arbitrarily decide to give non-Capricorns a health insurance discount, how does the original problem get worse? There was some mention of a ‘cycle.’ Can anyone describe it? It’s not obvious to me.

          • bluto says:

            From an insurer’s standpoint medical spending looks like this:
            Charge 1,000 people 10,000 in premiums ($10,000,000 in revenue)
            10 people will need about $1,700,000 in benefits
            40 people (excluding the 10 above) will need about $2,550,000 in benefits
            The next 800 people will spend about $4,250,000 in benefits.
            About 150 will spend nothing.
            I’m assuming 15% (in line with the ACA limits) for administrative costs and profits.

            Under an employer plan, all 1,000 people buy insurance and the government effectively pays about $2,500-$3,000 of their premium via tax breaks.

            When those same 1,000 people become unemployed, that group of 150 people who spend nothing are likely to include a decent number who are willing to take a chance that they won’t need an organ transplant, get cancer, or need really expensive viral drugs during their unemployment period and drop insurance (while not buying insurance would be an exceedingly foolish decision for the 50 people who account for half the spending–a payday loan for the COBRA premium would be economically rational).

            If the 150 people who spent nothing can self identify and forego insurance, the whole group breaks down as all the costs remain, but a meaningful fraction of the premiums are gone, so the insurance company raises prices. As a result, some of the 800 people in the middle look harder at their unemployed insurance costs and decide to forgo (while the prices still are a very good deal for those top 50 beneficiaries), so more of the healthy part of the pool drops out, and prices continue to rise, leading to more of the healthy part of the pool leaving and higher prices (the cycle I was referring to).

            By breaking the tie between employment and health insurance, there’s less incentive for the healthy to wait for employment to return which means insurance available outside of employment doesn’t become such a foolish bet for the insurance company and prices are closer to group averages (which means the pool of healthy people is again large enough to cover the small number of policies that will eat most of the cost).

          • Airgap says:


            I don’t really understand what’s going on. It’s like you got confused and thought I asked you to justify Obamacare or something. Your argument does not appear narrowly tailored to the question: Why does making health insurance cheaper for employers cause a greater difference in the cost to the employed and the unemployed than the amount of the subsidy?

            It’s common, but not universal, for employers to offer benefits as part of a compensation package, rather than allowing employees to buy health insurance through them at the subsidized rate as a perk. In the case of an employee in the typical situation, why does the price my employer has to pay being lower increase the likelihood that I will forgo purchasing health insurance when I’m between jobs?

            It would seem to mean that more employers would purchase health insurance for their employees, and that therefore more of the healthiest people would be insured more of the time. Which would lower the overall cost of insurance. It’s been noted previously that I’m an economic illiterate, but I would think this would mean a greater share of the 150 ubermensch would elect to purchase private insurance while between jobs. It’s possible I’ve missed something.

          • Creutzer says:

            Thank you, John Schilling. Now I see the point.

          • bluto says:

            Because the market for insurance outside of employment is small and the massive information assymmetry (most people can make a pretty educated guess about whether they’re highly likely to be in the not spending anything next year, or in the I’m going to have 6-7 figures of medical costs in the next year) mean the people who want insurance outside of employment are more likely to be in the very high cost group, which means the market becomes exclusively the high cost group (as all other groups leave).

            If there’s no choice, the information asymmetry stops being valuable (you can’t opt out in a good year or opt in for an expected bad year) so the price drops dramatically, because the insurer is getting the whole group all the time.

            It’s like a game where you agree to pay me $216 dollars if I roll 3 sixes under a cup, but I only pay to enter the bet after I’ve looked at 2 of the dice (and you don’t get to look at any) vs a game where you pay me $216 dollars for three sixes but neither of us can look under the cup til we agree on how much I’ll pay you to play.

          • Airgap says:


            One of us is having a very bad day, reading comprehension-wise, and I think it’s you. Did you miss the first paragraph of my last comment, or was your last comment written with it in mind?

            Say we force everyone to have health insurance all the time, but we give employers a tax break on health insurance because God loves honest labor and it’s possible to sell it to the public as “Keeping America Healthy” rather than “Tax Breaks for Big Corporations” even though it comes to the same thing in the end. You have been arguing that by having this tax break, we are driving up the cost of health insurance for the unemployed. This seems to me to be impossible, except in a trivial way (inflation or something).

            I think the most likely explanation is that you didn’t realize you were arguing for this point. In your mind, mandatory insurance and elimination of employer tax subsidies are tied together as part of a larger strategy to do good to mankind somehow, so much so that it didn’t occur to you to decouple them even when repeatedly asked to do so. Verily, Politics is the Mind-Killer.

          • bluto says:

            If we force everyone to have insurance there’s no difference in price because forcing them takes away choice and the ability to use the buyer’s access to inside information just like an employer’s decision for all their employees does.

            My explaination has always been using only the assumptions from in Crutzer’s question. The tax break strongly pushes people onto a market for insurance where the employer chooses whether a group buys or does not. In doing so, the removal of the individual’s choice (who has a lot of information about likely health care costs in the short run) changes the price far more than the value of the tax subsidy.

            Removing from the tax subsidy would require other institutions becoming the groups that decide, but most potential replacements are do not change like employment.

          • Airgap says:

            @bluto: I give up. Feel free to reread my comments and take another stab at responding later though.

          • Anthony says:

            Why is the price difference for health insurance outside of employment likely to be higher than the amount of the tax subsidy?

            Because, as Bluto tries to explain, employer-based insurance groups are not selected (usually) for likeliness to need health care. So by insuring a large group of people, you can assume that they will be overall close to the average in terms of costs.

            However, so long as individuals have any choice in the health insurance market (even if they are required to buy insurance, if they have a choice of coverage levels this still works), those people who know they are likely to need more health care are those more likely to buy insurance (or the better coverage plans) on the individual market. So an individual seeking insurance on the individual market is a greater cost to the insurance company than the average person insured through their employer. This is true even if the medical risk profile of people not offered insurance through their employer is the same as that of people who are offered insurance through their employer.

          • Airgap says:

            > employer-based insurance groups are not selected (usually) for likeliness to need health care.

            I’ve grasped this point by now.

            > So by insuring a large group of people, you can assume that they will be overall close to the average in terms of costs.

            I keep asking for arguments against cheaper health care for employed people, and you keep giving me arguments for the individual mandate. Stop that.

          • RCF says:

            To take over from Airgap, who has becomes frustrated at the conversation, IMO justifiably so:

            Airgap has asked how, when comparing “everyone has to have insurance, and there’s a subsidy” to just “everyone has to have insurance”, why the latter would result in higher prices for people not buying health insurance through their employers. Bluto then responded with a comment which compared “everyone has to have insurance, and there’s a subsidy” to “there’s no subsidy, and people don’t have to have insurance”. So Bluto really isn’t addressing Airgap’s question.

          • Adam says:

            I only read the last comment here by RCF, but isn’t the answer for why employer-purchased healthcare is cheaper the greater bargaining power of bulk purchasers?

          • Doctor Mist says:

            isn’t the answer for why employer-purchased healthcare is cheaper the greater bargaining power of bulk purchasers?

            Not entirely. Another reason is that the pool of employees usually includes lots of young, healthy people who don’t incur a lot of charges.

    • This would be the same law that penalizes corporations for not offering health insurance?

  2. Anonymous says:

    Did you have to post that graph on the evening of the first day of the bar exam? 🙂

  3. LTP says:

    I’m not sure I believe tenure is necessary at all, and not just a wholly bad thing. Let’s be honest, academia is already plenty politicized. If a tenured professor says something sufficiently politically incorrect, or does something sufficiently wrong (even if not illegal), they will be “pressured” to step down or resign by administrators and the public. If a professor is viewed as a crank, they probably didn’t get tenure in the first place. These things happen on a semi regular basis. And, I’m not sure every single one of those cases is wholly unjustified either.

    And, really, how often has the protection of tenure been used for noble and pure intellectual purposes in recent years, anyway?

    • onyomi says:

      Tenure is about 5% protection of academic freedom and 95% compensation for the fact that top professors get paid so little.

      • Mary says:

        And how does it compensate? If it’s not protecting their academic freedom, it must be protecting something else, and they must deem this protection valuable.

        • onyomi says:

          Free time, low pressure, steady income, sabbaticals, get paid to talk about interesting things and go to conferences where you schmooze and eat out on your institution’s dime.

          • Mary says:

            What does that have to do with tenure?

          • onyomi says:

            It’s what tenure offers you?

            Arguably all academic jobs offer some of these benefits, but because of how hard it is to get good academic jobs, and the fact that one cannot save a ton of money on an academic salary, one does not feel secure until achieving tenure and must, therefore, spend all the free time writing books, articles, etc. until that happens. Therefore, you don’t get to enjoy free time, low pressure, retirement plan, security for the future, etc. until you get tenure.


            I don’t think the above account is at all unusual. Many academics have some degree of anxiety disorder, and if you didn’t have one when you entered grad school, there’s a decent chance you’ve developed one by the time you’re up for tenure. As the writer states, tenure for many is all about emotional security.

          • Mary says:

            Then what you should have said it was protection from having to do your job.

          • onyomi says:

            To put it bluntly, yes. Academia is weird: everyone supposedly goes into it because they’re passionate about the work and don’t want a boring office job, but as soon as they get there the ultimate goal is job security so strong you can’t be fired even if you hardly do anything.

            I think the reality is that academia attracts people who don’t really like to work at all, but who like sitting around chatting about ideas while consuming wine and cheese. I certainly count myself among such (it’s why I’m posting here). But then, most people don’t really like to work; academics just found a socially respectable way to do it without getting bored.

          • onyomi says:

            To be a little less glib, I think part of what makes academia so subjectively stressful that people seem to desire extreme job security at the expense of pay is the creative nature of research.

            It’s sort of like being a writer or artist where you have the constant pressure to think in innovative ways and produce work on your own (rather than in accordance with someone else’s direction), but unlike being a writer or artist, even the most successful academics usually don’t make much, if any money off their work directly. Even if you write a book hailed as revolutionary in your field, it may sell a few thousand copies at most. It’s like being in the position JK Rowling was in, but even if you manage to write a Harry Potter, you get only prestige and no money.

            But what most universities are willing to offer their top researchers instead of big money is really strong job security. I think creative types feel very insecure on some level, because the creativity tends to dry up one day, and one never knows when that will be. If one can’t get a secure income on the basis of a few impressive projects it’s like being an NFL player who is only paid enough each year to make ends meet that particular year: come 40 you’re going to be regretting it.

          • RCF says:

            Regarding the SMBC cartoon: but isn’t a professor’s putative job to teach, not to write articles? Isn’t a bit odd that those two roles have been conflated?

          • onyomi says:

            On some level, yes, it is weird. Teaching and researching are two separate jobs, and one need not be good at one to be good at the other. That said, I personally find the two jobs to be mutually reinforcing: one never understands a subject nearly so well as when one is required to explain it in simple terms to others.

          • Troy says:

            That said, I personally find the two jobs to be mutually reinforcing: one never understands a subject nearly so well as when one is required to explain it in simple terms to others.

            In my experience there are good teachers who are not good researchers, but rarely are there good researchers who are not good teachers. And, in partial defense of our current system, this matches up with the jobs out there pretty well: there are many teaching-only (or mostly teaching-only) jobs, many teaching-and-research jobs, but few research-only jobs.

          • Mary says:

            Richard Feynman observed that having a task you actually had to do relieved the pressure to produce in terms of research.

            He observed this after a spell where he was completely dry, resolved that he wasn’t going to bother about it, and one day idly watched a frisbee in midair and got an idea.

            It helped that he taught even when he wasn’t bothering.

        • Anthony says:

          Job security is worth something. Knowing that you can count on your income stream for the next twenty years, that your employer can’t fire you and isn’t going to go out of business, provides a great deal of psychological security, and a decent amount of financial security as well.

          For example, if you have tenure (or its equivalent), it is easier to take the risk of buying a house you can barely afford, because you know you’re not going to end up out of work and scrambling to make the payments, or having to move and taking a loss when you sell the house. At the same income level, without the same job security, you should buy a less-expensive house, to leave yourself a little more cushion should you be fired, or transferred, or your employer goes out of business.

    • John Schilling says:

      And academia is about 95% leftist and 5% libertarian+conservative; get rid of tenure and that goes to 99% leftist and 1% falsely proclaiming to be leftist.

      Exaggeration, yes, but not by enough to make me eager to abolish tenure.

      • onyomi says:

        Why would abolishing tenure preferentially scare off conservative and/or libertarian academics? I would think libertarians would be all about making more money for more work (I’m assuming we replace tenure with a system which allows top professors to make more money to compensate them for the loss of absolute job security).

        • Mr. Eldritch says:

          [fires point-of-view gun at self]
          I believe he’s claiming that liberal control of academia is so great that libertarian/conservative professors are only allowed to exist because tenure means they’re impossible to fire. If tenure were abolished, professors would be fired as soon as they displayed wrongthink.

          • Steve Johnson says:

            Which neglects the selection effects – the 5% non-progressives (granting the hand-waving number purely for the sake of argument) in professorships aren’t people who converted after getting tenure – they’re professors who were able to hide non-progressive beliefs long enough to get tenure.

            It brings to mind Dalrymple’s quote about the purpose of communist propaganda – the professors who went through with ketman for all those years are now emasculated liars.

          • onyomi says:

            As a pre-tenured academic who keeps his libertarianish views pretty close to the chest in professional contexts myself, I do fear this may be true. I am definitely looking forward to being a bit more open about my libertarianism when I get tenure. If I thought I could *never* safely be open about my views while remaining in this profession it might indeed drive me to pick another field.

            To what extent my paranoia is justified, I’m not sure. Most professors I know would never admit, either to others or even to themselves, that they would allow a person’s political views to color their evaluation of their scholarship, but I fear they may be unconsciously biased, especially in a world where vague notions like “fit” end up being important to employment and tenure decisions (i. e. “we have way too many qualified candidates competing for the position, so it will be decided on the basis of whom we’d like to have coffee with.”).

          • Eli says:

            Of course, that involves believing that academia is nothing but a political, Tumblr-SJW-pseudo-left popularity contest, and that all those signs on the doors saying things like, “Physics”, “Cognitive psychology”, “Sculpture”, “Economics”, “Computer science”, “Mathematics”, “Oncology”, “Entomology”, and “Literature” are silly lies, as are the actual courses themselves.

            Plainly I was being fnorded when I sat through all those lectures about Turing machines, and the real purpose was to program Professor Barrington’s Dem-voting mild suburban liberalism into my social-democratic brain. It was that which turned me Marxist a year later, and not the experience of holding a job I hated.

          • Nornagest says:

            I always assumed that when people talk about academia’s leftist leanings in this context, they’re talking about those forms of academia which bear on temporal politics — an ever-expanding set, but one centered in the social sciences and those fields (also ever-expanding) influenced by critical theory. My (computer science) adviser was a pretty outspoken libertarian kind of guy who used to tell stories about going out on a mortar range with some Marines and recreationally blowing up random pieces of terrain, but I’ve never taken a class from, or even heard of, a practicing sociologist or political scientist to the right of Bernie Sanders. Many are far to his left.

            (I still haven’t figured out how Marxian archaeology differs from regular archaeology. Loved digging up shell middens, though, even if I was maybe doing it under a red flag.)

            This might be related to the Two Cultures thing. Economists seem to get a partial pass for some reason, at least on issues you can describe in economic terms.

          • onyomi says:

            “Of course, that involves believing that academia is nothing but a political, Tumblr-SJW-pseudo-left popularity contest…”

            No it doesn’t. It just implies that academics are overwhelmingly liberal, and that they may allow that bias to color their evaluations of conservative scholars’ scholarship, teaching, and/or “collegiality.”

        • John Schilling says:

          Mr. Eldritch has got it. The conservative/libertarian professors I know are either A: dinosaurs from an age when the left’s dominance was less complete, B: people who learned to keep their mouth shut for about a decade and now enjoy being able to speak freely, and C: people who were liberal at 20 and conservative at 40. Absent tenure, some of them would still keep their positions engineering and the better econ schools.

          Pendulum swings the other way, effect C no longer applies and I’m not sure leftists are as good at keeping their mouths shut out as conservatives, but there’s a generation or two of liberal dinosaurs in the pipeline so as long as tenure prevents them from being purged we should be OK.

      • Why would this happen? It’s not as if schools don’t know who’s conservative when making decisions on whether to offer tenure or not. I suppose it allows for professors who happen to change their views on politics after receiving tenure, although my guess is that (at least in fields where politics may matter), this would be incredibly rare.

        Its easy to envision a model where tenure hurts conservatives. For instance, liberal schools would be willing to tolerate conservatives to a degree, but knowing that once given tenure they can never be removed, are unwilling to grant tenure to conservatives. This in turn scares away talented conservatives from tenure track careers and into industry instead.

        • Douglas Knight says:

          How would they know? I know a mathematician who was directly told by his advisor to keep his politics and devoutness hidden until he got tenure. (You might question the value of political diversity in mathematics. But at least it is possible.)

          • Pku says:

            Do you have any details? I’m in math and I’m fairly liberal, but I still get bothered by how liberal american math departments feel (most of the people are decent enough that they’d probably avoid conscious discrimination, but still).
            One story that kind of annoyed me: last year we went to a conference, and one of our people said math departments needed to be more liberal, as evidenced by the fact that all seven of us going to the conference were male. I found this bizarre for several reasons, mainly that this (fairly brilliant) guy didn’t notice any of the obvious flaws with that argument.

          • Vaniver says:

            Pku, I am reminded of a thing I heard in a discussion of white people who adopted black children from Africa: “I had no idea how many of my friends were racists.”

            I mean, how would you know whether or not the people in your department would consciously discriminate against conservatives?

        • CatCube says:

          One of the bloggers on the old Volokh Conspiracy site (I can’t remember which one, but I think it was Juan non-Volokh) explicitly said he used a pseudonym because if the tenure committee found out his true political alignment he’d be blackballed.

        • Troy says:

          It’s not as if schools don’t know who’s conservative when making decisions on whether to offer tenure or not.

          They often don’t. Those of us with heterodox political views and no tenure don’t tend to advertise those views.

          • Walter says:

            Wise of you. Although it brings to mind the “Get Smart” episode where the entire gang was filled with law enforcement infiltrators. Any possibility that the people you are hiding your conservative beliefs from are, themselves, hiding their conservative beliefs from you?

          • Troy says:

            Any possibility that the people you are hiding your conservative beliefs from are, themselves, hiding their conservative beliefs from you?

            I often suspect that academics I know who do not discuss politics — either in real life or on Facebook — are conservatives (or moderates, or libertarians). But it’s hard to be sure, and it can be dangerous to risk “coming out” to someone if you’re not sure. Perhaps we need some kind of secret signal or password to communicate with each other.

    • I remember reading somewhere (can’t recall the exact source) that the only way one can get to do research in parapsychology is if you have tenure. Apparently, parapsychology is such a “taboo” subject, that one cannot get away with studying it otherwise. At least that’s what parapsychologists say. So without the protection of tenure we might not have had such groundbreaking studies as Daryl Bem’s paper on precognition. And yet, one does not hear (as far as I am aware) of tenured professors being pressured to step down for wasting time on what is widely derided as an empty subject that has earned little to no respect in the scientific community. Perhaps this is out of recognition of the need for a control group for science?

      (There may be elements of sarcasm in this comment.)

  4. DanielLC says:

    > you can’t give half of a medical license to a doctor who finishes two years of med school

    Why not? A medical license covers a lot of stuff. Why can’t you let them do just the easier stuff? Why can’t you just give them a grade, and let people decide how good of a doctor they need for a specific thing? Maybe more expensive insurance will pay for better doctors.

    • Douglas Knight says:

      You could design the curriculum so that the first half is a nursing degree and the second an add-on for physicians. But the current curriculum does not make sense if you truncate it: the first half is book learning and the second is hands-on. (Which is terrible for lots of reasons, not just truncation.) Also, nursing covers stuff not in medical school.

    • Oliver Cromwell says:

      It already happens anyway, called a nurse, or with even fewer restrictions, some kind of alternative medicine practitioner. Indeed it’s fully legal to offer absolute quackery with no training at all; what’s not legal is to offer actual medical care with non-approved competency testing, even totally official and above board testing that happened to take place in a different country. This is too perverse for an honest observer to see anything but price gouging at play.

    • Airgap says:

      I think in practice wannabe doctors who can’t hack it wash out into pharmacology.

    • Murphy says:

      Tradition, mostly.

      It’s still organized in a medieval fashion similar to old-style guilds with masters/journeymen/apprentices where you can only be a real doctor once you’ve reached the master level.

      It’s not in the interest of the people who are already doctors to reduce barriers to entry or to farm work legally reserved for masters out to other groups who haven’t had to make the same level of investment.

      • Cadie says:

        I’m not so sure. Having physician assistants of some kind who can handle the simple cases might seem to be a bad idea for full medical doctors, but a side effect of that system would be that doctors make more money. Because the assistant handles the easy patients with straightforward diagnoses and simple prescriptions, the physicians can fill their time with more complex cases that yield bigger bills to the patient or insurance company. So it would be somewhat harder work but the same number of hours and a bigger paycheck.

        • Vaniver says:

          the physicians can fill their time with more complex cases that yield bigger bills to the patient or insurance company. So it would be somewhat harder work but the same number of hours and a bigger paycheck.

          Presumably for less doctors, since there are only so many complex cases.

        • Tom Womack says:

          That seems to be a system that works until the first person turns up with leukemia presenting as persistent elbow pain, and their heirs sue the pants off you for having let the leukemia get to stage four while the physician assistant treats it with ibuprofen gel and physiotherapy.

          • Marc Whipple says:

            Same argument for lawyers vs. paralegals vs. that guy who hangs out in the courthouse pretending to be a lawyer who’s seen so many cases adjudicated he could probably beat me in a criminal case.

            However, AI is going make a right hash of that for both medicine and law. WebMD already knows that persistent elbow pain could indicate leukemia, it’s just not as good as humans at ruling it out. Give it a few more years.

          • Murphy says:

            Some of the medical expert systems are actually quite remarkable already and can regularly beat human experts but they face significant legal and regulatory barriers.

            Because of history there’s rules that protect doctors if the opinions they give are defensible and reasonable but expert systems don’t have the same protection. If an AI makes a mistake you can be sure the company is going to get sued no matter how reasonable the mistake.

          • Anthony says:

            This sounds like an opportunity for a doctor in private practice to buy/rent one of those expert systems, and save himself a bunch of work, while still getting to bill for the work he’s taking liability for.

  5. Douglas Knight says:

    Look at the expansion of residencies in medicine. I think they are longer than they used to be. But they are also trying to expand to more people. It used to be that they were for specialists. But then they invented the Internist, a GP with a residency certifying superiority. But now they’re trying to make GPs get residencies.

    • Scott Alexander says:

      I’m not sure what you’re talking about. In the US, there’s no such specialty as “GP”. All doctors have to do residency. Doctors in internal medicine and family medicine are the closest equivalents to what European countries call GP, but those are both residencies. In Britain and Ireland AFAIK, GPs do have to complete some specialized post-medical-school training, but there’s nothing in Europe that exactly corresponds to the idea of a “residency”

      • Douglas Knight says:

        It varies by state. You can take step 3 right out of medical school, but not all states will let you practice without a 1 year internship.

        A lot of GPs do 3 years of residency, just as much as FPs or Internists, but with no board certification to show for it.

        • Scott Alexander says:

          I’ve never heard of this. If it’s true, it’s certainly a well-kept secret.

        • Devilbunny says:

          I’m going to echo Scott’s suspicion, and add that my limited experience with family practice residencies (my medical school had one, but its residents did their hospital training at an unaffiliated hospital, so we never saw them after internship) is that they tend to be bottom-heavy.

          If you look at people who want to practice in a more rural area, there is essentially no advantage to finishing residency and being board certified, so they tend to get hired away after internship (or after two years of residency).

          At least in my state, the major exception is for those who did not graduate from an American or Canadian medical school. They are required to complete a residency of at least three years in order to get an unrestricted license to practice medicine, so they’re in it for the long haul.

          As far as I know, no state allows those who have not completed an internship to practice medicine unless they were grandfathered in.

      • Shenpen says:

        Wow. So without a GP who does the kind of first screening, telling people to go see a specialist in the hospital vs. take two aspirins and fuck off? Also, who does the basic stuff like injecting vaccines, putting people on sick days when they got a cold / flu so that they don’t spread their germs in the office, or renewing the painkiller prescriptions of the chronically ill?

        I think the GP system is very cost efficient primarily for the screening, so that we don’t have a lung specialist checking everybody who just coughs because has a cold or flu.

        • youzicha says:

          The US still has primary care doctors which do screening, refer people to specialists, and all the other things you mention. But (as I understand Scott) those doctors have had the same number of years of training as other doctors, they are just trained in “family medicine” instead of e.g. “lung medicine”. In the UK, the primary care role is served by “general practitioner” doctors, who have 5 instead of 8 years of training.

          (Note that this is a UK thing rather than a European thing, e.g. Sweden is more like the US in this respect.)

        • LHN says:

          Increasingly, the vaccines and other routine stuff tend to devolve on nurse practitioners or other non-physician personnel. I think the extent to which they have to be supervised by a physician varies by activity and state, but in practice that seems often to involve there being a doctor somewhere abstractly in the process, but not where the patients are going to run into him or her.

          (If there’s a doctor involved somewhere when my workplace does its annual flu shots, I don’t think they’re on site.)

        • Mary says:

          My PCP (primary care physician) does all that, he’s just in internal medicine.

  6. Daniel Speyer says:

    When I consider becoming a grad student, it’s never from a “this is an investment that might get me cushy tenure” perspective. It’s always from a “this is a job with crappy pay but great opportunities to work on interesting problems alongside good people” perspective. That seems to kill the analogy.

    I’m also noticing a trend that people in my circles are spewing a a lot of poorly-considered mud at academia. I assume there’s a signaling opportunity to be had, but I’d appreciate it if they’d stop.

    • LTP says:

      As somebody considering going to grad school, I totally agree and second this post.

      “I assume there’s a signaling opportunity to be had, but I’d appreciate it if they’d stop.”

      It’s a weird contrarian thing going on right now, particularly in the rationalist community, to have almost reflexive and unqualified disdain and dismissiveness for mainstream academia in general. I don’t actually think it’s a healthy attitude.

      • Vaniver says:

        It’s a weird contrarian thing going on right now, particularly in the rationalist community, to have almost reflexive and unqualified disdain and dismissiveness for mainstream academia in general. I don’t actually think it’s a healthy attitude.

        There is a rationalist technique called “replace the symbol with the substance.” When this is done with academia–replacing the things that we think are true about academia with the things that we notice are true about academia–the results are not pretty.

        This isn’t a reflexive or dismissive process; if anything, that seems more descriptive of the cognitive processes used to defend academia. But we won’t get anywhere if we argue that “no, the other person is biased,” and we may run into other trouble if we consider academia directly, instead of an analogous case.

        So let’s talk about being a soldier. Why do people choose to become soldiers? The job is dangerous, tasks and treatment are often demeaning, and it seems appropriately described as boredom punctuated by terror. This is even before we get into the question of whether or not the military is a positive force in the world. (This is the ‘substance.’) If you don’t know many soldiers or come from an anti-militaristic part of the culture, this might even be your first impression of being a soldier.

        But people become soldiers, and not just poor people who don’t have any other options; people dream about it. Why? Because of things like honor, glory, national service, family traditions, and so on. (This is the ‘symbol.’) One of my coworkers, for example, is the son of a general and wanted to join the military himself, but came of age when the American military was shrinking and had to go into industry instead.

        One can repeat this analysis for academia. If you would shudder to talk about ‘glory’ as a reason to be a soldier, you may want to be less willing to talk about ‘interesting problems’ as a reason to be a grad student.

        • LTL says:

          I don’t buy this analogy. Working on interesting problems *is* apart of the substance of academia. If somebody wanted to be an academic for the symbolism of having “Dr.” as a prefix and getting to go to prestigious conferences and be seen as an intellectual by others, that would be more analogous.

          • Vaniver says:

            Working on interesting problems *is* apart of the substance of academia.

            I agree that grad students work on interesting problems, but they also work on boring problems. People in industry also work on both interesting and boring problems. It is not clear that grad students are more intellectually stimulated than consultants, to use an example elsewhere in this thread, and so I am dubious of how well the comparison actually stands up.

            But you’re correct that the last comparison I make is not very tight, and the one you make is closer. The trouble is that it is difficult to attribute signalling motivations to others without it seeming overly negative. If I say a major motivation to go to grad school is the belief that “Smart people get Ph.D.s,” that’s harder for people to endorse and see as meaningful than “I will get to work on interesting problems.” (Though “glory” is a less polite version of that than “service,” so I probably should have done that anyway to make it fairer.)

          • TrivialGravitas says:

            @Vaniver: I think there’s a big difference in that academia offers you the opportunity to work on problems that are interesting to people who fund academic work, while industry offers you the opportunity to work on problems that are interesting to stockholders and/or really big government agencies. Depending on the problems you personally find interesting one or the other is going to be far more suitable.

          • PDV says:

            In practice, there is a lot less freedom to investigate interesting problems in academia than it seems, because you need to hustle for grants and such to be able to keep going. At the top end, if you have tenure and some prestige, you can probably investigate topics of your choice some of the time, but still not all. And in many fields, there’s just as many interesting problems to be had outside academia as within it. (For computer science and economics, things are near equal in and out of academia, for biology or mathematics it will be somewhat less even, linguistics or sociology you’ll probably see a significant gap, English lit. there probably will be a huge gap since there’s not really anywhere to study English literature, as such, outside academia.)

            ‘Interesting problems’ looks like substance, and feels like substance, but it’s mostly symbol in many fields. Depending on your field, you’re giving up much less in the way of interesting problems if you go into industry and collect sweet cash prizes for that decision.

          • LTP says:


            Many fields can only be researched for money in academia, that’s why interesting problems *is* substance. You want to study literature, philosophy, anthropology, theoretical physics, pure math, history? You can only do that for pay in academia.

            Yeah, if you’re a programmer or an engineer things are different, but for many fields, possibly most, there is no “industry” to flee to.

          • Tibor says:

            Speaking for myself, as a Maths PhD student – My reason to go to academia (in the sense of doing the PhD) was first to find out what “real research” is like and whether I would like to do a work like that, second, whether the academia is a good environment for me to do it (one can do research and work “on interesting problems” in a private company as well…albeit usually the research tends to be more applied, structured and supervised, which has its pros and cons).

            I see the pros of Academia as follows:
            – you are able to do whatever kind of research you like to do (more or less)
            – you have a very very flexible time schedule, basically you work whenever you want to work, of course you need some results at the end of it, but the time distribution is entirely up to you (actually, this is not true either. If you want the work done, you need to work when you have an idea…sometimes that means working till late on Saturday night while also calling it a day on Wednesday at 3pm, because you just spend several hours with absolutely no results of even an idea where to go…this could also be due to my inexperience though)
            – you probably have a higher societal status (among most people at least) as a professor (when you get there) than as a researcher at a private company

            Pros of the private sphere:
            – You know that what you do is actually being put to some use and your research is more focused, which is in a sense less stressful and also puts you into less self-doubt (“am I actually doing anything worth doing?”)
            – If your leaning is libertarian and you live in Europe (sans UK) where most universities are publicly funded, you might have some issues with being paid by tax money (This couples with the “Am I doing anything useful?”…being paid by tax money is one thing, being paid by tax money to do something of no use to others is another thing)
            – On average, you earn more money. I think this is not as important as it seems, since tenured professors have good salaries too, plus other benefits as well. However, as the article Scott links to notes, becoming a tenured professor can be quite hard sometimes. I should mention I’m doing my PhD in Germany, which, according to the article seems to be a smart move…but if I decided to continue in academia, staying here would not be such a great idea.

            Two things I really do not like in the academia and that ultimately will very likely lead to me seeking a (research level hopefully) job in the industry after my PhD are the politics and the horribly long and insecure tenure track. By politics I don’t mean what is usually mentioned here, although even German academia (I guess this is true generally) is more socialist-leaning than I would like. Then again, in maths this is not a big deal…or any, actually. Except for extreme regimes like Nazi Germany and Soviet Russia where they actually managed to politicize even mathematics, this is about as apolitical a field as it can get. And one can generally filter the rest of the university out. What I mean by politics is getting grants and things like that which involve a lot of what can only be called “political work”, convicing the granting commitees that your project is the right project for funding. I find that very irritating and although I don’t have to deal with it much yet, I would have to if I continued my career in the academia.

            The tenure track is long. In Germany at least, one is expected to have a PhD, then do a PostDoc at a different university, then do another PostDoc at yet a different place…then is the first time you can hope to get a tenured position. Those PostDocs should be 2-3 years each, so altogether you can hope to get a first full-time job at the age of almost forty…this is the biggest turnoff for me.

            Oh, and sorry for the length 🙂

        • Anonymous says:

          I don’t think this is what “replace the symbol with the substance” means here. Whatever “rationality” is, it’s not inconsistent with a desire for glory or an impulse to national service. To “replace the symbol with the substance” here would be to /define/ “glory” and “national service”–if you can do that there’s nothing wrong with wanting them.

        • Saul Degraw says:

          I think the issue is that there are tons of different academic subjects. I am an arts and humanities guy. My M.F.A. is officially a terminal degree. I could have gotten a PhD in Dramatic Lit or Performance Studies but it is not strictly necessary. I wasn’t so much interested in research but in teaching and specifically teaching at a university for the overly-romantic reasons I wrote down below. My fantasy dream job of being a professors is largely about charm. I would gladly take a medium salary if it meant being able to direct plays and teach Wednesday afternoon seminars on dramatic lit for my livlihood in a charming building. But I was smart enough to realize that this was a daydream.

        • SFG says:

          There’s also the fact that being a soldier is about the only socially-approved way to fire guns and kill people. For a lot of young guys with high testosterone, that’s appealing, and has been since we entered the great ape family.

          And plenty of countries have to draft people for the big wars, so it’s evidently not always sufficient.

        • LTP says:

          EDIT: Responded to the wrong comment, oops.

        • Adam says:

          I was both a Soldier and a grad student and I think the reality is our decisions are somewhat about the stories we tell ourselves and sometimes about something else we may or may not even be aware of. At the time I joined the Army, it was about the best paying option out there and stable with good benefits. I joined as a commissioned officer, though, opting to do ROTC in school, and watched many of my fellow students struggle to find jobs afterward, move back in with their parents, while I started fairly low but the promotions and raises come pretty quickly, I got a house almost immediately thanks to the VA loan not requiring a down payment or PMI, and just add the general prestige factor on top of that.

          Somebody or other I can’t remember regularly polls Americans on which professions they respect the most and the top three are always medical doctors, firefighters, and military officers (lawyers are always last, so have fun, law schoolers). Everywhere you go in uniform, people are friendly to you, ask you for stories, thank you for your service, airlines upgrade you to first class if they have seats available.

          Additionally, while your buddies out there in the corporate world are delivering actionable insights to build a marketing strategy for some shitty business with a crappy product that is going to under in ten years anyway, you get to deliverable actionable insights that win or lose wars. The wars themselves might be pointless and unwinnable, but hey, that’s a little detail, the politicians’ faults, and one way or another, you get to play a role in history (walk on part in the war rather than lead role in a cage).

          So how much of it was that and how much of it was the ROTC recruiter looked really happy and claimed he could retire right then and there just off all the rental properties he owned? I don’t know. We all tell ourselves bullshit about why we did what we did.

      • Eli says:

        It’s a weird contrarian thing going on right now, particularly in the rationalist community, to have almost reflexive and unqualified disdain and dismissiveness for mainstream academia in general. I don’t actually think it’s a healthy attitude.

        I’ve taken a lick and been a grad-student. I have some coherent, from-the-inside critiques of academia and the way it operates, and in fact, so does almost every single aspiring or young academic I know. We even have a lot of the same criticisms in common. If you want to know what we think, you can roughly start by reading PhD Comics, or just watching the film Piled Higher and Deeper — that’s sort of our “Dilbert”.

        However, I can’t help but notice that the Bravely Contrarian Rationalist Community Critique of academia has little to nothing in common with our we’ve-been-there-and-seen-it critique of academia. In fact, if I may look at contrarian clusters, it looks an awful lot like an, “I’ve been working in industry my whole life and never run a controlled experiment, processed statistics on data, proved an original theorem, published a paper, or submitted a thesis, but I want to be Of Science too, so let me denigrate all the work and methodology that goes into their mainstream science to make my less-mainstream x-rationality look better.”

        Looks quite a lot like a textbook case of tribal-pride biases at work — increasingly so as the semi-official “x-rationality” organizations work more and more closely with mainstream academia.

        • The Anonymouse says:

          Perhaps “tribal-pride bias” could be reduced if there wasn’t name-calling involved?

          Is there any evidence that our resident detractors of academia are posing as “Bravely Contrarian” rather than just, y’know, “ordinary disagreers with Eli”?

        • It may ‘ve bias on the part of the followers, but its (instrumental) rationality on the part of the leaders.

        • Ilya Shpitser says:

          I agree with Eli and TheAncientGeek.

          A lot of it is also “twentysomething omniscience.”

          It would be very curious to see what happens to the rationalists as their median age reaches 30-40.

        • grendelkhan says:

          I can’t help but notice that the Bravely Contrarian Rationalist Community Critique of academia has little to nothing in common with our we’ve-been-there-and-seen-it critique of academia

          Could you be more specific about this? It’s entirely plausible–some of the founding documents of the Rationalist Community involve dismissing out of hand a lot of academia (“the mountains of philosophy are the foothills of AI”, for example).

          But! You’re going to have to do better than just saying that the people with bad views must not have a point. Scott cites his sources below; if anything this looks like members of academia defending their sunk costs and ridiculous arrangements by saying that the unwashed outsiders just don’t get them.

          (tl;dr no, you’re being a dirty dirty tribalist)

          • Eli says:

            See, what strikes me as very weird about those dismissals is… it’s not as if mainstream academia doesn’t do Bayesian models of cognition. I mean, hell, the first paragraph of the PhD thesis of my favorite research scientist in that group goes:

            We would like to build computing machines that can interpret and learn from their sensory experience, act effectively in real time, and — ultimately — design and program themselves. We would like to use these machines as models to help us understand our own minds and brains. We would also like to use computers to build and evaluate models of the world that can help us interpret our data and make more rational decisions.

            Did you notice how the author basically says he wants to build self-modifying AI in the first sentence? And how this is an award-winning PhD thesis at a prestigious institution? The man runs his own lab now, pushing forward the subfield he helped his advisor found. They study how to build minds by looking at actually-existing minds.

            I would class them as “LW’s dream come true”, and yet only a small handful of people on LW itself seem to actually know this lab exists, or that their results might be useful or significant. The Virtue of Scholarship is neglected indeed!

            And if there is one single thing mainstream academia is incredibly, pathologically good at, it’s making you read the fucking existing literature on a problem before you go around reinventing the wheel. I mean, all of PhD school up to and including your qualifying exams is basically the Virtue of Scholarship magnified to such a degree that it becomes a kind of hazing ritual — but also a way to make absolutely sure that you are genuinely capable of having an original thought because you know which directions of search are fruitful and which aren’t.

            So basically, I’d really prefer that “rationalists” criticize academia from a position of being well-informed about what it’s up to, rather than dismissing it from the outside without actually knowing what gets accomplished in academia. That way lies crankery.

          • Samuel Skinner says:

            “I would class them as “LW’s dream come true”, ”

            A bit of a nitpick, but MIRI is more focused on friendliness. They also appear to believe that the first AI will probably be created in industry and only the first one or two that matter.

            Also I’m pretty sure they and there members are aware people in academia are attempting to build an AI; I’m pretty sure they talk about approaches and assumptions that were tried previously.

        • LTP says:

          I totally think there are valid critiques of academia, and I am totally aware of many of them having read lots of first-hand accounts myself. It’s the fact that it seems so reflexive and absolutist in the rationalist community, that many seem to decide that academia is entirely corrupt and useless a priori and then look for evidence to support that, which you’ve gestured at in your post.

        • Jaskologist says:

          When one sees that a system is broken, there is a temptation to want to discard the system altogether and latch on to anything outside of the system. I think most people have the sense that academia is broken, and Rationalists are latching on to that.

          But usually, the system is actually bent rather than broken. It is no longer functioning as it should, but it is still functioning somewhat. It still contains a lot of habits and institutes that limp towards its ostensible goal, and a lot of the vestigial checks and balances which keep it from going too awry are still hanging around. When you latch onto something outside the system, you’re basically taking a random solution to whatever problem the system tried to solve, and most things in solution-space are wrong.

          Let’s continue the analogy of a man with a limp. He still achieves locomotion, but very slowly. This is bad! We want him to move faster. But when we try painting flames on his sides, that’s not actually going to help much. If we try setting him on fire “because that’s what rockets do,” we’ve made the situation much, much worse.

          It is usually very hard to distinguish between bent and broken. We can see the thing that is wrong very clearly. It’s hard to notice the hundred things in the background that are still working properly.

          There can be good reasons for dumping the current system entirely. Maybe it really is beyond redemption, or so thoroughly co-opted by an enemy tribe that it is better to “let it burn” and rebuild afterward. In this analogy, the man is not limping, he is punching you n the face. In fact, let me stress this: if the system really is that broken, the only rational thing to do is to gravitate to something completely outside of it. This choice will probably still be a bad one (the system has absorbed and defanged those who would otherwise be useful to your cause), but nothing says that, given a list of choices, one of them will be good.

          TLDR; Rationalists are the Donald Trump of academia.

        • onyomi says:

          I think it is super tempting and sometimes, but often not, rational to dismiss a formidable body of information, such as that presented by academia today, or, say, traditional theology. If someone you trust says “there is nothing of value in the Keynesian school of economics,” then that saves you a lot of energy if you feel you can justifiably ignore any claims coming out of that quarter. Yet the people devoting their lives to this stuff, ideologically blinkered and wrong though they may be, are also probably (though surprisingly, not always) very familiar with the facile objections you use to dismiss their whole school of thought.

          Human knowledge is really vast and intimidating. Anyone who says “ignore all that and focus on this” is providing a very valuable service if, in fact, they are right–but it’s very hard to know who’s right about these things without studying them yourself, and the more sweeping the dismissal, the more justification required.

          I think academia itself can, paradoxically, encourage its own dismissal by lay people by being intentionally territorial and dismissive of non-expert opinion. Academics not only are not going to take your interpretation of Shakespeare seriously, there is also a subset of English literature specialists who will not take your views on Shakespeare seriously if your dissertation was on Milton.

    • Pku says:

      I’ve heard this described as “investing part of your salary in having an enjoyable job” – whether a job is beneficial in terms of working terms or just being paid more, it’s still the same benefit. (Also, agree with your point about the mud-slinging – some criticism of academia is called for, but a lot of it does seem to be more about signalling than anything else.)

      • Creutzer says:

        This works in some fields of academia and not in others. When you’re being worked to death in a lab, your investment has failed. My casual impression is that it works better in those fields that are at least historically considered humanities.

    • Professor Frink says:

      So one thing to watch out for is that priorities shift. It’s really hard quantify 5-7 years of opportunity cost. Yes, you’ll get to work on interesting problems, and that is great. But you’ll also make very little money and have little economic gain. While the people who skipped grad school buy homes, advance in their careers and have children.

      Whereas, going to industry job you can probably find problems ALMOST as interesting and have a middle class life.

      This is coming from someone who went to a good graduate school and did very well, but still found himself just shy of 30, driving a 20 year old car with no real savings, applying for jobs he could have done straight out of undergrad.

      • LTP says:

        “Whereas, going to industry job you can probably find problems ALMOST as interesting and have a middle class life.”

        Not if you are interested in the humanities or even some of the social sciences and “pure” hard sciences (e.g. theoretical physics).

        • Professor Frink says:

          I didn’t say “the same problems” I said problems “almost as interesting.” I did pure, no application mathematics in graduate school and now do consulting.

          I don’t work on the same problems, but they are almost as interesting. My friends who did hard sciences work in grad school now work in similar jobs to me.

      • Anon256 says:

        I was a math grad student for six years. I didn’t work very hard and enjoyed my free time; my funding (which did require some part-time work for the university) was sufficient to live comfortably and not feel financially constrained. The opportunities to think and learn about math were also nice but I think the socially acceptable excuse/cover for spending most of my time on leisure was a more important benefit for me. Eventually I dropped out and got a software engineering job, which I probably could have done coming out of undergrad if I had been motivated to do so. I do not regret going to grad school, and hope that I’ll be able to find similarly good ways to exchange money for leisure time in the future.

        • SFG says:

          It gets harder as you get older and people don’t want to hire old people.

          BTW, you are not crazy; many cultures around the world value quality of life over wealth, but it is difficult to achieve that here.

    • Scott Alexander says:

      This isn’t some random contrarian thing I’ve absorbed through Less Wrong. This is a vibe I’m getting from articles like Don’t Become A Scientist, Graduate School In The Humanities: Just Don’t Go, An Aspiring Scientist’s Frustration: A Resignation, The PhD Bust, Why Biomedical Scientists Suffer More Than Others, Is Graduate School In Chemistry Bad For Your Mental Health?, Professors On Food Stamps, and of course PhD Comics, all backed by the experience of my friends in those fields.

      If you’re going to disagree, tell me why you disagree instead of just rolling your eyes and saying something about signaling.

      • Pku says:

        I think most of those articles make good points but overemphasize the worst of things, and give an overly one sided view (most of the people I know in academia seem to enjoy it overall). Not so much “they’re wrong on any specific point” as “they’re arguing very well for one side, but the arguments on the other side (about how academia has a lot of advantages for the right kind of person) are also pretty good and kinda balance it out for a lot of people”. I do agree that they should train fewer PhDs, though.
        There are people who seem to be signalling more than anything else – a lot of criticism of academia is legitimate, but the people who go as far as saying “academia can’t do anything right and everything there is terrible” seem like they’re mostly signalling.
        One thing I really love about academia, and which really stresses me out about getting a real job (which I’m considering doing) is the idea of having to wear “serious clothes” to be taken seriously. Right now, the people I work with know I’m smart, so I can get respect even though I like sweatpants. I worry that in most real jobs, “the smart guy in the sweatpants” would automatically make me the nerd people make fun of. As a guy who hates formal clothes about as much as you hate noise (not to mention how much I hate the concept of an environment that would judge people by clothes), this is huge.
        In general, I think the accusation of academia being overly political are mostly wrong – it gets somewhat political, but it seems like one of the few places where if you’re good you can get by without constantly schmoozing and politicking (even grant applications are mostly written, so you can just describe your research in fancy terms, which is a lot more comfortable for most researchers than having to deal in personal politics).
        Of course, this may be very different in less hard-sciency fields. (Humanities seem like a pretty lousy place to be.)

        • Dues says:

          I agree with you that my disdain for academia has a lot to do with signaling. But for me, it’s signaling with a purpose. People like me who don’t want to be stuck paying for college want employers to be open to alternatives to academia, like the 3 month programming camps that seem to prepare programmers for the workforce about as well as a 4 year computer science degree, because colleges degrees have too wide a focus.

          I won’t deny that the non academic world is terrible I’m it’s own ways.

          • Professor Frink says:

            So be careful with this. Bootcamps are great if you want to write CRUD apps forever, but if you want to work on cryptography, data science, or any statistical or numerical programming then boot camps aren’t a substitute for a degree.

            Coding skills are much easier to pick up on the job then mathematical skills.

          • Airgap says:

            It’s true that bootcamps are less prep than a degree, but:

            1. People have always done crypto, data science, numerical programming, etc. by doing self-administered bootcamps over longer timescales. Me and lots of people I know, for instance.

            2. Bootcamps, at least the more selective ones, tend to guarantee you’re getting highly-motivated, learn-shit-and-make-it-happen-somehow people, whereas CS degrees attract a lot of people who just solve whatever problems the teacher puts in front of them the way they’re told, and then go home. These people are worthless unless you’re throwing armies of them at massive enterprise software (in which case they’re still mostly an impediment to the good people in the organization, but may have positive value on balance), and you have to filter them out yourself. Give me a guy (it’s usually a guy) with initiative and the ability to learn quickly over one who slept through his 8am Operating Systems 101 class any day.

            3. Coding skills are a species of mathematical skills. Say that over and over until you believe it.

          • Anonymous says:

            > 3. Coding skills are a species of mathematical skills. Say that over and over until you believe it.

            I read a while back about some testing that IBM did when they were recruiting for people without previous experience to teach them programming. They found that skill in logic and skill in natural language were correlated with success in the course (with significant weight for each factor), but general mathematical skill was not.

            Logic is of course a branch of mathematics but it’s not clear that coding skill is likely to be strongly related to math skill in general.

          • Creutzer says:

            I have heard it said that logic requires a different thinking style than the rest of mathematics. From my very limited personal experience, this seems plausible.

          • Professor Frink says:

            @Airgap- in my experience even highly rated bootcamps mostly turn out people who have a good handle on UI and CRUD stuff but not much else. Maybe they’ll be able to learn something else eventually, but I have basically no use for them now. The exception are the rare bootcamp participants with strong math backgrounds before getting hired.

            Give me a math major who has never coded in their life over a bootcamp grad with no math background. I can get the math major coding faster than I can get a bootcamp grad up to speed on statistics and data analysis.

          • Airgap says:

            > They found that skill in logic and skill in natural language were correlated with success in the course (with significant weight for each factor), but general mathematical skill was not.

            For this to be true, it would have to be the case that skill in logic, natural language, and general mathematics were not correlated, but they are. Most likely explanation is IBM has funny definitions for these skills.

            > I have heard it said that logic requires a different thinking style than the rest of mathematics. From my very limited personal experience, this seems plausible.

            From my decent personal experience (upper-division university courses in math & logic, extensive programming experience), it does not.

            > Maybe they’ll be able to learn something else eventually, but I have basically no use for them now…I can get the math major coding faster than I can get a bootcamp grad up to speed on statistics and data analysis.

            Besides the fact that this depends on the quality of the math major (they have math majors who can solve math homework problems and can’t do anything else useful), it sounds like you need a specific set of skills that most bootcamp grads and most cs grads (the natural comparison for bootcamps) won’t have, so this is kind of a silly quibble.

          • Publius Varinius says:


            For this to be true, it would have to be the case that skill in logic, natural language, and general mathematics were not correlated, but they are.

            Why? Correlations are not transitive.

          • Airgap says:

            > Why? Correlations are not transitive.

            There are lots of ways to look at this. The one that makes most sense to me is “Yes they are, but correlation isn’t binary, and weak original correlations can damp the transitive correlation all the way down to ‘not correlated.'” I suppose maybe IBM found that logic skill correlated with programming skill, but only barely. Does this sound right to you? It doesn’t to me.

          • Derelict says:


            Where did you go to university that Operating Systems was a 101 class? I can’t imagine a first year student working on anything near that complex.

          • Airgap says:

            A place where lower division classes were called “Programming 1”, “Calc 2”, etc.

            Also, I know people who did OS dev professionally in high school. You probably mean “The typical first-year student who doesn’t know jack shit, and probably never will.”

          • Publius Varinius says:


            But correlations really are not transitive!

            It is generally true that study time spent on solving homework is positively correlated with GPA.

            Time spent on rote memorization is also positively correlated with GPA.

            However, time spent on solving homework problems is negatively correlated with
            time spent on rote memorization.

          • Airgap says:

            > However, time spent on solving homework problems is negatively correlated with
            time spent on rote memorization.

            Not true. People who do less of one tend to do less of the other. Take retards, for instance. You should have just stolen Terry Tao’s example.

          • Adam says:

            Airgap, you might note that a CS degree from an elite school is effectively the same thing you’re saying about web-dev bootcamps, signaling that the person was motivated and smart enough in the first place. You get the added benefit that they already know more than just how to build a web site using one framework, even though they also had to take English Composition and whatever else you consider less useful (note even this isn’t always the case – my wife went to a pure engineering school, doubled in EE and CE, and managed to get away with two terms of Video Game Design as her only required ‘humanities’ classes).

          • Publius Varinius says:

            > Take retards, for instance.

            Since you claim that you were a student of an elite school, I have to conclude that you are being intentionally dense. This wouldn’t be the first time you did that.

            1. You presumably know that by “taking retards”, you can’t conclude anything at all about the correlation in question. If you didn’t know it, well, let it be known to you now; but in that case, I really wouldn’t advise anyone on statistics.

            2. You know now that correlations are not transitive. You did the extra bit of Googling, so you presumably also understand why.

            3. You know now that your “most likely explanation” about IBM is therefore not likely at all.

            *tl;dr* Stop acting like a retard.

          • Airgap says:

            > Since you claim that you were a student of an elite school

            I don’t recall doing so.

            > You presumably know that by “taking retards”, you can’t conclude anything at all about the correlation in question.

            As I recall, the retards who were ill-advisedly integrated into my secondary schools spent very little time on either rote memorization or solving homework problems (because they spent most of their time screaming and drooling). In general, people who study spend more time on both than people who don’t study. It’s a bad example.

            > You know now that correlations are not transitive.

            I don’t. I disagree with Tao, although as I’ve pointed out it’s something of a matter of interpretation.

            > I have to conclude that you are being intentionally dense.

            I am in fact intentionally sounding dense, but trying to not actually be. For reasons beyond the scope of the present discussion.

        • Someone from the other side says:

          Getting used to wearing a suit takes a month or two. Just do yourself the favor and get GOOD, WELL-FITTING suits.

          I say that as someone who often runs around in sweat pants (mostly I wore khakis and tshirt in university, shorts in summer) but wears suit daily.

          • Creutzer says:

            I agree. Your expectations about how physically uncomfortable well-fitting suits and ties are may be overblown. I know mine were. (I don’t have occasion to wear them frequently, but I wouldn’t mind now if I did.)

          • Pku says:

            I suspect It’d bother me a lot less than I expect if I actually tried it, but I do have an irrational distaste for environments that expect you to dress nice (my instinct is that somewhere that requires you to be well dressed is either overcompensating for feeling like what they do doesn’t matter by dressing importantly). I’m aware that I take this irrationally far, but everyone has things that bother them to an irrational degree, and that’s mine. (I don’t have a problem with anyone else wearing suits or whatever, I just don’t like people who try to social pressure me into dressing like them).

          • FJ says:

            @Pku: “my instinct is that somewhere that requires you to be well dressed is either overcompensating for feeling like what they do doesn’t matter by dressing importantly).” What’s the other option?

            I think you are close to the correct attitude: uniforms (and business suits are just as much a uniform as a soldier’s fatigues) exist to impress upon people the importance of the work that is being done by the uniform-wearer. You’re wrong to think that this is exclusively self-directed, though: a doctor’s white coat can have physiological and psychological effects on her patients. I agree that people tend to take the work that they do more seriously if they are wearing a particular uniform, but I don’t see why you assume that this is “overcompensating.” I’m sure that someone somewhere has a job that really doesn’t matter, but even the night-shift manager at Wendy’s is responsible for preventing foodborne disease, fostering the careers of the shift workers, and earning a return on investment for the pension funds that own Wendy’s stock. They also serve who only wear ties and wait.

        • Airgap says:

          In academia, as in real jobs, “the smart guy in the suit” is like the smart guy in the sweatpants except he looks much cooler. If you can’t grasp this, you are that nerd, and the fact that you’re surrounded by other nerds doesn’t fix this. Don’t you want to look cool? Of course you do. You just think you won’t succeed. Except it’s not hard: you get yourself a suit. Bam. Instant badass. Other researchers stop and ask “Why are you wearing a suit just to program the STM?” You cock your head and smile. “Because I’m worth it.”

          Paid for by Airgap and Sons bespoke tailoring.

          • Creutzer says:

            My experience is that wearing a suit and tie in academia would not be advisable for most people. Even jacket and tie is problematic. This seems to have something to do with academia wanting to signal that they’re different from the private or government sector and care only about ideas. If you do it, there is a chance that you will be perceived as someone who has bad ideas but tries to mask it by looking respectable.

            What Airgap suggests, it seems, is to do it on principle and as an eccentricity. This is the only way it can work, but it’s not for everybody, because, within academia, this is, in fact, a sort of countersignalling.

          • Airgap says:

            No, we’ve just redefined “Respectable” as “Dressing like a nerdy slob.” But maybe I’m wrong. Maybe von Neumann was trying to mask his shitty ideas by being the best-dressed mathematician of all time. Maybe he wasn’t even that good at math. We’ll probably never know.

          • Creutzer says:

            No, we’ve just redefined “Respectable” as “Dressing like a nerdy slob.”

            Well, yes. But that doesn’t change my point.

            Von Neumann lived several decades ago. You did, I assume, notice the changes in customary dress within and outside of academia since then? And even ignoring that, he was one of the most brilliant minds of the 20th century, of course he could pull off the countersignalling involved.

          • Airgap says:

            To describe von Neumann’s fashion choices as countersignalling is a vicious slur on the whole tradition of intellectual inquiry. The correct term is “Killin’ it, baby!” If I was in charge of whatever institution you’re connected to, I’d have expelled immediately, unless you had tenure, in which case I’d have you shot at dawn.

            I did notice that fashion has changed, but that isn’t a complete argument, and I think if you try to complete it you’ll find it doesn’t go anywhere.

          • Oliver Cromwell says:

            von Neumann was an outstanding genius who could do whatever the hell he liked. Most academics are team-enabled office workers, who need to signal some level of conformity with those around them. You can actually get away with dressing in suits if you make it very clear that the purpose is to look fashionable rather than just respectable, and that fashion is a deep personal interest of yours. Otherwise it will look like you got lost on the way to your office in HR or accounts. You have to signal that you are an individualist by conforming to an image of what an individualist looks like (I realise the irony).

          • Creutzer says:

            Airgap, please read more carefully what I write. I’m not even seeing where you supposedly disagree with me. I didn’t say I’m in favour of current academic dress codes.

            Perhaps Oliver Cromwell phrased the point better than me.

          • Pku says:

            Not sure what level of sarcasm you have here, but no, I really don’t particularly care about looking “objectively badass”. I care a lot more about the respect of the other nerds than your “cool people in suits”. And if you can’t grasp this, you need another level on your theory of the mind. ( (Not that I judge people for wearing suits, but I don’t like people who’d disrespect you for how you dress).

          • Creutzer says:

            Not that I judge people for wearing suits, but I don’t like people who’d disrespect you for how you dress.

            I would hazard the guess that that would be just about all of humanity…

          • Nornagest says:

            @Airgap — The trouble is that if you are a nerd, many of the people you want to impress on a regular basis are other nerds, and they’re likely to have weird countersignaling norms that don’t necessarily correspond with your idea of what looks good on you. Ignoring them is charitably described as “daring”; rumor has it that wearing a suit to a Google interview leads to an instant disqualification on culture grounds, for example. In practice it’s probably not that extreme, but I still wouldn’t call it a good idea.

            This can lead to some weird inversions of conventional formality rules. Personally I like good tailoring, and so I often dress more formally for clubs or parties than I do for work. But when in Rome…

          • Airgap says:

            rumor has it that wearing a suit to a Google interview leads to an instant disqualification on culture grounds, for example.

            Not even close. Trust me.

          • Anon256 says:

            Not true at Google, true at many startups. And even when not an “instant disqualification” it’s likely to hurt at the margin.

          • Daniel Speyer says:

            When I conducted interviews at Google one candidate showed up in a suit. I docked him points. If he’d been really strong, he could still have gotten hired, but if he’d been borderline it could have tipped against him (in fact, he wouldn’t have made it anyway).

            There was no official policy on this, but I think most of my co-workers would have done the same.

          • Airgap says:

            No wonder nobody good wants to work for google anymore.

          • HeelBearCub says:

            I will say that this idea that wearing a suit to work to “kill it” strikes me as the kind of thing that comes from someone who is either naive, pompous, unconcerned with the actual work, or destined for middle management.

            1) Get you work done. 2) Dress how you like.

            If you are trying to “kill it like a badass”, let me point back to #1. Suits are just clothes.

          • Deiseach says:

            rumor has it that wearing a suit to a Google interview leads to an instant disqualification on culture grounds, for example

            What about when Google want to hire accountants and “non-creative” staff? Wearing a good suit for an interview is ingrained in most people; would even Google hire a payroll technician who turned up in bunny slippers and his or her pyjama pants to the interview?

            I can well imagine the founders of companies getting to be counter-cultural in the things they wear (apart from Bill Gates and his suits) but even there I’m pretty sure only Da Boss who is the main man gets to wear the T-shirt and Bermuda shorts ensemble when attending a board meeting, and that they’re very damn expensive designer classily tailored T-shirts and Bermuda shorts.

            In my last job I did notice the differences in how people dressed; when we were interviewing teachers for the secondary school and tutors for night classes/continuing education classes, you could tell the art teachers a mile off by their casual clothing (and complete incapacity to manage anything resembling paperwork, but they were all lovely people even if they had no idea how to fill in a form when you were trying to make up their hours for their paysheets).

            I only ever saw one art teacher who wore a suit to interview, and he eventually ended up becoming principal 🙂

          • veronica d says:

            When I interviewed at Google, they specifically instructed candidates to dress casually. It was in the email they sent me. So I dressed casually. Easy peasy.

            Among the software engineers I work with, fashion choices vary, but almost no one wears suits. Which, I wish more of the men did. Suits look cool. I like cool looking people. But whatever. Most folks seem comfortable with their clothes. I’m cool with that.

            The idea of “docking points” for wearing a suit is a dickhead move. I would never do that. Like, OMG how fucking childish!

            I mean, I would not even dock some guy for wearing a fucking fedora. It doesn’t matter. It’s just a stupid hat.

            Are they friendly? Can they communicate? Do they listen and seem to care what others are saying? And (most of all, obviously) can they fucking code?

            That is all.

          • grendelkhan says:

            Another example of an eminent figure who dresses better than they have to is Vint Cerf–suits, frequently three-piece, even though he works for Google.

          • Creutzer says:

            Being so old as to have grey hair definitely helps with pulling that off.

          • onyomi says:

            There is a strong regional culture at work here, I should point out: in the US, the South is more formal than the Northeast, which is more formal than the West. I think this is a cross-industry trend, and applies to academia as well.

            And as much as I love the idea of being comfortable at work, something about the West Coast disdain for formal clothing rubs me the wrong way, as does seeing tourists in cargo shorts at fancy New Orleans restaurants (locals will not do that).

            I’m probably biased having grown up in the south, but if you can’t wear nice clothes for a job interview or a wedding or a special night out, then when *are* you going to look your best? And yes, men look better in suits and ties than short-sleeved collared shirts and jeans, as women look better in dresses and pearls than yoga pants and tank top.

          • Nornagest says:

            What about when Google want to hire accountants and “non-creative” staff?

            Dunno. I’m a techie and I was talking about the technical side of things, hence the word “nerd”.

            I would speculate that sales and finance candidates at Google would dress more formally on average than technical candidates but less formally than sales or finance at, say, Intel, but that’s based entirely on my read of company culture; I’ve never seen an interview on that side of things.

          • J Witt says:

            There was this lawyer who went from a bank to a startup, and on his first day he obviously wore a suit. His boss pulled him aside, and said “There are only two reasons for you to wear a suit. Either we’re going bankrupt, or getting acquired. Either way you’re scaring the talent.” And then sent him to a store to buy $200 jeans.

        • lumenis says:

          Data point for you based on my perspective from 9 years working at well known software companies:

          The dude* in the sweatpants is given a large prior for being odd when he shows up on his first day, yes, but I think most of the smarter folks also think “Hey, I bet I can learn a lot from this guy. He must be solid if he got this far without the clothes thing in the plus column!”

          * Gender: I’ve never worked with a lady who does the sweatpants thing, FWIW. I truly hope we’d give her the same benefit of doubt.

        • Anon256 says:

          My impression/experience is that casual or eccentric clothes are more common and acceptable as a professional software engineer than they are as a grad student. Amongst software engineers people are if anything judged negatively for wearing suits, and it’s rarely seen; by contrast I know lots of academics who wear suits regularly. I’ve also heard academics and grad students talk about having to dress more formally while teaching to be taken more seriously by undergrad students (or not be mistaken for one), and have never heard anything of the sort from software engineers. I honestly think you have it backwards (assuming your alternative to academia is software engineering and not, say, law).

          • Creutzer says:

            That is interesting. What field and part of the world are you in?

          • Anon256 says:

            Math; eastern US. Also I don’t mean to suggest that suits are the norm in my corner of academia, and those I do see tend to be rather shabby looking, but they look much less out of place than they would at amongst software engineers.

          • Creutzer says:

            Thanks. Computer scientists (and, for some reason, linguists) are probably at the extreme end of badly-dressed academia, although I am surprised to hear of math grad students dressing better to be taken more seriously by the undergrads.

          • Tom Womack says:

            While working in the scientific-software industry, I deliberately wore a suit to conferences because I liked being underestimated by people who assumed they were talking to a marketing person; they then generally seemed to appreciate that the person trying to sell them the software had written a third of it.

          • James Picone says:

            As a professional software engineer working in large, moderately-bureaucratic companies, I don’t think I’ve ever felt like I had to wear a suit/tie/nice pants. I usually just wear a nice polo shirt and some nice jeans. I also have substantially longer hair than is common for people of my gender. I don’t feel like either of those things has ever caused me problems. So there’s some more anecdata that software engineers can be casually dressed.

            “Never trust a programmer in a suit” is a bit of received wisdom in some programming cultures.

            That said, I still dress more formally for interviews, and the most casual place I’ve worked at was an organisation dedicated to doing scientific research, with a strong scientist culture.

          • Airgap says:

            “Never trust a programmer in a suit” is a bit of received wisdom in some programming cultures.

            We call those cultures “1x Engineers.”

          • James Picone says:

            Seriously? I think it’s more a strain of ye olde hacker culture, recently incarnated in tech startups and the like. Not the kinds of people who are likely to be engineers, and at least some of them are very good programmers.

        • Ralph says:

          I think your fear of having to wear serious clothes might be out dated depending on your industry. I wore shorts and flip flops to my job trading bonds in one of the most historical buildings in the Chicago financial district. So did my senior level boss.

          The entire tech industry is casual and they are basically driving the world economy as well as arguably doing some of the most advanced intellectual work around.

          Maybe just like private sector types mock academia without merit, academics have some strange misguided ideas about the seriousness of private sector work environments.

        • Jonathan Paulson says:

          One thing I really love about academia, and which really stresses me out about getting a real job (which I’m considering doing) is the idea of having to wear “serious clothes” to be taken seriously. Right now, the people I work with know I’m smart, so I can get respect even though I like sweatpants. I worry that in most real jobs, “the smart guy in the sweatpants” would automatically make me the nerd people make fun of. As a guy who hates formal clothes about as much as you hate noise (not to mention how much I hate the concept of an environment that would judge people by clothes), this is huge.

          Random observation: I dunno about other fields, but as a programmer, this is a non-issue. Everyone wears very casual clothes, and there is probably even a stigma against formal clothes.

          • Pku says:

            programming actually sounds pretty good in a lot of ways (and I’m currently thinking about looking for a programming job when I finish my PhD).

        • onyomi says:

          I am definitely in academia at least partially for somatic reasons. In particular, I find putting on nice clothes at the same time each morning and driving to a place full of coworkers to whom I have to present a professional facade for 8 hours a day incredibly tiring. I think I actually get much less done under such circumstances as compared to a job I can do in my pajamas, at home, early in the morning, or late at night (as I can with research; teaching, less so).

          Also, summer off is huge. The amount of vacation time considered standard in the US is just soul-crushing to me. Not that I’m doing nothing related to my job during the summer: as with my weekends and nights, they are often filled with research and writing I couldn’t do while teaching, learning new things tangentially related to my research which may lead in new directions, etc. I guess I just feel a very strong need to have a lot of unstructured-but-still-productive time, as well as flexibility to work late into the night if the mood strikes me or take off a whole morning if I sleep really badly, etc. etc.

          For all this I pay by having no truly “off” time and also by being paid much less than people with similar levels of education and time investment, but who don’t mind the whole office thing.

          • walpolo says:

            Yes, summer vacation is huge. There is almost no other job in the world that offers that kind of time off (at holidays as well). And it’s significant on the question of pay, as well. I don’t make as much money in a year as my friends in the corporate world, but if you factor in the fact that they work 50-55 hrs a week with maybe a couple weeks vacation a year, our income per hour has to be pretty close to the same.

        • Chalid says:

          I had a bit of a similar fear when going from academia into a large formal investment bank. What I discovered, and I think what was true for a lot of people, is that I didn’t actually care about the casual clothes. What I *really* cared about was not having to put any effort into thinking about clothing and stressing over all the weird incomprehensible social signalling associated with it.

          And the wonderful thing is that, if you are a man, you still don’t have to think about that stuff! Men’s business attire is very boring. You get a bunch of basically interchangable dress pants, and a bunch of basically interchangable dress shirts, and you can just grab a random set out of your closet every day without thinking about it at all and no one will care.

          (If you are a woman I suspect that this does not apply. Also doesn’t apply to e.g. sales roles but that doesn’t sound like the sort of thing you’d want to do.)

          • Troy says:

            Men’s business attire is very boring. You get a bunch of basically interchangable dress pants, and a bunch of basically interchangable dress shirts, and you can just grab a random set out of your closet every day without thinking about it at all and no one will care.

            Although if you’re married, your wife will tell you that most of them don’t match.

          • Edward Scizorhands says:

            If you’re married your wife will help you shop for clothes and make sure they all match.

            There is a very small set of rules I need to know with my current dress wardrobe, like “the grey shirt must be worn with black pants, not khakis” and “only tuck in the shirt if you are wearing a belt.” In fact, that might be the complete set of all my rules.

          • Creutzer says:

            Well, there are a few more rules about (not) mixing patterns, which may become relevant when you wear a tie. But you can wear “business attire” without ever encountering those, too, because you can have a lot of monochrome things.

            Also, there is no such thing as a grey shirt.

          • Pku says:

            Thanks. That actually does make me feel a lot better about my non-academic career prospects.

        • Tibor says:

          Agreed with the politics, not with politicking. I am a maths PhD student and in the field there is zero politics in terms of what is usually understood by the world “politics” here. Politics of getting grants and convincing grant committees, i.e. politicking setting up things like research training groups of “interdisciplinary researchers” that “intensively collaborate” even though this is in reality mostly wishful thinking at best and so on…well that is another thing. As a PhD student, you don’t get so much involved in this, but you already notice that it is a part of being an academian. On the other hand big corporations are probably not that much better in this. Sometimes something leaks to me from other departments, overall university policies or indeed from the presentations for the grant committee (for example when our RTG made a big deal about us being about equally male and female which I do not see as a research relevant good – or bad – on its own…though it sure is more pleasant that way :)) Mostly you can ignore these things though.

          As for clothes, engineers at Google do not seem to be quite “suit-friendly” kind of people and generally, unless you are planning to do a job in management or something where you have to communicate with customers in person, I don’t think people will force you into suits. Not in high-qualification jobs. This might be different in the US, but I doubt it.

          I share the suspicion towards humanities…but I probably am a bit biased against them (as are most people here) too.

          Regarding the number of PhDs – well their number may be too high or not, but in any case, I think it is a mistake to look at the PhD as something exclusively academic. My point is that the skills you learn during your PhD, namely the ability to do original research more or less independently (ok, you have a PhD advisor without which I would probably be quite lost for example, but still a disretation is way more original and independent than a Master thesis). This is quite a difficult thing to do in maths (or there is also a possibility that I am simply not smart enough :)) and if you want to do some sort of a research anyway, this helps.

          To be entirely honest, this also depends on other things. Here in Germany, it is quite a no-brainer (the decision to start a PhD) as long as you can find a position (different system than in the US – first, you start your PhD only after finishing the Ms degree, second, the positions are advertised by the professors who offer them and you apply for them as you would for a regular job…or you write an e-mail to a professor to ask whether she is hiring people in the near future like I did it with my advisor). As it is mentioned in the article Scott links to – you are an employee of the university with the same legal status as any other employee (that is including health and social insurance, etc.) and the pay is not bad. Sure, I’d get more if I started a job in the industry right away after my Master…but now I am getting paid AND improving my qualification at the same time, most of the work I do actually consists of exactly that – improving my own qualification. However, a PhD student position in the US for example, where one has to sometimes even pay tuition as a PhD student (unless you get a good enough grant for your project), at least that is how I understand it (correct me if I am wrong) would be an entirely different thing and I would probably think about that twice.

      • Daniel Speyer says:

        I’ll read those links and write a real response tomorrow.

        But the initial post that triggered this was “ACADEMIA IS LIKE DRUG DEALING (in a way that contains none of the reasons drug dealing is bad)”. This on the tails of University vs Scientology found a set of resemblances no better than could be found for any other life plan.

        The standard of discourse with which our community bashes academia is less than that which we demand for any other topic.

        • Creutzer says:

          The Scientology comparison is mostly specific to US universities. It doesn’t apply outside of the US and it’s not really about academia. There are two separate fields of complaints here.

        • Scott Alexander says:

          I certainly don’t think the article meant to imply that academia was as evil as drug gangs, just that there was a common economic dynamic at work.

          • randy m says:

            It is sort of a click baity title that makes you wonder if Harvard profs order hits on Yalies.

      • Troy says:

        I agree with Daniel that being a grad student is often a pretty nice gig. You said in your post (summarizing the linked article):

        TAs and grad students work in unpleasant conditions for much less than they could make in industry, because there’s always the chance they could become a tenured professor who gets to live the life of the mind and travel to conferences in far-off countries and get summer vacations off.

        On the contrary, I say: students don’t get paid that much, but their working conditions are just fine. Students generally already get summer vacations off and at better programs they can get conference travel paid for as well. And they’re already living the life of the mind: if you’re a student, your time is spent reading, writing, and teaching. For those inclined towards such pursuits, this is much more pleasant than most industry jobs that students could get instead.

        Being a student really is not that bad. Even the pay is usually enough to get by if you live frugally, get a roommate, etc. Students who complain about how bad being a student and how oppressed they are by academia usually don’t have much experience working in industry to compare it with.

        Adjuncts do have it worse, I think; they teach many more classes to receive about the same as a grad student stipend. Still far from the worse job out there, but definitely much worse than being a regular professor.

        • Saul Degraw says:

          What I loved about grad school was that it was the only time in my life when I was able to devote most of my time to theatre. I worked for money (for a small publishing company and then for a non-profit) but most of my time was spent in class or rehearsal or doing other arts-based things. It was great!!!

        • Tibor says:

          Summer off? I dunno how it is in the US, but I “only” get 30 days of vacation a year. In practice, it does not matter. If I don’t show up at my office one day, nobody is going to care (and due to my habit to think out loud sometimes and walk while I am thinking and due to the fact that share my office with other people I sometimes stay at home and work from there instead). The important thing is to have some results at the end, so whether you get 10,30 or 90 days a year of vacation on paper, it does not really matter much.

          • Devilbunny says:

            That’s very job-specific; many jobs require physical presence. Professors have to teach classes, doctors have to see patients, and welders have to weld. Opinion writers can work from the beach, but most can’t.

          • Tibor says:

            Devilbunny: Sure, I was talking about being a PhD student. And even as one I still have to be present sometimes and sure – this is almost exclusively during the semester rather than during the exam period/student holidays. And also obviously, if you’re doing a PhD in chemistry, medicine, experimental physics etc., you need to work with lab equipment and things that you often don’t have at home (that is as long as your freezer does not go to almost 0 degrees Kelvin and you don’t own a small nuclear reactor 🙂 ).

      • Eli says:

        Scott, there’s a big difference between saying that Science is a shitty career field right now (which is an entirely legitimate statement and a very real problem), and saying that mainstream science is silly and foolish compared to our superintelligent deheuristicized-and-debiased x-rationality and we’re all better off without mainstream science anyway because it’s all really politics. The latter has been the bailey (for instance, in the form of MetaMed) and the former the motte (in the form of all those things you linked).

        Hence these comment wars.

        • SFG says:

          I agree. Science is a shitty career. Science has brought humanity out of the Middle Ages. These things are not mutually contradictory.

          I.E., the thought process responsible for almost all advancement in human living standards over the past five centuries is not a good way to make a living for the average or slightly-above-average or moderately-above-average person in the United States of America at the turn of the third millennium.

        • Brian Donohue says:

          I thought Contrarian Rationalists were critiquing academia, and, within academia, hard science least of all.

        • Anonymous says:

          I agree with your comment as a criticism of some of Eliezer’s writings and of some of the comments here, but I think the parenthetical about MetaMed is unfair. MetaMed was never based on disdain for or opposition to mainstream science/academia (notwithstanding things Vassar might have said when he got carried away trying to be inspirational). Its main product was literature reviews of existing mainstream academic biomedical research. This was based on the idea that the process by which such research normally filters through to doctors is sometimes slow or unreliable, and that the existing structure of the medical industry makes it difficult for very rich patients to efficiently pay their doctors to spend more time on research review. Neither of these ideas is in any sense an indictment of mainstream academia; they don’t even really need to be indictments of the mainstream US healthcare industry for a niche for such a company to exist. As it turned out, the niche wasn’t large enough.

      • Aneesh Mulye says:

        I do not disagree.

        For a small subset of people (the link is to Matt Welsh’s reasons for leaving Harvard), (some) jobs actually provide a greater ability to work on interesting problems than a university.

        I’ve found this to be true in my experience working at Google. Though I’m not doing anything at the scale and complexity of what (I presume) Matt does, my job is still interesting enough to me now, and I now think that the career path of a software engineer allows me to continue to work on problems that I find interesting.

        Further, my job is light enough that I can do it in ~5 hours every day. I’m also starting to set aside ~2 hours every day to learn/study/hack, for which I’ve created a curriculum that contains a pretty solid assortment of topics (the details aren’t relevant here, though I will post it on my blog later), and I expect to have covered most of it in ~3 years. After that, I think I’ll probably be able to spend those 2 hours learning more specialised topics, or doing various interesting things, all of which I can do without constraints of the kind academics face.

        Stability and a good life are both important to me, and academia provides none of that. Nor does it provide the freedom that it promises; I’ve seen the lives of doctoral students and untenured professors, and they aren’t pretty, but more importantly, they aren’t at all free. Nor, oddly enough, are the lives of tenured professors. I suspect that it’s not until you’re both tenured and senior that you have real freedom – at which point your prime years have passed you by.

        Some lucky few may escape this, but I’d rather work on interesting things right now at my job, and have the time and freedom to work on projects and problems and learn things in addition (without compromising other parts of my life), and have a good life while doing it, instead of gambling everything on a slim chance for a bygone freedom in an academia that looks nothing like its old self.

        • Tibor says:

          Well, I am currently in the middle of doing a maths PhD, but my idea of what to do afterwards is very similar what you are doing. Hopefully, I will be able to get a position like that, but that is still in the stars right now.

      • Daniel Speyer says:

        Disorganized thoughts after reading all of that…

        Yes, pay and job security suck. Anybody who didn’t know that going in should have.

        Abusive treatment from advisors… clearly happens sometimes. I see a lot about it online and hear very little in person. I suspect it’s pretty rare but amplified by publication/resharing bias.

        Too much politics, not enough research, or not enough truly original research: should be better, but still seems to beat the alternatives, and I bet there are some good pockets.

        Overall, is there a case to be made against going to grad school? Of course. Are there people who need to hear it? Probably.

        But the case should be made well.

        I think I actually hit the most important points in my off-the-cuff response. This article compares to drug dealing. It could have picked music or acting. Those exhibit exactly the relevant pattern, do so less ambiguously, and the audience is more familiar with them. But they aren’t mud. Associating something with music doesn’t score points against it in Ethnic Tension.

        And this is a pattern. Look at that scientology infographic. You could make one as good about having a job, or living under a government. It’s not an argument — it’s a cheap shot. Likewise the bingo cards. Likewise describing college budgets as containing “black holes that suck down money” (I don’t know where most of the money goes, but I daresay nobody is literally throwing cash into a gravitational singularity, and I’m pretty sure the author of that line didn’t investigate to see where it is going and whether that’s worthwhile).

        We usually maintain higher standards of discourse than this. It’s really sad to see them thrown out here.

        • TheNybbler says:

          “This article compares to drug dealing. It could have picked music or acting.”

          Yes. But drug dealing is a much more interesting problem.

        • Tibor says:

          Music (not so sure about acting) does not seem to be like this. It is not just the bottleneck you have, the important factor is also that the ones that make it to the top more or less rely on those at the bottom. A successful musician does not need unsuccessful musicians to produce his work. A drug kingpin needs the foot soldiers to make and distribute the drugs…and a professor needs the PhD students and part time faculty etc. to be able to concentrate on his research. The main difference seems to be that, in some fields, what you learn during your PhD is/can be useful outside the academia as well.

          • Edward Scizorhands says:

            The successful musician also has no particular reason to keep other musicians down. Many of them can be successful at the same time, while there is generally a fixed sum of faculty positions, and marginal increases in the quality of faculty isn’t going to change that.

      • 4bpp says:

        The impression I remember consistently having when reading those articles is that they seem to be very keen on making incorrect assumptions about my value function and then telling me how I would be much better at maximising my utility if only I were to drop my idealistic misconceptions and give the career path of their choice (which is usually industry or startups) a fair consideration.

        The most common wrong assumptions made are that I really want to own a house (mental overhead for maintenance, location binding etc.; really not that attractive without the seemingly axiomatic value US culture assigns to home ownership) and that as a computer scientist, I must really just want programming problems better than designing spreadsheets and the world-changing startup of the day (or exciting consulting job, or whatever) will offer me intellectual stimulation nearly as good, if not better, than what I can obtain as a researcher. Conversations I have about problems with people in industry rarely last 15 minutes before I get the first sour “Why would you focus on that aspect of it? That has like zero relevance to our market” face, and the experience of my field colleagues (theoretical CS) in that regard seems broadly similar. It is probably true that STEM PhD programmes nowadays contain many people who would be better served outside of academia, but that this is even true for the majority does not mesh with my sample of reality.

        My pay (between $25k and $30k a year, depending on whether I TA continuing-education classes over the summer break) is more than sufficient, to the point where I have been seriously wondering if I shouldn’t start devoting more time to figuring out what to do with what I don’t use up; I live in a modern apartment a 10 minute walk away from my workplace, get to set my own hours, collect all the social and infrastructural benefits of a university campus, get free travel (through conferences), and TAing is pretty much the only context in which I ever have to spend time on anything I don’t want to spend time on. What is supposed to be the problem?

      • grort says:

        The argument you seem to be making is that some grad students can become tenured professors, and tenure is pretty nice, therefore all grad students must be in academia because they want to be tenured professors. (More: that all grad students must really dislike academia, but they’re slogging through it because they hope to receive tenure at the end of it.)

        Let’s make an analogy to artists. Some artists can get really famous, and can sell their art for lots of money, and that’s really nice. Therefore, is it fair to assume that all artists must be making art because they want to get really famous?

        I think if I said that to an artist — if I said “statistically speaking most artists don’t get really famous, you’d be able to make more money if you went into something more commercial like advertising” — they would be upset with me. Artists are artists because making art is what they want to do. And, likewise, many grad students are grad students because academia is where they want to be.

        — Now, I’ll happily agree that some disciplines in some universities are probably sweatshops. I think it probably correlates with the availability of Real Jobs in the field in question. For example I’ve heard that it’s super hard to get a Real Job with a bachelor’s in biology or chemistry, so I could believe that most bio and chem grad students are probably pretty desperate.

        But I went to grad school in computer science, and we all were very aware that we could get good jobs outside of grad school. We were all in academia because we liked being in academia, and I think very few of us had any interest in a lower-paid professorship whether it had tenure attached or not.

        And it — it kind of makes me sad, a little, when people suggest that my time in grad school was an error I made because I was foolishly hoping to get tenure.

    • Janne says:

      I went to grad school and got my PhD just out of pure curiosity. I never intended to stay in academia afterwards (and I only did in the end because of personal reasons).

      To me grad school is a tropical island: Somebody comes up to you and says “You want to live on a tropical island? I’ll pay you all expenses, and give you pocket money for you to live on a beautiful island for five years, no strings attached! Work on your novel, learn to paint, do whatever you want! How about it?”

      It’s a once-in-a-lifetime opportunity to learn a subject you love in real depth, to really dive in and focus on a fascinating problem just for its own sake, to travel to conferences and meetings* and talk with like-minded people. Seen that way, it was an easy choice to make.

      This, by the way, assumes you have funding for grad school. Don’t accept a position where you need to fund yourself through loans or something; that’s a suckers game. The project budget should cover grad student expenses and the last year or two your work will be a net asset to the lab. Where I did my degree, graduate school is a job (if low-paid), with pension benefits and all.

      * Most attendees at science conferences are grad students and postdocs, not tenured professors. They’re too busy finding grant money and manage the lab to do their own research or travel much.

      • LTP says:

        If you don’t mind me asking, what subject did you get your PhD in and what job do you have now?

        I am strongly considering going into graduate school with this mindset, but I’m worried I won’t be employable afterwards because I’ll be “overqualified” and have gotten my degree in something that is “useless” (philosophy).

        • Janne says:

          I had an M.Sc in computer science, and did my Ph.D. in cognitive science – The subject sits on the border of computer science, linguistics, psychology, neuroscience and philosophy. We belonged to the department of philosophy, so technically my degree is in theoretical philosophy. Nobody has ever cared about that, though; they look at what I did for my thesis.

          I work in Japan as a computational neuroscientist, mostly on doing large-scale brain models to better understand cross-area functional relationships. A lot of the work involves developing or porting software tools for large-scale simulations on clusters and supercomputers.

          Am I overqualified? Rather, I’m oddly qualified. I have gotten unsolicited job offers so clearly not everybody is put off by my somewhat specialized mix of talents.

        • Brock says:

          Been there, done that. Spent five years in philosophy grad school, and dropped out with an ABD when I saw how dim the career prospects were. Also, the dissertation was going nowhere.

          Best career move I’ve ever made. Luckily, this was in 1997, when getting an entry-level job in IT was a matter of being able to fake it.

          My advice: make sure that if you fail, you fail quickly, or fail gently.

          One way to do this: apply only to top-10 schools. Don’t apply to a fall back. Philosophy is very much about academic pedigree, so your chances of a successful career fall off rapidly with the prestige of your school.

          Or apply to a terminal masters program as a fallback, which might serve as a springboard to a top-10 PhD program, or might just convince you that it’s not what you really want to do, with only two years spent on the masters. Don’t do a masters thesis, though; you need to spend your time polishing the writing sample you’ll submit to the PhD programs.

          Internalize your recognition of the sunk-cost fallacy. At the end of every year, reevaluate your decision to pursue the PhD, and if your only reason for staying is that you’ve put in so much time already, get out!

          And have a back up plan. Mine was “get a job working with computers”. If you don’t have a back up plan, you end up staying just because you don’t know what else to do, and that’s not good.

          That said, the first three years of grad school were very enjoyable. I took classes from and with some really smart people. It’s when you’re done with classwork and start the dissertation grind that it really becomes a living hell.

        • Troy says:

          I will second Brock’s advice to get a terminal M.A. first if you’re not sure whether or not you want to do a Ph.D. or whether you want to be a philosophy professor. That will give you an idea of how much you enjoy graduate work in the field and give you a degree after a shorter period of time that means just about as much in the non-academic world as a Ph.D. in philosophy. An M.A. in philosophy also puts you in a better position for applying to top-level Ph.D. programs.

    • eqdw says:

      I’m not familiar with how academia works, and I’ve always wondered this. Maybe you can clear it up for me.

      It’s always from a “this is a job with crappy pay but great opportunities to work on interesting problems alongside good people” perspective.

      Why do you think that academia is the best way to do that? See, when I think “how do I get opportunities to work on interesting problems with good people”, I think in two steps. Step one is “make a ton of money”. Step two is “now I can do whatever the fuck I want”.

      And, looking around me, this doesn’t seem that unrealistic. I’m in the Bay Area, right. There’s a lot of Burning Man folks here. And about a third of them did exactly what I just laid out. They made assloads of money doing whatever smart people do, and now they’re spending it on things. All sorts of things. All sorts of weird applications of niche science that I would never even think of.

      To me, the burner route seems a much more viable (not to mention comfortable and stable) way to get those opportunities. Am I missing something?

      • Pku says:

        I think a lot of it is the culture you come from. In the bay area that seems like the default idea, but there are a lot of places where academia is what smart people expect to do. I didn’t even really think of “real job” as a valid possibility until I moved to the states.

      • Professor Frink says:

        One thing a lot of the “make a lot of money then do what the fuck I want” crowd misses is the equivalent of the doing a science apprenticeship (which is basically what grad school is).

        As a result, they have no real clue how to set up a controlled experiment, break down a research problem and actually accomplish science. I hope this doesn’t change anytime soon because consulting for projects like this has been fairly lucrative.

        • eqdw says:

          I guess my confusion is, why does that have to happen in a university?

          • Professor Frink says:

            Its where the people with the most experience doing science all work?

          • Tibor says:

            Statisticians from our department (we are an institute of mathematical stochastics) often give consultations like those Professor Frink mentions – to people in other departments. They said the medical doctors are the worst…the others don’t understand statistics either, but the doctors think they do 🙂 I am at one of the best (or let’s say top quarter to be safe…I dunno the rankings exactly) universities in Germany, so this does not seem to be a problem of the university itself.

            So I dunno, my impression is that outside fields like math, physics or computer science, the actual understanding of statistics is really really bad (ok, still better than that of your average Joe the accountant, but I mean among people doing some kind of research). This is worrisome, because a lot of research is crap because of this. But it does not seem to be exclusively a problem of independent scientists.

        • Airgap says:

          It’s possible that this inability is not caused by a lack of grad school but by them being idiots. Grad school filters out idiots to some extent. Do you know anyone who arrived in grad school with no idea how to set up a controlled experiment and acquired said idea there? NB: Not “Refined their skill at running experiments.” I doubt this, but do tell.

          • Professor Frink says:

            Almost everyone I knew in grad school was a dilettante who thought they were deeply knowledgeable when they arrived.

            Science is very difficult, and requires a great deal of experience. Ask any grad student late in their career why their first year was unproductive, the answer is always “I had no clue what I was doing.” I was particularly terrible at the actual science aspect of science so I actually switched to a closely related area of mathematics so that I could avoid it.

          • Eli says:

            Everyone thinks they’re really good at rigor, methods, and domain knowledge when they arrive to graduate school. That basic non-idiotness is why you were admitted.

            It’s not even close to sufficient to get you through to successful experiments (empirical sciences), proofs (formal sciences), or prototypes (engineering).

        • Shenpen says:

          Does science require experience? I thought it is a highly formal field where the “recipes” for doing all these can be put into books.

          • Vaniver says:

            Does science require experience? I thought it is a highly formal field where the “recipes” for doing all these can be put into books.

            Ideally, science does not require training. In practice, there are so many fiddly bits that it is impossible to put all the fiddly bits into writing, especially in biological fields.

          • Peter says:

            Well, seeing how the first year of my PhD was basically a write-off and my final year was pretty productive – and loads of people I’ve talked to have had the same experience, yep, it requires experience. There are some areas where you can just follow the protocols and grind away (I’ve heard of a few areas of biochemistry where there are projects to automate it), but who’s going to write those protocols?

          • Eli says:

            There are recipes. However, they are not per-field or per-subfield recipes. They’re per-lab recipes. The recipes are passed down by oral tradition from adviser to student as the adviser brings the student into their own tradition of research.

            This has the advantage that you don’t have to wait for New Textbook Edition Time (which in postgrad-level material can be 13 years away) to learn how to do research in a rigorous, up-to-date way, with the disadvantage that it relies on your adviser knowing how to do that, caring enough to do it, and being properly held to account for rigor and recency by the journal editors in his field.

            When that doesn’t happen, you just get a lot of very depressed grad-students.

            Source: myself, my girlfriend, and one of our other friends… all our grad-school time.

          • Marc Whipple says:

            In theory, there is no diffence between theory and practice. In practice, this is not so.

          • Tibor says:

            I can only comment on maths, which strictly speaking is not science, but I think that in one way or another this holds more generally. So the problem is, you do your bachelor and master (in Europe, the PhD only starts after the master, which takes 2 years, while bachelor is mostly 3 years only and almost everyone continues to master) and you learn the theorems and everything. You even understand the theorems on a certain level – that is, you can reproduce their proofs, you probably even know which assumptions are needed for what and so on. You can even prove some simpler statements without seeing the proof beforehand, but mostly those proofs consist of formally manipulating things you already know…Then you start your PhD and suddenly, you have t actually apply those theorems – mostly to make new theorems…and you realize you don’t understand them at all. That is, you understand them at a formal level, but rarely on a more intuitive level. This is hard to teach (although it can be done so, I think…you need a lot of pictures above all) but vital to learn if you are going to be successful at working with those things. So you revisit a lot of what you learned before and try to get a better intuitive grasp of that (I am not fully satisfied there with mine yet) and you try it with the new stuff too. And with the new stuff, it is especially hard. Basically, if your intuition about something that you are trying to explore in maths are right, then you are halfway there. This is where experience kicks in. The people who worked on hundreds of problems, some of which were “sort if like this thing, except for this and that”, those people can “guess” the right approach to the problem much faster than an inexperienced PhD student. This is also probably why in maths it is common to meet your advisor every or almost every week, whereas in other fields (it was a surprise to me when I learned about that), they see them maybe even just once a month. The only way to get this kind of insight seems to be through experience. By working on a lot of various problems, you get a sense for what is the right approach for other problems. And also, you learn to give up sooner on a certain approach that turns out not to lead anywhere and switch to something more promising. For example I was working on a problem recently, wrote 2 or 3 lemmas that were slowly sort of moving me to the right direction, spend a couple of months focusing on that…only to give up on it eventually, taking a step back and realizing that a very different approach is way more efficient and elegant. Had I had more experience, this process would have been way faster.

        • Smoke says:

          “consulting for projects like this has been fairly lucrative.” Are you able to consult for science projects in general, or just those in your subfield? If the former, is there any book I can read to learn about this generalized “doing science” skill?

      • Anon256 says:

        Depending on what you want to do, being a grad student may allow you to skip the hasslesome “make a ton of money” step and jump straight to doing whatever you want. Also following your two-step plan there’s some risk of getting stuck one way or another at step one and never making it to step two, though I think this risk is probably overblown.

        • cgag says:

          I’m curious why you think it’s overblown? I’d expect it was the vast majority of people who never reach step 2.

      • Adam says:

        The Bay Area is a unique place where a handful of billionaires have set up shop to throw cash around like darts at a dartboard because 1) they can afford to lose the money anyway, and 2) failing nine times out of ten is still profitable if ten is funding the next Uber. That gives people in the Bay Area an unrealistic view of how feasible “make a lot of money, then do whatever I want” is as a life plan to a randomly selected human from anywhere in the world.

        • John Schilling says:

          It gives people in the Bay Area an unrealistic view of how feasible “make a lot of money, then do whatever I want” is as a life plan for people in the Bay Are as well. The billionaires aren’t going to make you even a millionaire on a secure salary, it’s probably too late to parlay an affordable starter home into millions in the Bay Area real estate market, and your stock options are 90% likely to be worthless in the end.

          • meyerkev248 says:

            This This This.

            It’s called the miracle of dilution.

            FOUNDERS have a shot. They get points. Significant ones. 50% of some 8-figure startup with 20 employees split across 3 founders is a lot of money. 50% of Facebook is Hoo Boy.

            Employees… not so much.

            (Please note that all of these numbers should be taken in the context of Bay Area housing prices, which have both doubled since I arrived here and are in the “HAHA, this 800 sq. ft. condo with a 45 minute drive to work sold for $800K cash and this 1500 sq. ft. ranch home sold for $1.938 Million in cash”. It’s an odd catch-22. You have to be here to make the moolah, but you need so much more moolah to be here)

            Best case, you get a quarter. Of a percent.
            More likely, you get a dime (.1%) or a nickel. I’ll use a dime because it’s easy math.

            So let’s be clear. If you have a dime, and your company IPO’s for *Dr Evil* ONE BILLION DOLLARS *End Dr Evil*, you get one million dollars. At which point, under current Federal and CA tax law, you lose about half if it’s income and about a third if it’s capital gains. So half a million.

            My general impression is that it’s going to take 7-10 years for a company to IPO with a billion dollars nee major exceptions, assuming that you’re lucky to work for the 5% that actually DO. (Sadly, while VC’s can invest in 10 companies, you can only work for 1).

            So call it 70K/year post-tax. Except that as part of “the startup thing”, you probably took a 30K post-tax pay cut.

            So $30K/year for a 1 in 20 chance of making $700K. Which isn’t even enough to buy a house. Since all-cash is the only way to go these days.

            THEN Dilution kicks in.

            You see, every time you do a new round of funding, they release more stock. Which means that you started out with a dime and ended up not having 2 cents to rub together.

            Nee major, major exceptions, waiting for your company to IPO will cost you money. Even in the case of the major, major exceptions, the region is so expensive these days that you still won’t be able to afford your own house, or an air conditioner in your tiny cramped apartment that you’re spending the median national wage on to have roommates in.

            So… how DO you make enough money in Silicon Valley?

            It’s called the acqui-hire.

            1) Get hired in about 2 years before they go under. This will cost you $60K in post-tax income.
            2) Get a five-figure check on the acqui-hire, a nice $10K/year pay bump and another 5-barely 6 figures vesting over a few years. You’ll lose nearly half of this to taxes, since this will be added on top of your BigCorp salary.
            3) As soon as humanly possible, NOPE your way out of the Bay Area through inter-office transfer. Bonus points if you keep your Bay Area salary.

            So because of startup pay-scale + enforced savings, you’ve lived like an relative pauper, at which point, you’ve managed to save up 100 grand over 5-7 years.

            Will this make you rich? No.
            Will this pay for about half of your house off the stock grants when you move back to the Midwest? Yes.
            Will it at least break even on average compared to sticking it out back home and doing the career thing? Yes.

          • Devilbunny says:

            meyerkev248, this is why I live in Mississippi, despite its many problems. I make more than I would in California, I pay less in taxes, and at the end of the day, a house is a house – mine would probably run about 3-4x what I paid in a similar LA neighborhood and would be out of sight in SF.

            The logistics of living in flyover country are pretty good.

    • grort says:

      I went to (and finished) grad school and “become a professor and get tenure” was never a goal for me.

      — In fairness, I don’t think I’d rationally considered the economics. My main thought was that I was enjoying being in school and learning things, and I wanted that to continue, and I’d worry about the money later.

    • Mark says:

      I strongly agree. When I worked, a good potion of free time got taken studying the subject I am now payed to learn.

      That said, you need to know what makes you happy.

  7. “Employers resent this and (in theory) try to limit access to the privilege by lowering workforce, automating, etc, as much as possible.”

    That’s a very strange way to describe the change in employment that might come after a minimum wage increase.

    • Adam says:

      That is definitely a weird way to describe it rather than ‘a $40 budget constraint naturally places different hard limits on how many $1 burgers versus $15 burgers you can buy.’ Sentiment doesn’t need to play a role.

  8. Alexander D says:

    >I’m liberal Monday Wednesday Friday and libertarian Tuesdays and Thursdays

    And weekends?

  9. onyomi says:

    I’m currently in academia and hoping to get tenure, but I think it’s a pernicious system overall.

    First, it creates a real sense of first and second-class citizens. No matter how much people try to pretend otherwise, people look at you different when you’re on “the tenure track.”

    Second, it’s strongly self-reinforcing: people who supposedly have promise to produce a lot of research get on “the tenure track,” which gives them light class loads, opportunities for research leave, etc. all designed to give them the time and resources to do that research. If you really want to do research, but are unlucky enough (and it really is a pretty big crap shoot, at least in my field) to miss that “tenure track” window, you will likely be stuck teaching 4 classes a semester and therefore not having time for the research that would get you on the elite track.

    Third, the vast majority of professors don’t use it as an opportunity to do controversial, innovative research. They just use it as an opportunity to slack off. And I can’t begrudge them that, because the run-up to getting tenure is 10 to 20 years of pure stress. If it were more like “for each new book you publish you go up a paygrade” rather than “publish two books and reach the promised land where you’ll never have to do anything again,” there would be more incentive for older professors to keep doing something.

    • houseboatonstyx says:

      > If it were more like “for each new book you publish you go up a paygrade” rather than “publish two books and reach the promised land where you’ll never have to do anything again,” there would be more incentive for older professors to keep doing something.

      Something other than teaching students? From way out here in early twentieth century Autodidactica, tenured professors looked nice: relaxed among peers on modest secure income above the rat race, teaching students the tone of a liberal education.

    • Shenpen says:

      >First, it creates a real sense of first and second-class citizens. No matter how much people try to pretend otherwise, people look at you different when you’re on “the tenure track.”

      Are you instinctively egalitarian, or why you dislike it? I would love being a noble looking down at serfs, I would also love being a serf and hoping one day I will be a noble looking down on the other serfs. I would even be okay with being a favorite serf of a noble and thus being above other serfs.

      This all sounds like a far more interesting than egalitarianism, that stuff sounds like a boring game with nothing to win. Without a real first and second class division, what kind of game you even play?

      And this is even meritocratic non-equality, so more defensible if you care about stuff like ethics.

      • Tom Womack says:

        Serfdom really isn’t a system with a promotion process: serfs don’t become nobles. Very, very rarely serfs become soldiers and such notable soldiers that they become nobles (see Shakespeare’s _Henry V_ and the origin of the noble surname Bowes; but 15th-century Britain didn’t really have serfdom).

        It is conceivable with perfect play that a sixth-generation descendant of a serf might be a noble, via extremely careful pushing of a succession of beautiful daughters to exploit the various kinds of permitted idiosyncracies of nobles – the path from actress to noble has occasionally been followed, and serf actresses were not unprecedented.

        • Shenpen says:

          Do you mean feudalism was more rigid in England and France than further East? That would be an interesting proposition.

          Because from Poland to Hungary to even Transylvania whole villages were routinely ennobled. Given that serfs paid taxes but did not serve in the army – it would be foolish to trust the oppressed with weapons – every time they needed soldiers more than money they just picked a village or two that was really brave at fending off robbers or something and ennobled everybody.

          They were still peasants, not magically rich, although now they got to keep their taxes – and spend it on arms, so the financial outcome was not necessarily positive but of course they enjoyed the status elevation.

          There are whole ethnic groups and subgroups in the region where the standard model was being an ennobled (and with that, right to bear arms) peasant or herder, ex-serf – Cossacks being the most famous or hajdús in Hungary and even a common pattern amongst the székely / szekler people of Transsylvania: they were originally settled there to guard the border, hence they paid taxes in in blood not money, the noble way, i.e. allowed to keep arms and required to do service. Sometimes they were called noble, in other times “free” – it meant the same thing.

          It is possible that England had a more rigid system, since England allowed, even required serfs to own bows, and regularly raised levys, which is as bit mind-boggling, just how they didn’t murder all the lords I do not know.

          • Richard Gadsden says:

            Nobility was much rarer in Western Europe.

            Most of those bowmen were yeomen, not serfs – they owned their own land and weren’t tied to the village.

            A serf could become a yeoman. A man-at-arms (from a yeoman class) could be knighted on the battlefield. But none of those made you a noble.

            The whole of England never had a full thousand noblemen. There are only 817 nobles even now (plus 725 life peers), and that’s including Scotland and Ireland. There were rarely more than 200 in the actual feudal era – from a population of about 2 million, so 0.01%.

            Poland had about 10% of the population in the szlachta. A noble peasant was a contradiction in terms in England, a peasant szlachic was normal in Poland. “Szlachcic na zagrodzie równy wojewodzie” would be simply untrue in England – a gentleman who owned a farm was not the equal of an Earl.

            It’s not that the West’s feudalism was stricter or less strict than the East’s; rather the West had more gradations. In the West there were a whole bunch of steps between serf and noble – yeoman, gentleman, esquire, knight, baronet, at the very least. And there were five gradations within the nobility (baron, viscount, earl, marquis, duke) before you get to the royals (prince, king). In Poland there were serfs, free peasants, and noblemen (plus citizens of the towns and cities).

      • onyomi says:

        America has taken all the joy out of feeling superior, as we see with the lifestyles of Mark Zuckerberg and Warren Buffett. Asians still understand how to act aristocratic:

        • SFG says:

          I’d argue it’s a good thing–less money spent on expensive toys that have only signalling value, though of course nerds love their gadgets as much as the frat-boys love their yachts.

      • Eli says:

        Suffice to say there’s a lot of us who just don’t like the whole experience. Dealing with steeply unequal social situations is, to me, about as pleasant as consuming rotten, stinking (but not actually poisonous) food.

      • Deiseach says:

        A serf, by definition, can never become a noble. Pride of blood, heritage and breeding. It’s like saying “A donkey can hope one day to become a thoroughbred racehorse”.

        Only in times of massive upheaval can the common folk aspire to become great, by leading armies or amassing enough money to buy their way into the upper classes. And even then – see the Peasants’ Revolt and what happened to Wat Tyler and Jack Straw.

        • Airgap says:

          Serfs can aspire to be freemen. Freemen can aspire to be gentlemen. Gentlemen can aspire to be ennobled. Social climbing, like other sorts of climbing, is done one rung at a time. A ladder isn’t broken because it’s not an elevator.

          • Deiseach says:

            every time they needed soldiers more than money they just picked a village or two that was really brave at fending off robbers or something and ennobled everybody.

            That is not ennoblement, that is enfranchisement. A village of serfs may be declared yeomen and given certain privileges in return for certain obligations, but that does not automatically make them all barons and earls. And I’d imagine if any noble did eventually come out of it, it would be more likely to be the village headman or elder who got the rank – even a village of serfs will have a pecking order!

            It will take a long time for a serf to become a yeoman to get on the ladder of advancement; probably long down the line one of his descendants may indeed be ennobled, or make enough money to marry into the impoverished aristocracy (although aristocrats are very good at not being impoverished; look at the Duke of Devonshire or the Duke of Westminster, who owns chunks of very valuable real estate in high-status areas of London which bring in considerable rents).

            The marriage trade where American heiresses married land-rich, cash-poor European (including British) nobles, was the primary way of commoners attaining the rank to go with their money – see how Jenny Jerome, daughter of a wealthy American financier, married Lord Randolph Churchill (third son of the Duke of Marlborough and – personal note of animus here – a poisonous little toad, and as a third son with no prospects of his own apart from making a wealthy marriage to support a political career), and if you don’t think there’s a difference between arrivistes and nouveaux riches and those who are the ton, why do you think we have such terms in the first place? 🙂

          • Richard Gadsden says:

            Deiseach, this is a translation problem.

            “Szlachic” is not the same as “noble”, but that’s the conventional translation.

            It’s more like the old Roman or Athenian idea of citizen – the szlachta were all equals to each other (where western nobility had a hierarchy of ranks) and were the enfranchised class – the only ones allowed to own land outside of towns, the only ones entitled to vote, etc.

  10. moridinamael says:

    > Why don’t chef or engineer salaries look like this?

    My guess is just culture. Having been in academia for way too long myself, I inderstand the cloying psychological effect of getting used to the idea that Professors are more than human, that it is right and proper that they possess great power and prestige and corresponding pay.

    In contrast, as a practicing engineer, I/we as a group view this sort of hero worship as horse shit. Nobody is so good that somebody half their age can’t contribute to their work or even correct it. So while there is a sliding scale that corresponds roughly to what you might call “resume impressiveness” there isn’t any pervasive myth that Level 2 Engineers are just better and more valuable and deserve more categorically.

    There’s also an entirely separate argument from economics. If you view engineers as equipment with high maintenance cost, you want to optimize that human equipment for productivity while maximizing net profit, which implies a not-too-low but also not-too-high level of compensation. There’s no signaling, there’s only results. Universities etc aren’t restricted by the same economic incentives as engineering firms. A huge component of their economic model is signaling how awesome and prestigious their professors are, and part of that signaling involves paying them well.

    • Pku says:

      Is this really that common? I’ve never really seen this in math culture (Well, Except for Terry Tao). But then, I have heard that math culture is really different from a lot of academic culture. (I do think my advisor is ridiculously smart, but that’s less a function of him being my advisor and more a function of having seen him pull some amazing stuff out of his sleeve).

      • LTP says:

        Possibly it’s different because math is the only subject where the quality of your work can 100% speak for itself to other experts with no real room for alternate interpretation or ambiguity.

        • Douglas Knight says:

          The quality of work is almost entirely whether it is interesting, which is just as subjective in math as anything else. There is a difference in math that because the questions are precise, a question can be endorsed as interesting ahead of time, but very few papers are complete solutions to pre-specified questions. Usually a paper will claim to be progress towards a popular goal, but it requires judgement to decide whether it actually is.

      • Creutzer says:

        I think it varies a lot by scientific field, scientific sub-field, and even country. I’m in culturally a math-like sub-field myself, but I’ve heard stories from the neighboring sub-field that suggest it’s not like that. I will not speculate on whether the fact that my field is also more math-like in terms of subject matter plays any role in this.

      • James Picone says:

        Boasting aside: I’ve worked at the same company as Trevor Tao (Terry’s brother), just across a hallway. Talked with him a few times – he’s a really friendly guy. He also used to solve a 4×4 Rubik’s cube in the cafeteria line for fun.

    • Shenpen says:

      Re: hero worship: it depends on which tier you are talking about. A 140 IQ person in a 120 IQ team is no magic, but a 120 IQ person in a 100 IQ team is magic. Hero worship is wrong in the first case (Silicion Valley super smart innovators) but right in the second case (a thinking person in a team that does all by routine).

      When I work with average .NET business logic programmers or SAP consulants or that bunch, I often yearn for someone to show ANY sign of intelligence, as in, not just running through trained routines but show some kind of spark of problem-solving, creative skills.

      I think you work in the higher tier, the first case. The kind of companies, like Microsoft, who use creative problem solving tests at hiring.

      I tend to work in the second tier, because it is safer money. The kind of tier that hires by years of experience. Here, an actually intelligent person IS a hero.

      • Creutzer says:

        A 140 IQ person in a 120 IQ team is no magic, but a 120 IQ person in a 100 IQ team is magic.

        I’m inclined to speculate that you have insufficient experience with people 15-20 IQ points smarter than yourself if you think this becomes less magical the higher you get on the scale. But why would you think that, anyway? In either case, one person can think thoughts, and see and do things that are inaccessible to the other.

        When I work with average .NET business logic programmers or SAP consulants or that bunch, I often yearn for someone to show ANY sign of intelligence, as in, not just running through trained routines but show some kind of spark of problem-solving, creative skills.

        This suggests you believe these people to be around IQ 100, which I find highly unlikely. Maybe I’m hopelessly miscalibrated, but I don’t expect IQ 100 people can make a living writing computer programs.

        • veronica d says:

          Honestly, I’ve met some remarkably “dim” .net-type programmers, and these people seem qualified to kinda-sorta figure out a basic CRUD app. They know where the pieces go and how to put them in place. They can even, through sheer force of will, pound out something kinda complicated, but in a very basic “just keep adding code” sort of way. If they need new logic, they add a new if-then block. Add and add and add. At no point do these people realize that they can step up a level of abstraction.

          So yeah. I doubt these are IQ 100 people. I’d guess maybe they’re in the 110-120 range, perhaps a bit more. But they’re just — they know what they know and that is it.

          I’m manifestly different from them.

    • Adam says:

      I’ll also say this somewhat plays out in the military, too. While a young officer for a while makes a salary comparable to or even better than a lot of his peers in industry, when you start getting high up like a brigade commander or especially a general officer, you make a pittance compared to someone with comparable corporate authority. But god damn, they make up for it in the way they get treated by everyone around them. Their day to day lives look like a parody of Henry VIII, at least when they aren’t getting called before Congress.

      This seems to play out in fields where the upper limit of compensation for star performers is capped in some way, whether by public sector budget realities for division commanders or for doctors in the way medical treatment can only scale to so many patients you can treat within discrete chunks of time, and unless you’re a psychiatrist for movie stars or surgeon for NBA athletes, you can’t feasibly charge infinity dollars for your service.

  11. Janne says:

    I honestly don’t think tenure itself is the defining mechanism of the two-tier structure in academia. Many countries don’t have such a hard cut-off in the academic career, and you still get a two-tier system in practice. The reason is rather that funding is limited, and it tends to flow to those that already have funding. Those that can pull in resources get permanent positions; those that can’t will work on a per-project basis for the people that got the funding, getting paid from that funding in turn.

    And as much as I enjoy griping about the horrible state of academia, the fact is, the “have-nots” in this particular world aren’t really suffering. I’m a serial post-doc, and have long given up any idea of a permanent position. And sure, I’ve gotten job offers with three times my current salary; compared to my non-academia peers I’m poor. But compared to people in general I have nothing to complain about.

    My post-doc salary is a bit above median for people my age in my country, and it lets us live comfortably, go on vacations, save for retirement and so on. Sure, I have to find a new place to work every few years, but it’s not as if industry work sets you up for life either. At least I won’t ever lose my job at a months notice because someone decided to pull the plug on a project. I can come and go whenever I want and do side projects for myself as long as I do my job, and it’s much easier to actually take vacation days than at a company job. In absolute terms, an academic “have-not” existence is still a pretty good place to be.

    • onyomi says:

      Your position sounds a lot better than that of many of the academic “have-nots” in America. I have one friend who teaches five classes a semester at three different schools. For each of these classes he is paid something like $3000. Meaning he makes about $30,000 a year for doing way more actual teaching than tenure track academics do in 2 or 3 years. $30,000/year where he lives is barely above poverty level.

      Yeah, it beats being a coal miner, and maybe beats working at Starbucks in terms of personal fulfillment, if not salary and benefits, but still kind of sucks.

      • Janne says:

        Note that I’m talking about research positions, not teaching. Also, I’m not in the US so comparisons may be halting. US postdocs are listed as having a $43k median yearly salary which sounds pretty comparable to my experience.

        While 30k is not a huge amount it’s not exactly chicken feed either. I just looked it up for the US; and it’s about the same as an administrative assistant, an optician or a paramedic – jobs that people can and do support themselves with.

        Postdoc salaries are about the same range as that of a high school teacher or an accountant. Not a bad life.

        • Fallenscien says:

          Direct comparisons of salaries will lead you to incorrect conclusions, because cost of living varies significantly by location.

          $30K a year is enough to live comfortably in a small town. And if you’re a high school teacher or optometrist, you can get a job in one quite easily. The demand for high schools and vision clinics is pretty evenly distributed. An average one-bedroom apartment in a small town is probably going to run you less than a grand a month (based on current, personal experience), but you’d probably pay less than that, because you’d buy a house. Real estate is cheap in small towns.

          As a PhD working in a university lab, you are restricted to working within commuting distance of a large university. $30K a year isn’t enough to pay rent on an average one-bedroom apartment in New York (currently around $3040 a month based on a cursory web search). It’s going to be a tight budget for even fairly small universities, because real estate prices are driven up by the large, captive student population and comparatively wealthy professors, as well as the tendency of universities to be right in the middle of cities.

          • AJD says:

            Do universities tend to be right in the middle of cities? I guess when I think of high-profile American universities, plenty of them are in rural areas (UMass, Penn State, Indiana, Dartmouth) or suburbs (Michigan, Stanford, Princeton) rather than cities (Columbia, Penn, Georgetown, UCLA)

          • Janne says:

            A postdoc pays a median of $43K, not $30K. And that’s a median; big city universities will tend to offer higher wages to offset the expensive location.

          • PDV says:

            Universities aren’t all in the middle of big cities, but just by existing in a location for a while, they almost all build up a small city, or at least a large, wealthy and expensive town. Take UMass Amherst; that collection is in the middle of a rural area, but it’s on par with cheaper urbanized areas for cost of living, because there are five colleges worth of college students and they bring a lot of money into the local economy.

          • Anthony says:

            AJD: housing near Stanford.

            Housing nearer Stanford, in a much safer area.

          • AJD says:

            I didn’t say it was cheap to live there; I just said it wasn’t in a big city.

      • grendelkhan says:

        For each of these classes he is paid something like $3000.

        This is really a different topic, but wow is that some serious exploitation there, in the technical sense. Handwaving a class size of twenty, that’s $150 per student per class, probably $50 per student per credit hour. Costs per credit hour seem to range from $250 at the low end to $1,289 at, say, NYU.

        So your friend is capturing at most twenty percent of the value they’re bringing in, probably considerably less. If any of the classes they teach are giant lectures, the math gets even more horrifying. Rising tuition, falling wages. Ugh. I can see why there’s a ha-ha-only-serious sentiment of “burn it down” from the people getting screwed over by this arrangement.

    • Professor Frink says:

      Your postdoc situation is substantially better than US academics, where the pay is generally a bit lower than the average college graduate, and you get no paid vacations. Postdocs are also often soft-money, so if a grant doesn’t come through they can lose their job on short notice.

      Adjuncts are even worse. It’s common for people to teach 4 or 5 classes at multiple school for 2k-3k each. After driving expenses, health insurance,etc you are talking 50 hours a week for poverty level wages. You make more with better benefits as a cashier at Wegman’s (a grocery store).

      • Janne says:

        As I said in the other comment, postdoc salaries seem to actually be about the same as here. Much lower than if I’d skipped grad school, but still a fairly good, livable salary in absolute terms. And yes, it’s soft money – I go from project to project, leaving whenever a project ends. I’m no less secure than I’d have been working in some small IT shop.

        Adjuncts aren’t researchers, so they’re not directly comparable to the tenure/non-tenure research division.

    • Creutzer says:

      I’m pretty sure that despite the fact that in some places, being a post-doc is financially fine, people suffer from the perpetual uncertainty about whether there will be a next position for them, and especially from the need to move every two years. It sucks to have to find new local friends all the time, and let’s not talk about relationships. “Is he/she also in academia?” – “Yes…” – “Oh dear, sorry to hear that!”

  12. Liskantope says:

    I agree with most of the points I’m hearing against dualizing systems, but I also see a certain attraction in the certainty it brings once one has either reached the upper tier or entirely failed to reach it. Of course, this advantage is directly reduced when one considers all the time and energy required to unambiguously pass that fork in the road. For instance, I would rather have tenure and be guaranteed to be reasonably well off (while not being able to exceed an annual income of $100,000 or so) than find myself in a more flexible system where it’s possible to move as high as I like on the ladder depending on how well I do, but where there are no guarantees. The main problem is that, even after graduate school, I still have no idea whether I’ll be able to achieve tenure, and I expect many more years of pretty stressful positions before I can get it.

  13. Carl Shulman says:

    “and the idea of a half a minimum wage defeats the whole point.”

    Wage subsidies like the EITC scale smoothly. If under the minimum wage X people would get employed (at the minimum wage or higher), you can get rid of the minimum wage and set an EITC such that more than X people earn (in wages+EITC) at least the value of the old minimum wage, plus additional people earning less.

  14. Douglas Knight says:

    There are two things going on in academia that I think people mix up: adjuncts vs post-docs.

    Post-docs are temporary researchers who are working up the pyramid to a research professor job. They are full-time jobs that pay better than grad school. Grad students are all paid about the same across fields, but you can tell the health of a field by how well the post-docs are paid. People spend a lot more time as post-docs now than a few decades ago, but most professors go through this stage, so it’s a plausible lottery ticket.

    Adjuncts have lost the lottery and refuse to admit it. At most they have a chance at a stable teaching job, probably at a community college, not “travel to conferences in far-off countries.” They could accomplish pretty much the same thing by teaching high school.

    • Eli says:

      Except that universities have been cutting back on professorships and replacing them with more postdocs and grad-students, while also making professors, as such, do more management and administration than actual research. So actually, what many scientists now want is to eliminate post-docing entirely in favor of “perma-doc” positions, in which at least some funding and job security is permanently assured but the researcher can do research.

      • Creutzer says:

        Yes. France is the only country I know of where such a thing exists, and there is consequently much competition and the positions are badly paid. I’d still take one in a heartbeat.

  15. mico says:

    It’s also possible that drug footsoldiers’ labour is only worth $3/hr and so they can’t work for McDonald’s because it is subject to minimum wage laws. Another reason is that working for McDonald’s is less likely to get you a girlfriend.

    Neither of those apply to wannabe academics – they can work legally in the regulated market at a price they are worth, and increase their dating market value at the same time.

    • Vaniver says:

      Another reason is that working for McDonald’s is less likely to get you a girlfriend.

      Is this actually true? I see conflicting reports, and I’m reluctant to trust the glamorous version over the ‘two cultures’ version.

      • Oliver Cromwell says:

        Could you show me a report that suggests an entry level McDonald’s worker has higher sex appeal than a drug gang footsoldier?

        • merzbot says:

          I think there are a lot of people who would rather be in a relationship with someone who isn’t a gang member and career criminal than someone who is.

          • Tarrou says:

            They’re called “men”. The stereotype of women loving “bad boys” is so ubiquitous as to defy argument. Even Sheryl Sandberg is out there telling women to go out and bang a pack of ex-cons before belatedly, after many years, seeking out a stable househusband to drain resources from and dominate emotionally.

            Give a woman under thirty a choice between the average male poster on this forum and a career criminal, and the split will be greater than 9/1 in favor of the con. You ain’t gotta like it, but you do have to deal with it, unless you are lucky enough to be gay.

          • Troy says:

            Give a woman under thirty a choice between the average male poster on this forum and a career criminal, and the split will be greater than 9/1 in favor of the con.

            My wife is under 30. This is not true of her (as evidenced by the fact that she married me, I was her only boyfriend, and I am not a con). I am confident that it is not true of any of the other under-30 married women I know, of which there are several in my neighborhood, and I’m skeptical that it’s true of many of the single under-30 women I know in my neighborhood (many of whom are grad students).

            Consider that the women who PUAs pick up at bars are not a random sample of women.

          • Tarrou says:

            @ Troy,

            I don’t want this to sound dismissive, we all have our own experience. You have a data point. I have many many dozens of them. I don’t track “PUA” or whatever, but consider these two men:

            19, slim, sensitive, intelligent. Reasonably athletic (track and soccer), top of his classes. Deeply careful with women, solicitous of their feelings. A “good guy” in every sense, polite and conscientious.

            27, fifty pounds heavier. Eight years infantry veteran of three combat tours. Deeply misanthropic, anger issues, hypervigilance and a heavy alcoholic. Tattoos, scars, missing half a hand. Completely mistrustful of women, sexually selfish, socially callous.

            One of these men rarely got laid, and had one girlfriend in five years. One of these men never spent a weekend alone, dated a different girl every month. Both of them were me, at different stages of my life. Now, I didn’t much like being either of those guys, but women definitely liked the latter over the former.

          • Pku says:

            The statistic I’ve heard is that men who score higher on dark triad tests have more short-term relationships and one-night stands but have a much harder time finding stable long-term relationships. This seems to correspond pretty well with you guys’ stories.

          • Troy says:

            You have a data point. I have many many dozens of them.

            Well, yes, that is the difference. Men who have lots of — *ahem* — data points will be sampling from a different population than those of us who (by choice) just have one.

            Less euphemistically, women who have sex with someone they’ve just met or even someone they’ve dated for less than a month (whether that’s the first or second man in your description) are a much different group than women who want to keep their virginity until marriage. Your experience is with the former. Mine (as a Christian, who went to a Christian college as an undergrad and has largely Christian friends) is with the latter.

          • Airgap says:

            as evidenced by the fact that she married me, I was her only boyfriend, and I am not a con

            She just lied about the 2 dozen cons she banged before she married you. All women are like that. You should read this blog called “Heartiste.” It’ll open your eyes, man.

          • Jaskologist says:

            Troy, I think your parenthetical is key here. You picked your wife from a subculture that still promotes healthy, long-term mating strategies. The wider culture does not, and those who didn’t grab a spouse in or shortly out of college do not have a lot of good prospects to choose from.

          • Oliver Cromwell says:

            Entry level McDonald’s worker isn’t just a stable provider vs an exciting badboy, he’s also a complete loser within the stable provider economy. Those girls who want a stable provider aren’t going to be impressed by his job at McDonald’s. By comparison, plenty of girls will be impressed by the edgy lifestyle of a drug dealer no matter how badly it pays.

            Also, being a drug dealer isn’t just edgy, it’s powerful. These drug gangs are really militias which are the de facto governments – at least day-to-day – of the areas they patrol. In a military dictatorship, a member of the military has real official status as well as a badboy image. For sure this means nothing outside of their patch, especially in areas which are controlled by and loyal to the official United States government, but most girls these people meet live in the same patch.

          • Tarrou says:

            @ Troy

            You are absolutely correct that we are picking from different pools. But if you think the sexually pure, long term relationship focused pool is the majority and not the insanely tiny self-selected minority. I stand by my prediction.

            Nine out of ten go for the criminal.

            If you’ve found that one out of ten, congratulations, and I wish you all the best. But I do believe my experience is far more generalizeable than yours.

          • Troy says:

            Sure, there are different subcultures here, and I don’t at all deny that there are many women like you describe. But I think the 9 to 1 figure is crazy hyperbole. I certainly don’t think that only 1 out of 10 women are in my subculture, at least not in the United States. According to Wikipedia, some 71% of Americans are Christians, and some 26% are white evangelicals. (Women tend to be more religious than men, so this is an underestimate of women.) I’m confident that a large majority of the latter group (who were most prevalent where I went to undergrad) fit my experience better than yours. Among other self-identified Christians, many are nominal, but I still expect that about half would fit my experience better than yours.

            I’m very skeptical that your 9 to 1 number is even true among educated non-Christian white women, of whom I know a fair number, but even if 100% of these women are as you describe, the demographics suggested above would make your figures impossible.

          • Tarrou says:

            @ Troy,

            “Claimed to be a christian on a survey” and “actually practice the sexual mores of the sort of christians who attend christian college” are not the same thing at all. We both know the vast, overwhelming majority of those self-styled “christians” aren’t very serious about it. Given the divorce rates and out-of-wedlock birth rates, it’s pretty unlikely that more than a couple percent of the total population actually follow christianity as a primary guide to their sex life.

            And while I’m at it, I’d point out that the christian girls on christian colleges aren’t as resistant to the charms of people like me as you seem to assume. I grew up very religious, and one of my best friends attended Cornerstone U, a very conservative christian college here in Michigan. So there’s a couple of those dozens of data points from the very sort of self-selected for sexual continence pool you’re talking about.

          • SFG says:

            Curiosity: do you think, for us nerds who are not rich, converting to evangelical Christianity is the way to go?

          • Troy says:

            “Claimed to be a christian on a survey” and “actually practice the sexual mores of the sort of christians who attend christian college” are not the same thing at all.

            Right, that’s why I said that a “large majority” of white evangelical women and “about half” of the rest of (self-identified) Christian women would not pick the con, as opposed to making universal claims about either group. (I think the former claim is warranted inasmuch as, while there are lots of Christmas-and-Easter Catholics who will say “yes, I’m Catholic” on a survey, most self-identified evangelicals are serious about religion.)

          • Troy says:

            Curiosity: do you think, for us nerds who are not rich, converting to evangelical Christianity is the way to go?

            Do you mean, in order to find a mate?

            You’ll certainly find more women who are interested in getting married and settling down while fairly young, as well as having children, if those are traits you’re looking for. That’s not unique to evangelical Christianity, though; in particular, you’d find similar traits among conservative Catholics.

            Now, I’m not sure I can in good conscience recommend conversion for such mercenary motives (although I can recommend conversion for non-mercenary motives, such as the truth of Christianity). But thinking of it purely from an instrumentalist perspective, you might want to ask yourself just how congenial you would find the evangelical subculture to be. For example, there’s an an anti-intellectual strand in American evangelicalism that most posters here would probably find off-putting. (Catholicism is a better choice from this perspective.)

            However, perhaps I am taking a question meant in jest a little too seriously. 🙂

          • SFG says:

            No, actually, it wasn’t a joke. More of an exploration.

            I would be willing to go to church and pay the fees in order to avoid the nasty ‘marry the guy for five years, divorce him, and collect alimony’ scam. I would be faithful to my wife and raise however many children we agree on. What happens in my mind is between me and God, and it’s not unknown for people who start going for instrumental reasons to start believing later (as Lewis said, ‘when men lower their knees to pray, their minds often follow’). I am not hostile to Christianity.

            Thing is, I am older (mid-thirties), so it might be too late for me. Reasonably good job (no more details, because this is the Internet.)

          • Jaskologist says:


            Of course. It’ll do your soul good.

            As for your marriage prospects, eh, maybe. The dating pool is certainly of a higher quality, with stronger cultural forces pushing towards marriage. But being attractive is still going to matter. Christian women are still women; they don’t stop being attracted to the various things women like in men anymore than Christian men stop being interested in boobs. And they are still exposed to the influences of the wider culture urging them to put off marriage and sleep around more. So your odds get better, but it’s not a panacea.

          • Troy says:

            I don’t think mid-thirties is necessarily too late, although it’s getting up there. I suspect that if you became regularly involved at an evangelical or Catholic church you could meet a woman of the kind you’re looking for (although Jaskologist’s comments are also on point; you should still make an effort to be attractive, obviously). I will say, that if you don’t now believe, or don’t now fully believe, a Catholic parish, even a more conservative one, may be a friendlier place. There is often more pressure in evangelical churches to talk about one’s personal religious experiences and the like, and doubt tends to be looked at with more suspicion (although I should stress that this is far from universal). By contrast, if you participate in the rituals of the Catholic Church, most Catholics (in my experience) are less concerned with what goes on inside your head.

            That said, there are many evangelical-type churches, especially (in my anecdotal experience) mega-churches, that try to be very open to “seekers,” and so you may be able to be more honest and open in that environment as well.

            And even if you don’t find a mate, you will in many churches find a welcoming and loving community different from what you’ll find anywhere else.

          • Jaskologist says:

            Personally, I don’t find Evangelicals anti-intellectual. What they are is ignorant, which is to say they’re just like any other average American. I’m doing what I can on that front, which is why I teach courses in church on Augustine’s Confessions and Early Church History. I see it as helping to restore to them a heritage that was stolen from them.

            Someday we’ll tackle Boethius. Someday…

          • Svejk says:

            Give a woman under thirty a choice between the average male poster on this forum and a career criminal, and the split will be greater than 9/1 in favor of the con.

            This is nonsense. 99% of the value-add of the PUA lifestyle is that they approach vastly more women and are de-sensitized to rejection
            All those supposedly ballin’ ghetto cons the meek misogynists fantasize about are sleeping with the same relatively small pool of ‘loose’ women. They all know each others’ baby daddies because they all sleep with the same men. Derrick who owns the body shop is not taking their mess. The ballers are ballin’ within a relatively small and almost completely overlapping circuit.

            As Troy states, there are at least two pools of women with little overlap, and it is easy for shy men to mis-estimate the relative sizes of these pools. The modal woman has had relatively few sexual partners, and it is not at all difficult or statistically unlikely for a marriage-minded man to find partners from the few-to-none pool, especially if he is a member of certain religious communities.

          • CJB says:

            Errr- who do you know where the only women having casual one nighters, serial monogamy and the rest are the trashy hoes?

            Every been in, say, Washington DC? Or any college town? Or NYC? Or read/listen to stories women tell about their sexual history? Or sororities? or….

            It’s not just trashy ghetto/trailer park girls. It’s….well, it’s most women. At least in my experience of drinking with lots of young professional women who talked like college frat boys.

          • Svejk says:

            Errr- who do you know where the only women having casual one nighters, serial monogamy and the rest are the trashy hoes?
            For a the majority of the college-bound female population of the US, serial monogamy entails a high school boyfriend, 1 or 2 college boyfriends, and 1 or 2 post-college boyfriends. Their partners are drawn largely from the same socioeconomic class. Many women have even fewer partners; they tend to be religious.
            I’m not arguing that these women are cloistered, merely that they have no more experience than their focal male marital pool, and show no preference for sociopaths. However, if you adopt a sociopath’s resilience to rejection, you too can find a partner from this pool. This is the golden nugget in the PUA dross.

          • Tarrou says:


            I’m not quite sure where to go with that claim, other than to say that all of my experience, and the experience of my friends, brothers, co-workers, fellow students, bandmates, inmates and cell mates says otherwise.

            And this is not misogyny or “PUA”, or whatever you want to call it. I like women just fine, and what’s more, I like them for who they are, not who I think they are or wish they were. This isn’t sour grapes, I do just fine, and it’s not the ramblings of a congenital manwhore. I settled down with a wonderful young lady some years back and am quite happy.

            This is not a judgment on or condemnation of women, this is only observation. In my experience, and in the best experience I have heard and observed over thirty-five years of a very colorful life, there are your sexually aggressive girls, and your sexually prudish, marriage focused girls. Each group is maybe five percent of the total. The other 90% may take long trips into relationships, but on the side or inbetween, they get a lot of mileage. And when they do, it’s generally not with the chess club.

          • Fake Name says:

            > Curiosity: do you think, for us nerds who are not rich, converting to evangelical Christianity is the way to go?

            @SFG If you’re willing to go that far I’d suggest spending time abroad working as another way of suddenly, vastly increasing your attractiveness. A ~45 year old American of my acquaintance has spent most of the last ten years here in Beijing despite being able to make way more money at home, a programmer by trade. One of the reasons he loves it here is that here he can date girls in their 20s whereas at home he is a he put it restricted to divorcees with kids. The increased attractiveness is due to a variety of factors, culture groupies who really, really like America/the West, white/black fever girls who just don’t find Asian men attractive, gold diggers and green card hunters. Also just people who meet you and find you attractive but that’s slightly less cynical and partially covered by culture groupies.

            Gold diggers are easily avoided because they either pump you for information about your wealth fairly quickly or only hang out and do things that cost a lot of money. And if you’re not attracting them now you wouldn’t in any fist world Asian city. I never had a conversation about emigration with anyone I was dating but again in first world Asian cities it’s unlikely to be a huge deal.

            What is a huge deal is being white or black. When I got here I thought it was all about money. Then I met a guy who was 60, a high school teacher at an international school whose girlfriend was a 30 year old architect who had a UK Master’s degree, not attained through correspondence. After that I just accepted that white and black guys can punch week above their weight here. The final thing that persuaded me was a friend who went to Japan and said it’s very much the same despite Japan being a first world country.

            Go get a foreign Master’s degree or do your job abroad for less money or teach English. Alternatively mail order marriages have lower rates of divorce than the US average.

            On Tarrou’s view of women, particularity their 20s and PUA more generally. There are many, many lovely women out there, with limited sexual histories if you care. They just don’t spend that much time in bars or clubs. You can get 90% of the value of the PUA package by joining a gym, dressing relatively nicely and talking to lots of women and asking lots of women out. If you advertise you’re looking for a wife you won’t be single long either. Plenty of women want someone like you.

          • Airgap says:

            With all the places on the internet to discuss PUA Manosphere bullshit, do we really have to do it here? Look, all you folks are welcome to have any opinion any way on the matter that you like, I really don’t give a shit. I just think it tends to lower the tone of SSC.

          • SFG says:


            I was thinking there would be less bullshit here than elsewhere; PUA/Manosphere is notoriously full of self-promoters (quite logical as a large aspect of the technique is, in fact, promoting yourself fanatically!)

            It’s not my blog, but if anything I feel this is good, since I doubt the people here will be quite as unethical as Heartiste and the like in a relationship, and I’d rather see Scott’s fans pass on their genes than the Nazis at Heartiste, wouldn’t you?

          • Pku says:

            They’re not too good at passing on their genes though (except by accident) – even accepting their boldest claims about the number of one night stands they have, they seem extremely bad at finding serious long-term relationships which would lead to kids. So the best way to increase future gene quality would be to promote better contraception methods on heartiste.

          • SFG says:

            Pku: my argument is that PUA discussion is likely to convert a celibate less-wronger into a one-child less-wronger, improving the gene pool in that fashion.

            The Heartiste guys don’t care what people say here, and in any event are too busy complaining about the downfall of the white race anyway. I’m serious–it’s gone from Louis-Ferdinand Celine to Don Black.

          • Svejk says:

            Also, I do not mean to be harsh or dismissive toward ‘ghetto’ women. Even in the meanest projects in the US, you will find churchgoing women who are holding out for a ‘godly’ man. They might eventually have a child or two with the best of their prospects. If the relationship fails, they will enter the ‘single mother’ statistical pool, but they are qualitatively different from the received stereotype.

          • Autonomous Rex says:

            “Give a woman under thirty a choice between the average male poster on this forum and a career criminal, and the split will be greater than 9/1 in favor of the con. ”

            You think you oversold your bias a little bit there?

            I love this blog but it draws (along with a number of super humble individuals able to entertain self-doubt) a lot of super rigid paranoids selling self-evidently precise gut feelings, that won’t be walked back an inch.

            If you look back at the blog comments before the gamergate thing, it’s different.

          • Deiseach says:

            seeking out a stable househusband to drain resources from and dominate emotionally

            Tarrou, you make it sound so appealing, I felt a definite twitch of interest – and I don’t even want to get married, nor have ever done so, having declared my intentions to avoid marriage, husband and family at age nine 🙂

            I think part of the situation may be that women are now taking the opportunity to sow their wild oats as men used to do. Before they settle down with the spouse and kids and house in the suburbs, have fun and get it all out of their system. Complaints about “Why is this guy with a new woman every month while I can’t get a steady date?” seem not to include in their reading of the situation that, from the woman’s point of view, the man may indeed only be for the month; the bad boy she has a fling with but would never dream of marrying or even getting into a long-term relationship with. That for the woman, she may for her part be having a string of “a different man every month”.

            As Queen Medbh said, “I was never without one man in the shadow of another” 🙂

            Messengers came from Find mac Rosa Rúaid, the King of Leinster, to sue for me, and from Cairbre Nia Fer mac Rosa, the King of Tara, and they came from Conchobor mac Fachtna, the King of Ulster, and they came from Eochu Bec. But I consented not, for I demanded a strange bride- gift such as no woman before me had asked of a man of the men of Ireland, to wit, a husband without meanness, without jealousy, without fear. If my husband should be mean, it would not be fitting for us to be together, for I am generous in largesse and the bestowal of gifts and it would be a reproach for my husband that I should be better than he in generosity, but it would be no reproach if we were equally generous provided that both of us were generous. If my husband were timorous, neither would it be fitting for us to be together, for single-handed I am victorious in battles and contests and combats, and it would be a reproach to my husband that his wife should be more courageous than he, but it is no reproach if they are equally courageous provided that both are courageous. If the man with whom I should be were jealous, neither would it be fitting, for I was never without one lover quickly succeeding another [lit. without a man in the shadow of another].

          • Troy says:

            Personally, I don’t find Evangelicals anti-intellectual. What they are is ignorant, which is to say they’re just like any other average American. I’m doing what I can on that front, which is why I teach courses in church on Augustine’s Confessions and Early Church History. I see it as helping to restore to them a heritage that was stolen from them.

            On the subject of evangelicals and the Church Fathers, you may find this essay by a Wheaton professor interesting, if you haven’t seen it already:


          • walpolo says:

            There is some weird false-dichotomizing going on in this discussion.

            My sense is that the typical woman who seeks out one-night encounters does it as a way to pass the time between serious relationships, and is pretty serious about her relationships once she finds someone she’d actually like to be with long-term. Pretty much the same as the way a lot of guys approach one-nighters.

            So the fact that a woman fucks a lot of guys doesn’t say as much about her overall outlook as some of you seem to be suggesting.

            I do agree that there’s a common issue among young women of going for assholes. Not ex-cons, but guys who are dickish or arrogant.

      • Tarrou says:

        Well, it works well enough that women will leave a stable, middle class life to shack up with a low-level drug soldier, career violent criminal and accused rapist. Then work as a stripper to support that man’s coke habit, and leave their children in his care while they do it, with predictable if tragic results.

    • Tarrou says:

      You’re on the right track, but think broader. Why would someone do illegal work for less than minimum wage? If all you think matters is money, that’s risk and less money, makes no sense.

      Then factor in things like:

      Status! – Here you’ve started, with the girlfriend nod. But it’s bigger than that, you can’t brag on the street about your hardcore McDonalds crew. Your McDonald’s coworkers do not have your back. And waiting on people is embarrassing for a lot of young men. Not something that enhances your status. So, to attract the “drug gang footsoldier”, it needs to pay enough to offset these social handicaps….which is unlikely.

      Effort! – Selling drugs may be extremely high risk at a few moments, but most of the time it’s just sitting around, with your friends, possibly sampling your own wares. You can wear whatever clothes you want, you can start when you want, quit when you want etc. It’s not just that McD’s pays $7.50 and doing this pays $3, it’s that wearing a shitty uniform, showing up on time and slinging grease in a hot, noisy environment may not be worth $4.50 extra an hour. People’s free time is worth utils, if not actual money.

      Self-concept – I think this is a big deal, especially with the underclass. It relates back to status somewhat, but is more internal. Since I live in a very poor town and work in a job mostly with the underclass, I run up against this all the time. Men and women who have a pathological sense of pride that will not allow them to submit to basic authority. One man constantly brags about how “no one tells [him] what to do”. His jobs only last until his boss first criticizes him. I had a female client go on at great (and greatly uncomfortable) length about the list of jobs she would rather “suck dick on a streetcorner” than take. She then propositioned me.

      If you start with a list of jobs you will not take under any circumstances, add in a refusal to accept that bosses will yell at you and customers are shitheads, and finish it up with the fact that no bottom-level job is going to pay people more than their free time is worth to them (counting in status modifications), you get a relatively permanent underclass to which the only obvious answers are terrible.

      1: Continue to subsidize a vast and growing segment of unproductive, hypercriminal people.
      2: Slash benefits and let people starve in the streets. A few decades of that and you can starve the pride and status right out of folks.

      No civilized person could countenance the second and no society can fund the first indefinitely.

      • Pku says:

        Do you know how universal this is? Freakonomics has a story about a low-level drug dealer crowd for whom getting a job as a university janitor was a big opportunity (this is an anecdote from a researcher in the eighties, but do you know if it generalizes? If getting a low-level real job is hard even for the people who genuinely want to, I can see why a lot of people would stop trying, and start taking pride in not trying.)
        (Also, aside from that, this is actually a pretty sound argument for minimum wage: pay too little, and it won’t be worth people’s time to show up over whatever it is those people do).

        • Tarrou says:

          I couldn’t say how universal it is. I can say that in my town, black men, when employed at all, tend to get work as far from customer service as they can. This also goes for underclass whites to a lesser degree. For instance, most restaurants have kitchen staff that is solid majority black male. I’ve only seen one black male waiter, ever in this town, and he was very noticeably gay. It’s my general feeling from living here and knowing these guys that they’d gladly take hard, dirty jobs so long as they don’t feel “disrespected” or embarrassed, so the janitor thing might work.

          All this can be seen as an argument for the minimum wage only if you accept a raft of fairly ridiculous priors, such as:

          1: The government is really, really good at understanding underclass culture, quantifying vague concepts into monetary values and adjusting these quickly when they change.

          2: The minimum wage will not serve as a basic entrance barrier for the slightly less skilled, creating the duality Scott was talking about. Those who can get jobs might be better off, but there will be fewer of them.

          3: Governments are better at understanding what people will work for than businesses.

          4: The profusion of benefits to the poor do not serve as a floor “living wage”, which disincentivizes any work, because at the margin, once you start making money, you lose benefits. Making slightly more than the benefits isn’t worth it, because you lose more than you gain. You need much more money to make spending your days hard at work worthwhile when you get tens of thousands of dollars for nothing.

          • AJD says:

            Your observation about what kind of jobs young black men “tend to get” seems to me more likely to have to do with the kind of jobs employers are prefer to put them in than with the kind of jobs they prefer to have.

          • Marc Whipple says:

            Did you miss the whole long part of the message where they explained why it is they think young black men tend to get those kind of jobs? Hint: It involved a lot of expressed and implied preferences.

          • Tarrou says:

            Well, that’s the common reply, but I don’t think it holds any water. The town I live in is very majority-black. Many of the restaurant owners are themselves black, so I doubt you could make a plausible case for evil racism keeping people out of certain jobs. Many of the places are national chains, so they have a pretty strictly open hiring process. I only have deep inside information on two restaurants, but in both the managers tell me black men don’t apply for front of house positions. One specifically said he wished he could have some black waiters because so much of the clientele is black, and when there is a disagreement, they tend to shout “racism”, but he just can’t hire any.

          • Autonomous Rex says:

            Man you got the line on the blacks AND the whores. How do you rate the black whores in your town?

          • Deiseach says:

            Content warning: coarse and vulgar language.

            I don’t know if I’m underclass, Tarrou, but I wouldn’t take a customer service job. Two reasons: one, I’m hopeless at dealing with people and two, I did have a job in retail years back.

            Retail/low level service work – people do treat you like shit (excuse the language). Some people treat you like you are their personal servant. Other people treat you like you’re not even a human. My two favourite anecdotes are:

            (1) The Guy In A Suit (obvious office worker/some kind of ‘professional’ man in his 30s or so) who came in to buy a newspaper, was chatting with his friend and kept his head averted from me all the time, and didn’t even hand the money to me, just threw it down on the counter. Not even the common courtesy to look at me while he was handing the money to me.

            (2) The woman who complained to the manager that I didn’t smile and chat to her while serving her. This was on Friday evening when the shop was going full-blast with everyone doing their after-work weekend shopping. There was a long queue behind her. Me and all the till staff were going as fast as we could to serve people. If I stopped to have a nice little chat with this woman, can you imagine how pleased (not) the people waiting in line would have been at the extra delay when all they wanted was to pay for their groceries and beat the traffic rush to get home?

            That’s not counting the people who expect you to take cheques or coupons or foreign money (honest: I had English customers try to pay me with English money and not understand why I couldn’t just take it because it was good money) and who do not seem to realise that I was the monkey, not the organ-grinder, and had absolutely no power to change store policy.

            You get yelled at, threatened that “I’ll tell your supervisor about this!” and abused simply for doing your job.

            Oh, I was nearly forgetting my absolute favourite anecdote: there was a rash of thefts from shops in the local area involving threats and even the production of weapons (not guns, I hasten to add; at that time, petty Irish criminals had not yet reached that level of American sophistication).

            The manager told me, in all seriousness, that if someone tried to get me to open the till for them I should be prepared to get stabbed rather than do this.

            This was the same manager paying me shitty money at the lowest end of the scale. who paid staff ‘under the table’ for working overtime rather than pay the tax and social insurance on the wages, and who regularly fired staff when they had enough service to qualify for a pay rise up the scale as mandated by the national chain of supermarkets (we’re not talking huge money here by any means).

            I may have gone “Yes sir, no sir, three bags full sir” but you can bet I was thinking “Fuck this, you think I’m going to be maimed or even killed for the sake of your money? When I know you wouldn’t even buy me a bunch of flowers if I ended up in hospital stabbed like you want me to do? Anyone comes in here waving a knife, I’m standing well back and they’re welcome to take what they can get”.


        • Anthony says:

          Universities generally pay everyone well, except adjuncts. Getting a job as a janitor for a university means a reasonably well-paying job for life. It certainly pays better than mopping up the back (or front) of a McDonalds, or being the contract worker cleaning offices in a commercial office building. And unless you get assigned to the dorms, you’re not cleaning up particularly nasty messes – the things that happen in bio or chem labs get cleaned up by grad students or hazmat teams first.

      • Mary says:

        3. Start enforcing laws about unfit parents and lower the bar. Figure out the demographic of disaster (such as parents’ cirminal records) and take all the kids as newborns, before their parents (or mother and current boyfriend) can damage them until they are unadoptable.

        Given all the false positives this would produce, it is only marginally easier to stomach than #2.

        • Pku says:

          Scott’s posted a bunch of studies showing that parenting doesn’t affect the child’s outcome that much though… It probably would help improve things, but genetics would limit how much.
          Giving everyone IUD/vasectomies, OTOH, might work pretty well (whether or not this is ethical is another story).

          • Airgap says:

            A simpler solution is to make being on semipermanent birth control an eligibility condition for welfare. If the state is paying for the upkeep of your children, you don’t get to have any more until you can pay for the ones you already have. This (arguably) eliminates most of the ethical issues and catches most of the people you filthy eugenicists want to forcibly sterilize, and is a lot easier to sell as a political policy.

          • Pku says:

            I’m not sure it’d be easier to sell – everyone who uses the term “welfare leeches” would support it, but I think it would draw Incredibly vehement criticism from almost everyone lest of centre and become bogged down in interparty argument, with republicans who want to signal moderation or who represent districts that are religious/have a lot of people on welfare coming against it, so even a heavily republican congress probably couldn’t pass it. At least if it’s for everyone you wouldn’t have the social justice people up in arms.

          • Mary says:

            Those who actually go on having children knowing they will lose them instantly should receive psychiatric examinations to see whether they can control themselves. (Since they do pose a risk to others, namely the babies.) Perhaps if you have two kids that have to be taken, the presumption that you can’t should be irrebutable. We have institutions for them.

            I suspect that most will turn out capable of controlling themselves with that incentive.

          • Protagoras says:

            Those studies continue to confuse me. Studies seem to show that divorce has a lot of long lasting consequences for kids, and childhood abuse has a lot of long lasting consequences for kids, but other studies show that what parents do doesn’t matter? It makes me worry that what’s happening is that the studies that show parenting doesn’t matter are ending up showing that once you control for all the things parents do that matter (whether they’re being controlled for deliberately or via poor experiment design that only compares excessively similar parents), what parents do doesn’t matter.

          • Airgap says:

            Studies seem to show that divorce has a lot of long lasting consequences for kids, and childhood abuse has a lot of long lasting consequences for kids, but other studies show that what parents do doesn’t matter?

            It’s not like you can take parents disinclined to divorce or abuse their children, have a random half of them get divorced and beat their kids regularly, and observe the results. If your parents beat you, you were carrying child-beating genes on day 1.

          • Airgap says:

            Pku: I think you mean it wouldn’t be easy to sell. If you actually aren’t sure that it’d be easier to sell than forcibly sterilizing people at the discretion of the Federal Department of Parental Fitness, I’m inclined to say you aren’t qualified to discuss politics, but maybe you could say why?

            For the religious districts, you hint at a married-couple exemption (most of the worthless human scum you’re trying to remove from the gene pool are unmarried anyway), and in practice throw in lots of fine print like void if you’ve been divorced or had children out of wedlock or whatever, which may well win you more religious voters who’ll fail to notice it means more government-sponsored sin. From the other side of your mouth, you argue that you’re talking about a class of people who are set on a course of sexual immorality to begin with, and it’s just a matter of whether it’s the kind with negative social consequences or not, reference Matthew 5.29 without actually making a theological argument, etc. Remember, you don’t have to fool all the Christians, just enough of them to carry the district.

          • Earthly Knight says:

            “Studies seem to show that divorce has a lot of long lasting consequences for kids, and childhood abuse has a lot of long lasting consequences for kids, but other studies show that what parents do doesn’t matter?”

            The problem is that the “lasting consequences” children of divorce and abuse face tend to be things like poor impulse control, aggression, various psychopathologies, and failed or abusive relationships of their own. This leaves it ambiguous whether the causal pathway is parenting or heredity– it might just as well be that people who are genetically predisposed to be impulsive and aggressive both a) tend to get divorces and abuse their children and b) pass those predispositions on to their offspring at the moment of conception.

            “It makes me worry that what’s happening is that the studies that show parenting doesn’t matter are ending up showing that once you control for all the things parents do that matter”

            The strongest evidence that parenting doesn’t make much difference to adult personality or intelligence comes from adoptive twin studies, which don’t depend on dubious controls. The lesson of these studies is basically this: if you take two identical twins and raise them in different families, they will be eerily similar to one another as adults. On the other hand, if you take two unrelated children and stick them in the same household, they will grow up to be as different from one another as two strangers on the street. This, at least, was the consensus view in psychology fifteen years ago, the pendulum might have swung back the other way a bit recently.

          • Troy says:

            It makes me worry that what’s happening is that the studies that show parenting doesn’t matter are ending up showing that once you control for all the things parents do that matter (whether they’re being controlled for deliberately or via poor experiment design that only compares excessively similar parents), what parents do doesn’t matter.

            I think most of the studies are comparing opposite-sex two-parent households in which the children are not beaten. And often when I hear people summarize these studies the summary is “if you don’t abuse, starve, or leave your children what you do beyond that doesn’t make much difference.”

          • Mary says:

            “It probably would help improve things, but genetics would limit how much.”

            Remember that some of these people are responding to incentives. Theodore Dalrymple wrote of a prisoner, in for assaulting his mother and his girlfriend, who said that he didn’t need prison, he needed anger management classes. Dalrymple responded that he didn’t have any trouble managing his anger in prison. The prisoner retorted that he didn’t want the punishment, thus letting the cat out of the bag.

          • Anthony says:

            Studies seem to show that divorce has a lot of long lasting consequences for kids, and childhood abuse has a lot of long lasting consequences for kids, but other studies show that what parents do doesn’t matter?

            Protagoras – my theory is as follows:

            People have some level of genetic potential. There are lots of bad things which can happen to them which can leave them with worse outcomes than their potential would indicate, but those things have to be pretty big, and are more likely to have a permanent effect if they occur during childhood. Near-starvation for a few months in childhood *will* permanently lower your IQ, for example. Living in an active war zone for a while will affect your personality.

            However, in most of the “first world”, very few children will be exposed to this level of damage. Here in the U.S., the poor are fatter than the rich, indicating an absence of severe malnutrition, and even in the poorer fringes of Europe or North America, there’s very little actual starvation. (Even in South America and China there’s not much *anymore*.)

            However, there are two things which happen to children even in the rich world, which are capable of leaving permanent damage: child abuse and divorce. Figuring out how much of the bad outcomes of those is due to the event rather than the underlying personalities is difficult, because people who are likely (in the current environment) to divorce are different from those who don’t, and therefore their children will have inherited some of those tendencies; child abuse has the same issue.

          • Jaskologist says:

            Addendum to Anthony’s hypothesis:

            Our society is a meritocracy, yes (with merit mostly measured by IQ). But I’ve started to think that we don’t reward merit/IQ so much as we punish its absence.

            Many social institutions were built up to simulate the effects of IQ, binding marriage being one of them. In destroying the ability to legally make a life-long commitment, we eliminated the ability for most of those in the lower class to get the benefits of a life-long marriage. Those will now accrue only to those who are able to maintain the composure not to hit the self-destruct switch for every moment in their entire lives.

            This is, in effect, the removal of safety nets. Now, your ability to succeed is driven far more strongly by innate qualities than before. If you lack those, you once could have fallen back on the other institutions built to catch you. Now you just fall.

            Picture a rock wall. There are some people who can free-climb all the way to the top. Many more could do so with the help of some ropes, but the free-climbers have cut all of those. If they ran a study, they would find that ability to reach the top really is driven entirely by climbing ability. Tying a rope around your waste is found to have no effect (and nobody realizes that in the old days, ropes had the other end attached to the mountain). So at the top of the cliff, you have a lot of free-climbers strutting about, congratulating themselves on how evolved they are to not use ropes, as their fellow climbers splatter on the canyon floor below.

          • onyomi says:


            This is a very interesting point.

            I wonder if the same doesn’t also apply to morality in the way Voltaire worried about.

          • onyomi says:

            It seems like most studies show that you can screw up your kids in a variety of ways, but you can’t really turn them into super-kids.

            Therefore, don’t abuse the kids, get divorced, subject them to malnutrition or fail to vaccinate them and they will probably do about as well as they are ever going to do.

            As for all the enrichment classes, listening to Mozart in the womb, etc. Probably doesn’t make much of a difference.

            This, as Bryan Caplan argues, may actually be a big relief for today’s parents.

          • Earthly Knight says:

            Could someone provide citations to twin studies supporting the claim that child abuse and divorce have a significant causal effect on adult personality, intelligence, and psychopathology? Most of the studies I can find on google are equivocal, report only a very small effect, or both.

          • Earthly Knight says:

            Poking around a bit, it looks like the best studies in the vicinity focus on child sexual abuse. They consistently find that, where one adult twin reports being sexually abused as a child while the other doesn’t, the first is between 1X-2X as likely to also report some history of psychopathology. See e.g. here, here, and here.

            These seem to be the best studies available, but they are not by any means good: they don’t properly control for the possibilities that a) children genetically predisposed to mental illness are more likely to be targeted for sexual abuse, or more likely to put themselves in situations which might lead to abuse, b) the twin with a history of mental illness is more likely than the twin without to remember the same childhood experience as sexual abuse, and c) most of the excess psychopathology occurred in the short-term aftermath of the abuse, and had cleared itself up by adulthood.

            All in all, I still feel confident enough to conclude that if you molest your children this will probably increase their odds of developing a mental disorder later on in life a bit. So if you want to be a good parent, don’t do that. I would still like to see good evidence that lesser forms of abuse or divorce have a significant impact on adult outcomes once genes are taken out of the picture, though.

        • Derelict says:

          Aren’t those the most likely to complain about their “parental rights” and scream that the “gummint came and took mah baby” when they actually do it?

          • Mary says:


            That’s the problem.

            Of course, if you admitted that as evidence that they could not control themselves. . . .

        • Tarrou says:

          This is a pretty indirect way of getting at the problem, and has no guarantee of having any effect. Once again, the question is how good is government at telling who is a “fit” parent?

          Government is always a blunt instrument. You have to account for that. It’s one thing to propose (for the sake of argument) that all welfare, medicaid, WIC etc. be eliminated. This would be an extremely crude method and would hit millions of people who aren’t generational underclass. It would have a host of very obvious downsides. But it is reasonable to expect that given enough incentive, a government could conceivably implement it.

          Expecting a government, any government, to reliably distinguish between “good” and “bad” parenting is to have more faith in the abilities of institutions than sanity warrants. Just look at the parent’s getting arrested under our current laws for allowing a child to play in full view of their parents for a half hour or so.

          • Airgap says:

            This is one of the many reasons I propose a solution whereby the government only has to distinguish between “Is on welfare” and “Is not on welfare.” They’ll probably fuck that up too, but not as much.

          • Jaskologist says:

            Also, taking kids from their parents means putting them into the foster system. I’ve known some people who went through that, and from their reports, it did not sound like an improvement.

          • Jaskologist says:

            That’s assuming “the government” wants to distinguish. Disability is exempt from your test, right?

          • Airgap says:

            It depends how the disabled tend to vote.

          • Autonomous Rex says:

            You can assure us 90% of women are whores based on your own experience and past conversations with cellmates

            yet “Government is always a blunt instrument”.

            This binary switch from dogma to doubt, rabidity to reason, silliness to sobriety IS libertarianism.

          • Mary says:

            ” taking kids from their parents means putting them into the foster system”

            No, you have them adopted. That’s why you take ’em as infants, before parental abuse can make ’em unadoptable.

            Well, that’s one reason. The main one. There’s also the aspect that if you take the woman’s living doll before she gets bored with it, you are depriving her of the reason she had the baby. This will incent her to not have a second, since she won’t get the benefit.

          • Tarrou says:

            @autonomous rex,

            I have assured no one of anything of the sort. If you prefer to call normal women “whores”, well I guess I can’t stop you. A bit revealing, actually. Are you claiming that a woman with a few long term relationships who also has a few short flings in between is a “whore”? It’s all rather sexist of you, seems to me.

          • Edward Scizorhands says:

            Tarrou, you might remember that user’s first appearance here:


            This is Scott’s blog, not mine. But if I like a place, and it has someone that I’ve decided is showing up merely to hinder discussion, I run something like this Javascript

            var cites = (document.getElementsByTagName("cite")); for (i in cites) { var c = cites[i]; if (c.innerText == "Autonomous Rex" && c.getAttribute("class") == "fn") { = "none" ; } }; null

            and move on with my life.

          • Autonomous Rex says:

            …like eavesdropping on Dean Wormer and Lt. Niedermeyer hatching their plan to put Delta House on “double secret probation”.

            Here we have Tarrou, Scissorhands, Mary, Derelict, John Schilling, Onyomi, Airgap, Jaskologist, Suntzuine et al running comment field drills drills in advance of Scott’s next post:


            It hurts to be criticized or teased, but i hardly think it calls for banning. Especially on a blog where liberal voices are heard less and less.

            And here’s something that separates you and Scott besides a capacity for critical doubt: humor. As educated liberals are cursed with timorous mugwumpery, educated conservatives are by humorlessness. Thus Niedermeyer, your archetype. Frank Burns on MASH works here too. (i didn’t think you’d mind being compared to a tv character since you only deal in cartoons when speaking of “Government”, Liberals, “Leftists”, Women, Blacks, Unions, Feminists, Academics or Scientists et al)

          • Jaskologist says:

            Tarrou, Scissorhands, Mary, Derelict, John Schilling, Onyomi, Airgap, Suntzuanime et al,

            I am writing to you in order to express my delight at having been accepted into your long-running and illustrious SSC Comment Conspiracy. Regretfully, my man-servant appears to have misplaced my initial acceptance letter; I had to learn of my new position third-hand. He can be quite the dunder-head sometimes, but there’s nothing for it now. I shall make him eat a bowl of spider webs as punishment.

            When shall we meet to discuss our vile plans? I normally oppress the proletariat between 5 and 10 on Mondays and Thursdays, but I can move that around as needed. Put your people in touch with my people and we shall make whatever arrangements seem most fitting.

            Jaskologist, Esq.

          • FacelessCraven says:

            @Autonomous Rex – “Here we have Tarrou, Scissorhands, Mary, Derelict, John Schilling, Onyomi, Airgap, Jaskologist, Suntzuine et al running comment field drills drills in advance of Scott’s next post:”

            Having viewed the clip, the inference one might draw is that the steel-helmeted horseman is attempting a useful activity, if perhaps doing it badly, while your stand-ins with the golfing kit are in fact just wrecking shit for their own selfish amusement.

          • John Schilling says:

            Indeed, the first words of the “heroes” and the chosen title of the clip make it clear that the ROTC officer’s actions, contemptible in the context of training soldiers to win or at least survive a war, would be considered laudable if performed for the selfish amusement of some fraternity brothers.

            Several of whom turn out to be rapists of one sort or another, IIRC. It is a testament to the genius of John Landis and a most illustrious et. al., that I continue to enjoy that movie even understanding that the protagonists are a thoroughly contemptible group of people who achieve almost nothing in the way of redemption.

          • Troy says:

            I am insulted to not have been included in the Conspiracy.

          • Jaskologist says:

            Troy, though some may say our membership roster looks arbitrary and downright random, I prefer to think of it as highly selective, not unlike academia or a drug gang.

          • Airgap says:

            > Here we have Tarrou, Scissorhands, Mary, Derelict, John Schilling, Onyomi, Airgap, Jaskologist, Suntzuine et al running comment field drills drills in advance of Scott’s next post

            @Rex: I was going to give you the benefit of the doubt, as I don’t recognize your name, and why would I trust the others to say you should be ignored? But you’re pushing me into the direction of thinking they are right. It’s not too late though.

  16. Demosthenes says:

    I’m sorry, but the whole “academia is rough” narrative just screams privileged, white people problems.

    • meyerkev says:

      You work 80 hour weeks for ~$20K/year, and hopefully the ability to do interesting problems.

      I mean, I’d much rather be in grad school than say, construction on a 100 degree day, but it’s probably down below the 50th percentile in terms of awesome jobs. Unless, as above, you really love your interesting subject.

      /Every person in my family: “When will you do grad school?”
      Me: “2 years of grad school is one year of industry. For which I actually get paid.”

      • Oliver Cromwell says:

        I get $42k for 40 hours, which is worse than I’d earn in industry, but not enough worse to be a huge deal, or to not be worth the more interesting project and greater freedom. Of course it depends on your field, and perhaps location.

        For me, it’s the academic late game that is a let-down compared to industry. A really successful professor becomes a bureaucrat like a senior manager in any other field, but doesn’t get the financial compensation of a bureaucrat in the private sector or even in other parts of the government.

        Contra Scott’s post, to my mind an academic ‘winner’ is only a winner relative to being a younger academic. He is a loser compared to other 40-50s senior managers outside academia, with whom he more fairly ought to compare himself. A grad student, on the other hand, can be a winner compared to a 8-6 entry level accountant, without necessarily destroying his private sector end game options in certain fields either (economics, comp sci, to a lesser extent other numerate subjects).

        Humanities grad school without funding, by all means leave on the shelf.

      • Liskantope says:

        For what it’s worth, I’ve known at least one grad student who has also worked construction jobs, and said that a lot of the time he was tempted to choose construction work over an academic job.

    • Pku says:

      Isn’t it one level more privilegey to complain about other people complaining about things? (I mean, I guess that makes this a level 3 privileged complaint, but at least I’m not hypocritical about it).

      • Decius says:

        A meta to the side: What’s wrong with privilege?

        • Zykrom says:

          Controlling for success, the less privilege you have the more virtuous you are.

        • Pku says:

          I was trying to be snarky about saying there’s really nothing wrong with making privilegey complaints in this context. (Also, does this count as a level-four privileged conversation? or are we just on an aside within the same privilege level?)

    • Janne says:

      It isn’t rough in absolute terms. The field could certainly benefit from having less of an “up-or-out” approach to career advancement. But that has to do with the funding structure rather than with tenure itself.

    • ThirteenthLetter says:

      You’re… kidding around, right? There are huge numbers of nonwhite people in academia in th US.

      • Demosthenes says:

        Not really what I meant. I was getting at the supposed go to “underpaid, winner-take-all, maybe nice lifestyle of academia” vs. the “great money in i-banking, law, or medicine” dilemma. It’s conversation a bunch of New York and New Jersey undergrads at George Washington are having.

    • Airgap says:

      Privileged white people are people too. If you prick them, they bleed. Try it some time. It’s fun!

    • Richard Metzler says:

      Keep in mind that we’re talking about highly talented, highly educated, highly motivated people. It’s understandable that they complain about a system that rewards only a minority of them with a decent job after a decade of low-pay, high-pressure work.

      • David Moss says:

        A decent job after a decade of low pay high pressure work *at best.* I know a fair few people (here in the UK) who got their thesis published as a book, had between 3 and 8 publications (no mean feat in humanities/social science), and still just can’t get a job.

      • Demosthenes says:

        Of course. Just don’t be surprised when I roll my eyes.

        • Peter says:

          Likewise, don’t be surprised when the people you’re rolling your eyes at roll their eyes right back.

        • Peter says:

          In fact, come to think of it – do you have anything constructive to contribute here at all? I was going to make some quip about us having been rolling our eyes at your sort of rhetoric since before you turned up, then I thought I should check to see if you haven’t been here since the beginning, and unless you’ve been using other pseudonyms, it seems that the sum total of your contributions here has been this objection to the problems of the academic career ladder being discussed, and to gripe about Scott’s use of punctuation in a different post.

          I realise I’m in terrible danger of hypocrisy here, but could you explain how exactly your comment “not only contributes something to the discussion but contributes more to the discussion than it’s likely to take away through starting a fight”?

    • Mary says:

      Minding whether something ” just screams privileged, white people problems” — just screams privileged, white people problems.

    • SFG says:

      Yeah, but it’s a scam I see friends falling for, so I feel moved to comment.

      Snowballing fees for unpaid fines, check-cashing outfits, and crappy food, to list a few of the ways the poor get screwed, I am less familiar with.

  17. Michael Watts says:

    Some comments related to particular examples of yours:

    First, a disclaimer. I have no personal experience working in game development. And I never will, because all you ever hear about working in game development is that it’s a bleak hellscape of very low pay, very long hours, and high pressure all the time (“it’s crunch time every day”). Nobody’s keeping the game developers in their terrible jobs, though; they stay voluntarily. The only explanation I’ve seen advanced for this is that game development is a high-glamor career (not among the general public, but among programmers). This looks like a good analogy to grad students to me, and suggests that even if there was no “winner” status, most people on the academic track would still be “losers” working terrible jobs in terrible conditions for terrible pay. There’s no tenure in game development.

    I read a book once, possibly “M.D.: Doctors Talk About Themselves”, which contained the following piece of paraphrased analysis: “Reagan tried to lower health care costs by increasing medical school enrollment [or some other similar method of graduating more doctors per year]. This was a textbook case of an intervention by someone who had studied economics but was unfamiliar with medicine. They thought more doctors meant doctors would work for cheaper. But what they didn’t realize is that more doctors means more procedures. Once the doctors were licensed, they went out and made work for themselves, so medical spending went up instead of down”.

    My own take on that analysis is this: it comes from the conflict between what the people want (lower prices for medical care) and what the government wants (lower overall spending). Right now the government meets the demand for lower medical prices by paying for a lot of medical care. That means that if an intervention causes prices to fall, and spending rises as a result, that intervention has failed from the government’s perspective, since it is now spending more dollars on health care than it used to be. You might think that a family getting care they wouldn’t have gotten before, because the price has dropped to where they can afford it (or heck — because a doctor fearing competition has set up shop nearby where before there was no doctor at all), represents an improvement in society — but not everyone sees it that way.

    So anyway… that’s one reason M.D.s might be artificially scarce.

    • suntzuanime says:

      There might be glamor (in some sense) in academia, but it’s a dualistic glamor. There is no glamor in being a grad student.

      • Michael Watts says:

        That’s an exact parallel to game development. There’s no glamor in being a developer either. It’s all in the industry. This isn’t dualistic, because everyone involved is losing. But it has all the same problems as a system where a few people are winners, just less upside.

      • Saul Degraw says:

        I don’t know about glamorous but I loved sitting around and discussing plays in the afternoon or directing my actors over working in a cubicle.

    • Oliver Cromwell says:

      It’s very lucky for lil’ ole’ society that this doctor and his friends at the cartel were able to set the number *just right* to neither underprovide nor overcharge, something crotchety old Reagan was clearly too stupid to realise. We are a happy nation indeed to be ruled by such people, who of course have no thought whatsoever of themselves.

      • Michael Watts says:

        Well, obviously, they’re both underproviding and overcharging, because that’s the effect of supply restrictions. But I think it’s still worth noticing that because those effects run in opposite directions, we can be spending less money overall on overpriced medicine, since there isn’t as much of it. File it as an example of “think very carefully before you decide what to measure”.

        • Oliver Cromwell says:

          We might be but I don’t think that’s obvious at all. It’s entirely possible for the price of doctors to drop because of more open access to the field despite the supply of doctors remaining constant. More people could apply to medical school, doctor salaries could drop, and then applications peter out, leaving us at a new equilibrium where doctors are paid more like engineers but where are no more of them than there were before. Now I have no data to contradict the book’s anecdote, so the book might be right that this isn’t what would happen, but I don’t take the source at face value.

    • Loquat says:

      I’d suggest a slightly different explanation for the crappy status of game development jobs – not so much “glamour” as “doing something one loves”. Lots of smart young people with computer skills really love playing computer games, and therefore believe that making computer games will be much more fun and fulfilling than, say, debugging accounting software. Still a fairly decent analogy to grad students and adjuncts, though.

      • onyomi says:

        It does seem like many “dualistic” professions share the “passion” element. Though that alone only explains perennial oversupply, and not necessarily the strange distribution of rewards.

        • Mary says:

          For instance, acting in movies. You have the superstars, and you have a lot of people doing bit roles.

    • Marc Whipple says:

      This is a scaled up version of the principle that many towns are too small to support one lawyer, but almost any town can support two. 😉

    • brad says:

      This was a textbook case of an intervention by someone who had studied economics but was unfamiliar with medicine. They thought more doctors meant doctors would work for cheaper. But what they didn’t realize is that more doctors means more procedures. Once the doctors were licensed, they went out and made work for themselves, so medical spending went up instead of down.

      A way of putting this that an economist will instantly get is that medicine follows Say’s Law. This also applies in automobile repair. I’d imagine there’d be some way to generalize it to a whole class of industries based on a certain type of information asymmetry.

  18. zz says:

    My mom has been working as a physical therapist at a nonprofit that treats special-needs preschoolers for quite some time (>10 years), and has recently earned her Doctorate of Physical Therapy* with the intention of leaving that job to be an adjunct professor; reasons include “stupid paperwork”, “my knees are giving out”, and “I’d like to do research”.

    And now, I’m reading things like “adjuncts have lost the lottery and refuse to admit it”, and I’m unsure if she should be counseled against moving to a position usually taken by people who’ve lost.


    *As I understand it, one now needs a DPT to practice, but practicing PTs were grandfathered in when this change was made, which is how she practiced physical therapy before earning a DPT.

    • Oliver Cromwell says:

      Weird; a doctorate is a research degree, why would a practitioner need one? Or is it a title-inflation honorary doctorate, like JD or MD?

      • zz says:

        Near as I can tell, title-inflation

      • Nicholas says:

        Legally, you need the degree to qualify for the license.

        • Oliver Cromwell says:

          But is it a real research degree, i.e. is the licensing requirement to perform physical therapy to have carried out original research in physical therapy? If so that seems really bizarre, even by regulatory standards. MD and JD are just courtesy titles.

          • Richard Gadsden says:

            JD and MD aren’t courtesy titles, they’re a different sort of doctorate, that was historically more common than the research doctorate.

            With the rise of research doctorates in the nineteenth century, professional doctorates in fields other than law and medicine have largely disappeared (there are a few D.Mins still being issued in religion). There has been a huge expansion of professional doctorates in medical fields, and DPT is one of them.

            In at least some places, there was (before the eighteenth century) a ancient distinction was between doctors who could practice on the general public and licentiates who could teach in the university. Then licentiates started doing research, and the idea of a higher qualification in research came about – which became the Ph.D back when science was still “natural philosophy”.

          • Airgap says:

            > MD and JD are just courtesy titles.

            It’s true; I never went to med school, but I have an MD because Vivek Murthy is my father.

          • LHN says:

            The history of the JD is weird. Even as US law transitioned from an apprenticeship system (“reading the law” under an attorney, as Lincoln did) to a professional school system, most schools followed Harvard in giving out LL.B. (bachelor) degrees, despite it being postgraduate. There was a push for JDs in the early 20th century, led by the University of Chicago, but it burned out outside Illinois by the 30s.

            Then there was a new push in the 60s, which took– Yale was the last to make the change in 1971. E.g., my dad graduated Cornell in the early 60s with an LL.B. They later sent him a diploma that retroactively turned it into a J.D. (While I don’t have corroboration, he told me that it had in part to do with federal compensation laws, and it being easier to justify a given salary grade with a doctoral degree. But I’d guess that MD-envy and dissatisfaction with a mere bachelor’s as a professional degree in an era of increasing credential inflation also had something to do with it.)

            So now law schools give out JDs, LL.M. degrees (master of laws, often for specializations like tax), and (in some cases) S.J.D. (Doctor of Juridical Science) as a research doctorate. Though the last is much less common among law faculties than Ph.D.s in other academic fields, since you only need a J.D. to teach.

          • Deiseach says:

            I don’t know if this holds for all of the US or if it’s a local custom, but I was very boggled to read a news story or article or some damn thing (forgive me, it was a while back) about some lady who used the appellation “Esquire” after her name.

            At first I thought this was rampant snobbery run riot, but apparently it’s a title that if you’re a qualified lawyer (I don’t know if it’s the equivalent of a barrister), then you can use it? Like the idea of “letters after your name”.

            Which tickles me that Americans (at least in some parts) have retained the notion of esquires by office while the usage is more or less dead in the British Isles, but with the twist that now, apparently, it is a gender-neutral term 🙂

          • Adam says:

            Yes, attorneys in the U.S. get to put Name, Esq. on their business cards, although I feel like I see ‘Attorney at Law’ more often.

          • Marc Whipple says:

            “Esquire” is a title of custom among American attorneys, but there’s no official meaning to it. (Though presumably a non-licensed practitioner of law who called themselves that would be easier to sanction for it.)

            We all add it to our names for a year or two after we pass the bar, it’s just a thing we do. Most of us grow out of it fairly quickly.

          • Airgap says:

            I typically sign myself “Airgap, Esq., Esq., Esquire, Esq.” I may add additional Esq’s if I’m feeling particularly respectable that day.

          • Marc Whipple says:

            Oh, and by the way, Deiseach, we don’t make a formal distinction between barristers and solicitors here. We’re all lawyers or attorneys (the two terms are legally equivalent in this context, although formally we’re usually referred to as “attorneys”) the same. Law licenses are issued by the Supreme Court of each state – so we’re all authorized, literally, to practice before the highest court of our jurisdiction. (And don’t get me started on how the highest court of New York is the Court of Appeals. The hierarchy there is Supreme Court – Supreme Court Appellate Division – Court of Appeals.)

            That being said, while any licensed attorney can be admitted to the Federal bar, you need a separate qualification to actually appear as lead counsel in a Federal case, and yet another to appear before the United States Supreme Court. People who do that sort of thing tend to do it almost exclusively. That’s as close as we come to the idea of a “barrister.” While I’ve been in court a few times, it’s not my gig – I do transactional and regulatory work. I’m much closer to the idea of a “solicitor.”

            Those Federal qualifications are not called “licenses,” they’re called “admittances.” The Federal government doesn’t issue general Federal law licenses, although one of its agencies does. Namely, the US Patent Office, which offers a separate bar exam. While I’m meandering, it is interesting to note that the qualification to sit for the Patent Bar is not a law license, but a degree in a qualifying science or technology field (or equivalent experience.) A person who passes the Patent Bar but has no state law license is called a Patent Agent, and has a limited license to practice law before the US Patent Office, including its Trial and Appeals Board, and even the Federal courts in patent matters (but not in state court, even if the case relates to a patent, e.g. a contract dispute over a patent license.) A person who has both a registration to practice before the US Patent Office and a state law license is called a Patent Attorney. It used to be somewhat common for engineers, etc, at highly innovative companies to sit for the Patent Bar so they could work on patent applications for their employers. It was worth some prestige and a bump in salary. It’s much less common today.

            And the final bizarre twist is that the Patent Office is a branch of the United States Patent and Trademark Office: even patent lawyers usually just refer to it as the PTO. However, to practice before the Trademark Office, you have to have a state law license – but it can be from any state. Or even from a foreign country which is a signatory to the appropriate treaty regime.

          • Marc Whipple says:

            Airgap: My personal heuristic is that if I see “Esq.” after a person’s name they’re a relatively new attorney, if I see “J.D.” after their name they’re a bit insecure and they’re trying to impress people, and if I see both, they’re probably kind of a tool. If I see “Esq., J.D., B.S., M.S, CIIP, ETC” either their firm requires them to list all that crap or they’re a complete tool.

            It’s not a strong heuristic and it’s more for my amusement than for any practical purpose – I never base my actual response to someone on it – but it’s not bad for a first approximation.

  19. Academia resembles drug gangs in another sense: Selling something that requires increasing dosages to get the same effect.

  20. Second, dualized fields offer an inherent opportunity for oppression. We all know the stories of the adjunct professors shuttling between two or three colleges and barely making it on food stamps despite being very intelligent people who ought to be making it into high-paying industries.

    I doubt very many people go into academia for the money, since there are much easier ways to make money.

    • suntzuanime says:

      People go into academia for a basket of things that hopefully includes at least some money.

    • Nicholas says:

      I was going to say I couldn’t think of any easier ways, then realized I volunteer to teach people things for free. So of course I’m a teacher, i monetized my hobby.

  21. Wrong Species says:

    We don’t have half the minimum wage but there are interns. How bizarre is it that people think working for free is a reasonable thing to do to kickstart your career but will freak out when someone is getting paid 6 dollars an hour?

    • Deiseach says:

      How bizarre is it that people think working for free is a reasonable thing to do to kickstart your career but will freak out when someone is getting paid 6 dollars an hour?

      Because the implicit bargain there is “Get experience, get your foot in the door, and you’ll have the chance to get on the first rung of the career ladder”. The carrot to go with the stick is “This is a lucrative career, once you get a real paying job in this industry/field, there are all kinds of perks”. If the result of “working for nothing” means “get a low paying job after that”, then the game is not worth the candle.

      Working for minimum wage or low hourly pay is not a career, it’s a job (and probably not one with great prospects of promotion, stability, or permanence). Internships are supposed to be work experience – government departments of employment love them for getting unemployed people off the live register and into ‘work’ with the promise that once they’ve gained experience in a relevant field and demonstrated that they’re hardworking, ambitious, and talented that a real job will come out of this; naturally employers are not going to turn down workers who will do the basic stuff that needs little or no training but needs to be done by someone and who will work for free on top of that.

      How many of the interns actually get real paying jobs in the industry is another matter.

      • Cadie says:

        Minimum wage for interns would be better. I’m sure their work can be set up so that they’re productive enough to earn $7.25 an hour. And that would open up those job fields to people who are unable to take an unpaid internship because they need actual money now. Minimum wage isn’t much, but it’s a LOT better than nothing. There are many young people who could survive on minimum wage, renting a room or sharing an apartment, but since they don’t have someone to pay their bills, they have to take a job they get paid for.

        • FJ says:

          Have you ever actually had an intern? I’ve had a couple in 2015 so far. They were good kids and worked hard. They were also so totally clueless (understandably!) that I spent twice as long supervising them as it would have taken for me to do the work myself. Their work product has severe quality problems, and for ethical reasons I have to double-check everything they do. In my field (law), interns have negative value-added to the tune of tens of thousands of dollars a year.

          Which is fine: we use internships as essentially extended interviews-cum-training periods, and a substantial number of our entry-level full-time hires are former interns. It’s worth it to squander my time on shepherding them for a few months. But if we tried to make the interns produce positive value-added, we’d have to give them very different responsibilities. No more writing legal arguments and workshopping them; instead, they would fetch me lunch. Don’t get me wrong, I love that idea. But it would no longer be a very good screening-and-training program.

          • Me Peter You Jane says:

            “Have you ever actually had an intern? I’ve had a couple in 2015 so far. They were good kids and worked hard. They were also so totally clueless (understandably!) that I spent twice as long supervising them as it would have taken for me to do the work myself.”

            Well, yes, I’m sure the McDonald’s manager can make the same argument about his new hires, too. Yet a good proportion of the population would like to force him to pay them $15/hr anyway, while somehow letting you get away with rationalizing why you should be paying them absolutely nothing.

          • John Schilling says:

            What about college students? Should we pay them minimum wage to sit in classes and labs where they will learn skills that will make them economically productive in the future?

          • FJ says:

            @Me Peter You Jane: We actually do give our interns a (below-minimum-wage) stipend, and I’m open to the notion that we should pay them more than that for reasons of justice, efficiency wages, etc. I’m not trying to justify our desire to pay our interns poorly. I was attempting to respond to Cadie’s claim that “I’m sure their work can be set up so that they’re productive enough to earn $7.25 an hour.” Yes, you can maybe set up an intern’s work to produce positive value-added, but she wouldn’t be an intern — she’d be a gofer.

            Also, regardless of whether unpaid internships are prohibited, I hope that the First Amendment will always protect my right to rationalize why they ought to be permitted. Rationalizing terrible things is basically my only hobby.

          • Deiseach says:

            See, you’re doing what internships are supposed to do; take young people interested in the field, but with no relevant experience, and give them on-the-job training (but not in a real job for all the reasons you list) which will (a) let them know what it’s like to work in that job (b) give them a chance to find out if they’re able for it or not (c) give your firm a chance to see if any of them are potential hires when they finish their education and qualify.

            More and more though, it’s switching to “You’re already qualified, you know enough or have worked in similar jobs before to be useful, instead of an entry-level job we’re using unpaid labour and holding out the lure of a real job after this without committing ourselves to offering you a real job”.

          • Marc Whipple says:

            If you are educating your interns and getting no work from them, you don’t have to pay them and you shouldn’t have to pay them.

            OTOH, I had a fantastic intern two years ago who I paid because I wanted to be able to tell her to go and organize my files while I thought of something more interesting for her to do. Did she produce positive value? Eh. But that way I didn’t have to worry about it one way or another.

          • Adam says:

            I’ll posit a prima facie conjecture without evidence that there is a difference in the amount of time it takes practicing before someone can contribute positive value to a law firm doing the things an intern expects to do and the time it takes to contribute positive value to McDonald’s doing basically anything they have a job description for.

  22. Richard Metzler says:

    I’m not sure I agree with the premise of “dualized” vs. “non-dualized” fields; there are plenty of fields that have people working badly paid jobs under miserable conditions in the hopes of hitting the jackpot, but lack an established “congrats, you’ve made it” goalpost. Actors, musicians, athletes, writers… to some degree, even managers working their way up through the corporate hierarchies, although they’re reasonably well-paid along the way. So I see it more as different degrees of more or less fat-tailed success distributions, with some artificial cutoffs built in for some fields. What determines if a field has a wide distribution? Either there must be a deep hierarchy, for the few people on top to skim off the gains of the hard work of the lower layers (drug dealers, corporate management), or there must be extraordinary demand for the very best performers. That clearly applies to writers, musicians etc., but also to some extent to doctors, lawyers etc. Maybe the dualization by way of formal qualifications becomes relevant for these fields because, while you really want a doctor or lawyer who knows their shit, it’s hard to evaluate their performance before you hire them (and then it’s too late), whereas with artist it’s more WYSIWYG?
    ETA: Of course, some fields have rewards that go beyond monetary aspects – fame, glamour, doing what you love – and those may attract a broad base of applicants competing for the very few available well-paid jobs… it would be interesting to compare the income distributions in “glamorous” occupations versus “boring” ones, but it’s probably tricky to take into account all other relevant aspects.

  23. suntzuanime says:

    I think it’s a bit unfair to characterize employers not wanting to pay large amounts of money for things as “resentment”. If I only get 2GB of data on my cellphone plan before I have to pay extra overage fees, I’m not being “resentful” in trying to stay under the cap, I’m just being frugal.

    • anon says:

      Dont worry, Ive enough resentment about cell phone billing structure for both of us.

    • Autonomous Rex says:

      Great moments in libertarianism.

      Beneath some long threads about the whores and the blacks
      one little libertarian piped up, taking umbrage at the implication that employers “resent” the minimum wage.

      This is definitely a gut feeling-based objection.
      What does the evidence say?

      What does one hundred years, hundreds of millions of dollars invested in innumerable trade associations, think tanks, front groups, for lobbying, for public education, issue advocacy, etc. say?

      Besides “we are deeply and inauthentically concerned that someone will be laid off if the minimum wage is raised. Forget for a moment how ridiculously opposite that is from our real concern…

      • FacelessCraven says:

        I went to that site. It listed a whole bunch of quotes of people saying minimum wage laws were a terrible idea. searching the rest of the site indicates that they think the quotes are wrong, but a (very) cursory search didn’t find any explanation as to why, beyond saying that the minimum wage is an excellent policy that helps the lower end of the economic spectrum immensely.

        The rest of your statement seems to be that the minimum wage is obviously a fantastic idea, and the only people who disagree are stupid or evil. It doesn’t seem obvious to me, for a couple reasons, starting with the self-checkout machines I use to buy my groceries. Maybe you could elaborate?

      • Brian Donohue says:

        To some people, it is inconceivable that any opposition to a minimum wage can arise from motives that are not mean.

        I think they hold a picture in their head of some guy working two minimum wage jobs to support a family or something. But that’s mostly a myth. Only 4% of minimum wage earners are older than 24. If you think for one minute about the relationship between skills and compensation, this is completely unsurprising.

        To others, this is the most elementary Econ 101. They see a minimum wage as cutting off the bottom rungs of a ladder that doesn’t exist for the cerebral crowd here, the creation of an artificial “two-tier” system with the unattractive properties of dualized systems that Scott describes.

        More generally, many European countries with strict labor laws have experienced this dualization across their economy for a generation, and they have seen huge and intractable youth unemployment as a result. When governments make companies do their social policy bidding through employment rules, this sort of baggage always comes along for the ride.

      • Adam says:

        I don’t doubt that anyone forced to pay more for something than they think it is worth resents that. What I doubt is that the marginal impact of employers not hiring people because they resent the wage they command is greater than the marginal impact of employers not hiring people because they can’t afford it.

        If we’re going to talk about McDonald’s in particular, they only own 18% of their stores. They make their money off licensing and probably don’t give a shit what wages their franchisees pay their employees. The franchise fee isn’t going to change. The franchisees themselves probably own a few stores at most on average and are sure as shit going to employ every last person they can afford to employ rather than shut down, whether or not they resent it.

  24. Navin Kumar says:

    Econ is probably the exception to the dualization of academia. Everyone with an econ PhD gets a job. If you’re a good economist, you can work at a research institute. If you’re a medium economist, you can work at a teaching university or for the government. If you’re a crappy economist, you can always write for the New York Times.

    • Who wouldn't want to be anonymous says:

      Aaaaw snap! I love the stones involved in calling a Nobel* prize winner crappy.

      * Coveats may apply.

      • suntzuanime says:

        There’s kind of a running joke in reality about how all the Nobel Prize winners are racists or sexists or war criminals or whatever. The Nobel Prize apparently selects for the worst dregs of humanity.

        • Autonomous Rex says:

          …and you just protested it was unfair to characterize the mood of employers as “resentful” towards wage controls.

          My other favorite posts:
          the guy who said he didn’t find evangelicals to be “anti-intellectual”. the guy who said it would be cool to be a serf. the one who chimed in that it’s simple to make your way up the ladder from serf to lord if you set your mind to it….

          • FacelessCraven says:

            @Autonomous Rex – “…and you just protested it was unfair to characterize the mood of employers as “resentful” towards wage controls.”
            Why $15 an hour? why not $50?

            “the guy who said he didn’t find evangelicals to be “anti-intellectual”.”

            How many Evangelicals do you personally know?

            “the guy who said it would be cool to be a serf. the one who chimed in that it’s simple to make your way up the ladder from serf to lord if you set your mind to it….”

            Having lately read through those comments, it seems kinda like you’re hearing what you want to hear, not what people are actually saying. Judging from your other posts, you seem gobsmacked to be encountering views out of line with those you typically encounter. If you think people are wrong, disagree. Around here, pointing and laughing just makes one look vicious and dumb.

    • Ben says:

      Its not just economics, its also true of most engineering fields.

      The key question is, how valuable is a PhD of degree X outside of academia? In engineering fields, the answer is pretty valuable so for folks who don’t want to deal with academia you can go work in industry research, still get to do interesting work (although potentially less interesting) and make good use of your degree.

      The problem is, in many academic fields, including the hard sciences, there isn’t high demand outside of academia creating the second tier glut Scott is talking about.

  25. Harald K says:

    “If there were no minimum wage, we would expect a sort-of-continuous wage distribution from 0.01$ an hour all the way up to whatever Taylor Swift makes”

    No, we would not. This is silly. There comes a wage below which working isn’t worth it, because you’ll starve anyway (or, would anyway have to rely on whatever non-wage-employment option you have to not starve). There are natural tresholds in life. We really don’t want to push people over them if we can help it.

    This isn’t just a theoretical concern either, I live in a country with no _legal_ minimum wage, and of course there are no $0.01 jobs.

    If you want peace and stability, avoid pushing people over those thresholds where desperate high-risk strategies become sensible.

    • Oliver Cromwell says:

      I lived in a country with no minimum wage in the 1990s. When a minimum wage was introduced, one of the outrages used to justify it was that severely disabled people were being paid about 30c/hour. Not 1c, but not far off. Of course, they were *also* receiving welfare for being disabled, and the change introduced by the minimum wage was that they lost their earned income and their self-actualizing job, and kept just the welfare. No one was made better off, but I suspect many, perhaps most minimum wage supporters will regard this as a good outcome.

      • AlphaGamma says:

        The US certainly has an exception to the minimum wage for severely disabled workers if the business can prove that they can’t be as productive as someone who isn’t disabled (this involves them attempting to do the job under monitored conditions).

        The UK, by contrast, has (or used to have) government-owned factories where severely disabled people could work and be paid the minimum wage due to their wages being subsidised.

        • Oliver Cromwell says:

          I’m sorry my comment segued into critique of the minimum wage, but that wasn’t my main intention. Just the fact that people who can only earn a few cents in a true market economy per hour do exist and that it is not necessarily a bad idea for them to do so. Of course when one moves very far from the mean productivity the people involved will stop being seen as “normal” (in both directions), but this is not really a salient distinction. If someone earns poorly because they’re just really bad at concentrating or have very low intelligence without any organic brain damage the same would apply.

        • Adam says:

          The U.S has that, too. It’s called Lighthouse for the Blind and all GSA clients are required to purchase office supplies from them, even though they can be purchased cheaper through non-subsidized producers that don’t employ blind people.

      • Loquat says:

        See also: this flap about Goodwill (a major US charity-thrift store) doing basically the same thing. Per AlphaGamma’s comment, we’ve got a law that says it’s legal, and the workers subject to it typically are receiving some sort of welfare/disability as their primary income. Goodwill claimed most of its disabled workers earned near or above minimum wage, and that the individuals earning less than $1.00 per hour were rare outliers with incredibly low productivity (guy spends 15 minutes working and then freaks out and won’t work for the rest of his scheduled shift, that sort of thing). Naturally, lots of people were outraged when the story broke, but hardly any of the outraged people had an actual answer for the question of exactly what should be done with a person who’d like a job but just isn’t capable of doing even half as much work as an average non-disabled person. “Pay them the same, you heartless bastard!” isn’t really a good answer when the employer has a budget and performance targets to meet.

      • Autonomous Rex says:

        ” No one was made better off, but I suspect many, perhaps most minimum wage supporters will regard this as a good outcome.”

        “…people who can only earn a few cents in a true market economy per hour do exist and that it is not necessarily a bad idea for them to do so”

        You don’t even believe what you’re insinuating. This kind of rhetoric, implying that the speaker who favors the most brutal domestic policy is the one who really cares, while those who seem to care actually set out to ensnare the poor.
        This is a wierdly popular libertarian conceit, most paradoxically presented by those could not care less about the poor in any other instance you can think of. Only when it is proposed that something be done for the benefit of the weak do they speak up with righteous fury ON BEHALF of the poor, in full knowledge that it is against the policy preferences of the poor (who conveniently do not post on libertarian blogs to disagree).
        You will never see a libertarian actually go to a soup kitchen or homeless shelter or anywhere else to actually communicate this message to the poor.
        If they believed it they would try but they’re just playing games.
        This is so easily done that lately ive notice the coldest hearted conservatives and libertarians have been gleefullly testing it out. After all, who’s going to take the time to argue you don’t really care and are just interested in cheap labor, unorganized and as cheap as can be?
        Thinking, “can i actually get away with this bullshit? posturing like i actually care? claiming that nine people getting a raise and one getting laid off is a tragedy when all ten, who live lives beyond my imagining, who emphatically do not want the right to work for less, whose real representatives are fighting for the precise opposite of what i want? apparently i can…”

        • FacelessCraven says:

          @Autonomous Rex – I grew up dirt poor, and remained so until relatively recently. His position makes more sense to me than yours does. As someone with significant life experience with poverty, please stop using the misery of deprivation as a substitute for actual argumentation.

    • Scott Alexander says:

      I have friends who do video captioning jobs online that make them like 50 cents to $1 per hour. It’s not very good if you want to support yourself, but a lot of people have support from family or benefits or something and want to feel like they’re productive.

    • Urstoff says:

      If that were true, no one would take those Mechanical Turk jobs.

      • disciplinaryarbitrage says:

        From my understanding, having used Mechanical Turk for survey research in school, reasonably efficient MTurk users can make around or above minimum wage just from being picky about the tasks they take. The ones that only pay pennies typically can be completed very quickly, whereas longer tasks (like filling out surveys) demand substantially higher payments.

    • Tracy W says:

      When I was at high school in NZ there was no minimum wage for youth workers like us (I think the age cut-off was something like 18 years) but no one I knew amongst my friends had jobs paying less than about $4.50 an hour. Oddly the high-end shops and supermarkets paid less than the lower-end.

  26. Emile says:

    A stab at an explanation:

    That kind of dual distribution is what you get in a winner-takes-all situation, and we can divide that up into two rough patterns:

    * When the nature of the field is inherently competitive: drug markets, startups

    * When “quality” is not very easy to measure directly (at least by those making the important decisions), and therefore people settle on an indirect sign of quality; and the most memorable sign will be binary – this would be the case for tenure, law firms, etc. who rely on convincing uninformed “outsiders” (clients, potential students, grant-givers) of how good they are. The one who has the best reputations gets more clients and so gets a better reputation, etc.

    … or something like that. I think “information” is a key factor, in that a “smoother”, more gradual system would also be harder to understand, and so might not work as well (or at least, having it work well would require sorting out the information issues).

  27. vV_Vv says:

    It seems that information asymmetry may be a factor.

    If you are a tech employer in need for a computer programmer, you can probably accurately evaluate the skills of prospective employees. But if you are a patient in need for a doctor, or a student in need for a professor, or somebody in need for an attorney, you can’t evaluate the skills of the prospective professionals by yourself. You have to rely on the expert opinion of other professionals in these fields, and this intrinsically creates a licensing system, whether it is enforced by the government or not.

    To some extent this probably also applies to the illicit drug market. Whether you are a customer or a high-level producer/distributor, you only want to trade with “reputable” dealers, because if you trade with unknown people they may sell you crap or rip you off (and then you have legal recourse) or they may be cops and arrest you. A gang neck tattoo may be the criminal underworld equivalent of a doctor license hung on your office wall.

    • FJ says:

      The drug world employs a variety of strategies to get around information asymmetry. My favorite is branding: a pack of heroin bought on the street might be stamped with the Batman logo, Mickey Mouse, or some other symbol. Different brands have different reputations, and sellers advertise which brands they offer. As the examples attest, drug dealers have little regard for IP law, but they enforce trademark infringement among themselves.

  28. Z.Frank says:

    “Originally I was going to make a simplistic comment about licensing and regulation, but this doesn’t exactly capture it.”

    I like the play on words. Thumbs up.

  29. David Moss says:

    Your description of academia-without-tenure sounds pretty much just like UK academia. You still see hordes of PhD students competing for a few very low paid jobs with a view to competing for even fewer (relatively but not really) “cushy” jobs. You don’t see everyone getting a very low paid job or that “colleges would find a lot of room for them to do one-on-one tutoring, or low-level research, or something like that”, instead you see people fighting tooth and nail over a small number of very low or unpaid positions (I’ve seen a few universities invite applications for unpaid positions), because the market is so glutted and competition is so intense.

    This makes me think a lot of the description of academia and drug dealing is simply accounted for by the glutted market, with many more people competing for a small number of positions. That gives you a lot of people in really terrible conditions fighting for a small number of relatively much better positions.

  30. Deiseach says:

    others can’t make it to that MD and have no relevant whatsoever in the industry

    I’m just imagining every nurse, radiographer, med lab tech, phlebotomist, occupational therapist, etc. reading that and leaning back going “Reallllly? No relevance in the field – sorry, industry? Oh well, I suppose I’d better shift my irrelevant ass out of here so and let you relevant doctors give patients their meds, take the x-rays, write up the individualised diet and nutrition plan for the dialysis patients and scrub the bedpans”.


    • Scott Alexander says:

      Nurses, etc, are not people who tried for an MD and failed to get it.

      • Tom Womack says:

        Aren’t they? Obviously some people who fail to become an MD end up saying that the health-industry grapes were sour anyway, but I’d have thought half a training course in being a doctor was a really good start for any other health-industry career; they can’t all end up working for health insurance companies.

        • Devilbunny says:

          Well, in the American system, very few people fail out of medical school, and most of them do so during their first semester. (And that was at a low-prestige state school that would take a chance on iffy students if they thought they would stay and practice in underserved areas.)

          Those who do so are welcome to apply to nursing school, but other than already having taken the prerequisites they have essentially no advantage over any other applicant with similar grades – possibly a disadvantage.

          • Adam says:

            I think this is kind of by-design in American pre-med programs. I wasn’t even a pre-med student, just straight bio, but since more than half the students in my first few classes were pre-med, the classes largely consisted of memorizing oodles of minutiae about every single subcellular system in the animal kingdom, and the professors all but told us it was to weed about pre-meds who would never make it through medical school. I almost felt like Bio 100 was harder than senior-level courses, but of course some of that is the fact that it’s also new when you’re first starting.

          • Devilbunny says:

            Adam, relative difficulty is probably true. I took freshman biology as a senior, having nearly completed a BS in chemistry (decided to go to med school right before senior year), along with one other chemistry senior (wanting to be a high school teacher, she chose the harder path and double majored in chemistry and education rather than just minoring in chem – pretty sure she’s great at the job). We were very experienced at how to learn scientific subjects, extremely skilled at writing lab reports, and despite our habit of wandering in somewhat disheveled and partly hung over, had little difficulty in the class. If you survive physical chemistry, what’s Bio 101?

            Also, if your high school chemistry teacher is now around 40 years old and has the initials T.R., ask her what she and her lab partners did before the last physical chemistry class of the year. 😉

        • Graeme Sutton says:

          In the Canadian system at least, it is vanishingly rare for anyone to fail out of an MD program. Those who drop out do it during pre-med, but these are much more likely to go into law or academia (in my experience at least) than into a different health care career.

      • Deiseach says:

        Having been at the tender mercies of (for instance) a hospital doctor who tried three times to take my blood pressure and still couldn’t do it, or doctors who call nurses to draw blood, take blood pressure, etc. for them because “the nurses are better at it”, I think maybe medical schools should fail a lot more 🙂

        • Tracy W says:

          I think practice counts for a lot. I’ve had a lot of blood tests and it’s normally a matter of ages of the nurse fiddling about trying to find a vein and me getting big bruises. Then I attended a big London hospital where they have a whole room of nurses who just do blood tests. Your number is called, you go in, sit down, paperwork, strap, in the needle goes, out comes the blood, band-aid, goodbye. And in my case no bruising ever.
          Apparently doing the same thing every day makes you really really good at it.

        • keranih says:

          Physical hands-on skill vs knowledge base and rational reasoning. It’s not the same thing.

          • Deiseach says:

            Physical hands-on skill vs knowledge base and rational reasoning. It’s not the same thing.

            “As for living, our servants will do that for us”, eh? Yes, physical practice isn’t the same as intensive knowledge base, but on the other hand knowing every vessel in the circulatory system and every molecule of every reaction, but not being able to take a pulse when push comes to shove, isn’t much good either.

            “I’ll be able to swoop into action with my huge brain and vast store of knowledge once someone finds out for me if the patient’s heart is beating too slowly or too fast, because treatment will differ depending on which it is, so I have to wait here until a nurse can spare five minutes to fix on the automatic blood pressure cuff for me”.

            Probably I am just an ignorant peasant, but I didn’t have much faith in what the doctor was telling me about my condition after he couldn’t even manage to take my blood pressure 🙂

  31. Murphy says:

    I’m lucky enough to have gone into the side door of academia. I’m better paid than the postdocs though without the academic career path towards professorships etc.

    A lot of academic track people get conned with promises of “it’ll look good on your CV”. It’s become a running joke in the office with the postdocs because it’s a sign that someone is trying to get you to do extra work for no money.

    The best answer of course being

    You know what looks best of your CV? A big pile of cash.
    The best way to be perceived as valuable is proof that others have already paid you well for being valuable.
    Anyone who tries to convince you differently is trying to con you into working for free.

    • Oliver Cromwell says:

      Do you write all your past salaries on your CV alongside the job title and description?

      Now I think of it, this is actually a very effective way to communicate pre-selection information, and much harder to bullshit than just about anything else that goes on a CV.

      • Creutzer says:

        Might be effective in industry, where comparisons are meaningful, but in academia, it would convey extremely little information because unless perhaps you’re hired as a full professor, you have very little influence over your salary. You can’t negotiate, you don’t have many options, none of them are financially great, and within what little variation there is, you may be balancing salary against living in a certain place because your girlfriend lives there or because they have an awesome department in your field. Academics also move around a lot between areas with vastly different costs of living. So the information in salary figures is negligible.

      • Murphy says:

        Lots of employers and recruiters want to know your last salary.

        Often you can get away with lying but many people are honest.

        It’s one of the reasons that negotiating first salary has such an impact later in your career.

        If you’re earning piles of money already it’s easier to negotiate from strength for your next salary.

        Creutzer is incorrect: there’s plenty of academia where salaries can be negotiated. (The idea that there’s no room for negotiation is handy as a negotiating tactic for pushing down salaries though.)

        Even in the civil service starting on a higher spine point or grade has a huge effect and can have a huge impact on lifetime earnings.

        So I’ll add: people who tell you there’s no negotiating possible are also out to screw you.

        • Creutzer says:

          You’re right, some professors do get to negotiate their salary. I thought I had a caveat to that effect in my post, but apparently I didn’t. But especially in Europe, non-professor-salaries are usually set by the agency that hands out the grant on whose money they are employed. I don’t know what it’s like in America, but I’ve never heard of, say, a post-doc being able to negotiate for anything. Even professors’ salaries are, as far as I know, standardised to a significant extent in many countries.

          • Murphy says:

            They’re only non-negotiable if you don’t negotiate. A family member working in the civil service related some stories of colleges who had applied to posts yet somehow believed that they weren’t allowed to negotiate. They were too stuck on the idea that somehow someone else was handing down immutable law to them. It doesn’t matter if you’re a professor, a doctor, a nurse or an admin. If they actually want you and there isn’t a line of 100 people equally qualified looking for the job you can negotiate.

            Normally the ad gives a salary range or a band not a fixed number. If it says 33-40K they’ll try offering you 33 but you can negotiate it up. If it says band 6 you can negotiate to start on a spine point half way up band 6.

          • Creutzer says:

            I’ll keep my eyes open for post-doc advertisements with a salary range, because now I genuinely can’t remember whether or not I have ever seen such a thing, and I’m curious. I might have overlooked or misinterpreted them. For professors, I’m sure you’re right that you can negotiate where in the range you start.

            What I know is that for non-professorial positions, in many cases the professors who make the hiring decisions have no discretion over the rates because they are set by the university or grant agency, so they themselves would have to go to whoever is in charge of the money and ask “can we pay this guy/gal more than the standard rate?”. It’s like that at both my current lab and my previous one.

          • AJD says:

            This job posting with a salary range isn’t a post-doc per se, but it’s a one-year limited-term teaching contract, which fulfills a similar ecological niche in the academic job market.

      • Douglas Knight says:

        Salaries are not helpful on CVs or resumes. What is helpful on both is dollar figures brought in for the employer. The job of a professor is to bring in grant money. Many do put such figures on their CVs.

  32. Whatever Happened to Anonymous says:

    >I’d worry they’re exaggerating the importance of this factor compared to wanting to maintain street cred and McDonalds jobs being much more regimented both in the application process and performance

    Doesn’t this just make the analogy more apt?

  33. Captainbooshi says:

    Or how about benefits? If there were no benefits, we’d expect a more continuous spectrum of people working 40 hour weeks, 30 hour weeks, 20 hour weeks, and so on. Instead, we guarantee everyone who works X hours the privilege of good health care. Employers resent this and try to limit access to the privilege by hiring people to work X – 1 hours per week, or hiring independent contractors, or so on. This creates a dualized system with an upper tier (real employees) and a lower tier (people working 29.999 hours a week or whatever who don’t quite qualify for the benefits).

    It seems like this should be pretty easy to test by looking at all the countries with different public benefits systems. Do the countries with public healthcare have a more continuous spectrum than those that don’t? Does anyone here already know the answer?

  34. Anonymous says:

    >they’ve got no option under than continuing

    I think this is a typo, you mean “other than continuing”, right?

  35. Jos says:

    Academic grad students get a free education, right? As opposed to med, law and business students, who pay a fortune for an education that uses similar resources.

    A few weekends ago I was chatting with an M.D. who mentioned that her friend got an M.D./Ph.D., and saw the major advantage of that as getting an M.D. tuition free, and “all she had to do was teach some classes.”. (And do a dissertation, but I think if you get the MD and leave the PhD ABD, you’ve still got an MD.)

    • Professor Frink says:

      And spend several extra years in school. Doctors can make 200k a year out of residency, more if they are a specialist.

      So if you do an MD/PhD to cover your school costs, you are spending an extra 4 years in school (since and MD/PhD takes about 8 years on average), so trading off against about 800k in income. You are doing this to avoid maybe 150k in loans for medical school. Even appropriately discounted, this is likely a bad trade.

  36. Brian says:

    I’m not certain that the law firm structure really fits your thesis (former big firm lawyer, here).

    Yes, you are correct in the way things work out for law students. If you don’t land a big firm job, or have someone else pay for law school, you’re essentially fucked – law school is prohibitively expensive and there are no windfalls for lawyers (there’s no equivalent to a start-up, for example). Consequently it can be difficult to simply survive for non-big firm attorneys fresh out of law school. It’s a well-known problem; people write articles about it. It’s why I tell anyone who is debating whether to go to law school not to do it unless they get into a top five school.

    But you seem to be arguing that this structure arose because of a purely internal question of competition. The problem I’m having is that you’ve discounted the influence of the client on the structure of large law firms. You write:

    It seems possible that maybe top law firms act as a de facto licensing system – picking out a couple of excellent young law school grads as Officially Excellent, and then if you’re a sufficiently big corporation you refuse to use any except those? But once again, I don’t know why law would develop this structure and other professions wouldn’t.

    First, legal representation is vastly different than other services. There is a LOT more at stake. And the promise of large law to corporations has been this: we have the best and the brightest attorneys on earth, who will work ungodly hours for you for their entire working lives, using our infinite and up-to-date resources, and we have hundreds of these attorneys, should you need them. There is literally nothing too large for our firm to take on, and we will NEVER make a mistake.

    That was the pitch, anyway. And when the economy was going strong, it was a tempting one to corporate interests – hell, it was a sign that the corporation had made it, if they had a top-shelf law firm partner on speed dial. I imagine it gave them a sense of security – if you’re in charge of hiring a law firm in a moment of crisis for your company (or for a vital merger, or whatever), and you’ve picked one of the best, you can rest easy. You won’t be blamed if something goes south.

    Moreover, nobody really WANTS cheap legal representation, any more than they want a bargain basement surgeon. This is important stuff, right? So, when things were going well, corporations typically hired the “best” (read: most expensive) firm they could. Less cynically – as a lawyer who has practiced in large and small firms – there is a LOT that large firms can do, but small firms can’t.

    But that takes us to my second point: this isn’t actually a self-sustaining model, which I think you’re implying. It was a blip in the history of legal representation. The system has completely fallen apart, and legal salaries are starting to more closely resemble your chefs and engineers. After the recession hit, large corporate interests began taking a serious look at what they were spending their money on, and began to realize that they didn’t need a giant, expensive law firm every single time they needed a lawyer (or realizing that they may not need as many lawyers as often as they thought). Smaller firms became more attractive for most things. And when they did hire the large law firms, they began to ask uncomfortable questions: why am I paying $250 an hour per attorney for ten 25 year old kids to sit in a basement and review documents 12 hours a day for 9 months? Can’t that be done as well by contract workers, or a computer, for a fraction of the price?

    Large law firms had massive layoffs or closed down outright, and they haven’t started hiring people back. I think we’re going to increasingly see a flatlining of salaries for new attorneys, and smaller, more reasonably-priced firms taking a larger share of what remains of the market.

    • Ed says:

      As a current Biglaw lawyer (industry term; I personally prefer Scalia’s term “tall-building lawyer”), I will add that the ‘prestige’ phenomenon you described still exists: it just has shrank. Profits at the very top firms (Wachtell, Davis Polk, etc.) have never been better because there’s a small group of clients for whom money is still no real object; it’s the pretenders (firms like Dewey) that have struggled to keep up with the Joneses as more companies are taking a closer look at their legal bills.

      You are generally correct that this salary distribution comes about because of reputation: more than most other fields, hiring a law firm is mostly about reputation (you can’t really point to your previous work if most of it is confidential), and so you literally cannot afford any implication that your firm is unable to hire the very best law students.

      However, the bimodal salary distribution has already become antiquated: it used to be that the top firms would all match each others’ bonuses as well. The very top firms have begun shifting more and more of their pay into bonuses, bonuses which lower firms have not been matching. So there is some smoothing-out of the salary curve taking place already, not to mention the fact that non-NYC markets were never on the $160K scale anyway.

    • Frayed Knot says:

      I thought I’d chime in on this as a new lawyer (well, almost, anyway—I just took the bar exam earlier today).

      A few thoughts:

      1) Looking at starting salaries for lawyers might exaggerate how dual law is. Many firms start at 160k but then don’t raise associate salary on the same lockstep scale. Others match base salary, but not bonuses (as Ed mentioned). Even among those that keep to the lockstep salary scale for associates are likely to have much different partner compensation—at least judging by published profits per partner. Plus, a lot of the “middle tier” jobs in terms of salary are jobs that don’t hire straight out of law school (federal government, in house, etc.). I don’t think good data on lifetime earnings exists, but I bet it would be much more of a bell curve—that is, less bimodal—if it did.

      2) Even for starting salaries, the bimodal curve is a recent development. As recently as 1991, the starting salary curve was a more traditional bell curve. Before we work too hard to explain why lawyers fall into this bimodal/dual pattern, it might be that we’re looking at a temporary blip.

      3) Law is a bit different from academia in that most people aren’t working hard to get to the next level. Sure, there may be some lawyers out there who have plans to “lateral up” to a higher paying firm. For the most part, however, lawyers who are on the left-hand part of that graph missed the boat for making it to the right hand side. So you don’t have the same dual dynamic that you see in academia (or drug gangs, I guess) where people are working hard to make it to some promised land.

  37. Shenpen says:

    Law is very dual, and so is politics, and politicians tend to be lawyers. Sort of partially explains it. Their basic instinct and experience is to set up everything that way.

  38. TomA says:

    Instead of dualism as a descriptor of these unique labor categories, I would suggest using the term “artificially skewed labor distributions.” The common denominator of which is government-induced labor controls that inhibit the market from functioning naturally. In some professions (such as medicine), this intrusion at least appears to have a public interest intent (i.e. minimize incompetent practitioners that may cause acute harm to patients). It other fields (such as education), the intrusion is more likely about politics and control over indoctrination practices.

  39. Lumifer says:

    It seems to me you’re conflating two different phenomena in your post.

    One is the situation where a threshold/bottleneck/gate separates two group of people. The move from one group (“low”) to the other (“high”) is a discrete, significant step, not a smooth continuous curve. Typically the high group is the gatekeeper and limits access for its own benefit. This is similar to medieval craftsmen guilds. Becoming an MD is in this category, for example.

    Note that here is not much risk involved. If you have sufficiently high IQ (say, LW level), are not obviously broken, and want to become a doctor, well, the chances are high that you will succeed. On the other hand if your IQ is around 70, your chances of becoming a doctor are miniscule and there is little uncertainty in that.

    The other situation is winner-take-all, I usually call them “lotteries”. Start-ups are such a situation. More importantly, professional sports and entertainment are, too. There are a lot of amateur singers or actors who do drudge, underpaid work in the hopes of making it big. Very few do — and the uncertainty about the outcome is very high.

    Law schools, though, are an interesting combo. Being admitted to be bar is similar to getting an MD — if you have enough desire, IQ, and money, the gate is not difficult to get through, it’s not a matter of chance. But getting a job at a top-tier (say, NYC white-shoe) law firm is much more of a winner-take-all case.

    • onyomi says:

      I think this is a good point: the lottery is a different model from the bottleneck. Actors, athletes, singers, artists, racecar drivers mostly know that they are entering an occupation where almost everyone makes a modest income at best, but with a possibility of making it big. It is not nearly as hard to get tenure or an MD as it is to play in the NBA, but the very top doctors and professors also don’t get the huge jump in compensation that comes with making it into the NBA.

  40. Urstoff says:

    I have a fantasy that some day professors will be rewarded (monetarily, prestige, or otherwise) for teaching, instead of or in addition to research. Unfortunately, that would require the signalling model of education to be false, when it’s probably true. Right now, professors are assumed to be experts in two skills that don’t always correlate, and there’s very little oversight for teaching or pressure to get better at it, in clear contrast to research. Perhaps the good teachers at universities could fill in that middle income gap; they’re much better at teaching than adjuncts or research professors, but they don’t confer the prestige or grant money on universities like research professors. But again, this assumes that students and universities care about learning, which doesn’t seem to be the case.

  41. Phil says:

    I actually don’t think academia is a very good example of this phenomenon

    I can think of several better examples which sort of shift around the large conclusions one should make of this phenomenon


    Hollywood – is probably the most obvious example that springs into my mind, a small number of stars make astronomically large amounts of money, people trying to break into the industry wait table,

    interesting tidbits about, its not entirely clear how well talent is correlated to success

    depending on how much credence you give to various theories of the ‘casting couch’, it appears that oppression/curruption might be fairly rampant

    I don’t see any paticularly established licensing or credentialy aspect to this

    limited means of production (which might be changing), used to be the case that you needed a studio or production company to get any sort of traction in the industry

    though with the low cost of technology that might be changing


    other industries like that that pop to mind, political campaign professionals, publishing (if the Devil Wears Prada has any basis in fact, I have no knowledge of that industry outside having seen that movie), sports coaching


    as a parent, I have a hard time advising my child (my kids are 4 and 2 so its still hypothetical at this point) towards one of these dual professions, it seems like that is a recipee for being exploited

    what I think might be a useful compromise is to maintain the means of your own production

    ie – its not terrible to try to be an actress in a giant movie, but understand that if the casting agent say you got to sleep with them in order to make that happen, you can tell them no, because there are lots of paths towards success, sometimes shooting low budget iphone and making them popular on youtube, sometimes that’s the better, albeit humbler, way to go

  42. John Hayes says:

    Looking at these different career paths I don’t think there’s a general theory of dualizing because there are more specific dynamics.

    Law – there is high variability of outcomes, there are prestigious and practical firms, many lawyers go on the become corporate counsel or maybe into politics. In that case, sorting is pretty much done at entrance to law school, basically along intelligence grounds. A partner’s goal is to hire the best associates so they don’t have to spend time doing the work themselves and their customers are satisfied. Associates work hard because they’re paid to work hard and it’s the only way to get exposure to broad areas of the law within a single lifetime.

    Academics are actually somewhat similar, successful academics aren’t the ones who get tenure, they’re the rainmakers that consistently get funding for their projects and can fund a larger number of grad students. In both cases there’s a positive feedback loop: better funding means better grad students (even if they are only paid incrementally more, facilities are greatly improved) and better grad students means you can have more of them because they can work independently so you can focus on rainmaking. Tenure is just another permanent source of resource that enables longer term projects.

    The problem they both suffer is lawyers and academics are monocultures, they don’t know how to use people outside of their field. A law firms is lawyers until you hit someone who answers the phone or vacuums after hours. Academics are the same and that greatly limits their organizational skills outside of a few “big physics” projects which are actually run by private corporations.

    Private corporations are very diverse in their skills and processes and can both form much larger organizations and much more efficiently. For example, a corporation would organize academia to separate teaching and tutorial then breakdown research into fundraising, technicians, writers and editors, publicists, recruiting, accounting, inter project coordination (business development). Most people are going to only be good at a couple of things in their life, but academics spending time on a dozen jobs they’re mediocre to bad at is really inefficient.

    As a comparison, venture capital raises $20-30 billion a year – that’s spend on R&D across a lot of industries but a pretty big chunk is actually spent on marketing and then some on all other expenses. So that’s probably closer to $7-10 billion on R&D for all new companies combined. The average startup engineer will be paid 5x the average grad student so that’s a pretty small population doing a big chunk of the tech innovation in the world.

    You can pull from articles on the end of academic funding that the federal government’s budge for R&D (not including state, loan guarantees, endowments, charities or other private sources) is $136 billion according to the AAAS. A big chunk of that is probably DOE/DOD but there are individual agencies with larger R&D budgets than the entire VC sector.

    Drug dealers are actually hyper corporate, but “regulations” have forced them to be inefficient. It is necessary to keep a large staff of customer service, money control, inventory control, lookouts staffed at all hours. Most of these people are just standing around. Compare to a retail store, when there aren’t many customers a retail store will cutback staff because security cameras and inventory tagging do most of the work. In the drug industry you cannot employ these devices because there can’t be any records of either the customers or inner workings that could be used as evidence.

    This happens at every level, so it’s a painfully inefficient distribution systems and yet street prices for drugs pretty consistently drop and product quality goes up. Drug dealers have the same motive for quality, they want to be reliable for their customers especially as most of their customers don’t really plan ahead. So image you have to run an advanced just-in-time deliver systems with very low inventory levels, high transaction costs and no advanced ordering. Imagine running Walmart but you could only use carrier pigeons for communication.

    Back to the employees, this is not a high skill job – everyone has to do one thing, but most of the time they do zero. Area managers will overstaff just in case, but most of their employees probably couldn’t get a job at McDonalds even if they wanted to; they’re simply not skilled enough. Corporations are very good at finding creative uses for low skill people at the right price (that was the entire Chinese economy 8 years ago). A law firm or a research lab doesn’t know how to deal with someone that isn’t already a lawyer or an academic. Minimum wages mean that instead of investing is turning a $3/hour worker into a $10/hour worker they’re just have to hope for a supply of $10/hour workers.

    Back to the medical profession, there’s a little of the lawyer experience with a lot of useful breadth. The twist is government is choking demand so teaching hospitals bid down the price of residents. The profession is part of it, a practice of commitment testing, however the rest is plain incentives, supply and demand. If I were to think of a solution to increase the supply of doctors and improve the salaries and working conditions of residents it would be to remove all government funding.

    Right now the government pays about 2/3rds the cost of a doctors education, a lot of money, but it’s foolish to so deliberately direct money towards a future upper-middle class. The worse effect is congress now controls the supply of doctors but choosing how much they’re going to match. As much as I think lawyers have problems, no one ever says there aren’t enough lawyers; and they proliferate just fine without government funding. So getting doctors to act more like lawyers and invest in associates (residents) would be a huge improvement for doctors, residents and patients. Getting both of them to act like corporations would be even better, but baby steps …

  43. Zykrom says:

    OT: Does it make sense to call the writing style where the writer goes over the thought processes that led to their insight so you can sort feel like you’re coming up with it yourself “hardcore insight porn?”

  44. jjbees says:

    Medicine is non-dual when you look at compensation by specialty.

    An orthopedic surgeon or cosmetic dermatologist can make multiples of a family medicine or peds doc, and getting into those residencies is pretty tough; 8 years of education and working hard for specialties where only 60ish % get in, and the other 40% get to waste a year or scramble into a lower paying specialty.

  45. Anon says:

    since they’ve got no option under than continuing to work in the criminal underworld

    Other than?

    Also, startups may be dualized, but the difference here is that there’s no arbitrary authority like academic administrators or drug lords determining whether you make it big. The closest thing to that in the startup business is venture capitalists (VC’s), and their decisions are less arbitrary than most (provided they’re high-profile and respected). Even then, a startup doesn’t require VC funding most of the time, and VC funding doesn’t determine your success: the market does.

    • James Picone says:

      Does that matter? If it’s a bad outcome, it’s a bad outcome, regardless of whether it’s market-driven or regulation-driven.

      • Murphy says:

        Many people would prefer to gamble on an otherwise fair coin toss where they can get on the right side by competing fairly with others in the same position vs gambling where whoever wins will win because their daddy made some phone calls even if technically 50% of people win in either case.

        • Saint_Fiasco says:

          People underestimate the risks they believe they can control, like traffic accidents.

          It’s like, failing at a startup and dying in a car crash happens a lot. To other people.

  46. Jacobian says:

    Before I send a link to this post to my grad student girlfriend, I’d like to mention that there are downsides to working in a non-dual field as well. I work adjacently to the general field of “back office finance”: compliance, corporate finance, internal accounting, reporting etc. It’s a field that’s very easy to get in to, and even if you fail your BA in finance or business administration or accounting will get you an OK job with a very high probability. Once you’re in, you’ll mostly keep floating around between huge financial institutions for 45 years if you like it, with a range of salaries somewhere between $50k-$200k and determined mostly by number of years in the industry. There are no financial regulatory reporting superstars, and few corpses littering the side of the road. I think most public sector office jobs fall in the same category.

    So what’s the result? An industry of people who are conscientious and competent but (at least on average) unmotivated, uncreative, and very risk averse (at least compared to drug dealers or grad students!). Hard workers are rewarded with more tedious work, innovators aren’t rewarded at all. The parking lot is a row of Camrys, and so is the building. The safety is nice, but it doesn’t necessarily make working in this type of industry great. And some people are thinking how much money they’ll need to save to afford to risk a PhD 🙂

    • SFG says:

      Sounds nice–I’m smart but very risk-averse. How do you get in? What’s the maximum age you can enter at?

  47. Guy says:

    This is interesting because of how well it maps on to some other issues. For example, minimum wage creates a dualized system between workers and the unemployed. If there were no minimum wage, we would expect a sort-of-continuous wage distribution from 0.01$ an hour all the way up to whatever Taylor Swift makes for an hour’s performance. Instead, we guarantee everyone the privilege of $15 per hour. Employers resent this and (in theory) try to limit access to the privilege by lowering workforce, automating, etc, as much as possible. This creates a dualized system with an upper tier (employees with high wages) and a lower tier (unemployed with nothing at all).

    I’d expect this to have been dualized already, between those who make enough money to live and those who do not. Ideally, a minimum wage is just a way of saying “full time jobs that do not provide enough money to live on are illegal.”

    • John Schilling says:

      Except that this rapidly transforms to “full-time jobs that do not provide enough money for a family of four to live on are illegal”, and “live on” becomes “at least lower middle class and no food stamps”, and “full-time” becomes “not more than 40 hours/week”.

      Which seems likely to open a huge unbridged chasm between the lower middle class and the permanently unemployable. And since jobs that can support a lower-middle-class family of four on 40 hours/week tend to require more skills and experience than are likely to be acquired in high school, winding up on the right side of that chasm means finding someone to subsidize the extra years of education that one can no longer pay for by their own labor.

      We aren’t there yet. I don’t think $15/hr quite gets us there. But it is I think dangerously close. There are better ways to deal with the problem of impoverished families than demanding that absolutely every job be sufficient to support a middle-class family.

      • Murphy says:

        Which is why you also need a decent education system in the country which can get people up to the standard needed to allow them to compete in the labor market without leaving them as eternal debt-slaves.

        If you only do it half arsed and set a minimum wage but keep only a partial education system then you’re stuck with the unbridgeable chasm.

        • John Schilling says:

          A decent education system will almost certainly include some amount of on-the-job training, albeit possibly hidden somewhere. Cross-thread, we have the discussion of internships where the interns have zero or slightly negative real productivity but it’s still worth a company’s while to invest some training time in them.

          It is not realistic to expect that most people will jump directly from sitting at a desk in a classroom, to generating $20/hr in actual value. If, during the years when they can realistically produce only $10/hr, we can only pay them either $0/hr or $20/hr, that’s not going to work. Or, maybe it is going to work but the actual mechanism will be hidden and we won’t see the problems.

          • Murphy says:

            No, I think you’re using the paint by numbers simplified model which doesn’t really correspond to reality but sounds reasonable.

            Typical employees, even those who have experience typically have negative value when first hired. Even an experienced employee doesn’t even break even for about 5 months. [source: Peopleware: Productive Projects and Teams]

            Companies absorb these costs because unless they’re *really really* incompetently managed with a high staff turnover they still make a profit after that initial period.

            A fresh graduate has a longer breakeven period even at a lower graduate salary.

            It used to be normal for companies to hire highschool grads and to then invest the equivalent of years of training to get them up to the point where they could work as accountants or managers. It’s only in recent years that the culture has changed and companies have moved towards investing less and less in employees and hence investing less and less in employee retention.

            It’s perfectly possible, it’s just become unfashionable.

            It can be hard for a company actually willing to invest in training it’s employees when all the other companies are burning the commons and defecting in the prisoners dilemma to get a tiny edge but if regulatory agencies don’t allow their competitors to defect then they can still do fine on a flat playing field.

          • houseboatonstyx says:

            @ Murphy


            I hear about a similar pattern in many sectors. It’s like Atlas Shrugged, but with the … 1%? … gutting their own businesses from the inside, then disappearing with the capital.

    • suntzuanime says:

      In that case, forget about a $15 minimum wage, even the one we have right now is way too high. (If you consider the homeless to be dead I guess there are some metro areas where you couldn’t cut it much, but that’s an issue of housing policy more than anything)

  48. Brian says:

    Some comments on law…

    First, it’s dualized twice; once at the law firm hiring point, and a second time at the partner stage. So in the long term, you usually get a triple distribution as follows:

    Top tier: Hired by BigLaw, made partner. Makes at least $400K in a solid position for life.

    Middle tier: Hired by BigLaw, didn’t make partner, took moderately lucrative job (low 6 figures) as in-house counsel for a major company, law professor, or counsel at a major law firm.

    Bottom tier: Everyone else–solo practitioners, contract lawyers, government lawyers, etc.

    In law, however, the biggest cause of the dualism is a combination of competition and finding clients who will pay. The big firms are the only ones with the reputations to charge mind-boggling hourly legal fees, and they get the big corporate clients and rich individuals who are the only ones willing to pay those legal fees. If you’re good enough to get clients to hire you specifically for those prices, you make partner. If you’re good enough to do the work but don’t have the qualities needed to get those clients to seek you out personally, you work as an associate, counsel, or in-house. And if you’re not that good, you charge lower fees to the rest of the world who will have trouble paying even that.

    So the answer in law to “why isn’t anyone paying $100K to almost as good lawyers” is because corporate clients will pay $300/hr to associates to get the top partners to work for them, individual clients won’t pay more than $50/hr to junior lawyers, and the clients don’t want to pay anything in between.

  49. Benoit Essiambre says:

    What? Computer science is known to be very dual, often relying on terms like “10x” programmer or “rock star programmer”. There’s even (albeit debatable) studies about the two hump camel distribution in CS classes performance.

    My experience is that about half CS graduates are pretty useless in their field of study. Though they can sometimes get jobs in things like tech support or server maintenance but this is more like becoming a nurse instead of a doctor. For some reason, MDs can’t fall back to nursing with their diploma.

    • Smoke says:

      Computer science is known to be very dual, often relying on terms like “10x” programmer or “rock star programmer”.

      There’s no question that there’s a good degree of innate programming ability, but I suspect the 10x thing is a myth. In any case, we don’t seem to see 10x job desirability/compensation differentials for the most part.

      (I actually suspect the 10x thing has promoted the false idea in the programming world that programming ability is very innate and if you aren’t just good at it, or you aren’t passionate about it, there’s nothing you can do and you just suck. I found that by thinking moderately hard about how to improve as a programmer I was able to improve substantially. It was also very helpful for me to concentrate on the act of programming itself instead of worrying about how I was performing relative to others. A few years ago I adopted the attitude that I’m very smart and I shouldn’t be intimidated by big words because I can learn almost anything if I put my mind to it, and that attitude has been really helpful for me.)

      There’s even (albeit debatable) studies about the two hump camel distribution in CS classes performance.


      • Airgap says:

        I suspect the 10x thing is a myth.

        Where do you work and are they hiring?

        I found that by thinking moderately hard about how to improve as a programmer I was able to improve substantially.

        Would you believe me if I told you that there are people who do this and fail to improve substantially? Do you see where this is going?

        It was also very helpful for me to concentrate on the act of programming itself instead of worrying about how I was performing relative to others.

        This is another common trait among competent engineers.

        A few years ago I adopted the attitude that I’m very smart and I shouldn’t be intimidated by big words because I can learn almost anything if I put my mind to it, and that attitude has been really helpful for me

        Oh for christ’s sake…

        Ladies and Gentlemen, I give you the autosolipsist! He firmly believes that he does not exist! Truly the crown jewel of our intellectual carny freakshow!

        • Smoke says:

          There are innate differences in programming ability, and I’m lucky that I’m good at it, but I think the differences are exaggerated. If it’s possible to improve through focused self-improvement effort, that is evidence for a trainable aspect.

    • Airgap says:

      1x Devs still get jobs, unfortunately, which means you get to spend lots of time cleaning up after them. Programming could probably stand a lot more dualism because unlike medicine, our mistakes don’t just die.

  50. M says:

    Heh, definitely experienced that duality as an incentive not to try exploring other parts of the academic world – in the parts I’m more involved with, there still a lot of people at the “bottom” that will never be professors, but at least money and working conditions are reasonable (this is around mathematics-like stuff). So when thinking in the past that maybe it might be interesting to switch paths at some point to some more natural sciences-like stuff, the seemingly unavoidable labor rights abuse made a not very appetizing perspective. Wonder what is the parallel to this in the drug industry…

    Re: Italian academics, after being around similar situations in southern Europe more often than I’d like, I think the “conspiracy” theory is a bit exaggerated, maybe fuelled by the author being very frustrated by all the corruption around. I’d judge it more likely that the people that do a lot of politicking do very little research because there is only so much time available, but if they somehow got a time-turner and had a ok research output, that wouldn’t be a net negative.

    Re: health, it’s strange to see stuff being talked about as “benefit” after growing up and living in contexts where it’s seen as a “right”. The peoples of the world, I guess.

    • Eggo says:

      How can you have a “right” to something that your body and fate will inevitably take away from you?

      • Pku says:

        That’s a fully general argument against the concept of “rights” in the first place (with which I kind of agree in principle), but if you accept the concept of natural rights it seems fairly standard.

      • Zykrom says:

        The world ‘health’ is being used as a stand-in for ‘(some forms of) healthcare.’

      • Graeme Sutton says:

        He misstated it slightly, there is no right to health, but there is a right to healthcare. What exactly “healthcare”entails is a whole other shitstorm.

  51. Saul Degraw says:

    I seem to come to these threads relatively late.

    1. You are thinking about tech from the background of someone who just wants to code but I still think the economics of start-ups don’t make sense as compared to other industries. There are still a lot of tech companies out there that manage to attract a lot of investment despite operating at losses for years. I can’t think of any other business or industry that is given this much leeway with its creditors/investors. Wikipedia says GoDaddy is still operating with a net income of negative 143 million dollars.

    2. The Arts are much worse than academics in terms of people who don’t make it and are willing to suffer to make it. I am not necessarily talking about fame and money. There are a lot of working artists who make very little money but are happy because they are working in the arts. A lot of working artists have the luxury of coming from money and/or marrying into money. There is an Off-Broadway director named Trip Cullman. He is the heir to a fortune. His loft was featured in the NY Times this spring. He is a talented director but I can guarantee that his theatre work does not pay for his loft.

    3. I considered going into academics briefly. I decided not to because I saw the writing on the wall way back in the mid-aughts. The reasons I was thinking about going into academics was because I like school and it seems nice. The concern was not really money but lifestyle. My fantasy of being a professor was teaching at a college in a charming Northeastern or Northwestern college town or city in a building with great architecture and lots of windows (something like my SLAC alma mater or the University of Washington). Doesn’t it sound better to be teaching a seminar on Shakespeare on a Wednesday afternoon in a sun-filled room of a converted colonial over being stuck in a windowless cubicle in an office park? It does to me at least. I just knew this was a fantasy. Professors don’t get paid well but they get a great lifestyle if it works out.

  52. Saul Degraw says:

    I can speak to law because I went to law school instead of getting a PhD.

    There are about a million types of law and lawyers. The alleged brass ring for law students is getting an offer for a partner-track position at a big firm that does corporate litigation and transactional work. These are brass ring positions because they pay a lot out the gate (around 160,000 dollars) and seem prestigious based on clientelle.

    I never wanted to be a corporate lawyer. I always wanted to work for individuals. Most law students do not get the brass ring jobs. One of the reasons for the law school scandal is that schools especially mid and lower-tier schools cooked their books to make starting salaries for students look higher and implied that lots of students get the big law jobs. Big Law actually represents only a small fraction of lawyers in the United States.

    Fewer people make partner at these firms. Those that do can theoretically make millions a year. The really wealthy lawyers are people who founded their own successful personal injury practices. These cases can be lucrative but they are gambles because they work on a contingency fee. So you might spend years and a lot of money on a case and get nothing. Or you can make a killing. Associates at plaintiff’s firm have smaller starting salaries but can theoretically go on and start their own firms and bring in the money. Other lawyers get successful from finding a niche. There is a lawyer in the Bay Area who styles herself as the tree lawyer. There is no such thing as tree law. Tree law is just property law. However, she taught herself everything there is to know about trees, wrote a book about it, and just focuses on doing stuff with trees like defending your right to keep a tree or helping you get rid of a tree.* I don’t know how much she makes but I am sure she is pretty comfortable.

    *You can pay to cut down a tree or part of a tree on someone else’s property under certain circumstances like if it poses a danger to your house and safety. My parents had a neighbor with a diseased tree and that tree would have come crashing down on their kitchen.

    • Deiseach says:

      There is a lawyer in the Bay Area who styles herself as the tree lawyer. There is no such thing as tree law. Tree law is just property law.

      Based on what I’ve seen in the job so far, you could make a tidy little living (not huge riches, but enough to live on) from that kind of specialisation in property law.

      Neighbours who lived happily alongside one another for years – and then – rows over hedges, trees, someone painted an adjoining wall the ‘wrong’ colour, your dog is/kids are always trespassing in my garden – then it’s no talking to each other, complaints to the council, threats of going to the police, then going to law 🙂

    • Adam says:

      The best off attorney I ever knew wasn’t an attorney at all, he was retired at 32 because of a gigantic class-action medical case he won, and he continually posted obnoxious photos of private helicopter tours he kept taking of tropical islands on the USC rivals forum. I mean, good for him, I don’t resent the guy and live a decent enough life myself, but that is just a pure crap shoot. There’s no way you can possibly know in advance that a case like that is going to fall into your lap.

  53. Marc Whipple says:

    I can tell you that your musings are largely correct: law is now (it always has been but the polarity has gotten much sharper) a field of winners and also-rans.

    The thing is, though, that while I understand medicine is hyperspecialized, and superspecialists make the big bucks, GP’s can make a reasonable living, especially if they’re willing to live in an underserved community (not a poor one, just a smaller city that has trouble attracting good doctors.) Law doesn’t work like that. Either you work for a big firm that does big stuff, or you work for a placement company or a podunk (or a vanity) firm that scrabbles for cases. There is no in between, essentially.

    And yes, big companies use big firms, for multiple reasons including some not-so-obvious ones. Smaller companies don’t use smaller firms: smaller companies rarely use lawyers to any significant degree. So again there’s not that middle tier customer to use middle tier lawyers.

    • Saul Degraw says:

      That’s not quite true. Most firms that handle stuff like family law, immigration, personal injury, employment, and tons of other stuff are not that big. There are still lots of solo practioners out there who make good livings. I admit they tend to be older.

  54. Troy says:

    The numerous academics who have chimed in to say that “you know, academia is actually pretty nice” coupled with the (mostly) non-academics or ex-academics writing articles about how terrible academia is reminds of the Fiddler on the Roof quote: “They’re so happy, they don’t know how miserable they are.”

    • Pku says:

      That quote is fantastic. I’m going to start looking for opportunities to use it now.

    • Urstoff says:

      It’s a good gig if you can get it.

    • Murphy says:

      I recently moved from industry to academia and I’m finding it much less oppressive and I still got a decent salary boost from the move.

      I could probably get a few extra K if I went into the finance industry instead but my girlfriend has commented on how much happier I’ve been since the move.

      It partly comes down to what makes you happy. I enjoy learning new things a great deal.

      A culture where stepping out of the office a couple times a week to go attend an interesting lecture isn’t just allowed but is approved of and encouraged works for me. In industry expensive training is jealously guarded. In academia, partly because the college is already optimized for training it’s cheaper and is given out like candy.

      On the other hand my earning curve if I stay in academia isn’t as sharp as it would be in industry.
      If cash makes you happier finance might be for you.

      But then I entered academia through the side door, I’m on a higher band than many of the postdocs with Phd’s 7 or 8 years older than me who are still fighting their way up the greasy poll.

  55. Saul Degraw says:

    I think being a chef is highly dualized. As a I understand it, chefs don’t make money until they rise to executive positions and/or open successful restaurants for their own. Otherwise the hours are long and the pay is very low because restaurants including very popular, high demand, and expensive restaurants operate on tight margins. The hours are still long if you have your own restaurant.

    A friend of mine worked as a pastry cook at a Michelin starred restaurant (I don’t know if it was starred when she worked there but it was high-demand). She worked really long hours and was a kind of second in command at times. She still made peanuts. Pastry cooks are also the first to go when restaurants need to scale back because it is easy to outsource. She told me once that to really make money, she would essentially need to open her own bakery.

  56. Graeme Sutton says:

    “minimum wage creates a dualized system between workers and the unemployed. If there were no minimum wage, we would expect a sort-of-continuous wage distribution from 0.01$ an hour all the way up to whatever Taylor Swift makes for an hour’s performance. Instead, we guarantee everyone the privilege of $15 per hour. Employers resent this and (in theory) try to limit access to the privilege by lowering workforce, automating, etc, as much as possible.”

    My experience in the modern minimum-wage work force is that in practice, the distribution comes in the form of how many hours a worker can get.

    Also, a factor you seem to be overlooking in the dualization of medicine is that the public expectation of standard of care kind of rules out allowing a sliding scale of doctor competence relative to wages. People expect Doctors to have a bare minimum of competence and the professional organizations that license doctors see a compelling interest in maintaining public trust in the medical profession, hence the sharp cut-off.

  57. Albatross says:

    I’m not worried about almost doctors, almost lawyers or almost academics. As long as they have their undergrad they aren’t apex predators but only just short.

    However, dual fields do have lots of hidden costs that damage the field itself and consumers. If drugs were legal consumers get obvious benefits. And there are better jobs for low ranking dealers. France has more doctors than the US, and more population density, and so it has both a higher quantity of great doctor jobs and better care for consumers. Eventually adjuncts will just unionize and colleges will be forced to pay market rate. Or maybe the best and brightest go to work on Wall Street instead. Dual fields handicap themselves.

    I do look forward to $15 fast food workers though. Fordism rules and I bet service will be great. Might need GNI for the unemployed though…

    • Urstoff says:

      Service will at least increase in quality in the short term (until the mass automation) as the lowest skill workers are displaced by more skilled workers whose labor is worth $15 (compare and contrast: Wal-mart employees and Costco employees).

  58. Deiseach says:

    I don’t understand the snark about $15 per hour fast-food workers. Whatever your position on the minimum wage, at least I presume everyone understands that people on low wages pay the same prices as everyone else? That big cities (and maybe I’m mistaken, but isn’t the $15 dollars only a New York wage?) are more expensive? There isn’t “We charge so much per gallon for petrol for people on $30 per hour, but if you’re only making $9 per hour we have a discount price”. Utility bills cost the same. Rent is the same, which is why you have the limited choice of where you can afford to live.

    I think the attitude seems to be that only teenagers working after-school jobs work in fast food places, and they’re living at home being supported by their parents, so they don’t need that kind of “real” wage. But even if you’re scoffing at the kind of loser who is too dumb and uneducated to get a proper job and so is reduced to working as a burger-flipper, they still need to be able to make a living. Or else they will turn to crime, even if being a low-level drug runner doesn’t pay well.

    What’s the solution? Leave jobs like fast food workers as low paid, don’t impose the minimum wage, have workers there who are only temporary and being supported by someone else’s income? Those who are capable will get a real job that pays a living wage? And what of those who aren’t capable – what happens to them?

    While we’re all fantasising about a universal guaranteed basic income, what happens the people who can’t get better jobs in the meantime until that happens? A mix of welfare and crime to live on? Have we no better answer? I’m serious, someone tell me what can be done, because I don’t know.

    • Edward Scizorhands says:

      There isn’t “We charge so much per gallon for petrol for people on $30 per hour, but if you’re only making $9 per hour we have a discount price”

      Stop giving them ideas.

    • FJ says:

      “What’s the solution?” is always a good response to someone who objects to a particular proposal. That said, I think the big objection to a $15/hr minimum wage is that wages generally cannot exceed the worker’s marginal productivity. If you assume that a fast-food worker’s marginal productivity is below $15/hr (I’m agnostic on this, but it’s not obviously crazy), then you might as well set the minimum wage at $1,000,000/hr. Yes, it would be fantastic for fast-food workers to work 40 hours a week and receive $600 every week. But it’s not clear whether raising the minimum wage to $15/hr will actually cause fast-food workers to take home $600 a week. If raising the minimum wage actually *reduces* weekly take-home pay, then you’re actively pushing them further away from a living wage.

      I don’t know how to cure a rattlesnake bite. But if your proposal is to have the snake bite him a second time, I will probably urge you to reevaluate your treatment protocol.

    • Albatross says:

      My outlook on the minimum wage is complicated. I am definitely in favor of raising wages because I believe labor is priced too low and as an investor low prices mean it is time to buy. Fordism is my favorite method. Walmart and Costco workers spend a lot of money at the stores they work at. It is better if customer employees make $10 or $15 because they will buy more of your stuff. The effect doesn’t apply at funeral homes, but on average raising wages should raise sales. For complex reasons I predict there won’t be much inflation from wage increases (federal interest on bank deposits and the gap between inflation on luxury goods vs low prices on food, gas and clothing).

      Private companies increasing pay is my favorite mechanism and many successful companies are doing just that. Costco, QuikTrip, and Chipotle wages have pulled Walmart, McDonald’s forward with a cascade int other businesses like Target.

      I perfer min wages be local. Each state and major city should set an appropriate level. $15 isn’t high enough for San Fran and is way too high for rural Mississippi. Minnesota linked theirs to cost of living which is a good long-term solution. But Fordism advocates hope wage increases will lead to higher sales and profits and make such laws irrelevant. Americans work very hard and are very productive and higher pay isn’t at odds with capitalism. The fad of keeping wages low hurts companies. Nobody can afford a brand new truck. Nobody can afford to remodel their kitchen. We need more money for companies to get more of it.

      • CatCube says:

        “Pay our workers more so they can pay more for our products!” sounds an awful lot like the logic underlying a perpetual motion machine. I’m not sure how you can keep in business relying mostly on your own workers to buy your stuff, because you need to make up both your workers’ wages + other inputs in your pricing structure.

        • John Schilling says:

          It’s a common myth as to how and why Henry Ford turned automaking into a relatively high-wage, high-profit industry. As you intuit, it doesn’t work. And it isn’t why Ford paid well; he needed a higher quality of worker to support his business model so he had to pay more to get and keep them; if some of his employees bought some of his cars, that was purely coincidental.

        • Albatross says:

          My wife works as an adjunct at a private college. They don’t pay her enough for me to get an MBA there or send our kids to that college. Other adjunct roles pay similarly but most of those schools are cheaper. Considering how big a role parents education plays in children’s educational attainment the decision to price above academics means is curious.

          When I worked retail, workers were our best customers. A $1 raise is going to increase sales. Thus paying employees $15 an hour doesn’t cost $15 because the company gets a % back as sales. Obviously this doesn’t work for pay at higher rates due to diminishing returns. But with American wages stagnant over decades there is plenty of room. Full time at $15 is $30,000 a year. American productivity compares well with Japan and Germany which both support higher median wages by having fewer extremes. At a certain point American wage cost cutting dampens prices, sales and growth because profits to Walmart and McDonald’s go to rich stockholders who don’t buy anything at Walmart or McDonald’s. Wages are an expense, but set too low they will hurt the company. Just like every other expense.

      • Adam says:

        The best way to get your employees to buy your own stuff is to offer an employment discount, in the cases of Walmart and Costco that sell the same thing as anyone else, or to make a better product in the case of places that actually sell differentiated products. Just paying them more is pushing money in circles like CatCube pointed out. The best scenario for any single employer is every other employer pays their employees well enough to buy a lot of your product, but you pay your own employees next to nothing. Of course, that doesn’t work in practice, either, because no one will want to work for you. In practice, you get the exact equilibrium we actually have. Pay people in some shitty place that sucks next to nothing to produce your product, then sell the product somewhere else with a thriving middle class.

    • CatCube says:

      Well, some of the snark about $15 fast-food workers is being driven by the legal change in New York State (NB: The whole state, not just the city.) There, they now have a minimum wage exclusively for fast food of $15. If you’re working at a bookstore, you still have the old minimum wage. If you are a cashier at a grocery store, you still have the old minimum wage. If you’re at McDonald’s, $15/hour.

      Also, this applies to (again) all of the state, including places where more skilled jobs might be around $20/hour.

    • Urstoff says:

      We already have a better answer: the EITC. The benefits mostly go to the worker, and there is no risk of disemployment as there is with a minimum wage (indeed, it is structured to incentivize work on the supply side).

      • Kevin says:

        Now we just need to expand the EITC to people without children and people without jobs.

      • Devilbunny says:

        Unfortunately, the EITC is usually spun as “corporate welfare”. It is a very efficient way to put money in the hands of poor people, but like many economically efficient things, it is easily spun in a negative direction.

      • Dude Man says:

        The problem with that is that Republicans will propose to raise the EITC as a counter-proposal and, at the same time, include cutting the EITC in their budget proposal. The EITC may be a better policy than a minimum wage increase, but it’s proponents are not negotiating in good faith.

    • Adam says:

      Most likely, since momentum seems to be on the side of these laws, I think what will happen is fast food workers end up getting paid $15/hr, fast food places raise their prices, but at that point McDonald’s is competing with Five Guys or an even better place, won’t be able to produce shitty food any more, and will either go out of business or change and something roughly equivalent to Five Guys is about the shittiest restaurant that will still exist. Whether that’s a better world or not I don’t know. I guess it depends on the number of people buying from McDonald’s because it’s really fast but who would gladly pay more or incur inconvenience if they had to, and people who literally can’t afford anything not on the dollar menu.

      • mico says:

        MCD and other chains will be replaced by independent immigrant-run hole-in-the-walls like in Germany, where the chains are unionised but there is no minimum wage.

      • FJ says:

        Isn’t automation the most likely medium-term outcome? Perhaps burger-flipping robots haven’t been perfected yet, but plenty of retail food establishments are already introducing self-service ordering kiosks, and this brochure explicitly touts the kiosks as cheaper than paying minimum wage. Rising wages (whether legally mandated or simply the result of a tight labor market) will encourage employers to substitute capital for labor. This substitution effect is, after all, the justification for high taxes on cigarettes, gasoline, and (via tort law) defective manufactured products… there’s no reason to assume that the substitution effect will suddenly stop working in the fast-food industry.

    • Ben says:

      The problem with the $15 minimum wage is that it doesn’t actually make sense in much of the country, including as it happens the vast majority of New York state by land area and about 1/3 of the state’s population. I have no doubt that a $15 minimum wage is very important in NYC and in a few other high cost metro areas in the country (SF/Bay Area, LA, DC, Seattle) but in much of the country its utterly nonsensical.

      I live in upstate NY, one of those wonderful, dying, snowy, rust-belt cities. The median hourly wage is around $17.50 an hour, for comparison the median hourly wage in NYC is $21.70. It doesn’t take an economist to realize setting the minimum wage to 85% of the current median isn’t particularly stable. Also note here that Rochester, NY is one of the better off parts of upstate NY, drive an hour to Buffalo and the median hourly wage is 16.70 and a couple hours away in Binghamton its $16.

      I know you can live on $10-11 per hour in Rochester, I know because I do it and I know other people who do it too. Several of my fellow PhD students support a non-working wife and in some cases child as well on our stipends, which work out to be $10-11 per hour based on 40 per week, 50 weeks per year. Its not living the high life but its entirely livable. One of the “benefits” of the whole dying city thing is housing is incredibly cheap. You can rent a 2 bedroom for less than $900 per month and a friend recently bought a 3 bedroom house for less than $120,000.

      The $15 per hour minimum wage fits into a broader theme of New York state laws designed for NYC that create real problems for cities upstate which are completely different economically. More broadly the people in those high cost metro areas have a valid point abut cost of living but the Fight for 15 movement doesn’t make a ton of sense in huge swaths of the country where it is currently being pushed heavily.

      Like many people here I think basic guaranteed income is the correct long term approach but so long as we have a minimum wage its important to be realistic about what a given economy will support. A state wide and federal minimum wage should be calibrated toward the low end and then let individual municipalities increase it as needed or perhaps even better index minimum wage directly to cost of living in a given county.

    • Autonomous Rex says:

      A lot of people seem to be standing on the sidelines, neither minimum wage employee or employer, who viscerally want it to be lower. Its a kind of partisan sadism that goes along with wishing the worst for political opponents, but its always presented as empathy for the least-skilled worker.
      This empathy ends when you talk about job training for the least-skilled.
      We could get somewhere if they would just say we want cheap labor for our corporations. Instead millions of everyday conversations about the issue bog down in the libertarian’s gleefully deployed “concern” about potential disemployment effects.
      These are regular citizens who curiously identify with billionaires and billionaire needs.

      • Jon Gunnarsson says:

        The question is whether higher minimum wages actually help low-skilled people. Accusing libertarians of intellectual dishonesty and faux concern (and without any argument or evidence) does nothing to resolve that question.

      • houseboatonstyx says:

        > These are regular citizens who curiously identify with billionaires and billionaire needs.

        That came out pretty explicitly in 2000. On the issue of whether the 3% with the highest income should have their taxes raised, someone did a poll and found (figures from memory here) that 23% of respondents thought they were already in the 3%, and 19% expected to get there well before retirement age.

      • RCF says:

        Everyone is a minimum wage employer. It’s almost certain that you buy something for which minimum wage labor is an input.

        • Dude Man says:

          I wasn’t aware that buying a whopper at Burger King made me an employer.

          • Marc Whipple says:

            “The man who eats meat is at the same moral level as the butcher.”

            I’m not a big believer in boycotts, as it’s basically impossible to target them effectively in an economy as interconnected as ours. But the logic is reasonable.

          • TheNybbler says:

            Yes, the man who eats butchered meat is as morally responsible for the cutting apart of animal carcasses as the butcher. But that’s because cutting apart animal carcasses is necessary to get that steak or whatever to the plate; eating butchered meat implies being OK with butchering. Responsibility for the parts of the butcher’s business practice that aren’t necessary to cutting apart the carcasses doesn’t necessarily flow through to the customer.

            I see three major possible results from raising the minimum wage

            1) Increased unemployment: employers make do with fewer employees, e.g. cutting services, using automation, getting existing employees to work harder, or replacing employees with fewer who are better at getting more done in a given time. Also newly non-viable business closing.

            2) Wage compression: Jobs which are now $15/hr still pay $15/hr. Jobs which are now $8/hr also pay $15/hr.

            3) Inflation

            Here 2 is the desired effect. It seems to me the immediate effects are likely to be a mixture of 1 and 2, but 3 will be the inevitable result. The effects of the higher labor costs will flow through the system, we’ll get wage and price inflation across the board, and within a few years we’ll be hearing demands for a $30 minimum wage.

          • onyomi says:

            I don’t even see why it’s ethically desirable for the jobs which are now minimum wage to increase to $15, while jobs which are now $15 remain at $15. I know people who have worked 10+ years to reach the point where they provide 15$+ of value to employers. Why should every teenager be artificially lifted to that level? To do so means you basically have some jobs where you *really* earn your salary and others where you earn it+bonus which comes at the expense of business expansion and the unemployed (and I don’t think I know anyone who thinks people with low-paying jobs are more deserving of pity than the unemployed, so…)

            Eventually, no one will do the *real* $15/hour jobs for just $15, since one can earn that at McDonalds. McDonalds will gradually replace most of their employees with machines, the other businesses will compensate for the fact that no one will work for $15/hour anymore through a combination of raising the salary of a few employees, laying off the less valuable employees, automating, outsourcing, downsizing, merging, etc. (and that’s assuming they manage to stay in business, which they won’t all do). And yes, of course, prices of consumer goods will go up. In other words, the effect, again, is a smaller number of better jobs when what the economy and, in fact, justice both want is a larger number of crummier jobs.

            Increased unemployment and sluggish growth will increase the cry to expand government welfare programs and further reduce the incentive for people currently on disability, unemployment, etc. to get off it. Of course, having more people beholden to the government is great for the government. But not for everyone else. We already see the more advanced stages of this in Spain and Greece: an ever expanding underclass of the unemployable relying on government benefits provided by a shrinking economy which eventually can’t handle it.

            Increasing the minimum wage unambiguously takes us further in this direction of the expanding unemployable underclass. Which is why it’s ironic that progressives are always on about equality, equality, equality.

          • James Picone says:

            So the Australian minimum wage is $15.96 AUD/hour, or $11.67 USD/hour by current exchange rates.

            When should I expect that spiral of inflation and depressed economic activity?

    • RCF says:

      It seems to me that your post is engaging in two things: first, it’s ignoring the issue at hand, which is the feasibility of a $15 minimum wage, in favor instead pretending that the issue is the desirability of it, as if those scoffing at it are expressing the sentiment that it is no great hardship to live on the current minimum wage, and second it is doing so in a rather oblique way, rather than explicitly stating your thesis.

      $15/hour at 2000 hr/year comes out to be $30,000/year. The per capita GDP is about $50,000 year. Is it feasible to have the minimum wage be half the national economic output?

  59. Who says you *can’t* have half a medical degree? You could have a progressive series of certifications qualifying people for a certain level of medical work. Actuaries basically have this sort of thing going on now. They spend their careers studying and taking tests earning themselves higher earning potentials with each test. I think basically all licenses and certificates can be gradualized in this manner.

    • Creutzer says:

      I think the idea is that medical work itself is not neatly classifiable into “levels”, at least below a certain (pretty high) threshold. With the consequence that someone with half a medical degree is just pretty useless because there are no tasks for them.

      • Murphy says:

        Well that’s the structure that’s been created, certainly. Whether there could be is another question entirely. The medical profession is already pretty strongly stratified, it’s just that many of the tiers have no straightforward route to move up from them.

        Care Support workers, EMT’s, Nurses, Med Students, Junior doctors.

        Plenty of supervised grunt work is found for med students, it’s just that the current system treats it as the unpaid apprentice tier so they don’t need to actually hire people with pay.

        In specialist areas nurses have more knowledge than junior doctors. In a neuro hospital I’m familiar with the official workflow if a neuro-trained nurse considers something serious to be wrong is for them to ignore the junior doctor on the floor (because there were too many deaths due to junior doctors often being quite ignorant of specialist areas and failing to recognize problems the specialist nurses recognized) and to escalate directly to the specialist consultant.

        If you were re-designing the system from scratch it would be logical for there to be a clear career path for specialist nurses to keep training and climb into junior specialist doctor roles but instead it’s designed on a historical guilds model so that anyone wanting to make such a move would have to drop out and restart from the bottom on the doctor-guild path.

  60. Bisitor says:

    In law it is the customers who cooperate with the tier system.

    It is a privilege issue with tacit support from large corporate customers.

  61. BBA says:

    Some more minimum wage anecdata.

    Labor unions ought to act as a counterbalance to this sort of behavior by employers, but in the private sector in America they’ve become negligible, so instead it falls on the government to collectively bargain on behalf of the citizenry. Sure this has a “democratic deficit” problem, but so do unions.

    • Adam says:

      That’s interesting. The California Department of Rehabilitation and Corrections had a $73,000 minimum wage ten years ago.

      • BBA says:

        In the public sector, the union can effectively be on both sides of the bargaining table, as with California prisons. This is connected to why unions are still relevant there but not in the private sector.

        (Also, “can be” != “always is”, as any teacher in Chicago can tell you.)

        • Adam says:

          I’m sure that’s part of it, but I actually went through most of their application process before joining the army instead because it took so damn long, and I think a lot of it was just the locations of the prisons. Aside from Chino and San Quentin, they’re in the absolute middle of the desert at least a hundred miles from anything resembling civilization. They had to do something to entice people to do that, on top of the general shitty conditions just being a prison guard in the first place. And you didn’t get to choose where they sent you.

  62. Anonymous says:

    Scott, I’m noticing a pattern in your writing of personifying government as some kind of benevolent ‘we’. As though, at the moment, government is being driven by bad people with stupid ideas, but if ‘we’ just get involved then ‘we’ can make it stop doing bad things and start doing good things.

    I’ve seen this mentality in other people too. I think it’s worth seriously considering that it matters a great deal exactly where you cut off an arm of government. You cannot simply say that the government should stop doing bad things and only do good things; that, to take this mentality to the extreme, the government should micromanage everything, that we should have a planned economy, but one where the government doesn’t make mistakes and only makes good decisions. In many, many cases, cutting off the arm lower down, at the finger level, removing only the bad fingers and keeping the good fingers, is completely unviable. The bad fingers are bad for a reason, you have no way to identify and remove them. But if you cut off at the elbow, then the private arm that will grow in its place will have far fewer bad fingers, because the conditions that create private arms are not favorable to bad fingers in the way that the government arm is.

    I’m sure this is all sounding awfully tribal, and as we all know, tribalism is bad, so you might be inclined to look for reasons why what I’m saying is sort of true but also sort of not true, and therefore retain the satisfying idea that all tribes are wrong to roughly the same extent, and the truth is somewhere in the middle. Which is why I think it’s also worth seriously considering that while reversed stupidity is not intelligence, averaged stupidity is also not intelligence. It’s absolutely possible for the truth to lie, not in the middle of all the ideological groups, but fairly near to one of them, with the ideologues of that group having come to roughly the right conclusions for entirely the wrong reasons.

  63. Aaron says:

    The duality of academia is due to the regulatory environment which supports it. It’s an artifact of that environment and without it, the university system as we know would be unlikely to survive.

    It’s not hard to imagine a system in which degree requirements are not legislated. Students could take whatever courses they wanted and then pass a certification test by some professional organization. Teachers who were actually good at teaching would be in high demand, as would be their video/online lectures. The bad teachers not so much.

    All the mandatory fluff would be gone and the costs would be vastly lower (no more monopoly pricing). Publish or perish would be dead. Elite universities would likely survive for the wealthy due to their prestige.

    Online learning has changed the nature of education, the fact that universities (and all schools for that matter) continue to operate more or less as before is indicative of the fact that they are artificial.

    • John Schilling says:

      Outside of medicine and law, where are degree requirements actually legislated?

      • Douglas Knight says:

        Most people in America work subject to licensing requirement. Most of those requirements include education, not just testing (eg, barbers and real estate agents), although not a lot and usually not through the college system.

        College degrees:
        The huge population of school teachers require a bachelor’s degree. They don’t have to major in education, but they do have to do education-specific things. It’s probably a minor’s worth of course work plus time teaching. High school teachers probably have to minor in their subject. Some states require master’s degrees (in anything?). Pay is usually determined by degree, so most teachers get a master’s and a lot a PhD.

        Engineers require a bachelor’s degree in engineering plus an apprenticeship. Architects require a master’s in Architecture, with a bachelor’s an arbitrary field a prereq (although a bachelor’s in architecture shortens the master’s by a year). Actuaries and CPAs are mostly about written tests studied out of school, but a prereq for the test is 4-5 years of study in an arbitrary subject. (I think CPAs have to minor in accounting).

        (I don’t think that this support’s Aaron’s claim.)

        • John Schilling says:

          Licensing in general is a big issue; I was asking specifically about statutory requirements for a college/university degree.

          Schoolteachers were the big one I had forgotten about there; thanks for the reminder. CPAs as well, which I should have remembered on account of my stepsister is one. But engineers, no. Most engineering jobs don’t require a license, and by the time anyone is going to offer you one of the minority that do, you’ll almost certainly have enough work experience (if you’ve been careful to document it) to sit for the exam even without a degree. I think the same is true for architects, at least in some states.

          Generally speaking, I don’t think lack of a college degree bars one from a large fraction of middle-class jobs by statute. Employer preference or policy is another matter; for the moment, a college degree is about the easiest not-a-complete-idiot-or-slacker filter that will reliably pass Griggs, and there are always resumes in the stack that have a relevant degree.

          • Douglas Knight says:

            The fact that a few states do not require college for engineers and architects does not change the fact that most do.

          • Agronomous says:


            Schoolteachers were the big one I had forgotten about there; thanks for the reminder.

            This is the kind of thing that makes me read the comments here at SSC, when I don’t bother with those at other sites. More of this, everybody.

            (I don’t mean to single out John Schilling, except to say that he’s exceptionally typical of the level of discourse here.)

      • Aaron says:

        State law dictates how many credit hours are required for a degree (I don’t know if this is true for all states or just most of them). And there is also the accreditation process which is required by the federal government. Though the accreditation organizations are usually private, they enjoy a quasi-governmental and privileged role.

        My point was that this complex web of regulation has the affect, if not the purpose, of maintaining the current university system as is. This allows prices to rise due to lack of alternatives, and puts little to no pressure on universities to change.

        Were this structure not in place I believe that the higher education system would have changed dramatically just as others have due to new technology. The fact that it has not is evidence or at least indicative that it the status quo is protected. Travel agents didn’t have regulatory protection are are essentially gone. Real estate agents and taxi drivers do and are still, for now, around.

        College degrees also have a strong cultural presence and this is a key factor as well. I suspect though, that if the university system was allowed to change, that attitudes towards degrees would change along with it.

    • Dude Man says:

      Online learning has changed the nature of education

      Has it though? The completion rate for most MOOCs are abysmal, the certification tests haven’t appeared, and I’m still not entirely sure what benefit an online class has over a correspondence course, which have existed for a while now. Maybe these things will change as online learning advances, but there’s a difference between “might change the nature of education” and “have changed the nature of education.”

      It’s also worth pointing out that the most prominent alternative to traditional education that began in the last ten years (coding bootcamps) still relies on in-person teachers.

      EDIT: changed “will” to “have”

      • Linch says:

        “I’m still not entirely sure what benefit an online class has over a correspondence course, which have existed for a while now”

        Well, one of the above is free.

      • Aaron says:

        That’s a fair point. How about “could change the nature of education”? Access to high quality online training material such as lectures and computer-aided learning programs doesn’t mean live, human tutors and teachers are no longer needed.

        Given the freedom to choose an education program I could see a variety of interesting and novel ways emerging. Perhaps you might watch lectures by great lecturers from around the world, and then participate in tutored student study groups in person? Or you could learn Russian from a teacher located in Russia via a multi-person Skype session. All kinds of things would be possible.

  64. Steve Sailer says:

    Richard Epstein argues that tenure is a way to grant professors some property rights in the university, much like making partner gives lawyers some property rights.

  65. Barry says:

    “What about tenure? We can imagine an alternate universe where academia is populated with various PhDs on equal footing. Since there would be a glut, their salaries would be very low to start, but low salaries would mean easy employment, and colleges would find a lot of room for them to do one-on-one tutoring, or low-level research, or something like that.”

    This I disagree with. Note that adjuncts seem to be treated miserably except when a particular administration or department wants to be nice.

    I haven’t heard of a gradation there, except of bad and worse.

  66. JohnMcG says:

    I’ve had similar thoughts along these lines on the “women and tech” discussions.

    The general prevailing theory is that there are fewer women in tech because male engineers are jerks who exclude women, subject them to a series of aggressions and harassment, both obvious and micro, and that the solution is for male engineers to shape up. I’m sure there’s some truth to that.

    But then, women have been able to break through in fields like law and medicine, where the established male incumbents are not exactly known for their sensitivity. It seems unlikely to me that male programmers and engineers are bigger jerks than male doctors and lawyers.

    And I think part of the answer is that doctors and lawyers have more social prestige than engineers and programmers, and therefore, people are willing to put up with more crap to be doctors and lawyers than programmers and engineers. And engineers’ lower social status means that the charges of insensitivity stick. The women who quits medical school because of a creepy classmate will told to toughen up; the one who quits a tech internship will find allies.

    • JohnMcG says:

      Another consequence is that the non-dual professions are a constant grind, more so than the dual ones. There is never a point where someone in a non-dual profession has “made it.” There is upward mobility, but also downward mobility.

      This probably has good consequences in terms of output, but maybe not so good consequences in terms of humanity for those involved.

    • Saint_Fiasco says:

      The Last Psychiatrist has an interesting theory that says women are one generation late in the fight for high status jobs.

      And it’s quite apparent that power is a generation or so ahead of you, so in 1990 a 40 year old who grew up around successful lawyers then says to his 5 year old, “daughter, you should become a lawyer!” and she probably at one point collaborates to decry the lack of female role models, and then by the time she graduates law school she discovers she’s a dime a dozen, power has been withdrawn, one step ahead

      He names some other examples like college and politics.

      Women today are struggling to be recognized as competent technology workers, because that’s where the power and money are now. Maybe in twenty five years from now programming knowledge will seem trivial and not some kind of arcane wizardry. Then men will move on to some other field and women will get to be programmers one generation too late.

  67. “Ayurveda” in itself is an amalgamation of two Sanskrit words: ‘ayus’
    which means life and ‘ved’ which means science or knowledge.
    The millenial generation is also featured in the doc which shows how Mariel Hemingway’s daughters
    are coping with the family “secret”. Take a 500-milligram capsule up to three times per day.