Griggs vs. Duke Power Co seems to come up a lot here as a scapegoat. This is the Supreme Court case that said companies can’t use anything like an IQ test to help in job interviews unless they can prove in court they’re not being racist. Since this is hard to prove, most people play it safe and avoid these tests.
So (opponents of the case figure) this is the reason we’ve gotten so bogged down in credential-ocracy. Along with whatever particular skill they’re going for, employers want generally smart people. They’re not allowed to test for that directly, but someone who can finish a college degree is probably pretty smart. So employers demand a college degree as a minimum requirement. And we end up with this farce of someone going $50,000 into debt to study Art History for four years so they can get a job in marketing.
This is the story, but I don’t think it’s true.
The Griggs decision explicitly places the same restrictions on college degrees that it does on IQ tests. Obviously nobody cares about this, and it turns out the Griggs restrictions aren’t that strict and most people manage to ask about college degree just fine. This changes the argument to be about whether there’s complicated case law around Griggs which makes employers figure that college degrees will probably be accepted in practice, but IQ tests probably won’t be.
But I don’t think that’s the main cause either. Other countries don’t have their own version of Griggs vs. Duke. I don’t know too much about their labor market, but I think many of them have the same problem. This UK website starts by saying “the fight for a well-paid job without a degree is a tough one, but there are still a handful of roles out there in which you can earn serious money without a degree-level qualification,” which seems like the same over-optimistic language you might hear on a US article saying the same. I also remember that when I went to teach English in Japan – a job which required no teaching credential and only the English-speaking ability typical of any Anglophone child – they still insisted I have a Bachelor’s in something before letting me apply.
Another point against – I think it’s Griggs v. Duke-compliant to ask people their SAT scores. See Wall Street Journal: . And TIME: Your SAT Scores Will Come Back To Haunt You. Contrary to these articles’ predictions, I don’t think asking SATs in job interviews ever really caught on, but that’s not the point. The point is it seems to be legal and a known thing which companies could do if they wanted. Given that the SAT more or less approximates an IQ test, if there were some pent-up demand for IQ-testing job candidates the SAT would be a perfectly good alternative. Given that only the handful of companies in the articles above ask for SAT scores, I don’t think employers are really that interested in IQ.
This leads to a third objection; there are some fields where standardized test scores are universally available, and they end up as credentialocratic as everywhere else. In medicine, for example, all applicants take the MCAT, a pre-med focused standardized test. But medical schools in the US still require a college degree plus a bunch of universally-recognized-as-irrelevant requirements (all pre-meds have to pass a Calculus II class, even though Calculus II never comes up in medicine). All medical students are required to take several USMLEs and report the results to the residencies they apply for, but which medical school you went to still matters a lot if you want a residency in a competitive specialty. In fact, residencies are infamous for turning down candidates from foreign schools with genius-level USMLEs in favor of US-trained candidates who barely managed a pass, for unclear reasons. Lawyers have the LSATS (and possibly bar exam scores?) and are also pretty famous for being judged on what school they went to.
All of this makes me think that, as nice as it would be to attribute everything irrational about credentialocracy to one bad Supreme Court decision, there’s probably something wrong on a deeper level.
I think it’s some combination of prestige associated with higher education (from the days it meant much more than it does now), selecting for people who are conscientious (and it’s a good bet that someone who managed to stay in education so long would be), and wanting to have the best candidates (high IQ and college education is probably better than just high IQ in the employer’s mind).
I figure that in the long run, a university education will be a net negative in a hiring situation, given how little is required to get one.
People always say college screens for “conscientiousness,” but I wonder how much of that is just rule-following.
I mean, if you tell an 18-year old, “You have to spend four years getting a piece of paper that will make you more employable, and it doesn’t really matter how much you learn,” well, there are two reactions. Some people will sigh and jump through the hoops; other people will say that’s ridiculous and suggest you do something anatomically challenging to yourself.
You can call the first group more “conscientious” or you can call them (us?) rule-followers who aren’t willing to challenge the status quo. Either way, I can understand why many employers would rather hire from a group of people who have proven a willingness to jump through arbitrary hoops.
Yeah. This comes at it from a different angle. But I was wondering how much the credentially process was filtering for a certain cultural compatibility by proxy. Selecting from a group which all went through an approximately similar process or environment. Because sometimes the hardest part of hiring is finding someone who has that nebulous quality of “being a good fit” at the company. If I went to the same college as the guy hiring, that tends to raise my “fitness” in their eyes.
I agree with this. I’d say it’s fit + conscientiousness.
As long as you pass some minimum competence threshold, personality fit is more important. Most of the time, the person hiring you would rather hire someone that is pleasant to be around and mediocre at the job than someone who is difficult to relate to but exceedingly good at the job. This is especially true at larger organizations.
This also goes to the point of conscientiousness. If a job requires a minimum IQ of 110, we might not value a 150 IQ much more than 115. But an incremental indication of fit and conscientiousness goes a long way.
All that said, companies are willing to hire large numbers of H1B’s, which would seem to contradict the cultural fit argument when meaningful money can be saved. I suppose that’s a decision made higher up, by managers who can see that money is being saved in hiring but have no reliable way to evaluate how competent the new hires are and who aren’t directly exposed to any culture clashes.
So, if you want to end the college bubble, put money on the line. Require companies to pay a tax (set it at something like 10-20% of starting salary) if they ask for job applicants’ level of education, or if they directly recruit on campuses (actually, go ahead and double the tax on that one). This solves the problems with simply banning the practice of disclosing level of education. They can ask, but it will cost them. They have to decide if it’s worth it.
My impression is that software companies often bring in a large number of H1Bs from India at the same time, which probably gives you a locally-cohesive culture.
Something occurred to me that may supports the “good fit” factor playing a role. In what may be the most amusing gap between job requirements and credentials, many of the most expensive escorts are college educated and advertise the fact. At first blush you’d think having a college education has no bearing on being a successful whore, but if you think about it it makes sense in a “good fit” way.
The men who can afford high class escorts are themselves college educated, and at that price bracket they really are paying for companionship more than just sex. The degree thus provides some common social ground, while also signalling a degree of intelligence and class, which are also important considerations. In short a college educated punter has good reason to think a college educated escort will be a better fit. In fact i predict that if escorts didn’t have to protect their privacy for social and legal reasons, those with prestigious alma maters would advertise them as well.
It seems to me then that credentialism is indeed to some extent due to cultural factors.
Remember all the fame/viral articles about the porn star who was going to Duke?
I predict that those things don’t go viral if she’s going to Middle Tennessee State.
Going to college doesn’t require conscientiousness, but graduating it does. Or at least that’s the way it was, now it’s almost the other way around as colleges are less likely to fail students.
But the theory is that the mindset and abilities that allow someone to write thesis papers and study for and pass exams while living alone will translate into a professional who can learn on the job and complete demanding tasks, and this is the best proxy absent other more relevant experience.
Which is why Scott was admitted to Medical School despite having a largely unrelated degree in Philosophy; the graduate school assumed that completing any rigorous course of study would demonstrate aptitude (along with the exams).
Well, not exactly. As he tells it, he went to medical school in Ireland where college isn’t a prerequisite for med school, so any college degree at all is a bonus.
What Randy M said. Conscientiousness helps with pretty much everything, and doing well in college will naturally happen more often to people who are highly conscientious.
By and large, those are exactly the same personality trait. An organized radical is often an oxymoron.
I feel this is a huge loss to many organizations. Both types of people would help a company excel. It feels like there are so many areas employing people who have all studied the same broken curriculum. Out of the Box thinkers are necessary to overcome obstacles.
I always want to change the status quo but maybe that just means I should lead and hire a bunch of conscientious followers.
It surely screens for many different things, including conscientiousness, rule-following, IQ, and enjoyment of learning.
That last one is probably quite significant. I’m probably of average conscientiousness, but I enjoy learning so much that most studying/school work wasn’t too unpleasant for me.
Of course, these aren’t independent. Things are more fun if they come naturally to you, and it’s easier to stick it out if you’re seeing results. Scott’s old post “Parable of the Talents” seems relevant here.
How about “goal driven”? Not sure if that is “a thing” in the literature but it seems like one of the more important ones.
College says “Here is a long term goal which has some requirements. We’ll help you figure out how to get there, but ultimately you are responsible for picking your own path”. Many jobs that require a college degree have elements very much like this.
According to Wikipedia, highlights mine:
“Conscientiousness is the personality trait of being careful, or vigilant. Conscientiousness implies a desire to do a task well, and to take obligations to others seriously. Conscientious people tend to be efficient and organized as opposed to easy-going and disorderly. They exhibit a tendency to show self-discipline, act dutifully, and aim for achievement; they display planned rather than spontaneous behavior; and they are generally dependable.”
It seems to me that rule following is a major part of conscientiousness.
The standard of English teachers in Asia is already abysmally low, requiring a college degree is a *good* thing. Otherwise every slobbering buffoon from flyover territory would be heading there for a job since they can’t get one since the mill closed down.
I think you’re overestimating a) the willingness of the common folk to move, b) the prevalence of the knowledge that it (teaching English in East Asia) is even an option.
Just wanted to second your point b) my family of slobbering buffoons from flyover country, and most of the other denizens that I have had the pleasure of talking to about it had no idea that this was a thing.
Because the job is not available to slobbering buffoons. If it was, knowledge would eventually spread.
Japanese language schools requiring a bachelors is partly them projecting their own societal norms onto other cultures (where you go to school is everything over there), partly i would guess what they hope to be a form of background check to filter out some axe murderers, partly for marketing purpose (can tell students their teachers are university grads), and lastly just an issue of because they can. It’s not like they could pay foreign non-grads much lower salaries it’s already not much at all as it is. Most people go there for the experience of living abroad or because they have an anime/asian chick fetish, not because it’s particularly profitable.
True about their own societal norms (all of East Asia has an even worse education fetish than the US, including a fetish for top US schools); also true that the Japanese are informed enough about America to know that, of the Americans they want (older than 21, middle classish or above, reasonably intelligent, mature, responsible…white, yes, they are racist), almost all of them have a college degree. Near enough that they can use it as a proxy and filter, without worrying about lack of applicants.
If it stopped being the case that all the sorts of Americans Japanese companies wanted to hire possessed college degrees, Japanese companies would stop using it as a proxy.
More generally, once it’s known throughout the world, as it currently is, that the smartest, most responsible Americans almost all have college degrees, then nothing stops anybody from using that as a proxy, even if it would make more sense on a society-wide level to just use some tests.
While I won’t let government policies entirely off the hook (was mentioned in the other thread that the doctrine of “disparate impact,” for example, has made it harder and harder to test as a serious prerequisite for a civil service job), I think a big part of it is cargo cult thinking+signalling treadmill. The rich and powerful were the first to send their children to college en masse. Others took this to mean not that the rich and powerful can afford to send their children to finishing school+networking camp, but that a college education was the key to becoming rich and powerful.
And then, of course, once it became known that a college education was your ticket to a bright future, the government had to subsidize it so that now it’s impossible to afford it without a subsidy.
As I understand it the English Schools themselves don’t really care, the Taiwanese, Korean, Chinese, and Japanese Governments require a BA to get a work visa to teach. Also, ‘BA in anything’ English teachers are not restricted to Asia, I believe their are opportunities in South America, the Middle East(do you capitalize Middle East?), Europe, etc, where I doubt the particulars of Asian culture have much influence.
That’s a misconception. Out here in flyover territory, very few of us buffoons actually slobber.
And those who used to have well-paid blue-collar manufacturing jobs are an aging cohort today, tied down with family commitments, unlikely to have passports and spare cash for a $1,000-plus-one-way trip to Japan.
In any case, adjusted for cost of living, is teaching English in Osaka really a better deal than working fast food or Walmart in a Detroit suburb?
My husband taught English in Korea and saved a bunch of money doing it, while also donating to charity generously and spending as much as he liked on his own interests.
I believe the compensation is reasonable, and importantly they often provide housing. I think the ‘live thriftily and payed off my student loans teaching abroad’ story is fairly common on the websites dedicated to this sort of thing.
> In any case, adjusted for cost of living, is teaching English in Osaka really a better deal than working fast food or Walmart in a Detroit suburb?
No, not particularly. It’s kindof a “peace corps” like thing to do after college when you don’t really know what you want to do yet.
There are many markets other than Japan which do not require a college degree for English teaching, yet as far as I am aware they still manage to operate and are not filled with slobbering buffoons.
Given the rather negative view I have of campus politics, if I was an employer I wouldn’t be looking to hire humanities graduates.
It’s a lawsuit waiting to happen.
Sure, but that’s a) limited largely to USA (are there even any other countries with substantial victimhood leftism and a trigger-happy lawsuit system?), b) only started to be visible in the mainstream recently (employers may or may not had time to reassess the situation).
I don’t have a ton of proof, but I think a lot of companies won’t ask for SAT scores out of fear of Griggs-compliance, even if it *would* be permissible. Most of the time companies are trying to avoid lawsuits they could win, really- it’s not just trying to avoid lawbreaking, it’s trying to avoid things that look enough like lawbreaking when you squint that people are tempted into expensive lawsuits.
I imagine this is a significant factor–the firms described as using SATs in the Time article are large consulting, finance, or software companies that can presumably afford the expense of gathering data to correlate test scores with a plausible metric of job performance and clearly have the legal firepower to defend themselves (therefore deterring spurious lawsuits). A smaller firm recruiting for positions that are less obviously IQ-related will be more heavily burdened by the need to establish that they’re meeting the Griggs test, and will probably have HR folks who are more inclined to cover their ass by just avoiding the risk in the first place.
Edit: the EEOC’s guidelines for employment tests, in addition to being 17,000 words long, indicate that Griggs-compliance isn’t inherent to the SAT or any other test unless you can show that there’s a valid relationship to the specific type of position you’re hiring for. So you can’t simply say “hey, the guys at Prestigious Consultants Inc. use it, we’re good” but show that the relationship holds for the job in question. Furthermore, there’s a requirement that the employer make an effort to find less-impactful alternatives to a test with disparate impact.
If you’re a manager trying to introduce test scores as part of your selection process, your HR person is likely to take one look at the mess of requirements and rattle off ten reasons why you can’t do it. An organization where hiring top people is vital can probably push through this, but most won’t bother.
If this were true, you would expect that common advice to job applicants to be that if you have a high SAT score disclose it in your job interview. (I once disclosed my GRE math score in a job interview when the topic of my math aptitude arose.)
It *is* common advice to state your degree field/major, which correlates decently well with SAT scores, and people often will list “cum laude” and the like. Also, as a hiring manager, I have seen a few people put on things like AP exam scores and ACT exam scores when I was hiring for a out-of-high-school data analyst position. Not a majority, but it does seem somewhat in practice, at least.
Strangely, I don’t think I’ve seen an SAT score on the hundreds of resumes I’ve reviewed, even though I’m west coast and the SAT is far more common than the ACT.
If I saw an SAT score listed by anyone other than a recent college grad, it would raise a red flag to me. Why are you telling me about your accomplishments from High School? Why aren’t you telling me about the relevant work experience and projects you’ve done since then?
putting your SAT score on resume also risks you getting taken as a huge nerd (similar to listing toastmasters/spelling bee type accomplishments). I have a perfect 800 GMAT score (never took SAT) and have put on / taken off mention of that from my resume multiple times over the years and am still uncertain on whether it’s a net benefit or not – I do it when I interview with Asian / Middle-East people and generally take away for Western firms.
I never put my SAT score on my resume, but I did put down my National Merit scholarship (won basically on the strength of my SAT score) for a couple years after college. I don’t know how much good it did; no one I ever interviewed with asked me about it, and I removed it once my actual job experience was extensive enough that my resume didn’t need that kind of padding.
I’ve been tempted to include “sine laude” just to see if anyone’s paying attention, though apparently the proper term is “sans mention”.
I was advised to put my SAT score on my resume when in college, and a lot of people did.
I am primarily familiar with the legal field, so I’ll speak to the use of the LSAT rather than the SAT, focusing on the top end of the legal field (“biglaw”).
It is unimaginable that major firms could actually get away with sorting candidates by LSAT scores. The numbers are striking. In 2010, there were 7,789 applicants with a 165+ LSAT and a 3.5+ GPA — 63 of them were African American. The mean LSAT score among African American test-takers was more than a full standard deviation below that of Whites or Asians.
LSAC data (search (“Table 4a”)
LSAC amicus brief (search “in the 165+”)
Law firms are already getting criticized heavily for not having more “diversity,” even where there are far more minority attorneys than one would predict based on LSAT scores (and then they get criticized when the numbers drop off as you go up the ranks). Law firms already have to practice affirmative action to get even close to an “acceptable” number of minority hires; they don’t really have an incentive to redo the work that law schools performed in figuring out which similarly (sub)-qualified minority applicants had better soft factors.
Perhaps this isn’t as much of an issue in other areas, but I would be surprised if so, given the way the discussion is often framed as needing to “catch up” in terms of diversity efforts to where mainline corporate America is.
How do minority lawyers do in the courtroom? I’m normally suspicious of claims about the value of diverse perspectives, but this strikes me as a situation where they might be justified.
That is a question that I honestly don’t know the answer to. Judging actual attorney performance is very difficult, and you’d have to control for a ton of factors. Perhaps someone has done so, but I’m unaware of it.
For what it’s worth, minority lawyers make partner at major law firms well below their numbers in the field but also far above the numbers predicted by their LSAT scores.
One possible explanation could simply be intertia. Even though there has for a long time been known, most employers don’t use it.
E.g. this meta-analysis of how different tests predict performance: http://mavweb.mnsu.edu/howard/Schmidt%20and%20Hunter%201998%20Validity%20and%20Utility%20Psychological%20Bulletin.pdf
1. Work sample tests (54 % correlation)
2. Cognitive tests (51 %)
3. Structured interviews (51 %)
The bottom performers are:
– Job experience (0.18)
– Years of education (0.10)
– Interests (0.10)
– Age (0.01)
So while it seems that a combination of IQ, test-task and big 5 personality test could yield good results. Most people look at resumes and do unstructured interviews. Because… well, because people are slow to adapt.
Yet, some people clearly do. I learned this stuff from the book on how Google recruits, for example. Perhaps only the highest performing companies selects correctly.
There’s an expression I remember from the tech sector around the 2000s: “No one ever got fired for buying a Microsoft product”.
Which is to say: it may be demonstrably inferior, but it is broadly standard, and therefore easy to justify. This isn’t so much inertia as it is the natural sum of human incentives. If you’re an HR guy, you don’t get credit for great hires but you DO get a talking to for bad ones. The people giving you a talking-to are probably business managers, not HR experts. They were hired based on their degree/experience, and they are not totally swept up on the latest studies examining the strength of correlation between evaluation metric and eventual performance.
If you hire a dud, and you have to justify yourself to management, do you really want to be saying “well, according to this paper I read IQ matters more than degrees so I did something no other company our size is doing ?”
And I think that’s the key. Take a look at the companies who are hiring people without degrees. They are mostly smallish startups who are generally inclined to take risks, or tech sector companies who have a culture that doesn’t reflexively punish risk taking. Alternatively, it’s you have people promoted through the ranks and somewhat bypassing degree requirements by demonstrating they can do the job.
To tie it in with the quote at the beginning, nowadays you can totally get fired for buying a Microsoft product if it’s not properly suited to the task, because it’s understood that the benefits of well selected software often outweigh the “risk” of going off-brand. Moreover, the knowledge that there are alternatives out there has percolated above the rank and file into upper management. Give it time.
That’s a weird cultural shift. The expression is much, much older than the 2000s, and goes “Nobody ever got fired for buying IBM”.
It’s still current, in the IBM form, today.
IBM is rapidly approaching the point where “buying IBM” doesn’t make any sense. You’d instead say “hiring IBM” like you’d say “hiring McKinsey” or “hiring Wipro”.
Well, to Nate’s original point: it may be that the corperate cultures capable of making this shift have a competitive advantage over those who don’t.
Or perhaps I’m just seeing entrenchment and selection bias at work. If you want to break into an established industry, you have to beat out the current players. The obvious risk-free moves are already mapped out, so you take risks (like hiring on IQ and using open source software). If the risks pay out, you get to be part of the entrenched industry! This may or may not mean you continue taking risks. IE, in 15 years we’ll know whether Tesla has a corperate culture that allows them to effectively take on risk, or whether they took a bunch of risks to get started, and once they are part of the market they will stop.
Or to put it another way, the companies REALLY willing to take risks and try new practices are companies who are screwed if they don’t, because the established players in most industries have such a large advantage that it’s not worth gambling.
Curiously enough, as a result months-long IT crisis, the Swedish PM fired today two ministers for what amounted for buying IBM. “Bying IBM” meant here outsourcing sensitive police and military data servers to “IBM cloud” (i.e. random IBM engineers in Eastern Europe) without even bothering to check if the Czech IBM engineers might need security clearances if they can look at top secret Swedish government files.
Some expected that the whole government might collapse because of this; the actual director responsible was fired several months before (and received amazingly small fine that reeks of either corruption or the government trying to sweep the whole issue under the rug).
That study’s been updated with another 20 years of data: https://home.ubalt.edu/tmitch/645/articles/2016-100%20Yrs%20Working%20Paper%20for%20Research%20Gate%2010-17.pdf
See pg. 65, Table 1 for all updated correlation coefficients, but to excerpt:
GMA tests .65
Integrity tests .46, +20% validity when combined with GMA test (V)
Structured interview .58, +18% V
Unstructured interview, .58, +13% V
Phone interview .46, +9% V
Interests .31, +10% V
Conscientiousness .22, +8% V
Agreeableness .08, +0% V
Peer ratings .49, +1% V
Job knowledge tests .48, +0% V
Work sample tests .33, +0% V
Job Experience .16, +5% V
Age .00, +0% V
Thanks for that! Some interesting changes.
By the way the table on page 65 looks to be all paired numbers. Thing plus GMA (General Mental Ability – which I think is code for IQ/SAT/Cognitive).
Level of correlation is up from the last one though, and is quite shockingly high. IQ + Integrity-test predicting 78 % of performance variance:
Yeah, GMA + integrity test is crazy high, and seems less labor intensive than GMA + structured interview (especially if you’re doing a panel interview). I’d screen candidates with the first two tests, and then run the top set of candidates through a structured interview for sake of efficiency. One could vary how many candidates to interview based on time, resources, and importance of position.
One can use other measures to minimize bias, like the ones Daniel Kahneman developed for the Israeli military. From Thinking, Fast and Slow:
There are other things you can do too; A chapter in the Best Place to Work covers other effective ways of minimizing bias in the selection process, backed up with research.
It seems clear cut to me…Dunno why people don’t bother doing it.
Our hiring needs (which are probably not unique) don’t quite fit that process. We don’t have a bunch of people coming it at once, for 1 (or a fixed number of) spot(s) to fill. We have a few people coming in per week for a very large number of spots we would like to hire for. Our process is actually pretty close, we have ~6 metrics a candidate is evaluated on, grading is done independently by different people on each different metric without talking to the others, the grading is as specific as possible, but instead of picking the best N candidates out of M total, we take all candidates whose individual scores and total score exceed some minimums.
Useful info. One grain of salt for reading these reports, though — no company is going to hire random people just to see how well they do so they can report the data. So there is a minimum filter you have to pass even to be included in the study. Our company once did a study and found that having a mixed performance on interviews correlated with strong job performance — not because doing poorly on your interviews makes you a good candidate, but because if we hired a candidate *despite* their doing poorly on some interviews it meant that there was something really extraordinary about them; most people with mixed performance didn’t get hired at all.
Much interesting in this update, thanks. I’m especially surprised by the change to structured vs unstructured interview, which are now identical in their predictive validity (when not combined with GMA). Correct me if I’m wrong, but doesn’t that negate accepted wisdom, as stated forcefully e.g. by Kahneman?
(“A vast amount of research offers a promise: you are much more likely to find the best candidate if you use this procedure than if you do what people normally do in such situations, which is to go into the interview unprepared and to make choices by an overall intuitive judgment such as “I looked into his eyes and liked what I saw.”)
I think the fact that the SAT is explicitly an admission test plays a role too. If someone has a really good SAT score but chose not to go to college, that would raise real questions about why they made that choice. Maybe it shouldn’t, in the abstract, but given that our culture does place such a value on credentials, an employer might be tempted to wonder why the candidate wasn’t willing to play that credentialing game the way so many others are. And that’s doubly true with the candidate has an SAT score that’s high enough to get them a free education at a good school.
Kinda like facial tattoos/peircings. Nothing inherently or morally wrong with it, just a question of aesthetics. Except it is broadly understood that even a “tasteful” facial tattoo is an obstical to employment, so any candidate who has one has to justify why they chose to buck convention and common sense in favor of personal aesthetics.
See also: why I have to wear more than a bathrobe in public.
Yes. The higher a student’s IQ, the easier it is to graduate from college. So given that you didn’t go to college, disclosing that you have a high IQ sends a negative signal as to your diligence.
A rebuttable one–if you dropped out of college to do a startup, it doesn’t look so bad. If you dropped out of college to concentrate more fully on your pot smoking and video game interests, yeah, it looks pretty bad.
So couldn’t people who drop out of college to smoke pot and play video games claim, when they eventually look for a job, that they were trying to start a company?
College degree signals your willing to do something expensive, sometimes completely unnecessary, in order to just get along in the system. You’re willing to bend to standards arbitrarily set. It basically signals that you are part of the middle to upper class, and not part of the backward, non-educated, rebels.
That’s mostly why people do it in established fields – to prove you are part of the ingroup and not the outgroup. Fields not established, like software engineering, mostly require expertise and are most likely to not require degrees because they are recognized not as corollary.
In Software Engineering the signaling is not yet established because there is not a dominant ingroup (yet). Ingroups are still fighting each other but you can be assured there will be a costly secondary signaling behavior eventually.
good point. also computer nerds have historically tended to be part of the outgroup socially – think of your programmer steoreotypes – bad social skills, bad body hygiene habits, survives on pizza, weird interests, dislikes authority/rules, etc. etc.
I can’t speak for other fields, but my own CS degree, while expensive, was absolutely necessary. I wouldn’t be nearly as effective without it. I think it is a mistake to lump all college degrees into the same bucket; in addition, you have to always keep in mind the fact that most people are not genius-level polyglots who can learn to do anything they want to in a week. Some of us are regular Joes who do require schooling.
Yes, it quite irritates me to constantly be told that I got nothing from college education, when my one regret about college is that OO wasn’t part of the curriculum at the time. I still think much more procedurally and less object oriented.
But the grounding I got in solid programming concepts has served me very well in my career.
Yes, It’s good. I learned a lot as well. But I can’t imagine a doctor saying, well I don’t have a degree but I learned a lot online, and then I did a residence. But you do see people without CS degrees just learning online and being great. And it’s not like it’s not possible to learn everything needed to gain entry to both without degree…
Programmers don’t get sued for malpractice if e.g. it turns out they wrote Heartbleed into OpenSSL.
Software is probably the most failure-tolerant area of STEM. And perhaps justifiably so; it is in most cases easier to patch a buggy app than a crashed plane or a dead patient. Most software applications are non-critical consumer goods with no safety impact. And I have to believe that this factors strongly into the software industry’s unusual willingness to hire people on the basis of fizzbuzz, a few open-source builds, and talking a good talk. So I would not expect this practice to be nearly as successful, nor generally adopted, in other STEM fields. Or in medicine, accountancy, and the like.
That doesn’t mean that a college education is the only possible path to success or even entry-level employment, but I have yet to see anyone propose an alternative for engineering that would be generally suitable.
First year of the CS degree, for me, was clutch. Remaining 3 years were not useful.
You only find out your score on the bar exam if you fail. If you pass, you are just told you passed.
At least in New York, you get your MBE scores even if you pass. Never had anyone ask about them though.
Yeah. Don’t know if this varies by state but that’s true in Virginia as well.
Prior to 2013, Washington State used to publish exemplar essays for each question from the previous exams, so if yours made it, you could at least conclude that you were among the top entries for that question. So you found out if you failed or (sometimes) if you did really well, but not in between.
I seem to recall reading things about how in theory this was to prevent using the scores for ranking instead of treating the test as a pass/fail. (As someone whose practice area is not covered at all on the bar exam I can’t say I am unhappy with this approach; I’m not sure what overpreparing for it to max out my score would have gotten me.)
The parent comment is false as to many states, including Texas (which puts your score right smack on the top of the letter).
In Oregon, I think they mail you your score, but your passage is published (and lots of people freely assume that if you don’t list bar admission on your resume w/in six months of graduating law school, it’s because you failed on your first go-round).
Not true in the jurisdiction where I passed the bar. My scores were reported to me, but not to anyone else.
I can speak to business schools and the GMAT, which offer some mixed evidence on the question.
A GMAT is like SAT on steroids, and it approximates IQ closely enough that it’s accepted by Mensa. In terms of explaining variance in application to b-schools, GMAT counts for probably a third to one-half, along with undergrad GPA and work experience.
For companies hiring out of business schools, a number of them don’t even ask for the GMAT but some rely on it almost exclusively, especially top tier management consulting companies. I think that in my school McKinsey extended interview invitations to almost everyone with a GMAT in the top percentile (>=750) and almost no one outside that range.
I think companies are being quite rational. At the top tier of consulting / I-banking there’s probably a difference between 140 IQ and 110, but for a lot of other non-STEM companies, there really isn’t. To be good at marketing or corporate finance (or my own job) probably 110 IQ is enough and things like work ethic and how well you get along with people/authority start mattering more. For this reason, companies look at who can survive a full term in college instead of SAT/GMAT/IQ.
Every HR guy I’ve ever dealt with complains about filtering candidates, because if they are less than totally prescriptive about job requirements they get flooded with thousands of resumes. Worse, nowadays people know that the “minimum requirements” (3 years retail experience for an entry level retail job) are often there to cut down the less confident applicants, so they end up flooded anyway.
You’re right, IQ 110 is probably enough to work most jobs (almost by definition). But saying “Applicants have to take this pre-screening test and score above 130” would do a good job cutting down the applicant pool. The only fear is that you’d end up with under-employed smart people who might leave you, but that really only happens if you let them get bored, and honestly that’s on you as a manager if you let that happen.
The best point I’ve heard on the prevalence of educational signaling dovetails right with this point that IQ tests are not sufficient for employers. The first big factor is conscientiousness. Intelligence and conscientiousness are even somewhat negatively correlated variables (http://www.sciencedirect.com/science/article/pii/S0191886903004380) so a measure of only one of those is not really useful in finding the other. But education is trying to separate out people with both those traits. A highly intelligent person who gets bored with college and quits is not a good deal for most employers, and it’s harder to fake years of showing up an succeeding in college. Perhaps even more subtly is the cultural knowledge that one acquires which integrates a person into the broad culture of educated people. This reduces misunderstandings and tensions on teams leading to smoother functioning organizations.
Educational signaling is still bad for society as a whole given the rather expensive gate keeping to the middle class it does, cheaper signaling tools would be nice, but IQ is not the end-all-be-all of employment needs so Griggs’ contribution to this problem is probably pretty limited at most.
Scott, thoughts on this study, particularly pg. 18?
This paper inexplicably fails to mention Scott’s main point – that the Griggs ruling also forbade using education credentials.
It doesn’t. It forbids using methods with disparate impact that are not demonstrably related to the job. Nobody cares about asking for education credentials so effectively it has no impact on that. Lots of people hate IQ with the fiery passion of 1,000 suns and believe it is racist, so you will get sued for that.
Even assuming other countries don’t have similar bans on IQ testing, could their focus on credentials be the result of the rise of multinational corporations, many of whom are based in the US, and who are more likely to have blanket hiring policies covering all regions rather than tailored policies? Or maybe more generally it is just a reflection of US business practices, which are widely imitated.
Also, you mention the SATs as proxy for IQ. But what if companies are not primarily interested in testing for IQ? Maybe they want to give more job-apecific tests but don’t because they are afraid (correctly or not) to run afoul of the law.
I don’t know how to address your comments on med school though.
FWIW, back in my day, Israel had explicit IQ testing at several levels of schooling; employers were allowed to look at the results, IIRC.
The WSJ/Time link pair seems misformatted?
As an individual who has previously promoted Griggs as a causal explanation of the credential-ocracy, I have to say you make a very good case for it not being causal. I think I have to admit I was wrong about that.
However, I think it is still fair to say that while Griggs might not be causal, the same forces/ideas/theories that inspired the Griggs decision – e.g. disparate impact legal theory – are themselves significantly to blame for the credential-ocracy. I would of course agree that these things are not the only reason for the increasing value of credentials (especially in regards to the examples you have previously talked about in medicine), but it is hard for me to see lawsuit avoidance not being at least one of these reasons. In fact, this is the precise reason that was given why the test results were tossed in the Ricci case.
Regarding the SAT, while most companies might not currently ask for an SAT/ACT score on their application, I would submit that this is likely because they don’t have to: by asking for a college degree and comparing colleges, companies can get an approximate score range and know right away that the prospect is among the x% of the country that takes the placement test. Why risk a lawsuit when you can get just about the same amount of useful information without?
Most people can probably agree that currently, market forces probably do not perfectly align towards putting the entire supply of the most competent individuals into the jobs in which they could do the most good. I am not sure if this can be improved and by how much.
However, before we spend a whole lot of effort trying to optimize that, I would suggest that we stop intentionally making it worse.
Supporting Scott’s point, IBM actually has an “Information Processing Aptitude Test” they give to all applicants. They ask a bunch of hard questions based on deduction and logical reasoning, mostly using numerical series progressions and word problems. There’s not a lot of information on this test, but here are some Quora questions so you get an idea:
Obviously then if some companies want it bad enough, they can approximate some sort of IQ type test anyway.
The McKinsey Problem Solving Test (which I think they’re calling something else now) is similar, kind of like a baby GMAT I think. I hadn’t taken the GMAT, it reminded me of the quantitative GRE with a whole lot of business terminology added in.
I applied for a job at Fannie Mae in the early 2000s and to get to an interview you had to take a test. From my memory it was kind of a test of logic and math focused on programming. If you were hired you were moved into a training program. I am guessing the test was pretty highly correlated with an IQ test.
I remember that test!
IBM was my first job. I was being hired from my co-op with them, but they gave me the test anyway. I heard I did well on it but not the exact score.
The requirement of a college degree can be a proxy for IQ, but I think it more has the following functions:
1) hazing – all the other people in the organization got a college degree, so they want potential applicants to have one as well
2) showing that you’re willing to jump through hoops and put up with bullshit. If you can’t navigate an unwieldy bureaucracy, you can’t make it through college
3) Because of unprecedented mobility and communication technology, an employer has a large sea of potential applicants. One needs to pair down the list somehow. And very few people will complain if a generic credential like a college degree is used.
I’ve met a fair number of unintelligent people that managed to get college degrees. There’s certainly a floor higher than the general literate population, but it’s not that high.
Yeah, count me as very skeptical of the claim that “someone might very well be a great employee, they just can’t figure out how to write a term paper!” This may be an issue for literal rain-man style autistic savants, but man, grade inflation in college is a real thing. Writing term papers is not hard. Navigating college bureaucracy is annoying, but it isn’t hard (there are people who will hold your hand through every process if you just ask) – and the “navigating bureaucracy” skill IS necessary/required for a career at any reasonably large organization.
I have no confidence that someone who literally cannot figure out how to get through a mid-tier state institution with Cs and a diploma would be an effective corporate employee at anything other than the most specific and repetitive task with virtually zero room for advancement.
In the Third World, or at least my local corner of the third world, credentialism is seen as a necessary evil to prevent nepotism and corruption.
Civil service jobs used to hire based on political affiliation, now they are based on a more formal system where a college degree gives you some points, every year of experience in a relevant job gives you more points and so on.
The reason we use college degrees and not something sensible like IQ is that you can more easily cheat an IQ test or forge it entirely, while a college degree is harder to fake. Even if you produce a perfect facsimile of the literal piece of paper that represents a college degree, the country is small enough that someone who actually went to the college you said you went will be suspicious that he doesn’t remember sharing an Intro to Roman Law class with you or something like that.
Additionally, college is very cheap (the opportunity cost of not working is more painful than tuition) so that downside does not apply.
Those conditions are somewhat similar to the US 50 years ago. Most elite professions were very insular so important people would know each other (usually at college) and college wasn’t prohibitively expensive. Maybe the problem is just that those adaptations don’t work like they used to.
What matters isn’t the fine print of the decision, but how likely someone is to get sued. Once employers started asking for college degrees but didn’t get sued for it, then other employers knew that asking for college degrees was safe and the popularity of doing so snowballed.
Other countries don’t have the ethnic balance of the US.
Anything is Griggs-compliant if you can prove it’s a necessary qualification for the job. The problem is that proving it is difficult, expensive, and risky. Some companies are big enough to have in-house legal departments and to be able to absorb the risk. Using SAT tests isn’t made infinitely expensive by Griggs, just expensive enough to distort the market; there will be the occasional company who uses them even in the distorted market.
It affects the point. There are reasons why it hasn’t become popular, and risk of discrimination lawsuits is one.
>Other countries don’t have the ethnic balance of the US.
And this makes credentialism a good idea … how, exactly?
My understanding of the law overall, as communicated to me by my company’s labor lawyers, is that it’s illegal to use any hard decision criterion that has a discriminatory effect on a protected group (women, minorities, older people), unless you can prove that (a) the criterion is correlated with job performance, and (b) you’ve selected the least-discriminatory criterion among the set of equally-predictive criteria. So it’s not that you can’t be racist, it’s just that you have to prove that no matter how you cut it, the best selection criterion (in terms of job performance) is racist. I was a bit surprised to learn this: the law could say that you can’t be racist, you have to settle for using the best selection criterion (in terms of job performance) that isn’t racist at all. But the lawyers were very clear: it doesn’t say this. It’s more (dare I say) meritocratic.
This doesn’t just apply to IQ — it applies to everything. If I recall correctly, it would even apply to college degrees, if that really was a hard criteria some company was using. One reason this is hard to enforce is that many companies don’t use hard decision criteria. They collect a bunch of info and make a soft “holistic decision.” Such companies are still on the hook if their bottom-line numbers are discriminatory e.g. if their “holistic decisions” just happen never to hire women, even though plenty of women apply. It’s very hard for a company to defend themselves in court in such a situation. But it’s a different kind of court case when there’s a hard criteria: if the company did a bunch of work ahead of time to prove that their criteria satisfies (a) and (b) above, then they’re actually pretty safe. If not, then the plaintiff (often the Equal Employment Opportunity Commission, a government entity) can do their own statistical homework, and if the criterion is discriminatory and doesn’t satisfy both (a) and (b), then the company is in trouble.
There are industrial-organization psychology consulting firms whose whole business is working with companies to create hard decision criteria that satisfy (a) and (b) above. My understanding was that IQ was perfectly in-bounds for this, you just had to do the work to prove it. And, often, your hiring pool was often roughly from the same stratum of IQ anyway, and the residual racial factor of IQ was stronger in than the predictive-of-job-performance. So it might be more trouble than it’s worth, but definitely not illegal outright.
Caveat: I don’t remember specifically, but companies might be able to use some basic, very low-pass filters — which may include years of related experience — without being subject to the same scrutiny. Maybe that covers degrees too, I’m not sure.
I see in the comments here a lot of people acting as though going to college were some terrible chore that signals the willingness of the degree-holder to suffer arbitrary amounts of inconvenience.
But isn’t college, you know, actually a lot of fun?
Mine was! And I didn’t even drink at the time!
I mean, there are parties, there are members of your preferred gender(s), there is way less parental supervision than you’re used to while still having your parents probably help out financially, there are sports, there is a lot more free time than you’d get while having a job (at least for most students), and I think that in fact lots, maybe most, students enjoy most of their classes.
Obviously, some college students do not enjoy college. But I think most do. It seems way more of a signifier that you have enough money to afford it than that you’re willing to subject yourself to horrible drudgery.
Oh, yeah, college was a heck of a lot of fun. Also, very expensive. I think I had ours paid off within a decade, but the price has only grown in the fifteen years since I graduated.
Does it make sense to take a pretirement when poor? Does it make sense to so strictly enforce doing so society-wide? Delay family formation nationwide in order to make the jobs of HR reps easier? I think not, despite, as said, enjoying it a lot.
Sure, I’m not suggesting that it makes a ton of sense (though there is something to be said for having fun when you’re a kid and not in your 60’s). But I’m talking about statements like this:
(Several other people in this thread and the previous two have suggested similar things.)
I mean, that doesn’t really convey the calculus involved. It’s not a chore (for most people). It’s fun! And that means that a college degree doesn’t signal much willingness to “sigh and jump through the hoops,” unless the hoops are “four years of mostly-paid-for time of high leisure, lots of time around a peer group of people eager to experiment with much less restricted sex and drugs than they are used to, and, again, classes that I think most of their attendees mostly enjoy.”
This does not seem true. And it is only going to get worse, when a greater fraction goes to college, more people will attend that either don’t graduate (but still have to pay for it) or take 5-6 years for a four year degree, or don’t get as much aid, or aren’t smart enough to budget as though they were on borrowed time; and with more degrees issued, the value as a distinction will go down, meaning the pay-off will decrease.
If college was free, then it would merely be inefficient, perhaps an inefficiency society wants to indulge as a gift to the youth. But it really isn’t, and like medicine, asking the government to throw large amounts of money at people to make it cheaper seems to have had the opposite effect.
The hoops are graduating from a rigorous course of study only loosely connected to your future career, a scholastic feat that requires a level of concentration and intellectual prowess that few are able to attain. But that would be bad for business.
I didn’t see anything in a quick pass through that wikipedia article which suggested that 50%+ of the cost of college is born by students, though it was long and possibly I missed something.
I feel also as though you are imagining that I am arguing “college is an efficient use of resources,” or “making everyone go to college is a great idea,” or “the government should pay for more of college.” I am arguing none of those things. I’m saying that asking for a college degree does not get you a hide-bound rule-follower who is willing to do extremely un-fun things.
True, and I addressed this more in an edit. Basically, the rule following isn’t in attending college, but in attending to college. Go to class on time. Learn how to cite sources in APA style. Do so in a ten page paper about an unfamiliar subject in three months. etc. Being able to do so even in the midst of distractions you point out is all the more daunting, in fact.
Okay, this is a literal meaning of “mostly-paid-for” that I am not prepared to refute; instead I’ll amend that to be that just because a cost is born equally or more by a third party does not mean it isn’t ruinously expensive or a net harm (for many). I see now we’re probably in agreement on this but I initially got a different impression by your argument appealing to the positive experience college is for most, putting aside the drawbacks at least.
I don’t think that a typical college student has a great deal of difficulty achieving graduation on academic grounds. I mean, some of this is kind of “everything selects for everything.” Yes, if you graduated from college, that does probably mean that:
a. You aren’t the kind of extremely hard-core slacker who has extreme difficulty doing literally anything besides playing video-games and smoking pot.
b. You aren’t so antisocial that you constantly get into fist-fights.
c. You aren’t in the bottom 5% by general intelligence.
And a long list of other traits besides. But these are cheap traits to test for, I don’t know that college selecting for them creates a pressure for “college for everyone” that we see today.
That’s partly because of admissions offices doing their job and making sure the criteria of “typical college student” still leaves some children behind, and partly grade inflation and dropping of standards. If college becomes as ubiquitous as high school, it will cease to be useful as a selection criteria and something else will have to arise to take its place in that regard.
Your category a does describe one of my good friends from college, minus the pot. Nice guy, even kind of bright, couldn’t bring himself to class on time. He might still be attending somewhere.
Some of us went for STEM degrees.
I have a Bachelor of Arts in Computer Science.
And in any case, the bulk of the credentialism doesn’t require an engineering degree.
Seems like we have similar degrees but apparently very different experiences. I’m honestly curious here. I’d really appreciate answers to at least a few of these questions (although there are a lot and some of them may be personal).
Was your GPA in the low, average, high, or very high/honors range? Did you generally try to take many challenging courses you thought would be most interesting, or “balance” hard courses with ones known to be easy? Did you have to (or choose to) work during college? Would you say your university was “pretty reputable”, “medium-level/not-very reputable”, or “unranked”? At graduation, did CS classes make up more or less than 45% of your credits?
Probably even more important: would you describe yourself as probably average intelligence, probably above-average, or probably very above average? Would you say your time management and study skills were mediocre, average, or great?
Finally, during college did you have any particular plans for post-graduation/dreams that potentially would depend on your college performance?
I graduated from Williams College in the late 90’s.
I had a GPA in the like 3.4-3.6 range I think? Not 100% sure. It was enough to constitute “cum laude” but not “summa cum laude.” I did not have the best grades in my department or my class.
I took pretty much all of the available CS classes at the time, which constituted usually about 50% of my classload (counting math requirements as CS).
In so far as you can reduce such things to a single scalar, I’m very intelligent. My study skills were mediocre at best. I intended to work as a software developer post-graduation, and had no plans to do grad school.
But more so, this is not based on just me. It’s also my observations of my schoolmates, and my ongoing contact with my high school friends through college, and my sister’s college experience, and my discussions of college with my colleagues in my work life. At which point we’re talking about a much broader subset of people. People who went to big state universities and tiny liberal arts colleges, people who went into technology and people who got “soft” degrees and brokered them into white-collar jobs not particularly related to their area of study. I mean, it’s not scientific, and it’s definitely not an unbiased study, but I was not a uniquely and strangely happy student.
No, no it is not. It is very very much not fun. Not suffering-filled, unless you are unlucky enough to have anxiety or depression at the same time. But definitely not fun either.
(context: Acquired 4-year engineering degree that most people take in 5 years, in 3 years instead)
College is not a party, it is a loop of “wake up, go to classes, go to more classes, eat food, do homework, do more homework, do even more homework, have a bit of free time at the end for aimless computer browsing, sleep, repeat” for four months, and then you repeat the loop. Having a full-time job is *enormously* more relaxing than college, and offers far more social opportunities and time for partners.
In my experience, the people who enjoyed college in the way you are describing were almost always the people who were kind of bad at it, or who dropped out in a year or two. For the people who actually made it through with a STEM degree in a reasonable amount of time, I heard far more stories of stress and suffering than stories of parties.
Sleep, Studying, Socializing. Pick two.
Yeah, see, I think that you’re objectively wrong for the very solid majority of students (like, probably between 70-90%).
This isn’t to dismiss your experience with college — it is your own. But if you imagine that you can generalize your experience, then I think that you’ll rapidly come to extremely incorrect conclusions about what happens if, for example, you make a college degree mandatory for your employees.
yeah, but you got an actually relevant degree which should still retain its relevance regardless of whether or not credentialism is in play
however, people who get degrees only as a credential and then go on to a job that doesn’t use the knowldedge they acquired, mostly have fun
problem is that they pay a lot for it, or their parents do, or their loans do, and then also everyone has to go because everyone else is going
I hated college. It was even dumber than high school. The people were worse, the classes were less challenging and less interesting, and there was constant judgement.
I have found working days to be the most pleasurable. A large part of this is due to NOT having to deal with people my own age, as well as people my own age maturing somewhat.
Like, my Wife has young siblings. They are total morons. One missed filed his tax return for years because he just didn’t get around to doing it. Another shoves her dirty dishes into cabinets so no one can see them….her kitchen doesn’t have flour, or corn starch, or sugar, or any other staple, but by God it has a bunch of silly little hipster knick-knacks.
My impression, based on my memory, and my Wife’s memories of her college students, and the various stories I still here, is that the vast majority of college students have this gross level of irresponsibility.
So, no, I really, really disliked college. I didn’t even pay for mine and I still wish I didn’t have to go.
Additional note: I may not be average, I had rather severe depression in college. I am confident some of that was augmented by how much college sucked, though.
Some aspects of college were very fun. The best social life I’ve ever had was there, I made a lot of friends and seeing them often was easy, the median intellectual level was high and people were often fun to talk to, etc. But in other respects, it was terrible. Lectures, homework, and studying added up to a significantly higher workload than I have at work now, it was more stressful, and a lot of it was a waste of effort from a human capital perspective. I didn’t have much money. I was freer from my parents than when I lived with them (which was a significant improvement), but they were still breathing down my neck. I probably have more free time now than I had then – and I definitely have more stress-free free time.
And then there’s the price tag.
Caveat: went to college to study computer science in The Netherlands.
It was a substantial improvement to my life, but mainly by being a substantially less abusive environment than high school.
I am an introvert for whom (young people) parties are not too pleasant.
I went from ~50% women in high school to a single digit percentage in college. Students at my university would go to another (non-STEM) college city or invite students from other colleges over to meet women.
Dutch universities often don’t have a campus and Dutch financing makes staying at home and commuting more cost-efficient. Since I grew up near a STEM university, this was an easy choice.
So no. My parents were never particularly strict anyway (and/or I wasn’t very rebellious).
I did the same sport at college as I did before college (and at the same non-college place). If anything, my sports situation got worse as I aged into a team of other students, who turned out to be fairly flaky.
True, I played a lot of computer games.
In the first years, 50% was math, which was OK, but not particularly enjoyable. It really was a lot like work, with some nice & interesting stuff and a lot of red tape & drudgery as well.
It’s kind of fun how my college experience is kind of the polar opposite of both groups. For starters, I studied math, so nobody can tell me that I studied something easy ( though admittedly it was applied math, so we’re the Hofstadters of the the science world).
In any case, I had a lot of fun in college. But not because of socializing; College was easily my worst time socially speaking, I didn’t really get along with anybody and spent almost the entire time effectively as a shut-in that only saw other people to work on homework.
No, I liked it because I was genuinely interested in the topics of my classes, and because I still had a reasonable amount of free time after classes, that, for the first time in my life, I could use any way I wanted to. So logically, I spent almost all the time that I didn’t spent with math on video games, tv series and reading.
In my opinion, college allows you to get paid for something that I would normally do in my free time.
I also worked part-time as a postman, and this solidified my opinion: Being a postman is a chore. Admittedly one of the less bad ones, but I usually thought the entire time about other things and mostly was bored while being one. As a postman, my life revolves around other things, it’s just a way of getting money.
On the other hand if I do math, I actually think about math, if I come home I think mostly about math, etc. unless I get tired, in which case I play video games to ‘recharge’. but I naturally start thinking about it again afterwards, even without getting paid.
For my girlfriend it’s the same, though with psychology instead of math.
Imo, both if for so many the classes in college are a chore, or either they think the time was great because of sex and parties and not because they actually liked the classes, then at least for me that calls the idea of college that is currently embedded into our culture into question, maybe even the entire ‘college should be for everyone’-sentiment.
That’s the popular stereotype, and the experience of the privileged few. Do you also believe most baby boomers were free-loving drugged out hippies in the 60s?
Is there data somewhere on this?
There’s always been the division between the better-off, ‘go to college because that’s what you do before you get a job in Dad’s firm, have fun and slack off’ types and the ‘go to college because you need to get a job so you work like a beaver and grind away’ types. The Hooray Henrys versus the scholarship boys. The Knights of the Campanile versus “I did four years in Trinity and I had no idea this society existed until I read some alumnus talking about it in a newspaper article in the business section”.
I’m currently in university and have little work experience so I can’t say I’ve been able to definitely make the comparison, but I’d be happy to stay in university a lot longer were it not for the prohibitive cost. I wouldn’t say university has been fun, but I expect work to be even less fun.
Agreed. Even disregarding parties and other things that introverts might not like, you end up meeting enough people that you’ll probably find some you get along with. In my experience, in order to not enjoy university you have to both not enjoy the social side and either be mismatched with the academic level (finding the work too hard, or less commonly too easy) or be doing the wrong course.
Infamous counter-example: Tai’s Method, also known as the Trapezoidal rule. I know this is medical research, which is not the same as clinical practice, but nevertheless I would still contend that basic Calculus is quite useful for anyone with the potential to do research, especially given that the article has well over 300 citations and many appear to be genuine, rather than critical.
That looks more like Calculus I to me than Calculus II, although I suppose that’s a nitpick.
In my three semesters of high school calculus, the first (“Calculus A”; the classes weren’t numbered) was differential calculus and the second (“Calculus B”) was integral calculus, so this is about as close to Calculus II as it was possible to be in my experience.
Oh, I see what’s going on. My high school didn’t break things down by semesters; you took classes by the year, and there was only one Calculus class, which covered differentials and integrals. When I got to college, there were Calculus I and II sequences, corresponding to single-variable and multivariable calculus respectively.
I foolishly assumed this was typical.
Calculus C dealt with infinite series.
Were you in the US? I strongly tend toward the assumption that my high school classes were called A, B, and C to match the labeling of the AP calculus tests (AB and BC). Then again, the same school now offers a pretty different math curriculum in terms of labeling.
In my college, single variable calculus was math 19A and 19B, and multivariable calculus was math 23A and 23B. I didn’t take the single variable classes, but 23A was again differential calculus and 23B was again integral calculus.
(Unrelated tangent: I can see why integrals are covered after derivatives from a logic perspective, but the ordering nevertheless seems kind of perverse to me in that differential calculus is significantly more difficult than integral calculus is.)
In the States, yes.
At the university level, the Calc I and II sequences were broken down into individual courses — I think two each — each with its own catalog number. Calc I and II were the titles of the sequences, not the courses.
Wait hold up, how is differential calculus harder than integral calculus? The procedure for integration is basically differentiating backwards, which requires more cognitive effort than just differentiating alone. In fact, the actual operation of differentiation is little harder than doing algebra. There are rules, you follow the rules, you’re done. Basic integration is still like that only you run the solution backwards, which makes it more difficult. Then you hit more complex functions and it’s like trying to be divinely inspired with the knowledge of what the functions higher form is. If there’s no inspiration, there’s no solution, it’s immensely frustrating. The notion that integration would be easier, let alone significantly easier, is very odd to me.
This reminds me of that woman who insisted to me that division was easier than multiplication. Which was weird for the exact same reason: division is just multiplication run backwards with guesswork. It seems plainly harder. She also said that she did not understand the point of long division until she had to divide polynomials, since to her it was obvious how many times any given number went into another. Maybe there’s some level of math aptitude that makes it easier to do things backwards than forwards?
Differential calculus made sense to me. Integral calculus always felt like I was holding on to only one part of the elephant/platypus hybrid while blindfolded.
Differentiation is easier than integration, but if Michael Watts just studied integration in “integral calculus” and differential equations in “differential calculus” then his comment makes sense.
> Given that the SAT more or less approximates an IQ test, if there were some pent-up demand for IQ-testing job candidates the SAT would be a perfectly good alternative. Given that only the handful of companies in the articles above ask for SAT scores, I don’t think employers are really that interested in IQ.
I thought I read here that there was a change made to the SAT format in the early 90s that had the effect of uncorrelating it from IQ? Am I remembering that wrong?
It’s still strongly correlated with IQ, although it has a lower ceiling and so can’t distinguish among people with sufficiently high IQs.
The test itself was basically the same, but it was renormed, so the same SAT score represents a lower IQ score after the change than before.
The difference is not as dramatic as you might think from comments here: for a given SAT score, the difference works out to about four or five IQ points across most of the scale, if I remember right.
Are you sure? I Googled it a bit, and e.g. Mensa stopped accepting SAT scores after 1994:
Is the renorming you’re thinking of in 1994?
Yes, I’m thinking of the 1994 renorming. Don’t know what problem Mensa has with it, but I have a hard time taking them at their word when they say it doesn’t correlate, especially when it looks like they dropped all the commonly taken aptitude tests about the same time.
I know this is horribly anecdotal, but I’ve had both a professionally done IQ test and a post-1994 SAT test, and the IQ predicted by the latter according to the conversion table I saw is within a couple points of my score on the former.
Hm, okay interesting. I only used Mensa as a source because it was the first source I could readily find that addressed the separation I was thinking of.
Strangely, as the wikipedia article correctly references, employers may be barred from discarding tests that have a disparate impact (https://en.wikipedia.org/wiki/Ricci_v._DeStefano).
The US employment legal regime is a mess – governed by a conflicting tangle of federal and state law which can never quite reconcile the tension between employment-at-will and anti-discrimination. Combine that with the caprice of EEOC and other administrators and the morass that is the civil court system, and you get extremely risk averse HR people.
Scott may be right, however, that it doesn’t make a lot of sense to blame it on the use of college degrees versus tests. They’re literally back to back examples in the EEOC compliance manual. On the other hand, requiring a college degree is easy while there have been several high profile cases where employers took a lot of effort to comply with the law on tests, but lost anyway. Ford spent a couple decades dealing with litigation after introducing a test even the EEOC admitted was valid in 1991 and ended up paying out over 8 million. For a lot of companies, that’s a risk they can’t afford to take.
Anyone have a feeling for what proportion of companies check to see whether someone who claims to have graduated from a university actually graduated and that it was a real university?
MIT’s dean of admissions, of all people, falsely claimed a few degrees on her resume and didn’t get found out until nearly 30 years after she was first hired and a full decade after she became dean.
I only got documentation on the background check from one of the companies I’ve worked with, but the others might have done it and not told me. Kinda doubt it, though.
Most large companies use a background check firm that will at least verify claimed jobs and credentials.
The case of medicine is somewhat unique because of the AMA. Useless pre-requisites are a means of reducing competition.
I really wish you would consider the possibility that there isn’t anything “wrong” and that trying to screen job candidates for jobs is actually a hard problem.
If you think this isn’t true, consider not describing what job needs to be done and just asking for people to give you their SAT scores the next time you have a job you want done.
But the current difficulty of screening for jobs, or whatever the root cause is, is creating a situation where college costs way too much and also everyone has to go. (It may not be the cause of college costs, but they exist and exacerbate the problem of everyone having to go). And yeah, not everyone has to go, but you get what I’m saying – a ton of people go to college and spend a ton of money, all for basically no reason, often at the expense of the state and even their own, due to financial aid and student loans (Google says student loan debt is at $1.4 trillion as of right now). I feel that it is reasonable to believe that there is a better solution to the problem than this. Maybe I’m wrong about that.
Sure, I think the argument that we should try and find more solutions to the problems being solved have merit. It seems unlikely that the classic liberal arts college education is the best suited solution for all of the 70% of HS graduates who go on to college today. College is probably the hammer constantly in search of nails in the US. My recollection is that Europe had a much more developed trade school route, and I believe it still operates in this manner, at least to some extent.
It might be nice if we could bring back something that looked like apprenticeship as well, if it could be non exploitive. But of course the modern business is too competitive to do very much in the way of training on the job, certainly not for skills that take years to learn.
But that’s a really different statement than the one Scott seems to be making.
Sounds almost as stupid as not describing what job needs to be done and just asking for people to give you their college degree.
Anything’s dumb if you use it in isolation; the question is why some criteria are being factored into the vague holistic decision and not others.
Yes, I agree.
And absolutely no one does that.
Which is why calling “credentialocracy” and darkly intoning that “something is wrong on a deeper level” is missing the forest for the trees, encouraging a paranoid mindset that is unwarranted. Fitting people to jobs and jobs to people is hard, and more than that, it’s competitive. We know quite a bit about how people behave in a competitive environment, and they don’t lend themselves to problems like “take this IQ test” or “take this general assessment of job ability” and now everyone knows everything needed to properly assign you your job.
I don’t think Scott is calling for exclusively using IQ tests in job interviews. I think it’s more suspicious that it’s not particularly common, given that it’s highly predictive, and theoretically legal.
You can also use an IQ test or some other test in conjunction with other common filters. There’s nothing saying you HAVE to rely only an IQ test.
I have seen what amounts to IQ tests as part of online job applications. And the fact that they were a little sneaky about it makes me think businesses are still afraid of Griggs or related law on some level.
It was sort of like:
If a train leaves Scranton at 11:45 pm…?
In “Targetting Meritocracy” Scott also seemed to be rejecting IQ tests (“Remember that IQ correlates with chess talent at a modest r = 0.24, and chess champion Garry Kasparov has only a medium-high IQ of 135”).
The thing about chess is it’s incredibly easy to compare “candidates”. That’s what Scott is missing, an understanding that, even granting that he has identified a problem, he hasn’t even had a glimmer of grappling with why the problem exists.
At some point the shoe is going to be on the other foot (actually, it very likely has been on the other foot as he hired a lawyer at some point) and Scott will have to decide which candidate out of a pool of candidates he prefers. And then he might start getting down to the brass tacks.
Sure, screening candidates for jobs is a hard problem. Addition of more and more costly screens that don’t work all that well doesn’t seem to be a good solution.
If the ability to do the job is mostly dependent on cognitive abilities and it’s not a single job but a whole series of them that I expect anyone I hire to have to learn on the job, _that should work_, or so sayeth the research.
I will posit that for almost any job where the requirement is “has any 4 year degree” (as opposed to, “has a engineering degree, because this is an engineering job” or “has a nursing degree, because the job is nurse”), a 4 year degree should not actually be required, and probably has little direct bearing on the effectiveness of any employee of reasonable general competence.
Can anyone think of counterexamples? I really can’t – almost anything for which any degree will do seems like the sort of thing where a few months of on the job training are all that’s really needed, and after that training (which the college grads will also need) the college grad and non grad are indistinguishable except in the particulars of their break room stories.
Certainly, I can’t think of any counterexamples where “any 4 year degree” is more qualifying than “high school + 1 year in related role”.
I made a similar point in the other thread. A college degree, any college degree, is a bizarre job requirement.
It would have been a bizarre requirement 50 years ago. Thanks to the decline of K-12 education it is not anymore.
Basically, an “any college degree will do” requirement is proof that you are literate.
I’m a college professor. Why isn’t there a system where for $1,000 I spend three hours testing a person to see if she is smart enough, in my opinion, for white collar work? I could certainly tell if this person was literate and knew basic math within three hours. Plus, under this system, I could occasionally get re-certified by some trusted and prestigious organization.
I’m not a professor, but, at one point, I worked as a phone screener. My job was to test people on the phone to see if they had any programming skills whatsoever; if they passed, then their recruiter — my employer — would forward them on to the next phase of interviews with the client company. FWIW, my job paid well, but was absolutely exhausting.
OMG! And you survived? Phone screens are horrible — I would quit an become a hermit if I had to watch more professional programmers who can’t program their way out of a wet paper bag. They make me do maybe 3 a month and that’s almost more than I can handle.
You are a brave one.
That would cost companies $1,000 per applicant, which is presumably substantially more than (pseudo) checking for a college degree.
I think the idea is that the person takes the test with James once, gets his certification in writing, and then that serves in place of the college degree when they’re applying to jobs.
So it would be a cost of $1000 to each individual person, which is far cheaper than college at $5digits per person.
It’s like someone said upthread: it’s a filtering mechanism. People didn’t all need degrees to get a reasonable job because you could get a job working on the assembly line in the big Box Factory in town.
Now those kinds of jobs are gone, so everyone is applying for anything that isn’t “working at a fast-food restaurant”. If you have ten job openings and three hundred applications, you need some way to cut down those three hundred to the top “okay these are the hundred we will actually look at to make a selection for short-listing for interview” and “Preferred: degree” in the job spec on the advertisement is one way of doing it.
I am not even sure it’s a terrible filter. I see some research saying the degree is uncorrelated to job performance.
However, in my anecdotal evidence, the kind of people coming out of 4-year schools are really head-and-shoulders above non-degreed employees, and even 2-year or off-brand degreed employees.
My last job was running most smoothly when we had plenty of degreed temps filling white collar workers. They barely did anything, because they were lazy temps, but they were still more productive and knowledgable than the non-degreed employees.
Then the economy stopped sucking and the company couldn’t hire degreed temps for $20/hour (which seems expensive until you realize the staff employees were clearing $60k+ with fringe benefits that probably added another 12-13k to their salary).
Were I to run my own company, I would try to money-ball by hiring people exclusively guilds from MMORPGs. Preferably entire guilds. Were I to manage in a company, I would probably insist on interviewing 4-year candidates, just out of my own experience.
Yeah, so long as the economy sucks (or is a “buyer’s market” for employers), it’s going to be hard to avoid the filter mechanisms because those allow one to winnow the oversized pool.
That is, so long as jobs have plenty of applicants with degrees from schools which demand high SAT scores, there’s no need for them to ask for SAT scores because they can winnow even better by simply knowing your school.
I think the no. 1 key for fixing all this is eliminating subsidized college loans. Probably not going to happen, politically speaking, but so long as the government makes it super easy to borrow the money to play the costly signal game, it’s going to be hard to get people to stop playing it en masse (though I think some cracks are already beginning to show).
College graduates are definitely better candidates.
The question is if grinding everyone through four (and increasingly more) years of college is a useful way to filter candidates.
In this scenario they are probably offering too much money. Offer less and they won’t be so overwhelmed.
It might be a different story if you actually had a reliable mechanism for ranking them, but since it sounds like the employer in this hypo doesn’t — just offer less.
Then you just end up with the shitty candidates as the best ones drop out.
The goal is the opposite: find the best candidates in the pool of people who are willing to work for $X.
In this scenario they are probably offering too much money. Offer less and they won’t be so overwhelmed.
Lemme just pick myself up off the floor here. “Paying too much money”. Oh yeah, that is definitely the problem, right!
First, the scenario as described applied most stringently during the 80s recession in Ireland, where the jobs on offer were very scarce and most certainly did not offer too much money. People were so desperate for work that yes, hundreds of applications were received for any kind of job, so employers had a surfeit of choice. ((This got me caught in the double whammy: get a qualification* because we’ve all been warned if we want a job we need a qualification, there are no jobs going in that field, apply for any job going, get told ‘yeah but you’re over-qualified, if we hire you, as soon as you get a better offer you’ll leave’ – well, there are no better offers going, why do you think I’m applying for a production line job???)
During the 90s, the enticement the various governments and the IDA (the semi-state body charged with attracting companies to set up in Ireland) pitched a campaign about our highly-educated young workforce – with the corollary “and you can pay them way less than you would have to pay college graduates in the USA, so please take advantage of our generous corporate tax regime and set up business here!” Indeed, even today the emphasis is still on “our young, educated workforce with a lot of college graduates” and further education is seen as the way to get a job in the modern economy where low- or unskilled jobs are shrinking fast and the unqualified will be left on the scrapheap.
Now, during the boom times of roughly the end of the 90s up to 2008, as the economy soared, employment rates soared with it, and this time employers had the opposite problem – a lack of labour. So wages went up, and they would employ any warm body that walked in the door.
Come the crash in 2008 and we’re back to the Good Old Days of no jobs, emigration to the UK/USA/Canada/Australia, and back to “100s applying for single vacancy advertised”.
Things are improving slowly, but we’re by no means back to the boom times again. Salaries guide put out by recruitment firm here – you guys can tell me if you think the IT/Engineering etc. wages are comparable to what you’d expect (remember to turn euros to dollars). In March of this year, the embargo on public sector recruitment was lifted and there were jobs in the local authority advertised (the very local authority where I’d been short-term working, in fact). I had applied for this before I got my current job, but I decided to go along to the testing day anyway (I’d never done this particular test so it was out of curiosity).
Now, I don’t know if $26,773 per year for clerical/administrative work is “too much money” by American terms (looking at this site it’s about or just slightly more than the average rate); the jobs were on the Grade III entry level (promotions have been held down for years as well as embargoes on recruitment, so there are a lot of women in their 30s and even 40s still stuck on Grade III level or even Acting Grade IV which is Grade III pay but Grade IV work and responsibilities). New hires start on the first step of the pay scale, which is the $26,773 figure mentioned above.
I asked the guy invigilating the test about how many applicants they had; they were doing two days testing in my town, then another day down in the city. He estimated they’d be testing around 300 applicants (and these are just the applicants called to do the test after sending in the initial application form). I asked him how many jobs were going. About ten.
Yeah, it’s the sky-high wages that are to blame, sure enough! Agreed, it’s an attractive position, but a lot of the people applying were like me – on short-term contracts, nobody hiring permanently, most of the big employers in the town shut down over the past twenty years, and so you apply for everything that’s advertised.
*Don’t have any kind of a degree, this was vocational training after leaving secondary school
It’s true you’ll on average get a worse candidate, but only slightly and you’ll save significant amount of money. For example if you have 300 applications when offering $20,000/year and would get an additional 50 applications by offering $25,000/year — you’ve spent an additional $5,000 a year for only a 15% chance of getting one of these better candidates attracted by the higher salary.
You seem to be coming at the question of how much a company should pay for a given position from an entirely different set of premises than I am. To me it is a market issue, not a question of what a particular position is somehow inherently worth.
Again, assuming that you don’t have a good way of sorting and are essentially picking an applicant at random.
If you do have a good method of sorting then you should use that instead of arbitrarily throwing out a bunch of applications because you got too many.
I don’t think that’s a realistic model of how the labor market works – you need to account for “efficiency wages.”
To me it is a market issue, not a question of what a particular position is somehow inherently worth.
Brad, what I was getting at is that Ireland is not America and there are simply not the same number of “just go out and get a job” opportunities, or indeed outside of say, Dublin, the “get a job with a big tech/finance company”.
So over here, it’s not “oh I see Big Co is offering a job at $5 grand more than I’m earning here, I think I’ll apply”, it’s “there’s a job going and I need a job, okay I’ll apply”. So screening on things like “have you a degree” cuts down. And I am willing to bet US companies do the same, if they’re sufficiently large/prestigious enough that everyone wants to work for them (I see, for instance, the impression that Google is hiring everyone constantly with no limits on how many programmers they need – plainly that can’t be so, there must be some limit on “okay we want/need to hire X number this year” but the impression given is ‘just sign up with coding boot camp and Google will take you on’).
As I said, Ireland is not back at the good days of the Celtic Tiger when people did have expectations of “I will get a rise this year and if I don’t, I’m applying to this other firm”, so it’s not wages being offered are higher than the market rates that is driving large numbers of applications for jobs, it’s scarcity of jobs in the first place.
If you do have a good method of sorting then you should use that instead of arbitrarily throwing out a bunch of applications because you got too many.
I agree with your point here, but isn’t this the whole point of this entire discussion in the first place? Companies don’t seem to have a good system for sorting applications and do rely on “do they have a degree?” as a criterion to throw out excess applications before considering the ones left.
My post above got a little messed up somehow. The last two paragraphs should have been below the @Aapje paragraph and don’t really follow what I was saying to Deiseach. Sorry about that.
In any event, to respond. My point is that if you have so many people desperate to take a job, any job that they flood your office with a ton of resume and you don’t have any good way of filtering them* then that implies that you could probably pay less and get about the same quality employee. Maybe you think it would be morally wrong and taking advantage. That’s not what I was or want to debate. Only if it is true or false.
notpeerreviewed references some literature that posits reasons for why paying above market wages might make sense, inter alia: to avoid shirking, reduce turnover, and attract productive employee. The first two I have no quibble with, at least not in this conversation, but the third I do. If an employer has little or no ability to distinguish from among the hiring pool then I think this effect will be correspondingly small.
* I am assuming here that an ‘any college degree’ requirement isn’t a good way of filtering. Some above argue it is. That’s a separate issue.
What people who hire pretty consistently complain about (at least in IT) is that they get large numbers of applicants, but fairly few of those have the minimum quality they ask. For low level jobs this tends to involve things like ‘shows up on time’ and for things like programmer jobs ‘can actually program something elementary.’
This makes sense since the low quality people mostly keep not getting hired, while the quality people tend to get picked out far more often. So the pool would logically be much worse than average.
You seem to model the labor market as a commodity market, where you can simply lower the price you want to pay and then supply will lower as well. However, the less transparent the quality of the offerings are and/or the higher the cost of determining the value of the offering, the less sense it makes to try to pay less, as that simply will greatly increase your acquisition costs.
This is especially true since false negatives and false positives play a large role.
For example, imagine a pool of 1000 workers with 100 of sufficient quality. Your initial pick selects 5 of sufficient quality (5 are false negatives) and 5 poor workers (false positives). Then you spend $$ to test each of these candidates, which also suffers from false negatives and false positives, so you may end up thinking that 5 workers are good enough: 4 good workers and 1 bad worker. So then if we assume that the final pick is random as to quality, you have a 1 in 5 chance to make a bad pick (which has high costs).
Now imagine you are willing to pay less and get a pool of 100 workers with 4 of sufficient quality, because quality workers will disproportionately choose not to apply. Because the pool is worse, false positives will form a larger part of your initial pick. So your initial pick selects 2 of sufficient quality and 8 poor workers. Then you spend $$ to test each of these candidates, which also suffers from false negatives and false positives, so you may end up thinking that 4 workers are good enough, 2 good workers and 2 bad worker. So then if we again assume that the final pick is random as to quality, you have a 1 in 2 chance to make a bad pick (which has high costs).
The higher the costs of the testing and the higher the costs of making a bad pick and the higher the chance of false positives and negatives, the more sense it makes to get a better initial pool by offering more salary (and the more sense it makes to do cheap initial filtering that is fairly effective, like asking for a degree).
I looked at the financial professional jobs, because that is what I am most familiar with. Those salaries looked very competitive with US salaries. IT doesn’t look like Ireland is suffering too much by low salaries. Although, having looked at many way off base salary guides in the US, I don’t know how much one can really trust such guides.
I found one amusing thing in the guide. One of the positions they listed was a SOX accountant. SOX refers to the US law Sarbanes Oxley. The US really does penetrate deeply into foreign markets.
You seem to be assuming that the best candidates will apply even with a low opening bid. This is not the case, so by lowering your bid you’ll be left with only the dregs to choose from. Even if you’re poor at sorting, one gem among many at least gives you a chance to find it.
Sarbanes-Oxley applies to foreign companies listed on a US exchange as well as foreign subsidiaries of US companies. Given that Ireland has spent decades attracting US companies with low tax rates, the demand for people with SOX expertise seems reasonable.
For the legal case at least, they do use a test. What law school you went to is basically isomorphic to your LSAT score. Law schools admit a very narrow range of LSAT scores, with the score having ~90% of explanatory power for admissions decisions. Candidates typically attend the best school that admits them. Many law schools don’t even bother giving grades (Yale). Hiring choices are based on internships at the end of someone’s first year in law school (even though they have 2 years to go) and hiring for summer internships is done before 2nd semester grades are known. Basically, the whole system is a thinly disguised filtration by LSAT scores, which are thinly disguised IQ tests.
I read one of those books about law school admissions and it appears that your explanation is completely contrary to the facts.
Law schools admit based first on the prestige of your undergrad school and secondly on the basis of your undergraduate GPA.
They then throw out any applicants without normal glowing recommendations. Being able to find a few professors that liked you is a basic skill in the profession, apparently.
And only last is LSAT score considered to be important. It’s much lower weight, at least in prestigious law schools, than the other three major parts of an application.
I don’t know anything about law school admissions, but the top schools tend to get the people with the top LSATs. I assume that’s because they pay attention to the LSATs.
This article lists the top law schools by median LSAT and also shows the US News ranking of the law school. They seem to track very closely.
Anyone know more?
What you describe is almost the exact opposite of how things work. It’s so deeply wrong and backwards that I’m kind of boggled how you could wind up encountering a source so misguided. Also boggled about how it could be written published, recommended etc. I just feel like at some point it the filtering process something should have stopped that outcome. I wonder if the book was just some self-help guru just literally making stuff up, publishing at a less major publishing house, and selling it to suckers who don’t know actual lawyers / law students. What was the title of the book you read?
My source is knowing hundreds (or at least high 100s) of people who have applied and been admitted to top law schools (some I know professionally, some I went to HS with, some college with, some I met at other schools through clubs / extracurricular, some I’m related to, some friends of friends).
Edit: just considered the possibility that we are talking past each other or are looking at different parts of the distribution. The people I know and the experiences I draw from are all people going to top 10 (realistically top 7) law schools and are all in the top 3% of LSAT score distributions. Its possible that the process at one end of the spectrum looks very different than it does at the center of the distribution… in which case we could both be correct, because we are talking about different things
I hereby dock myself 2 internet points for this comment. Its not the worst, but the tone and manner of the pre edit version are suboptimal. I would have been more friendly and polite in real life.
This doesn’t match my experience. Most schools do give grades — Yale is a tiny school and an exception to a lot of rules. The critical summer internship is after the 2L year. Selection for it is done in the fall of the 2L year. First year grades are available and more or less determinative in getting first round interviews.
What is true is that the school attended has a huge impact on how far into the class those interviews will go.
This is misunderstanding the way legal cases work. Yes the headline case says that diploma requirements and IQ tests are both suspect, but the way it actually plays out as a practical matter is that diploma requirements don’t really get the same scrutiny as IQ tests. This case has been applied tens of thousands of times in the last 50 years and the general sense of the legal community is that your company can get away with diploma requirements in many cases, but that IQ tests are going to be a problem. So when an HR lawyer says “you can’t do IQ tests because of Griggs” you aren’t correct to say “well Griggs is suspicious of diplomas too”. Or at least you aren’t usefully correct.
Griggs offers a test which weighs statistical discrimination against demonstrated relation to job functions. It plays out on a case by case basis. The sense of the legal field as played out by the thousands of cases over 50 years is that the diploma alone won’t get you in trouble. So I’m not saying that it definitely IS the case that Griggs is contributing to over-credentialism, but I definitely am saying that you can’t wave it away by suggesting that Griggs doesn’t like diploma requirements either. It is a matter of intensity.
Here is my take on one large facet of credentialism.
Principal/Agent Problem and coordination between departments. With the rise of Human Resources as its own department, there are certain thresholds that you won’t be able to get past without clearing it through HR.
HR departments like three things: the ability to trashcan a bunch of applicants, the ability to cover their ass if a hire goes bad, and the ability to avoid litigation.
Credentialism immediately lets them screen out a bunch of applicants. So when justifying their time and effort to upper management they get to say things like “we looked through 300 applicants before showing the hiring team the top 10 possibilities”.
HR also wants to be able to cover its ass if a hire goes bad. One of the key ways that hires go bad is when they don’t know enough to even be worth training. Let’s say that for some reason 15% of people who seem suitable for the job without a credential will wash out for those reasons and that 20% of people who seem suitable for the job with a credential will wash out that way. (Note that I intentionally have made the credentialed people more likely to fail at the job–perhaps it is too boring for them). If they succeed, HR will get very little credit. If they fail, HR has a much higher chance of taking the blame. Humans are highly loss-averse and eager to avoid blame. If a credential can help HR avoid blame even 1/3 of the time when someone gets hired despite not knowing enough to be worth training, that will easily erase the 5% difference in successful hires between the credentialed and non-credentialed so far as HR taking blame goes.
Credentialism also plays into litigation avoidance. If your hire does something horribly dangerous and someone gets hurt, the lack of an available credential can mark against you in court (a self-reinforcing dynamic because as the credential becomes more prevalent the lack of it will hurt more). More subtly credentialism can insulate you against getting sued for your illegal racist or sexist programs because in many fields minorities and women are much less likely to have credentials.
So note that even if credentialed people are actively worse, there may be systemic reasons why companies with HR departments tend toward credentialism.
I suspect that lots of companies initially see that HR is screening out very good candidates, so with weak HR departments they make hires anyway. But eventually something bad happens and loss aversion kicks in even if on balance you have gained much more by avoiding the credential than you lost in the bad incident. HR says “I told you so” and most upper management people aren’t going to see the gains.
I think that it matters a lot that degrees are not just IQ tests, they are evidence that you learned certain skills/had certain experiences.
Also, you can’t tell how a lawyer did on the bar exam. Scores are only reported (even to the examinee) if they fail.
That wasn’t my experience in Michigan (1982). I passed AND got my scores, on both the state exam and the Multistate.
That used to be how it was on the Professional Engineer exam, but nowadays they only give you information on how you did if you fail–and that info is pretty scant.
A senior engineer (who took it when they used to tell you how you did question-by-question) said that it used to be that people who just barely missed would find the weaker questions and quibble with NCEES, and this ended up tying up a lot of time and litigation. I imagine with actual lawyers this process was even more painful.
So they went to a system where you can’t know what particular questions you got wrong, but they dribble out just enough information to maybe guide your study for the retest.
I think this just goes back to Risk-Aversion. We’ve become so good at finding holes and plugging them. Any bad hire is a potential cost to the company. But it’s sort of like plugging your snorkel because water sometimes gets in.
I think the problem is that you’re really plugging someone else’s snorkel. I.E., everyone else has to go to college. But at present, you’re not really missing out on that many great candidates by requiring a college degree – instead you’re just screwing the candidates.
Epistemic status: exploratory
There is a difference, as I understand it, and that is that employers are the ones administering those IQ tests, but employers are not the ones issuing university degrees.
This may be important for determining who is responsible in the event of a legal violation.
If an employer issues IQ tests and they are later deemed to be racist / illegal, the employer is liable for this.
If an employer has a college requirement, and these requirements are later deemed to be racist / illegal, there is wiggle room for the employer to say “hey, it’s not me, it’s those colleges with their racist admissions process”
Yes and the colleges have their own affirmative action shenanigans to justify their inequality of outcome. Because the college has successfully managed to explain the inequality of outcome and/or is seen as working to end this sufficiently, businesses who depend on degrees can just kick the can down the road (‘why blame us for disparate impact due to demanding college degrees when the actual problem is that colleges are not handing out these degrees proportionality, take it up with them’).
Adding to the SAT point, I’m pretty sure GMAT scores are still widely used in blue chip fields like investment banking and consulting. Although, I believe this has a lot to do with the grade non disclose policy at many business schools, in fairly sure GMAT was used a bunch. Places still recommend taking the GMAT in lieu of other exams if you want to be in those fields.
One more point of interest. There is one major company I know of that uses as part of its hiring process a test actually marketed as a test of cognitive ability: the NFL.
Interesting that this is one of the few high paying industries that can’t, in the eyes of the left, be accused of discriminating against non-Asian minorities.
You underestimate the left. It mostly doesn’t because the thinkpiece crowd aren’t exactly the ones tailgating. But if they wanted to, they could. You could break it down by position, or better yet you could use head injury and CTE stuff without the positional breakdown. I bet the whiteness of kickers and punters is enough to show a racial gap in those sorts of things.
Or you take the easy route and look at coaches and managers. That criticism definitely gets leveled, but again it’s a different crowd than the usual thinkpiecers.
I think a lot of people arguing for hiring based solely off IQ tests are coming from a self-serving position – they’re good at taking IQ tests, and bad at interviewing/cover letters/all the other intangible, subjective stuff that goes into the hiring process these days. I say this because I’m like that too, but I’m acutely aware that there are tons of jobs I’d be terrible at.
Also, IQ tests don’t directly measure g. They measure something that’s correlated with g, but also with how good you are at taking tests and (to a decreasing extent, I hope) how “prestigious” your cultural upbringing was.
Are (many) people here actually arguing for that? I think what’s being argued isn’t that IQ tests are a great way to hire people, but that non-STEM college degrees are an extremely expensive (to society as a whole) way of hiring people. If what those degrees screen for is IQ and conscientiousness, why not just test IQ and conscientiousness in some less expensive way?
I’d submit that most people arguing against college are starting from the assumption that college is bad, full stop, and working backwards from there. Hell, Scott thinks, or at least argues from the position, that education and school in general is bad.
In the US, the best hiring criteria is the work-sample test. Outside the US, it’s a work-sample test with an IQ test as secondary data.
@BBA. It is true that my arguments are self serving, but that doesn’t mean they are incorrect. I am abysmal at interviews, but pretty good at tests. Well, my resume and cover letters are usually pretty good for getting interviews, but once I show up in person I always seem to bomb. But the main evidence I use for determining that tests are under-used are from when I was interviewing other people. Once I created my own test, it filtered out a lot of people that would not have done the job well, despite whatever experience or education they had. One time I disregarded my test for a candidate with lots of great experience, and it was a grievous error. It took her several years to understand aspects of the job that my other hires understood immediately. My test was pretty simple math for an accounts payable position. There are lots of people at that level that don’t have an intuitive feel for math, and those folks didn’t belong in the position I was hiring for.
It seems to me that most hiring seems to fit the model of finding the group of people that fit the arbitrary qualifications of education and experience set up for the job, and then one interviews the group to find the person with the best “fit.” I think the “fit” usually just means the most likable person. (I have not been able to project myself as likable.) Even for those interviewers that look for less trivial traits than likability, the interview will still select highly for social skills.
In my opinion, social skills are weighted too highly for most jobs. It needs to have some weight, but for jobs other than say sales, HR, or management, other skills are at least as important. The skills of most jobs will vary widely for people with similar levels of education and experience. Testing directly for those skills, whether or not the test is highly g loaded, would end up selecting better employees. That was certainly my experience when I was hiring accounts payable clerks.
Keep in mind that anyone below the executive level is likely not compensated based on overall company performance.
There’s a huge incentive to have an irrationally high preference for social skills over things that directly relate to job performance – mainly that you’re going to be working with this person every day and you’d probably rather them be nice and bad at their job than a jerk and good at it – or at least, your incentives are structured for you to prefer this.
Edit: And I’ll add that even at the fairly elite white collar level of management consulting and investment banking, MBAs are basically taught that the “airport test” (basically, are you someone I’d be okay with being stuck at the airport with during a long delay) is one of, if not THE most important factors in whether or not you get a job offer. The partners who make the hiring decisions have a pretty clear and direct financial stake in finding the “best person” and STILL often just defer to “which of these guys do I like best”
I think the cost of requiring a college degree is so low to the corporation that they simply don’t care. Because the cost is so high to employees in a lot of cases, the costs should eventually rise for corporations such that they’d actually attempt to find a solution. But right now, a combination of culture and government subsidy have made it so that pretty much everyone who’s anyone goes to college, meaning that the cost to the employee is basically baked-in, allowing corporations to continue to capitalize.
I think you guys need to start differentiating liberal arts degrees (which may indeed be worthless) from STEM degrees (which teach you useful skills). For example, the probability that an applicant for a programming position knows what things like “O(n)” and “hashtable” are increases dramatically when the applicant also possesses a Bachelor’s in CS. Some people I’ve interviewed have argued that, in the modern world, the knowledge of such arcane concepts is superfluous; those people are objectively wrong.
This forum might be a good test. Maybe the STEM majors can tell us if they thought their undergrad education was useful.
I know a few architects and engineers IRL, and they have the exact same complaint as everyone else: their degrees taught them a lot of useless stuff that they never need to apply, and didn’t mention any of the things that they would be doing every day.
As I said above, my education was incredibly useful. However, on this forum, I’m the exception — since I’m a pretty average guy with a below-140 IQ. People like us cannot simply read a book and pick up any discipline we want to, so college is useful for us.
I got great value out of my C.S. degree.
When the small start-up I was in needed new talent, we looked at recent graduates of the C.S. program at the local university. We also were a bunch of graduates of local C.S. programs.
So, from both ends, yes, we found it helpful.
My C.S. degree was extremely useful.
To quote my Analysis of Algorithms professor: “To become a great programmer, you have to program for ten years. Or, you can program for two years, and take an algorithms course.”
I’ve never found reason to disagree with him. (And that wasn’t the only useful course in the curriculum.)
My physics degree was useful. It taught me a mindset for approaching problems that has been very useful even though I left the field. My second degree in CS was mostly useless. I don’t feel either the coursework or even my CS master’s thesis process was particularly helpful in any way.
To be perfectly honest I think most degrees (not only technical degrees) try to convert you to a few methods of thinking about the world. Individual students and programs can fail at this task. But it seems to me that it’s more important to teach how you can design memory structures for specific purposes over the specifics of Red-Black Trees (to take a simple example). I honestly don’t think CS is very good at this yet, as a relatively new discipline.
My civil engineering degree was pretty useful. I mean, I could complain that there is a lot of stuff that I do that wasn’t covered, but what was covered was a lot of “background knowledge” that I may not use directly, but it informs the things I do.
For example, I may not use the classes I took on steel and concrete design daily, but what I learned about the mechanics of those materials and the structure of the building codes is damned important.
Going further afield, I will never do a design of an canal, drainage control structure, or other water-carrying channel. However, the fact that I took an open channel flow class means that when I talk to the hydraulic engineers about changes to the spillway proposed by the design team we’re both on, I can have that discussion without sounding like a retard. Similarly, the general mechanics of materials class will help the discussions with the mechanical engineers about hoists. General knowledge can help in a lot of surprising ways, and if you think it doesn’t, you need to take a careful look at how you appear to other disciplines. You might be coming off worse than you think.
I think they probably could have shaved a year or two off the degree without any loss: lot of useless electives teaching things that were already obsolete. Stuff like functional programming is neat, but I’ve never seen it come up with any relevance in day to day operations in any of my jobs. Biggest advantage I got from college is placement in internships, so in addition to my diploma I had a year of real-life coding under my belt when I started the job hunt.
I can see how some people might have needed that, though: maybe a 2 year degree that handles all the basic stuff I actually use, then a second degree that takes another 2 years and gives experience in stuff like functional programming, systems programming and crypto? Obviously there’s a danger that credential creep just ends up requiring the latter, but we haven’t seen tech jobs creep up to requiring a Master’s, so there’s a potential that this ends up being a meaningful distinction.
Bachelor’s in CS, and one in math. Both in the early 1990s.
Both were useful, in subtle ways. Among the highlights was a combination of computability theory, algorithm design, operating system architecture, hardware architecture, and engineering physics, which gave you a detailed picture of how a computer works from the semiconductor level all the way up to top-level abstractions, from which I kept a somewhat less detailed picture that gives me an intuitive sense that that funny black box under your desk is in fact doing exactly what it’s told*, and the possible ways in which it isn’t doing what you hope.
The math side gave me a lot of content to write programs about. (Sadly, this was largely pre-Wolfram Alpha.) A lot of the math was proofs. If I had not chosen to enter the workforce quickly, I would very likely instead have pursued a doctorate in computer science, with program correctness as my field.
*barring a short circuit or chip failure or something like that
“Didn’t mention any of the things they would be doing every day” is so obviously an exaggeration as to be useless. And if we discount the hyperbole, so what? No field worthy of a college education is so narrow that one can define a set of knowledge that is necessary and sufficient for every job in that field. Either you learn a lot of stuff you’ll not use in the particular job you wind up with, or you don’t learn things that you will need, or more likely some of both. Or, possibly, you’re studying at Exxon Entry-Level Petroleum Engineer U, and Exxon owns your career and you’re a fool for signing up for that deal.
My undergraduate aerospace engineering degree was definitely useful, in spite of both learning things I would never use and not learning things I would eventually need. It is plausible that I could have taught myself as well as UT Austin did, but it is unlikely I could have done so faster, better, or cheaper – if they didn’t know exactly which skills and knowledge I would need in my actual jobs, I knew even less. And they were better suited than any arrangement I could privately arrange for providing useful feedback on my progress, lab facilities for hands-on experience, suitable teams for design exercises, libraries full of information that was not then and probably is not now on the internet, and smart knowledgeable people I could talk to about the things I couldn’t learn just by reading the textbook.
Extracting a piece of this:
I think this is an admission that the knowledge instilled by a college education is not necessary to perform many jobs we think of as “knowledge work.” People are understandably frustrated that they are shelling out six-figures for knowledge (and room and board) that not only was not necessary, but can’t even be used. The opportunity cost is higher because people are sacrificing 4 years of wages to study something, and then have to pay interest on the debt as well (or forego investment returns).
That’s a huge hit to material well-being.
I’m glad you feel your education was a good investment of your time and money.
Quite the opposite. It is an assertion that what we think of as “knowledge work” does not depend on a narrow body of knowledge that could plausibly be learned by just reading textbooks / the internet, but rather requires an ill-defined core of knowledge plus the skills to find the relevant knowledge when there’s nobody telling you “it’s in this book here”, plus the skills to proceed and do useful work when the necessary knowledge isn’t to be found in any book or website, plus some measure of practical experience. The guy who just read the recommended texts and can pass a test on the facts in those tests, is useless.
In my case, both my degrees presented a great deal of knowledge, and a small subset is directly applicable to my current job, a ring around that which is sort of indirectly applicable, a second ring around that which is not applicable but I’ll get to momentarily, and the rest, which was somewhat interesting but useless dross. (I guarantee I won’t use my parageography study for anything beyond personal hobbies.)
The thing about the useful nugget and first and second rings is that any of that might have been directly useful to something in my profession, but I didn’t know exactly what I would be working on for the next 30-40 years, so it makes sense to have all of that, just in case. (Plus, there’s stuff in the second ring that makes the core and first rings make more intuitive sense, and therefore easier to keep in my brain.)
I suspect the same goes for a lot of college grads. Basically, any STEM course that stresses theory and only briefly focuses on tech flavor of the year is probably doing it right. I would be in lousy shape if UT’s CS curriculum had devoted years of coursework to the intricacies of SunOS, rather than OS fundamentals with a brief aside on a handful of SunOS features so that I could actually test a few programs here and there.
I see a lot here that’s not specific to a collegiate institution. The person who read the recommended texts and can pass a test can be trained to research, take initiative, and work in teams, all OTJ, and more cheaply than sitting in a classroom for 4 years. Those are, in fact, specifically the kinds of things companies complain collegiate workers do not have.
This is also a complaint that many grads had about their college experience(beyond just the content of the classwork). It wasn’t anything like the work world, because 90%+ of the classwork was this:
Perhaps your education taught you this:
Mine certainly didn’t, and my impression is that most of my college graduate friends had similar experiences.
This is why they seem upset at their college experiences. It did not impart all the useful soft skills that you gained in your experience.
The specific competitive advantage of a collegiate institution is in the ill-defined knowledge piece, not these other functions, so these students all basically graduated with barely more ability than the person who successfully passed the test.
So, yes, they probably were useless, for specifically the reasons you said, but they all went to college. So college didn’t bestow them with those skills.
MolBio/CS double major, MolBio PhD, current biochemist.
I found my bio training super useful, but in a way that I suspect is a bit domain-specific to biology as opposed to other disciplines with more robust theoretical grounding. Biology, as a body of knowledge, often feels like a huge pile of uncorrelated, historically contingent factoids, which you’re obliged to be at least aware of in order to be conversant on anything interesting. In no order: the 4 nucleotides/20 amino acids and their basic properties, the lac operon, TCA cycle, cell cycle, nitrogen cycle, photosynthesis, the major oncogenes, basic mammalian physiology, principles of genetics, thumbnail sketches of the major model organisms (E. coli, yeast, fly, worm, mouse), and so on and on and on.
Any individual biologist probably couldn’t go more than a few inches deep on most of those – I doubt 1 in 10 could draw the full TCA cycle cold – but if they came up in conversation you shouldn’t get a blank stare. I couldn’t give you more than a cartoon version for a few of those, but that cartoon is enough to know where they fit in the pile and where to go for details should the need arise. And even that level of familiarity took years of plowing through textbooks and (critically) interacting with biologists to get a sense of what they think about.
In principle, someone brighter than me might have bootstrapped themselves up to that level of understanding with nothing but a library card and SciHub, but my undergraduate also involved a ton of lab work (optional, but anyone serious about the science track did the same). That aspect of the training seems basically irreplaceable, since the practice of biology is also a big mishmash of finicky techniques (PCR, Western blotting, sequence analysis, cell culture, etc.) and a huge amount of your success comes down to metis, or even muscle memory. I think this is why there are so few bio prodigies compared to more mathy fields: no matter how searingly brilliant you might be, there’s simply no way around this long grinding gauntlet (besides going into theory).
PhD was a more advanced version of the same, and really, sort of like an abbreviated form of the medieval apprenticeship system that somehow survived into the modern world.
The CS program was pretty fun, but regretfully I ended up going in a direction where it really hasn’t contributed much to my career besides a few perlscripts, and my abilities there have decayed almost back to baseline.
I hold a Bachelor’s and Master’s in CS and work in the field, and have found my degrees useful.
Depends on your definition of “useful”.
Did any of my degrees help me get a job? No.
Did they help me do the job(s) I did get? Yes, especially maths and physics. The only math I took in college and haven’t had a use for is tensor calculus and I suspect that is more because I didn’t quite understand it well enough to apply it properly than any inherent fault of tensors.
I realise that my experience is the exact opposite of what is being described as a problem here, but I’m OK with being weird.
My liberal arts degree training turned out to be incredibly useful to my eventual job of being a liberal arts professor.
But of the things I learned as an undergrad liberal arts major, the only ones I found to have any practical professional value outside academia were foreign languages (though having a greater appreciation of Russian fiction and cinema makes my life a little nicer in other ways).
That reminds me of the old joke we used to have in the math department. A degree in math tended have only two applications, and one of them was teaching math.
My Bachelor’s degree in Chem and Math looked promising until I got my Master’s in Chem, where I basically spent three years coding stuff, which seemingly blacklisted me from any employment that required a Chem degree. A typical interview back then went for me like this:
“So, tell us what you did during your Master’s studies.”
“I analyzed data blah blah C++ blah blah ROOT blah blah distributed cluster calculations blah blah”
“And did you do any bench work?”
“Well… I TA’d Gen Chem labs, O Chem labs, P Chem labs… you name it, I TA’d it.”
“Ok, we’ll get in touch with you. Goodbye!”
So in the end, I said “screw this” and went into IT, where the interview went like this:
“So, your resume here says you did something in C++, is that right?”
“What does this code do?” *shows me a piece of paper with code*
“Calculates a factorial.”
“What is the name of the technique I used here?”
“Oh, I guess you’ll do just fine. Welcome to the company.”
Griggs vs. Duke Power may not apply outside the US except via cultural influence (which please do not underestimate), but other nations have their own legal and cultural limits on what measures can and can not be used to screen job applicants. Often contradictory ones, as see e.g. this article offering guidance to US multinationals hiring abroad. Doesn’t specifically mention the UK, but France’s civil code has a provision that looks a lot like Griggs, and EU data privacy laws place practical obstacles in the way of a “test them all, let the computer sort them out” strategy.
I think that, at least in Western culture(*), there is an instinctive hostility to the idea that one’s future prospects can be constrained by a written examination, and that Griggs is merely the US manifestation of that. We do have the SAT, yes, but that was grandfathered in from an era when most people didn’t expect to go to college at all and those that did expected that grades, interviews, and family connections would be weighed alongside tests. If there were no SAT today, I am skeptical that courts would allow it to be created tomorrow.
* Asian cultures, in my limited experience, are much more comfortable with this. I’m not sure I take that as a good thing.
Agree with this.
SAT is culturally tolerated because it’s a legacy thing strongly tied to college admissions which do not meet with the same level of natural skepticism and hostility as corporate HR departments do.
I am a mid-level engineering manager for a very large aerospace company. Their rationale for requiring degrees is clear and I suspect it is shared by many companies. They prefer to hire all of the skilled employees as “exempt”, meaning not subject to fair labor standards laws and not eligible for overtime. The state and federal labor overseers require that the company have well-defined rules for distinguishing exempt from non-exempt and the company uses a degree as one of the primary criteria. The HR folks will absolutely not allow deviations from this policy because it would jeopardize the entire company job category structure. I can cite examples and details if anyone is interested but this is a really clear policy across every place I have worked.
Interesting, thanks for sharing. This could definitely be the case for a lot of different white-collar companies that are basically all exempt positions. I cannot think of any friends in corporate jobs that are not exempt.
I haven’t had time to read all the comments below, so I hope I’m not redundant here.
Yes, it seems to be true that non-US companies rely on education instead of testing at least as much as the US, which seems to imply that Griggs is not related to US companies using education instead of testing as the initial filter for job interviews. But there are some other issues that counter balance this:
1) There are cultural reasons for over-valuing college education, which seems to apply to all developed countries. Griggs exists as part of the culture that says tests are biased and one dimensional, so you need to use other things instead. As you say, Griggs emphasized college degrees almost to the extent of test taking as bias causing, and yet Griggs isn’t enforced that way. The way Griggs is enforced is based on the culture — subsequent Federal agencies and judges emphasized the testing thing, not the education thing. So Griggs isn’t the only factor pushing education over testing, but it is part of the emphasis in the US.
2) I think that countries outside the US have their own reasons for using education more than testing. For one thing, other developed countries are all smaller than the US, so they are more likely to know each other in the same industry, and with fewer applicants less filtering is needed. Also I believe that other countries filter their students much more by testing in the schools themselves, so which school you went to is a much better proxy for IQ than it is for US schools. At least in Britain, it is my understanding that you have to pass a whole series of standardized tests in one’s teens to make it into an elite college. For the US, one must take the SAT or ACT, and many schools don’t even emphasize the scores very much. I’d love to hear from the Europeans in the group as to whether I am correct in my analysis in this point #2.
3) Even if US companies would tend to emphasize education over testing regardless of Griggs, it is very difficult for companies to start using testing as an experiment as long as Griggs is there. (Technically, I believe that Griggs was essentially overturned by cases in the ’80’s, but then it was mostly codified in the 1991 Civil Rights Act, so it is as if it still existed). As in so many things, it seems that Google is fighting back against this rule, but I don’t know how many others will take this risk. I personally think that testing is way under-used for applicants, based on my very good experience in using a test I created for many hires I made in the late 80’s / early 90’s (luckily my firm was too small to have lawyers around to forbid me to do this). Griggs makes it very difficult to change this.
I was not impressed by the Time article. Their only source was a Wall Street Journal article. The WSJ article was semi-blocked by just showing a rather grayed out version, but what I could see didn’t look like it had very good sources itself. I think this is just click seeking journalism. I do think that asking for the SAT score, and making hiring decisions based on that, could well get companies in trouble. If it isn’t a problem, it would be because people think of an SAT score as an addendum to schooling, and don’t realize that it is very similar to an IQ score. But I wouldn’t count on that if I was hiring.
This would only be an important factor if people regularly apply to jobs in cities other than the one they currently live in, and I don’t think many people do that. Also many companies filter out applicants that live in other cities, as demonstrated by how many job ads contain the words “local candidates only.”
And in the case of Europe in particular, the Shengen Area greatly decreases the importance of national borders to labor markets.
With this in mind, a better comparison is to look at the size of the major cities, and the US’s major cities are not especially large compared to those in other countries.
In state schools in Britain (or maybe just England, or maybe just in my area of England; I’m just speaking from personal experience) you do standardized tests called GCSEs or O-levels in the last year of secondary school (ages 15-16) and a second set of standardized tests called A-levels in the second year of the institution that we call “college” (ages 17-18). “College” is a two-year institution which comes in between “school” and “university” (it’s probably best for Americans to think of it as the British version of “high school”, the difference being it’s a two-year institution rather than a four-year one). Admission to college is largely determined by GCSE results and admission to university is largely determined by A-level results. There are no hard criteria, everything is decided on a case-by-case basis between the applicant and the college/university, but generally it’s my impression that the standardized test results are far more important than anything else (including which school/college you went to; plenty of people get into Oxford or Cambridge from obscure state schools in deprived areas).
I don’t think GCSEs and A-levels are quite like the American SAT however. They are subject-specific, not general tests. They don’t necessarily involve doing an exam; they might involve coursework instead, wholly or partially. They’re standardized in the sense that there’s a National Curriculum so the requirements to obtain the qualification in a given subject in a given year will be the same across the country. They’re nothing like what I imagine an IQ test to be like; nothing in the British education system is at all like what I imagine an IQ test to be like. But I haven’t taken an American SAT, or an IQ test, so don’t know too much about what they’re like. (Also the SAT name is confusing for British people, because here “SATs” refers to various sets of earlier standardized tests which are taken before GCSEs [these aren’t important in later life, they’re just used as a means of assessment within the school system].)
Like the OWLS and NEWTS.
Thank you house. This is pretty much how I thought it worked in Britain. It sounds like the tests aren’t so much IQ tests as knowledge tests. But school subject knowledge tests normally measure for intelligence plus diligence, so this might be even better than pure IQ tests for employers, since they want both characteristics. So I think I was right that British employers have a better proxy for basing suitability for a professional job on the university they went to than a similar employer in the US. Thus, absent Griggs, one would expect US employers to rely on education less than in Britain. I THINK the rest of Europe is more like Britain than the US in these matters, so I don’t think the fact that employers in Europe use education as proxies as much as the US is good evidence that Griggs doesn’t matter.
I think this is a Harry Potter reference? I am embarrassed that much of my understanding of British education comes from Harry Potter, which is why I asked for input from those more knowledgeable than I. I don’t know how much JK Rowling made up and how much it really matches British schools.
Well, OWLS and O-Levels, NEWTS and A-Levels, are at least roughly comparable.
The only thing I took in the course of a British education which I think was something like an IQ test was the one to get into grammar school.
(For the uninitiated, grammar schools are academically selective schools covering from about age 11. It used to be the case that everyone took a test and went either to the grammar school or the secondary modern, but grammar schools have largely been abolished so everyone goes to the single local comprehensive school unless they go private. Mine was one of a few grammar schools that survived due to local resistance or just as random lacunae.)
Incidentally, this test is a major source of scepticism for me about claims that results on IQ tests aren’t improved much by tutoring. People who I had outscored consistently on tests across all subjects in primary school, but who had more tutoring than I did, did better than me on the test, while my own performance on practice tests rose sharply over the few weeks’ coaching I did.
If it’s true that relying on standardized tests is no more illegal than college degrees, then I wonder if a startup in a non-tech field could gain a competitive advantage by requiring an IQ or SAT score on applications, and ignoring degrees and (maybe) work history. Sort of how some companies take advantage of the gender wage gap by hiring women somewhere in between the prevailing female and male rates.
Presumably established companies won’t do this sort of thing because no HR rep would take responsibility for it. But if you’re a founder, you don’t have to care about that; there’s no one to fire you except you. And your hiring friction costs might drop substantially.
“some companies take advantage of the gender wage gap by hiring women somewhere in between the prevailing female and male rates”
They don’t and they can’t, because most of the gap is provably not different pay for equal work, but different pay for different work. I suspect that much/all of the remainder is due to the same or due to actual higher costs of hiring women (they have a higher tendency than men to get pregnant and then take pregnancy leave, for example).
Suppose women are being discriminated against in hiring in the software industry, so that they’re underpaid or underemployed relative to men.
Then by preferentially hiring women as programmers, you can get better programmers for less money than by gender-blind hiring. You will pay less for the same talent.
One reason I’m massively skeptical of claims that women are being systematically discriminated against in software is that if it’s true, then a whole bunch of extremely competitive companies that live or die on their programmers’ productivity are all leaving money on the table. Further, they’re doing so despite explicit programs to bring in more women and almost everyone mouthing the “women are great” slogans.
 I’m ignoring effects based on organizational culture–if there’s some reason why a software house full of men works well but a software house of equally-talented women collapses into constant infighting or something, then it won’t work.
A better reason to be skeptical of claims that women are being systematically discriminated against in software is that hiring rates don’t seem to be lower than application rates which don’t seem to be lower than education rates.
The only way you can expect a equal hiring by gender if far fewer women apply is if you have large scale discrimination against men.
Assuming it’s legal, if you break away from the pack by hiring based just on SAT instead of on college degree, you will not only attract smart-but-no-degree people but also a lot of witches (“witches” in the SSC sense, such as https://slatestarcodex.com/2015/07/22/freedom-on-the-centralized-web/ or https://slatestarcodex.com/2017/05/01/neutral-vs-conservative-the-eternal-struggle/).
It’s not so much as that the Griggs decision was the cause of credentialocracy, as that it makes replacing it with meritocracy more challenging. It makes me think that some of the places that have access to score, like the medical examples cited actually don’t really care as much about merit as they do about what fancy school their residents attended, so that they lay claim to the false glow of those credentials, since it is difficult to measure the performance of the residents, having them sound intelligent and have well known credentials may be enough to make people feel that they are a high status place.
The difficulties with using test scores legally are well known in the HR field. Even with something as simple as a strength test, there is the possibility of asking people to lift more than the job requires and thus being an illegal test. https://www.shrm.org/ResourcesAndTools/hr-topics/talent-acquisition/Pages/Validate-Employment-Tests-Avoid-Lawsuits.aspx The validation process often requires having performance measures for the jobs themselves, which can run into their own discrimination law problems.
The interference of the EEOC and DOL in the hiring process to work against making meritocracy easy is real, and they are largely getting in their own way. Given a test, they should be ensuring that the best scoring workers get the job, instead they expend a lot of effort explaining why lower scorers should get the job.
I don’t know how it plays out with the court decision, but there’s an important distinction between:
a. Hiring the best person for the job.
b. Being able to show that your process for doing (a) was fair and reasonable and you didn’t bias it to hire people you liked, relatives, friends, or people whose race or religion or gender you find more suitable.
Relying on explicit credentials meets goal (b). So does relying on widely-used tests like the SAT or GRE. But many other criteria you might use in predicting performance, some of which may be quite useful, won’t be easy to justify to a skeptical person demanding to know why you don’t hire very many blacks, or why you hired that guy who turned out to be a complete loser.
You can think of college degree credentialism as a way the Blue Tribe fights the Red Tribe.
This article is relevant. (Summary: Business prof says businesses are just irrationally demanding in general about the qualities they look for in applicants, even when it hurts the bottom line.)
Just based off the summary:
That would mean these irrational businesses could be outcompeted by more rational ones.
Unless that’s happening right now – and I’m not sure how we’d know that – I’m quite sceptical.
Perhaps the system is a local maximum despite its dysfunctionality. The jobs get overspecified, and no one gets hired. The positions remain empty, and the companies continue to function, slightly leaner and meaner. Presumably if lack of employees was actually hurting, someone with a cluestick would bypass the system and get someone hired.
This would be my guess too.
I work in consulting and the general POV at my firm is something like “every non start-up has way too many people and would be better off eliminating like 10% of its workforce”
I had an internship in corporate America once and was utterly amazed at how many people I met who could not really describe to me what they actually did. The number of people who sat around all day and didn’t seem to do much of anything was staggering.
Hyper-specific job descriptions are intentionally designed to try and catch the needle in the haystack. The ordinal preference ranking may very well be something like:
1. Get the guy who matches exactly what you’re looking for perfectly (probably doesn’t exist, but worth taking a shot)
2. Let the position go empty
3. Hire the closest thing you can find to that guy
And if the company is particularly desperate, 2 and 3 may switch places. This is why you will often be told when job hunting that it’s worth applying for jobs even if you only meet 50% of their “minimum requirements.” Because there may very well be literally zero applicants who meet 100% of them, and you may have a better chance than you think.
My friends who have taught in Korea say that any job interview is like an IQ test and especially English proficiency test.
The con – highly stressful parenting and tests
Pros – less nepotism/connection based jobs
I stand by what I said here on this subject last year (here and here). The perception (right or wrong) in industry that the use of IQ in hiring is a legal minefield is very real. People are not scared of being sued for requiring college degrees, because everyone does it. And “everyone else is breaking the law in this other way, so why can’t we break it in this one?” doesn’t sound to me like a winning argument in court.
I think there is a lot of money on the table for someone to improve on the current dismal state of information available to hiring managers. There is other low hanging fruit besides intelligence testing, such as the status quo where most companies (at least officially) refuse to provide any kind of references for former employees besides confirming duration of employment, also due to fear of lawsuits. Maybe you could even come up with an adult version of the marshmellow test that doesn’t take four years.
But you will be sued, and your name will be dragged through the mud, and your potential customers will rightly be afraid of that. It’s an opportunity a little like the one Uber went after, for someone with more courage than me.
In general, I think that people underestimate this sort of consideration. Another example: why isn’t there more wage or job growth for low-skilled people? Forget about robots and immigrants. Entrepreneurs are a limiting factor in job creation, and a potential entrepreneur considers many possible opportunities. Suppose you are looking at two possible companies with similar risk and expected value. One will hire a relatively small number of highly paid engineers to write software. The other will hire a huge number of low-skilled laborers. In the former case, everyone likes you – your employees, the public. In the latter case, pretty much everyone will resent and hate you, and you will face constant interference from regulators and lawsuits. It’s a pretty easy guess which business you will prefer to start. And so it’s a pretty easy guess where we will see the most economic growth.
Has anyone ever sued over the college degree requirement? AFAIK they haven’t, but it’s a big world and I don’t know the literature.
Reported federal cases citing Griggs with the word bachelor’s in them. Probably some false positives but I checked and there are certainly some of point.
I’d also posit that although we know that the SAT is basically an IQ test, I’m not so sure the average person does. So you’re less likely to catch a crapload of bad publicity for asking for SAT scores than you are for mandating an IQ test. Even if it’s functionally equivalent, the perception of it is very different, hence why everyone gets away with it.
My employer puts a lot of weight into SAT and GMAT scores for recruiting purposes. But officially, it is “optional” for job candidates to provide theirs. Of course, when I was applying, the advice I got from just about everyone was “unless your score is really bad, you probably should provide it.” My score was really good, and I ended up getting a job I never thought I could ever qualify for. Now that I’m here, I get to occasionally review resumes, and I must say, I don’t think a resume without a score on it has ever made it through the system far enough to reach my eyes.
Eh, it’s not an IQ test. It’s a measure of a certain kind of knowledge. No matter how smart you are, you aren’t scoring high if you haven’t been exposed to the things on which you are being tested.
Given that everyone taking the test has been sufficiently exposed to the concepts, much of the variance is going to be explained by IQ, sure. But that variance is still a measure of what you know.
The SAT-IQ paper you link to doesn’t seem to make much sense, at least not the part about the post-1994 revised test. Their conversion equation maxes out around 125 IQ, and has a negative coefficient for the SAT verbal score, suggesting that people with higher general intelligence do more poorly on the SAT verbal. Assuming that the latter problem might come from an accidental sign reversal in their equation, it would still max out at 129. That doesn’t seem particularly useful if there are a lot of smart people applying for your job.
I think one goal of requiring a college degree is hiring people who are about 22 rather than 18 years old (in the U.S.). As someone who works at a University, I can attest that people do mature during this period, and lots of people I’ve spoken to about hiring agree. Whether that maturation has anything to do with college is highly debatable! (It’s also debatable whether this preference for older young people is wholly good — it probably partially weeds out the more precicious and eccentric.) But in any case, a degree requirement is an indicator of “not an 18 year old,” and I would guess that a blunt “must be 22” requirement in a job posting would be illegal.
Quickly skimming, I don’t see this addressed in earlier comments, though I could be wrong.
I’m pretty sure it would be perfectly legal to have a minimum age for any job. In the US age discrimination is only illegal when the victim is over 40, and only when younger workers are preferred to older workers (not the reverse).
And to state the obvious, college degrees are required or preferred by many employers even when hiring people much older than 22.
About the first part: that’s interesting! Though technically legal, many organizations (e.g. the University I work for) would never allow a job posting that sets an age criterion. Of course, hiring through a university is a comically complex process.
About the second: yes, I thought of that; I was considering only ‘entry level’ jobs for young people.
Re: credentialism, I think it’s important to distinguish between three theories:
(1) Employers irrationally overweight educational credentials in hiring decisions, while alternative means of assessing candidates’ ability exist.
(2) Employers rationally weight educational credentials as a noisy signal of unobservable ability, and thus students rationally respond by acquiring as many credentials as they can. Thus we’re stuck in a bad equilibrium, and should coordinate some way to get out of it.
(3) Trade organizations require credentials as a means of restricting entry into their profession and keeping salaries high, at the expense of customers and potential entrants.
I think that (1) is fairly implausible as a large-scale phenomenon, although I can believe it might exist to some degree in large bureaucratic organizations that are shielded from competition (e.g. government).
(2) seems quite plausible to me, but very difficult to change. And it’s also possible that credentials are serving a valuable informational function here.
(3) seems correct when applied to certain industries (e.g. medicine), but not in general.
Scott seems to be arguing for (1), or at least that is how I read him. Many others seem to be arguing this as well.
Summing up the position, uncharitably, it sounds like “Employers should just test job applicants for their merit and be done with it.”
Yes, that is very uncharitable. Most of the people arguing for more testing (including me) haven’t been real clear on the subject, so I can see why you could take that position. I would prefer the initial filter for applicants to be a test than to be education in most cases, if there could be only one filter. But even now, most companies have more than one filter for initial applicants; I think it would be a good idea to emphasize testing more than it is now. Not necessarily an IQ test, although for initial filtering one might need an outside test that already existed so the company itself didn’t administer it. Maybe an employment firm could administer a generic test for many potential employers. I think employment firms already do this to a limited degree — but it is a bit risky under Griggs.
I very much doubt that anyone here suggest hiring folks based solely on a test. When I hired a bunch of people 25-30 years ago, the test I used was the most important part of the selection process, but I did a traditional interview also. Written tests don’t do a good job of measuring social skills, but the usual interview does pretty well there. I contend that with testing being pretty risky, the social skills component of most jobs is weighted higher than it should be. Skills are determined too much based on schooling and experience, both of which are pretty crude measures.
I think you missed my point.
How do you test for merit? What common test for merit would apply across almost every profession and be hard to game?
What do you mean by testing for merit? You can definitely test for knowledge in a particular subject. I have also seen in various places on SSC comments how employers test IT applicants for simple programming methods. The test I gave tested for simple math methodology, and another test I created (that unfortunately was nixed by HR), tested to see if applicants could analyze large quantities of data on an Excel spreadsheet. I think this sort of test to see if applicants understand certain basic concepts is pretty easy, and would likely be a lot more effective than looking at education and experience.
If you mean testing for creativity, that is very difficult. But of course education and experience don’t help you there either.
Edit: I see you seem to be asking for a test that could be broadly used for lots of professions. But why? If you are creating tests, make one specifically for the position at hand. Maybe if you want a more general filter done before the employer gets involved, it has to be more general. For that an IQ test might be best, or a certification suggested by someone else where some professor makes sure the applicant has general verbal and math skills.
Because it’s easier to make one test than 10,000?
@Matt M Perhaps I’m missing some sarcasm. How is it harder to make a test for the position you are intimately familiar with than to encode a universal standard for general ability? It’s not ‘make 1 test vs make 10,000’, it’s let 10,000 people make their own tests vs make 1 test that does all 10,000 jobs simultaneously that those 10,000 people are individually best positioned to do for themselves.
I think (2), with the caveat that it’s not an equilibrium, it’s a positive feedback loop. The students acquire more credentials, which increases the noise, which results in employers requiring yet _another_ level of credentials. This does seem to have leveled off and even backed off in tech (for a while there was a trend of requiring MSCS degrees), so there’s factors working against it at least in some cases. From the outside, it appears nursing is in such a spiral.
I think that the credential spiral occurs because of (3), in industries where it is relevent, such as academia, law, medicine (doctors) and possibly now even nursing.
Industries such as software development or “tech” in general which are not strongly unionized, if at all, reach an equilibrium by requiring credentials which are hard to “fake” or “buy”. Usually it is a college degree, but for software development in particular, a good GitHub profile is maybe even better (or maybe software companies face a shortage of potential employees and had to lower their standards even increasing the risk of hiring poor candidates).
I think most white-collar jobs require a college degree and care a lot about pedigree, even if they’re not part of a “closed shop” like medicine or law. Tech is really the outlier here, which may reflect the relative ease of credibly signaling ability.
In the context of Griggs, here’s something I don’t understand at all: the Wonderlic test.
The Wonderlic test is an APA-approved IQ test. The NFL’s scouting combine grades all its new prospects on it; opinions vary on how useful it is, but everybody takes it and some teams value it highly, especially for quarterbacks. QB is an especially interesting position in this context because it’s 80% white, unlike most other positions (the NFL is two-thirds black overall). The traditional justification for this is that QB demands a lot of soft skills like tactical judgement, precision, and leadership. Those are harder to measure than stuff like running speed or pass completion rate, and are often — though not always! — held to correlate with your Wonderlic score (some representative discussion here).
To summarize, then: Wonderlic is an IQ test, used by one of the most famous employers in America, to justify hiring white folks in a field that’s otherwise dominated by blacks. It is not the sort of thing that should exist in a world where IQ tests for interviews are forbidden or even frowned upon, especially on disparate impact grounds, and yet here we are.
Fooball is a sport, and sports are special and we close our eyes and hum real loud whenever we would otherwise have to acknowledge that they are businesses (eww!) that make money (sayitisntso!). Football is an American Pastime and whatever Football does must be good so long as it is fun and entertaining. It would be far from unpredendented for the courts to exempt a sports league from laws aimed at ensuring that mere businesses treat everyone fairly.
And I think some of that humming is to ignore the enormous disparate impact of teams hiring that is so disproportionately Black. Since there is no evidence that one race is physically more adept than another, it has to be racism that causes this? Let’s just keep humming.
I don’t think the Wonderlic is “used to justify drafting white QBs over black ones.”
QB has sorted based on race long before the NFL. It’s predominately white in college as well. I would assume in high school as well (although I know less about this, so maybe not). Leaked scores have revealed plenty of high profile QBs who had terrible Wonderlic scores and were still drafted quite highly.
Employers want three things:
1) Domain-specific competency.
2) Conscientiousness / determination / grit, to deal with persistent problems.
3) Social skills adequate to working in the domain-specific groups.
Domain competency is doable with some sort of testing regime, but college is the best proxy for demonstrating grit/determination, and you can argue that elite colleges are likely to be a better proxy for demonstrating the social skills necessary to keep up with hard-chargers in certain professions.
I think the trick to cracking this is to realize that, while full domain competency is always required, the grit and social aspects associated with various jobs varies wildly. Everybody needs enough grit to show up to work every day and do the job, but being a robot repair person requires a lot less grit than doing professional engineering or science. Similarly, all employees need to be able to jump over the “don’t creep out your co-workers” bar, but the same robot repair person can be a lot more socially inept than somebody who needs to negotiate contracts.
My impression is that the current battery of personality tests is wholly inadequate for culling the people who aren’t going to cut it when tasks are difficult and long-term, or who simply can’t keep up with the needed levels of office politics. So the credential obsession will continue until something can measure this other stuff at least as well.
I don’t see any reason why you can’t train and test for grit or social skills, but it’s a very different “lab” environment than you’d need than for, e.g., learning Cisco or Microsoft IT skills (which are already quite successful in gauging competence). It’d be interesting to see if some of the MOOC/lab hybrids could take something like this on.
It’s an interesting theory and I don’t disagree in general.
I would just note that for all the talk about how grit and social skills are required to succeed in college, the best proxy anyone has been able to find for predicting long-term success in college remains… you guessed it… SAT scores!
I suppose that grit, determination, conscentiousness, etc. can’t be accurately measured with some quick IQ-like test. Some usually low-functioning people may be able to muster enough determination to function at a higher level for the test day, or maybe even for a few weeks or months, but then they’ll burn out and return baseline low-functioning level. Quick psychometric tests can’t detect this.
The only thing that you can measure with IQ-like tests and predicts academic and professional performance is indeed IQ, or its close proxies such as SAT or GRE.
It’s hard to imagine a test for grit and social skills that didn’t involve some real-world project lasting weeks or months. That’s gonna make the odds of somebody out-performing their actual skill level very low.
College degrees test that for a few years.
If few months were sufficient, why aren’t shorter programs more popular?
Some years ago, I hired for a coordination / management type job someone who didn’t have a degree, but who had lead a large successful WoW guild for several years. His application circulated around the recruiters desks for a few months as a joke, until I overheard the joke, and asked for the application to be forwarded to me.
I decided to play the joke out more, and had recruiting schedule the interview.
I hired him.
Since then, I’ve hired several more people without college degrees who have instead demonstrated grit and organization and logistics skills by running large MMORPG guilds, and one who produced raves in the 90s.
Being a GM is such a thankless job.
I’d be inclined NOT to hire such a person… not because I think they’re wasting their life or are a huge nerd or whatever… but because I’d almost worry such a person possesses too much “grit” and is willing to waste too much time and effort on things that offer virtually zero reward.
Does this mean my significant experience in playing and running roleplaying games is actually useful? Hot diggity!
Maybe because nobody’s spent much time trying to optimize the grit/social skills part of the equation?
Colleges have strong incentives to keep students in school as long as possible–it maximizes their revenue and profits, and the CW is that at least a 4-year degree is enough to provide a good proxy. We have lots of existence proofs that what’s happening in short programs isn’t adequate to act as a proxy for grit/social skills.
But there’s not much going on in between these two extremes. For example, could you generate a grit test that produced reliable results in 4 weeks? The military has a pretty good test (basic training) that lasts 10 weeks. Could that be a model?
Can you teach social skills to adults? If not, can you test for them in a small number of weeks.?
There’s a fairly serious bootstrapping problem here, in that developing the programs and tests is likely to be expensive and less than optimal to begin with, and the entire credentialling industry will go gunning for you. I’m not sure that that’s surmountable, but it seems as if it’s quite the gold mine for anybody who actually succeeds.
Don’t high school grades and coursework show grit? Sure, today it isn’t too hard to get a high school diploma of some sort or another. But if you have an A average and took a bunch of AP courses you probably were pretty diligent.
Yes, but the education and credentialing market is competitive, there are lots of different programs which produce various kinds of certifications. Students have incentives not to waste time and money, employers have incentives to hire young candidates and not to exclude potentially good candidates just because they couldn’t afford overpriced certifications. This counterbalances colleges’ incentives to make degrees longer and more expensive.
If I understand correctly, this gives access only to low rank roles, technical roles (engineer, physician, etc.) require degrees, and officer roles require either officer school or years of experience in lower ranks.
Does this mean my significant experience in playing and running roleplaying games is actually useful? Hot diggity!
Did you have to run a session every week, without fail (on pain of individuals leaving for other campaigns)?
Did that session cover all of the content available? If not, did you run additional sessions to cover all of the content of interest, in that week? Without fail?
Did you allocate rewards from each session in a way that was understood by all to be fair? Did it factor in absences by individuals? Failure to prepare outside of session? Failure to perform during session?
Did your sessions involve 25 people, possibly including subs, all possibly in remote locations, including other nations and timezones?
Did you construct a record keeping system visible and satisfactory to all members?
Did you have a system for resolving irregularities or disputes in any of the above requirements?
Did you keep this system running for at least one year, preferably two or more, and delivering rewards to a level that objectively kept pace with content produced by an outside party?
…Standing in the Shadows is not the only person I’ve heard of who took “ran an MMO raiding guild” seriously as a credential for management. Having run various parts of a raid guild myself, I can understand why.
Nothing that hardcore, but then again WoW raiding is serious business. I’m a pen and paper man.
Nope. It’s cartelized. Stray from the basic architecture, and you won’t get an accreditation. No accreditation, no jobs for your students.
I’m not suggesting that we use basic training for professional roles; I’m suggesting that we use something like it as a proxy for grit that doesn’t take 2 or 4 years. The idea is to provide a more specific measurement of all three points of the domain competence / grit / social skills triad.
Ideally, those three independent axes would give you enough information to determine suitability for any job. You need a specific test (and score) for domain competence, but you might be able to get by with a single test for grit (requiring varying minimum scores based on the type of job) and social skills (again with varying minima).
In college, yes. In the workplace, though?
If your argument is that IQ swamps any grit and social skills demonstrated in college, and therefore college is a poor proxy for measuring those properties, I’ll buy that. But it’s still likely a better proxy than anything else–and likely a better proxy than just the SAT.
What I’m proposing is a systematic way of developing the grit and social components, so that there’s a much better proxy for their measurement. Then, assuming that competency-based education has the domain-specific part of the equation covered, you have a fairly complete suite of assessment tools that is systematically better than a college credential.
The recruiters at my employer have told us that when it comes to MBA hires, GMAT score is the best predictor of long-term success with our firm. I can’t say if that’s universal or not, but that’s what we’ve been told.
Of course, you can also take the “efficient markets” approach (as many others have suggested) and assume that college degrees are required for a reason, etc. If a degree from an elite university is the biggest predictor of workplace success, and SAT scores predict university success, therefore….
An efficient market will select the best alternative available, not necessarily an optimal alternative. I’m prepared to believe that a degree and/or a good GMAT score is currently the best alternative available. But that seems a lot like an opportunity to invent something better.
I’d be a lot more indifferent to this if there weren’t a lot of human misery and a not inconsiderable amount of social danger wrapped up in the status quo. The outcome is heavily class-biased as is. That’s no way to have a healthy society.
The last time I saw a discussion of degrees and jobs (probably the 90s, probably sf fandom), it seemed like practically no one had a job which was related to their degree. Have things changed?
What is more probable, that there is a widespread systemic market inefficiency in all developed countries that only you and few others have noticed, or that you failed to understand why using college degrees to screen potential employees is valuable?
College evaluates people performance on various tasks, some not very interesting or challenging, over several years. Therefore it probably measures something that no simple psychometric test can measure, but it is valuable to employers.
There are high-IQ people who can ace any standardized test, but they are disorganized, emotionally unstable, tend to procrastinate, have drifting interests, have difficulties focusing on tasks they don’t find intellectually challenging, and so on. These people don’t make good employees. These people also struggle in school, and some fraction of them will fail to get a college degree. If you are an employer, screening by college degree is an effective way to avoid hiring them.
Has there been any research on whether American employers would like to use IQ tests, but are stopped by fear of increased risk of lawsuits?
I’ve been thinking about this ever since a friend linked me to research showing that tests of general mental ability predict job performance better than anything else. When I considered suggesting that we use them in our candidate screening process, the main objection I thought of was that candidates would not want to take such a test, and that people might object to other members of the team having access to their IQ (effectively) (since anyone involved in making the hiring decision would see the GMA test result). IQ feels more personal than e.g. where you went to college.
It seems to me like a lot of the commenters assume that a liberal arts education has no useful/transferable skills it teaches for the job market. But I can think of a lot of ways in which liberal education can teach useful, transferable skills.
To mention some of them:
Philosophy: I only had 2 courses in Philosophy in High School (in Spain it is compulsory even if you chose the scientific specialization). But I did get to read excerpts of philosophical texts. Now, is knowing the ideas on politics and ethics of different philosophers directly useful? No, but being able to read and understand original texts of great thinkers such as Aristotle, Kant, Nietzsche, Sartre, etc., is a useful skill, correlated with IQ but not entirely based on IQ (my ability to understand those texts improved with schooling). Because their texts are both difficult and not modern (and very boring for most people). Being able to read complex and boring texts shows IQ and grit, so that is in part signaling. But the ability to read complex texts does improve (I do think I have become a better reader by being exposed to complex texts in my field), so it is also somewhat trainable. So this training in reading boring and complex texts can be useful for a variety of jobs. For example, a lot of civil servants need to read boring memos, laws, complaints, etc., and make reasoned decisions based on that (without needing a law degree for that).
Arts. Now, in a lot of countries, this is separated from humanities studies. In Spain, Universities usually teach a broad range of skills for painters. Perspective, different techniques (oil, watercolor, pencil, acrilic, etc.). They also teach a lot of optionals, which can be specializations (computer graphics, sculpture, engraving, etc.). Are this skills necessary? Well, I do think that a graphic designer who draws computer games should understand perspective, composition, and color. So this degree does teach useful skills.
History: If you study history, you learn to read historical documents and find links between art during say, the inter-war period and the economy during that period. If you learn to see this type of associations, you could work in political think tanks, predicting election results, or as a journalist.
Sociology: here, you learn to interview people and learn to extract measurable, quantitative information from this. This is actually hard – so many of the statistics recorded in Spain, at least, have around 10 % of people answering don’t know/don’t care to statistical questions about different subjects. So convincing people to answer and learning to formulate questions that are easy enough to answer so enough people bother but are still informative enough to extract data is not that easy. This skills can be used for market research.
Do universities make a good job teaching these skills to aforementioned majors? I am not sure, but if they don’t, then it’s a problem of universities not being able to teach those jobs, not of employers requiring degrees with no direct skills application for the job.
Is University the best way to teach these skills? I haven’t heard of a better way to teach reading at a high level, writing and summarizing it correctly, and learning to translate what people say into stuff that can be made into a product.
If it becomes known that running a guild in WoW or the equivalent is a good job credential, what effects might this have on the games?
Best I can tell (as a long-time WoW player), none. The “you can list being GM on your resume and people will see it as a valuable skill” meme has been around for several years, and I’ve noticed no major changes in the game.
I’d also suggest that as far as “ways you can bolster your resume” go, it’s incredibly inefficient. As has been pointed out, it will get you laughed at by about 90% of hiring managers. It also takes a whole freaking lot of time and commitment. Time and commitment you could easily spend on volunteering for a non-profit or taking a course or obtaining a certification relevant to your professional field. Things that won’t get you laughed at by anybody.
Also most GM’s I’ve had have been blue-collar guys, the likes of which probably don’t even have much in terms of written resumes. A whole lot of retired/unemployed/disabled people are among the games most active players as well.
This is stupid and ahistorical. Even if Griggs doesn’t actually say this, this is how it has been interpreted by everyone since it was issued.
Just imagine the outrage that would occur if some well-known employer actually decided to hire based on SAT tests or on any measure that correlates with actual intelligence.
The repeal of Griggs would send a powerful signal going much beyond its practical significance.
The current model, based on empty credentials, benefits two groups: members of some powerful Democrat constituencies and not-too-bright children from middle-class families who get recycled through prestigious-sounding colleges and, by scrupulously sticking to the conventional wisdom and jargon taught there, get an unfair advantage over slightly brighter children from other backgrounds.
Since they and their relatives probably comprise a majority of the voters, there is no hope for improvement anytime soon.
I wonder to what extent this might be the result of people simply not knowing the extent to which an approximation of an applicant’s IQ predicts performance. At least in the US, most people I know (i.e., educated liberals) tend to believe that IQ is either unmeasurable, domain-specific, or of minor importance to predicting performance.
Combine discounting of the concept of IQ with the fact that a lot of employers are picking from a pool of applicants who have already passed through several IQ filters such that the actual variance in IQ scores among qualified applicants is small enough to reduce the utility of making IQ comparisons. For example, becoming a doctor at all requires passing through several filters that select for intelligence to a non-trivial degree: getting into college (SATs), which bachelors degree was chosen and whether it was completed (both of which correlate with IQ), getting into a medical school (requires MCATS; letters of recommendation; GPA; awards; competence in subjects that indicate IQ to some degree, like Calculus II), completing the medical degree, etc. Passing through each filter is partly dependent on IQ, but since people need to pass through so many filters before getting an MD, its unlikely that applicants with MDs will have much variability in their IQs and, therefore, its unlikely that knowing IQ will give you any additional information. I could see how this kind of situation would actually lead people to think that IQ isn’t important.
you are right. it isn’t griggs. there are plenty of IQ proxies that people can use with no problem.
Another point is that duke power instituted the IQ test as of the effective date of the 1964 civil rights act. The supreme court could not reasonably rule any other way without risking a massive circumvention of the 1964 civil rights act in the South.
Late to the party here.
Is Griggs the only reason for the lack of intelligence testing? Probably not. However, Griggs (and its kin) are an immediate barrier. The problem is that there are two kinds of discrimination which put a company at liability; implicit and explicit. Everyone understands explicit discrimination, that is going “No X”. Implicit discrimination is easier to trip over. To run afoul of implicit discrimination, your hiring process just has to result in workforce that is not balanced in regards to a protected class. If your workforce is unbalanced, the only defense to an EEOC claim is that your screening is only for bona fide job requirements, or things that are must-haves to accomplish the job. This is what trips up IQ testing. IQ testing is so granular that it is effectively impossible to defend any given cut-off point. How are you going to prove that 100 IQ can do the job but 99 can’t? If 99 can, what about 98? 97? It would take a lot of expensive testing for every single position to create a defensible standard. Most companies don’t have enough cookie-cutter slots to justify such an effort.
The other part of the problem is the test itself. If the IQ test they use is shown to be biased in output, then the company is now exposed to an implicit discrimination claim. The companies using SAT scores are opening themselves up to the ‘bona fide job requirement’ risk *if* there is a hard cut-off on the scores they will accept. The fuzzier the decision system, the more legal coverage it provides for the company.
A lot of degrees can’t be defended as bona fide requirements because most everything you learn at college you could learn elsewhere, so why do college degrees get a pass? One reason is because educational certification is already used as a government imposed barrier to certain fields (See: if the government does it, it is okay). You have to have a medical degree to be a doctor, law degree for a lawyer, etc. So you can point to that precedent. The other thing is that you generally won’t see the more legally adverse companies explicitly say that only a college degree will be accepted. They will instead add the language ‘or equivalent experience’. That is their out, though good luck getting your ‘equivalent experience’ through their automated application system.
If it wasn’t for Griggs and its kin, would IQ testing be widespread? Maybe, maybe not. But as along intelligence testing is de facto illegal, it is a moot question. Is Griggs et al. solely responsible for the rise of credentialism? Probably not, but it most likely helps.
It is a matter of considerable hubris to come to the conclusion that nigh everyone in a field is simply getting the law wrong.