[epistemic status: My bias is against the current college system doing much good. I have tried not to be bogged down by this bias, but take it into account when reading my interpretations below.]
[EDIT: An earlier version of this post claimed that one paper had shown a u-shaped relationship between time spent in college and critical thinking. A commenter pointed out this was true only of a subset in two-year colleges, but not of four-year colleges or college in general – which shows the expected linear relationship. I am sorry for the error, and correcting it somewhat increases my confidence in college building critical thinking.]
Over Thanksgiving, I was discussing tulip subsidies with the pro-Bernie-Sanders faction of my family, and my uncle claimed that we needed college because “it teaches you how to think critically”.
The evidence sort of supports him, but with the usual caveats and uncertainties.
First of all, what the heck is critical thinking? Luckily, we have a very objective scientific answer: critical thinking is the ability to score highly on the Watson-Glaser Critical Thinking Appraisal. Now that we’ve answered that, we can move on to the question at hand.
Most studies on this issue are terrible because they lack control groups. That is, they measure students when they enter college, measure them again when they leave college, and find that their critical thinking ability has improved. But this could be for any number of reasons. Maybe older people generally have better critical thinking than younger people. Maybe life experience builds critical thinking. Maybe college had nothing to do with any of it. The best meta-analysis of such studies, MacMillan 1987, finds exactly this, and concludes:
Overall these studies suggest that seniors, in the main, are probably better at critical thinking than freshmen. However, since the most compelling data were gathered through weak pretest-posttest or longitudinal designs, it is difficult to separate out the effect of college from the maturational effects that occur despite college.
I like the phrase “maturation that occurs despite college”, although I don’t think they meant it in that way.
But in any case we need a better study design to conclude anything from this. There are two studies with moderately good designs, both by a guy named Pascarella. The first compares 30 college students to 17 matched non-college students and follows them up for one year. They find that while both students and non-students gain critical thinking skills over the course of the year, the college students gain 17% more, corresponding to an effect size of 0.44.
The second, larger study compares students doing college full-time to students doing college part-time, under the theory that if college is causing the effect, then a little college should cause a small effect, but lots of college should cause a big effect. They find this in the four-year college sample, and a garbled u-shaped mess in the two-year college sample. At least the four-year sample, which is what most people are interested in, looks good.
On the other hand, some other studies find less impressive effect sizes. Arum and Roska recently wrote a book on this kind of thing, Academically Adrift, and they find that two years of college (start of freshman to end of sophomore) only increases critical thinking by 0.18 SD. This is weird because it’s twice as much college as the past few studies but less than half the effect size. Also, it doesn’t seem to be controlled, so this is the sum of actual-college-effect-size and confounder-effect-size. According to one review:
College entrance to end of sophomore (ie half of college) improves critical thinking by 0.18 SD. This might have been greater in the past “Pascarella and Terenzini estimated that seniors had an 0.5 SD advantage over freshmen in the 1990s. In contrast, during the 1980s students developed their skills at twice the rate: seniors had an advantage over freshmen of one standard deviation.”
Note that we’re comparing unlike with unlike. Four years of college need not produce an effect twice as great as two years of college, any more than a space heater that increases the temperature of a room 10 degrees after being left on for one hour will increase the temperature 87600 degrees after being left on for a year. Indeed, some studies suggest that most of the gains happen in freshman year. But even so, there’s a very clear downward trend here. As usual, we have no good way of knowing if that’s caused by gradually-improving studies, gradually-improving college critical thinking training, or gradually-deteriorating colleges.
One more question: do we know the specific aspects of the college experience that cause critical-thinking gains?
Specific explicitly-advertised “critical thinking classes” don’t. Being a college that prides itself on a specific “critical thinking focus” doesn’t. This sort of thing seems very well replicated, although there are a few individual studies that don’t really interact with the rest of the literature that seem to have at least temporary positive results. Classes that you might think would teach critical thinking, like logic, or philosophy, or statistics, don’t seem to do especially well (here’s a specific in-depth discussion of philosophy). The only positive result anyone’s been able to find is that “liberal arts” (here viewed as a broad category including science and mathematics) seems to do better than occupational skills, and “education, social work, communications, and business” seem to do worse.
In terms of broader factors, – one study finds that quantifiable college experiences explain “between 7 and 17%” of the variability in first-year critical thinking gains (other sources say classes explain 2.5% and extracurriculars 2.9%). Studying a lot seems to help. So does reading unassigned books. Aside from that, the biggest finding is kind of concerning:
Students who characterized their relationships with other students as “competitive, uninvolved…alienated” were more likely to show gains in critical thinking than were students who portrayed their peer relations as “friendly, supportive, or a sense of belonging” The data in this study do not permit confident explanation of this relation, but one might speculate that a sense of participation in a friendly, supportive peer environment may require a partial suspension of one’s critical thinking skills.
…wow. I was going to say something like “students busy spending time with their friends have less time to learn stuff”, but your cynical awful explanation works too.
So what’s the big picture?
Well, we know that people will gain critical thinking skills during the four years from age 18 to age 22. We have an small study that finds college helps a little with this process and a larger study that shows dose-dependent effects of college. We have some hard-to-compare effect sizes ranging from 0.18/2-years to 0.44/year. That’s modest but appreciable, and it’s probably at least somewhat real.
But those of you who went to my talk last week hopefully know what my next question will be: how long does this last?
For example, we know that parents’ personalities have all sorts of interesting effects on their children while those children are living with their parents. We also know that as soon as children leave their parents, those effects go down to near zero. What people tried to interpret as some deep fact about development was actually just a reflection of the environment that those children were in. Likewise, preschool makes children do much better in kindergarten, but by third grade the preschool-educated kids are doing the same or worse as the others. It’s not fundamentally altering their developmental trajectory, it’s just creating an short-term effect.
Every one of the studies I’m citing here was done in freshman or sophomore year (one study cited another done in senior year, but I couldn’t find it directly). No one has ever looked at students who have been out of college a year – let alone out of college thirty years – to see if the effect continues. I would bet that it doesn’t.
Until somebody checks, enjoy your opportunity to tell people that the evidence backs college building critical thinking skills.
The graph reminds me of Alexander Pope’s “Essay on Criticism”, warning
“A little learning is a dangerous thing;
Drink deep, or taste not the Pierian spring.”
It reminds me of how hard it is to “adjust for confounders” in highly multivariate contexts with complicated causal relationships.
Incidentally, the wiki page on confounding is pretty good; it actually explains what confounding is and how to adjust for it (in theory, if you actually know the causal structure of the phenomenon). This is much more subtle than many people realize, and the modern understand explained on the wiki page is only a few decades old.
The graph is my mistake and I apologize for it. Please see correction above.
Don’t depressed people demonstrate better critical thinking skills? That might explain why the alienated students are the better critical thinkers.
EDIT: I might be conflating IQ with critical thinking ability, and I’m honestly not sure where the differences are.
I have vague memories of reading that. On the other hand, I also have vague memories of a study showing that depression lowers IQ over time.
Anecdotal evidence: I have noticed a marked decrease in my ability to process complex ideas during my worst depressive phases. I also am subject to self-critical delusions about inherent worth, causal stories about how I’m a terrible person, etc.
I ALSO am more likely to be impatient with rose-colored views of the world. Cynicism seems to be a signal for intelligence, and the world certainly isn’t rosy or fair. The universe doesn’t like you. It doesn’t hate you. It doesn’t anything you. And depression amplifies that reality. I guess someone looking through those lenses could look like they’re thinking critically.
For what it’s worth, I also recall a study finding that depressed people tend to be more accurate in predicting future disasters than “normal” people.
The thing I’ve heard about is called “depressive realism” and can be summarized as “non-depressed people are unrealistically optimistic about some stuff while (mildly) depressed people aren’t”
The theory is kinda controversial and the evidence is mixed. Personally I think there’s a grain of truth to this but it also seems likely that the loss of some of people’s normal biases is off-set by a gain in new depression-related-biases. Like on wikipedia they say “Although depressed individuals make accurate judgments about having no control in situations where they in fact have no control, this appraisal also carries over to situations where they do have control, suggesting that the depressed perspective is not more accurate overall”.
Well, there’s depressive realism; I haven’t seen any studies specifically relating depression and critical thinking, but one would assume that better predictive ability is a consequence of critical thinking.
Yes, this is what I was thinking of. It’s not exactly “critical thinking” skills, but it sure seems to be in the same general area.
If being “uninvolved…alienated” with other students* is increasing your critical thinking skills, then a lot of mental illnesses and disabilities should correlate positively with critical thinking or at least should dampen the negative effects of said illnesses. I actually heard from autistics who at least implied not being involved in social interactions leads to them behaving in a more rational manner. As someone who had multiple psychologists consider me being autistic although no actual diagnosis came of it, the thought how stupid people are in many ways occurs to me a lot. And autism does coincide with depression.
*Potentially also just other people.
I would think that there would be a big difference in what students are studying. math and hard sciences should improve critical thinking while social sciences might very well depress it. also, I wonder if past studies of critical thinking, say 30 years ago, showed more improvement? today’s colleges seem to be pushing groupthink, not critical think.
I’m always slightly confused at the disdain for social sciences here. All of the research cited in this essay is social science research, as is most of the research discussed in this blog.
I think a lot of the perception that colleges are less difficult now than they were stems from the fact that more people are attending now, which brings down the mean ability of college students even if comparing like-to-like wouldn’t find this effect. (enrollment increase citation)
Finally, I do agree that math would improve critical thinking skills as defined by the Watson-Glaser test, but probably not in the normal sense of being able to integrate research into a coherent whole, articulate and evaluate arguments, and apply theory to real-world situations. If anything, it would seem like social sciences, economics, and some liberal arts (history or politics for example) fit that concept best.
“I’m always slightly confused at the disdain for social sciences here. All of the research cited in this essay is social science research, as is most of the research discussed in this blog.”
I think there’s a perception that social science consists entirely of gender studies, the political parts of sociology, and the political parts of social psychology.
Don’t forget Literary Criticism, which AFAIK (and I could be wrong) is explicitly built upon post-modernism, which is the opposite of critical thinking.
Literary Criticism is the quintessential Humanity! (along with History).
I hope people aren’t confusing the humanities with the social sciences, but I could see it happening
I don’t think many people would include Literary Criticism in the category of “Social Sciences.” (Of course, I wouldn’t put 90% of what I understand to be gender studies into the Social Science bucket either.)
I thought Lit Crit was English?
Post-Modernism is basically thinking critically about the definition of Thinking Critically. It turns out if you go meta on your group epistemology, the conversation self-destructs.
Literary criticism is a field whose existence precedes postmodernism. Certain kinds of literary criticism can be called postmodern. There is a specific kind of postmodern literary criticism, called deconstructionism, which is based on the work of Derrida. Deconstructionism is often taken, and I think wrongly, as the paradigm of postmodern thought; it is also held in contempt by a lot of people.
Postmodernism is not uniform and has a lot of interesting thought under its umbrella. Postmodernism, defined broadly, is basically a rebellion against the notion of absolute metaphysical truth. Even thinkers like Quine, who most people would take to be hard and scientific (even scientistic) could arguably be called postmodern, in that they resist foundationalism (indeed, Quine in many ways anticipates Kuhn, a more solidly postmodern thinker). Quine would probably disavow the label, though – then again, so did Foucault.
Lightman, the problem with your definition of postmodernism is that the modernists already rebelled against the notion of absolute metaphysical truth. So that can’t be the distinguishing characteristic of postmodernism. Though it is true that the pattern of exaggerating, or outright lying about, the views of one’s predecessors in order to make one’s own views seem more novel and more of an obvious improvement is a very long tradition in scholarly circles, and perhaps is a substantial factor here.
because while good work gets done in the social sciences, a great deal of what gets done in the social sciences is not science, but scientism used to justify pre-existing preferences. To make matters worse, the scientism usually gets more attention than the science.
Do you have a good way to know what portion is not science? And if not, wouldn’t the fact that “scientism” gets more attention lead you to overestimate its frequency in the actual literature due to availability bias?
For students, it’s not that important what gets more attention by the public, but what is actually present in the lit, so your point wouldn’t apply to their education, which was the thing originally criticized by @rose.
>Do you have a good way to know what portion is not science?
No, which is why I abstained from stating any opinion about relative quantities.
>And if not, wouldn’t the fact that “scientism” gets more attention lead you to overestimate its frequency in the actual literature due to availability bias?
>For students, it’s not that important what gets more attention by the public, but what is actually present in the lit, so your point wouldn’t apply to their education, which was the thing originally criticized b
only if you assume that the scientism-ists don’t get to do any teaching, and they definitely do.
Sorry! That definitely came off as combative. I think what I am still taking as an issue is your use of “great deal of what gets done” which does imply some estimate of proportion. Furthermore, the fact that this is used as evidence for social science education being particularly bad means that you think not only some “scientism-ists” teach, but that there are enough as to significantly decrease the quality of the teaching as a whole.
I agree in effect direction, but the reaction to social sciences here seems to imply knowledge that the effect size is very large as well which is where my skepticism lies.
“Do you have a good way to know what portion is not science? ”
No, so it must be presumed non-science until proven otherwise. Basic science, that.
Well, the research discussed on this blog is social science research. And you might have noticed that most of it is worse than nothing. Let’s recall the most important thing we learned about the issue under discussion here:
Most studies on this issue are terrible because they lack control groups. That is, they measure students when they enter college, measure them again when they leave college, and find that their critical thinking ability has improved.
(And then they conclude that college is entirely and solely responsible for any change in that critical thinking ability.)
I don’t think it’s hard to explain the disdain for social sciences. I’m with cassander. The problem isn’t that studying social science questions is a bad idea; the problem is that nobody’s doing it, and instead we have a priest caste pretending their knowledge is real.
The problem is that people who end up studying sociology or psychology are usually the people who “hate maths” and who are also pretty bad at it. Hence you end up in a situation where most of the sociology and related fields research is done by people who do not understand statistics and who see it at best as a necessary evil. The people who do study and understand math are usually not that interested in those sociological questions, so they do not help either (insofar as they cooperate with other fields it is usually natural sciences, very rarely social sciences).
One solution might be to pay the math people to check the math and statistics in sociological journals (the usual reviewers do not understand it either) and discard the articles which are bad, leaving only those that use statistics well to the reviewers to go through. But there does not seem to be an incentive for anyone to do that. The journals are only read by sociologists who do not care about the maths being correct that much and it would mean extra costs for the journal to set something like this up.
Interest and skills in math and interest in sociological questions are not mutually exclusive. There is no reason to assume you can not be curious about both nor good in both (beyond opportunity loss).
Are you picking on sociology for a particular reason? The one PhD sociologist that I know well seems very statistically savvy and she says it’s common for her colleagues to end up leveraging their stats skills to go into data science.
@Chalid: In fact, I would expect sociology to to second best (economics would be my bet no.1) among social sciences in terms of understanding of maths and statistics. So let’s forget about sociology in particular and substitute it with social sciences in general. This is because I have seen quite some evidence of poor quality of statistics among social scientists in general (both anecdotal evidence and actual data – resp. metastudies checking the quality of their statistical work), although I don’t think they differentiated between the fields, or at least i cannot recall that.
@andy: It is not mutually exclusive. But as it happens, in reality it is so, that those who study social sciences are on average less interested/capable in maths. That is, you might get a few good social scientists who do their statistics well, in fact I would expect there to definitely be some, but they will be hard to find among so much, well, more or less noise. So one would want to reward these scientists or punish those who do not care/cannot do statistics well but that would require the reviewers in the respective journals to care about statistics and understand it, which obviously is not usually the case given what things get published.
Also, this is not a problem entirely exclusive to social science. But the quality goes more or less maths+physics+informatics (the applied varieties, theoretical maths or physics do not make statistics on data, obviously) > natural sciences + engineering > social sciences. Of social sciences I would expect economics and sociology to be at the top, psychology at the bottom.
I might also have a little bit unrealistically high standards of what is good given that I do probability theory and do not have to bother with data collection and such things. It might turn out difficult for non-skill related reasons (money, time), I have zero experience with doing that.
Failing to include a control group is not a math error. It is not because sociologists hate math. It is because they hate science.
It might turn out difficult for non-skill related reasons (money, time)
This. My (brief) experience in Psychological field work was that, with tremendous effort, an N of 50 might be obtained, of which at least 5 would have to be thrown out. The statistical flim-flammery is to a certain extent a work around for the fact that a representative sample to make generalizations with isn’t feasibly procured, but papers need to be written anyway.
In my experience people in natural sciences and engineering tend not to be that great at statistics. If you’re doing experiments it’s almost always better just gather more data rather than learn a bunch of statistical techniques that your reviewers won’t believe anyway.
Fields where you can’t run your own experiments, or where they are very expensive/difficult, is where you get real statistical expertise.
Douglas Knight: I would say it is a math error. If you actually understand statistics you know that this cannot work. I doubt that they “hate science” (whatever that means), but probably they do not realize the importance of having a control group and that is because they often see statistics as a magical box where you put some numbers related to your hypothesis and it tells you whether you are right or wrong.
But I don’t think it makes sense to try to turn social scientist into theoretical statisticians. I just think there should be more division of labour. The social scientists should know their theories and make hypotheses but let the statistical work for hired statisticians. Every university department has its server administrator, a secretary and other people who do the stuff the faculty either cannot do or would waste their time doing. I think that professional economentricians and statisticians would be a sensible addition to all social science departments, letting the social sciences do what they understand well and allow those who actually understand statistics do that.
If, say, I wanted to do some numerics or programming other than a few basic for loops and if statements (which is usually enough for my purposes), I would someone form the numerics department to help me, I do not try to do that stuff on my own, because I know that I know very little about the subject and would either do it wrong or spend a lot of time struggling to do it right which I could spend more efficiently. Of course, I should still probably get a basic knowledge of programming as a social scientist should have a basic grasp of statistics, but I still let the professionals help me with that.
No, Tibor, the problem is that they are not trying to distinguish hypotheses, but trying to measure parameters under the assumption that their hypothesis is correct. If their hypothesis were correct, it would be a perfectly fine way to measure, not a statistical error.
Social science professors do not need expert consultants. They routinely exhibit lack of understanding of Stats 101 level of experimental design, probably because there is strong selective pressure on them not to understand it.
The studies Scott cites that I’ve read are all very very aware of the problem of not having a suitable control group. Most of them don’t have this as their only result, but rather as one of a group of metrics measured. I think you are too quick to assume malice or incompetence where many alternative hypotheses are available. Social science isn’t all bad, much of it is simply really, really difficult to set up.
Also, what do you mean by “strong selective pressure on them not to understand it?” Given that many of the most famous (in the field, not in popular press) social science professors are those who develop new, powerful, experimental designs or statistical methods, this seems like an odd claim. If anything, I would support @Chalid’s claim that social scientists (here I mostly mean psychology & econ–I don’t know about others) tend to have a better grasp on statistical methods as their research depends more on precise statistics and reading the literature requires staying current on new and complex methods.
PSJ, really, did Scott cite a study without a control group? His text claims that he didn’t. He cites MacMillan, which is a meta-analysis of idiotic studies. That is not the same thing!
Including non-students as a control is not “really, really difficult to set up.” People do studies of non-students all the time.
On the other hand, some other studies find less impressive effect sizes. Arum and Roska recently wrote a book on this kind of thing, Academically Adrift, and they find that two years of college (start of freshman to end of sophomore) only increases critical thinking by 0.18 SD. This is weird because it’s twice as much college as the past few studies but less than half the effect size. Also, it doesn’t seem to be controlled, so this is the sum of actual-college-effect-size and confounder-effect-size.
Somehow I’m skeptical that’s what you were talking about in your previous comment, the ones that (1) you have read and (2) are aware of their lack of control groups. Probably because they don’t exist.
But…both Academically Adrift and Pascarella’s conceptual replication of it don’t have control groups and cite that as a major flaw…how does that not fit the criteria above?
If you read my other comments below that I made long before this one, I talk about both of these papers. Separately I say it does seem that Pascarella (1994) and Pascarella (2005) are the only studies with good non-college controls. So you know that 1) I have read them and 2) I know they don’t have controls.
I haven’t read the whole book, but I don’t see where it talks about lack of a control group at all, let alone as a major flaw. Feel free to quote a passage.
I have no idea what you mean by “Pascarella’s conceptual replication.” Pascarella came first and did have control groups.
http://www.tandfonline.com/doi/full/10.1080/00091383.2011.568898#.Vl4wAN-rSCQ Replication which says Simply put, one cannot validly use an average gain score during college as an accurate estimate of the value-added effect of college
Although you are correct that Arum and Roska don’t see the lack of non-college controls as a major flaw (at least in the review in which I searched for quotes), they defend this by saying Instead of estimating value-added models, we aim to understand the relationship between specific students’ characteristics and experiences and their growth on the CLA over time, and we use a longitudinal design to do so. apologies for the error. But since they don’t make claims about the value added by college, but rather institutional, demographic, and other background differences in result, this seems like a correct defense, demonstrating solid awareness of the actual claims that may be made from their research. (source).
I apologize for this error as well as misreferring to Arum&Roska’s corpus simply as Academically Adrift. I have not read the book, only the papers the book is based on.
And this is the part of social science that can’t run RCTs because it’s impossible or illegal! To say that they hate science because measurement with correlational studies and natural experiments is extremely difficult again seems like prescribing malice and incompetence where better explanations are available. The phenomenon you are seeing is that doing and precisely interpreting this part of social science is difficult, not that the researchers are incompetent.
Research I choose to complain about is a highly selected subset of all research. I see a lot of studies where I say “Oh, that seems right” and don’t write a lengthy blog post about how angry I am about how wrong it is.
> I’m always slightly confused at the disdain for social sciences here. All of the research cited in this essay is social science research, as is most of the research discussed in this blog.
I agree that there’s a lot of unfounded disdain for the social sciences, but I don’t think that’s a very good counterexample. The research cited in this essay is pretty useless. At a high level, I don’t have any better understanding of the phenomenon of critical thinking for having read this essay. How would we expect things to behave across time, across culture, under changes is any of hundreds of variables (geography, major, ethnicity, SES, peer group)? In order to do anything useful, you need an understanding that can answer questions like this. Anyone can perform some measurements and call it a methodology, but a “research program” that is just measuring and not modeling is not really producing anything worthy of the name “science”.
Hostility toward anything that isn’t STEM is bread and butter for the grey tribe.
Which would be bad enough on its own, but when compounded with an obsession over, and propensity for wild speculation about, the very subjects they disdain it’s insufferable.
This is something I’ve seen before – the idea that hostility towards the social sciences/humanities is a STEM thing – but I am as humanities as they come and I have serious issues with a lot of what comes out of the social sciences.
Too much of what I saw in university under the social sciences umbrella was like the humanities, but with less humility, worse fact-checking, and more jargon, plus a tendency to do “sciencey” things in contexts where the scientific method couldn’t really be applied.
I’m with you there – I too am on the humanities side, and yet while word-processing someone’s dissertation for their Masters in Education (Guidance and Counselling), I was eyerolling and exclaiming out loud “This is a heap of tripe!” at the competing and mutually contradictory theories from the social and psychological sciences while typing, footnoting and endnoting, citing, referencing, bibliographing and figuring out how the hell to use the Harvard Referencing System/Reference Style which I’d never even heard of, much less used, before.
I did gussy-up some very pretty graphs out of the data, though, for the candidate’s requisite “middle section with graphs and charts and annotations to prove I did Real Science really scientific research” 🙂
A lot of what I read in the social sciences sort of seemed to be like science with bits missing. Like “here are the facts, here are possible explanations, obviously the possible explanation that does the job best is the one that fits into our theory, didn’t we do a great job?”
Sometimes they’d save time and only present the explanation that fit with the theory. And it wasn’t guaranteed that the facts would be correct, either. And there’s “case study” used as a synonym for “anecdote”.
A lot of what I’ve seen just sort of coalesces into a cargo cult of jargon, occasional charts, and ad hoc hypotheses that ends up resembling theology more than science.
The Grey Tribe feels under attack. Some nasty anti-STEM agitprop, even includes the threat of being mugged:
You should have posted this in the last thread! It would have been fun to go to their meeting.
That sounds like San Francisco, all right. I think my favorite part is how the article can’t believe that tech workers actually feel threatened by threatening posters. Because privilege, or something.
That said, I don’t think this shows that the Gray Tribe is seriously threatened in a broader context. This town’s just crazy.
Nornagest, you live in SF too? Do you know of a safe way to meet Gray Tribe people around here?
I don’t live in the City at the moment, but I’ve spent a lot of time there. The best way I know to meet Gray Tribe people is to go to events or venues that cater directly to them. To name a couple, Noisebridge is the oldest hackerspace in the city, and it used to host a lot of events; The Interval in Fort Mason is run by the Long Now guys and makes some of the best cocktails I’ve had. There’s a bunch of others depending on your interests.
If by safe you mean physically safe, I think your chances of getting assaulted by a “queer direct action collective” are fairly low.
Thanks. By safe I meant “won’t jeopardize my livelihood by exposing me as a non-blue-triber”. You saw what happened to Brendan Eich.
If you don’t have Brandon Eich’s high profile, then the only way you’ll find yourself on the sharp end of the Two Minutes Hate without going out of your way to invite it is extremely bad luck.
There are about two hundred and fifty million people to Brandon Eich’s right in the States. Even in the greater Bay Area there’s a few million. If you don’t worry about being struck by lightning, you shouldn’t worry about this.
>I’m always slightly confused at the disdain for social sciences here.
Last social science course I took, the standard for true was “can you provide a justification?” So, for instance, I could say “I tossed my tribe’s divination sticks, and now creationism is true.” Also, I was in the minority in insisting that intelligent design was pseudoscience*. And I wasn’t in the Bible Belt so much as Harvard.
*More precisely, the class dealt with religion and one presentation gave fairness biased coverage to evolution and ID, as if either explanation were plausible, and I was literally the only one in the class who objected to this, although I’m unsure what portion of the class believed ID was or wasn’t pseudoscience.
“Finally, I do agree that math would improve critical thinking skills as defined by the Watson-Glaser test, but probably not in the normal sense of being able to integrate research into a coherent whole, articulate and evaluate arguments, and apply theory to real-world situations. If anything, it would seem like social sciences, economics, and some liberal arts (history or politics for example) fit that concept best.”
Mostly because when you are educated in another science, it becomes painfully obvious that the way they integrate research into a coherent whole often ends up to be tangential, sweeping exaggeration or even claiming opposite of what original studies found.
They are not applying theory to real-world situations, they are using a little of theory as starting point for ideology building and extrapolating theoretical results way beyond what original theory proves.
The “we took 30 students on a single college, measured them before and after” study being used as starting point for sweeping conclusions about effects of college in general is all too common. So is the “study found 4.3% difference between male and female preferences therefore women in general x and men in general y” is super common too – and as much as from anti-feminist supposedly scientific thinkers as from radfem.
This has more to do with only good general conclusions leading to healthy paychecks, and funding for the research only covering about an afternoon’s worth of actual subject-contact. I’ve spent almost three months on one project where establishing what we were measuring, how those measurement metrics compared to previous work, and finding a venue that would allow us to collect data took almost nine weeks, analyzing the data took about one more, and we spent less than two weeks (I think four business days total) actually collecting any data!
It’s soviet-style science: You’re told to come up with a million potatoes, so you /say/ you have a million potatoes, but what you really have is one hundred potatoes in a box, and ten-thousand empty boxes labeled potatoes.
Yeah, they tried to find that, but it didn’t work.
I’m suspicious of all of this research, so I don’t think that proves there’s not a difference, but it certainly didn’t show up.
Pascarella’s replication paper of Academically Adrift, How Robust Are the Findings of Academically Adrift? replicates the Academically Adrift findings closely as well as, along with Pascarella (2005) and the Arum/Roska work, giving evidence that the modern score changes are smaller than those in the late 80s. Hours spent doing schoolwork also decreased over that time period.
Although I guess @rose wouldn’t accept this finding anyway, what with it being social science.
Did hours spent doing schoolwork really decreased? All comparisons I have seen took only some selected work kind (pages read usually) over that time period and ignored everything else. They would ignore time spent writing, time spend doing exercises, projects for cs classes etc.
The other point would be that it is possible that high schools got better, especially the best high schools who produce elite students are probably much better then they used to be back then. One would expect less of college impact on student who graduated better high school.
Anecdotally, reading Feynman memoirs did not made impression of constantly overworked student on me.
It is possible that Richard Feynman did not need to put as much effort into his studies as the average student. And that when he did it didn’t feel so much like work to him. So I’m not sure we can infer much from the fact that his memoirs don’t give the impression of overwork.
@g This is what I found about Feynman in high school: “Although his grades in mathematics and science were outstanding, he had performed much less well in other subjects”. The other thing I found was about his college notes being full of grammatical errors and misspellings.
Yes, it is very likely that Feynman needed less work to achieve the same grade in math. There is no reason to think he would performed as well and easily in humanities and non-math-or-physics classes required to be well rounded.
Incidentally, “students are studying less now then before” comparisons I have seen tend to focus on work generated more by humanities – amount of pages to be read. I did not seen comparison that would focus on math or physics difficulty.
“I would think that there would be a big difference in what students are studying. math and hard sciences should improve critical thinking while social sciences might very well depress it. als”
That seems completely counterinututive to me — it’s the hard science subjects that tend to present ready-made solutions, (with the exception of some area of maths and physics).
On the other hand, the hard sciences teach you a rather important piece of the critical thinking puzzle: check your theory against the real world or GTFO.
In principal true, in practice it’s much easier for a 15 year old to read some books, take some interviews and reality check Plato than to build a super collider and reality check the higgs boson.
They vary. Engineering is the most just-look-it-up, which seems to lead to some problems (http://rationalwiki.org/wiki/Engineers_and_woo).
Critical thinking is roughly equivalent to considering the world as hostile or unreliable. It is developed by encountering (and noticing) disagreement and contradictory information.
What disagreement or unreliable information do you encounter in math or hard sciences?
I did an undergraduate computer science degree with a lot of mathematics and physics courses, and I do not recall much in the way of misinformation in the course material. There was a first years stats course that was somewhat relevant, and there was a fourth-year computer science essay writing course that was also relevant, but the bulk of the course material focuses on problem solving with trustworthy information rather than critical thinking.
I would expect social sciences and humanities courses to develop critical thinking to a greater degree.
As someone with a physics degree, my personal experience was that while critical thinking was often displayed in coursework and by professors, the only classwork that actually demanded critical thinking was the pure lab class that had us obtaining actual experimental results and drawing conclusions (including attacking our own results to identify aspects we didn’t control properly and potential sources of bias and error, and using this to estimate about how reliable the results were).
Mathematical thinking has the advantage of having a strict objective test of success, but tends to be more specialized: in both proofs and applied mathematics the bulk of the effort goes into showing a set of premises are identical to a specific conclusion, i.e. deduction with little or no inductive logic. I think you need messier subject matter together with an objective measure of success to demand a full range of critical thinking tools.
But I’m not sure that ‘Watson-Glaser’ captures most peoples’ idea of ‘critical reasoning skills’. WG seems more a test of aptitude for logical inference – I would imagine closely correlating with IQ and I don’t think many people are claiming that college increases IQ.
I don’t personally like the fuzzy concept ‘critical thinking’ (no doubt it includes a lot of politically correct indoctrination); I would just say that a liberal arts education probably enhances (somewhat) well defined skill sets such as – synthesizing relevant information and writing a lucid, grammatical, well structured summary of it; succinct well organized public speaking; comparing and contrasting bodies of knowledge and theory (e.g a ‘libertarian’ versus a ‘socialist’ view) which is NOT the same kind of thing Watson-Glaser is testing; basic research skills; evaluating the pros and cons of different ‘approaches’ to social problems and texts, etc, etc. Many of these skills translate to work environments.
Of course there is also, perhaps more importantly, the addition of basic knowledge about the world. e.g what is ‘free trade’; what is ISIS? what is a cost-benefit analysis?
At a higher level in the best students there are also intangibles that might be gained – a kind of critical imagination which can see meta connections between diverse bodies of thought and knowledge (linking medieval history to current civil wars for example) – and I’m pretty sure Watson-Glaser doesn’t test that!
> no doubt it includes a lot of politically correct indoctrination
When I first read this, I was a little baffled, since critical thinking and political correctness seemed to be entirely unrelated (or even, to a degree, opposed). I’m aware that critical social theory theoretically supports critical thinking, but political correctness always seemed to be a misapplication of the fundamentally good idea of thinking critically about the status quo. But then I decided to actually look up attempts to formally define “critical thinking”, and one of the ones included on Wikipedia includes:
“in critical social theory, it is the commitment to the social and political practice of participatory democracy;”
That’s obviously just using the term in a different sense: i.e. thinking in the manner of the critical school.
Sure, but if it’s one of the accepted definitions, then any instance of the term’s usage has some probability of using that definition. That’s what I was expressing surprise at, since it was new to me. Most of the other definitions (or at least the ones I was familiar with) are all essentially expressing different ways to reach: “challenge your assumptions and be aware of status quo bias”.
Most of the time that I come across “critical thinking,” it cashes out as “ability to criticize,” and comes without any sense of what it means to survive criticism, or how to pick topics to criticize effectively. (At best, it’s a bland “criticize everything!”).
The predictable result is that people become more entrenched in whatever views they already hold, because of selective demands for rigor. They’ll share a meme they agree with without a second thought, but now have many more tools for dismissing memes they disagree with.
The distinction the Critical School makes is between “traditional” thinking and “critical” thinking, critical in the sense of criticizing tradition. Founded by Germans in the 50’s (i.e. people who had probably met Germany’s last king) the Critical School grouped participatory democracy on the “non-traditional” side of the divide.
Do you mean the Frankfurt School? That was founded in the 20s, which better fits meeting Wilhelm. But I rather doubt that they did.
*I* have always assumed that critical thinking was carefully examining a subject with an attempt to remove your own biases, or at least acknowledge your biases and attempt to consider other perspectives and other points of view.
No one told me that was what it was, I just thought about it. Critically.
I don’t know that Universities ever were good at teaching critical thinking, but I can say that when I started at the first college I went to, there were teachers who encouraged thought and analysis (oddly the first one that comes to mind was a Lesbian English Instructor. At least she said she was playing for the other team when I hit on her later 🙂 ) There were also Professors there who made it pretty clear that there was the right way, and the wrong way to look at a problem and the best way to figure out which was which was to look at what they thought and what sort of grade you wanted.
“*I* have always assumed that critical thinking was carefully examining a subject with an attempt to remove your own biases, or at least acknowledge your biases and attempt to consider other perspectives and other points of view. ”
That is exactly the line with which multiculturism and PC are touted.
I don’t know where to put this comment, since it could go lots of places in this comment section, but: after spending ten years in the University system getting multiple degrees in Engineering, one of the only professors who actually challenged the clarity of my thinking was an English professor in a Women’s Literature class. I remember being absolutely floored that she had correctly identified my lazy thinking in a paper, because I was so accustomed to being able to just gloss over details and trust that nobody would read my writing closely enough notice the underlying issues. It was perhaps the only time that I actually felt grateful for having points deducted.
According to the Deloitte website, which will administer online critical thinking tests to applicants at a certain stage in the job application process:
I actually took Deloitte’s and KPMG’s online tests on behalf of an ex-girlfriend this fall.
It’s just a shorter dumbed down version of the GRE or SAT, with the difference that on one of them you couldn’t navigate backwards to recheck questions you already answered. My guess is that it’s primarily a weed-out tool to keep the group interview sessions small, and that the important evaluations mostly take place in the one-on-one interviews.
I am not sure why you think weed-out tools do not reflect important evaluations.
Perhaps alternatively, students who gain critical thinking skills faster than normal are more likely to want to stay in college? I only skimmed the first Pascarella study, but unless I see a study which people properly form large random sample groups and deny a college education to half of them, I’m going to suspect alternate causation.
> Perhaps alternatively, students who gain critical thinking skills faster than normal are
> more likely to want to stay in college?
Depends on who’s paying the bills, the state of the economy and their major.
If Mom and Dad (or Uncle Sam) is floating the cash, the economy is poor and they REALLY like philosophy classes, then yeah.
If they are paying their own way, couldn’t get into an EE degree but know a little perl and this startup in The Valley is hiring…
If all you know is just a little perl, are not good enough to get in to EE and have just one friend in that startup in The Valley that can hire you, then leaving the university is not really proof of superior critical thinking.
It might get you more money short term, but once boom ends you are someone who was not good enough to get into EE and did a bunch of perl scripts in a steam of failed startups (overwhelming majority of startups fail).
It is different if you are confident in your skills overall so you think any other startup or bigger company will hire you after this one fails or if you trust your business skills and entrepreneur sense to run your own. In which case you are likely to know more then just a little perl (either technically or having that knack about business and organization) and I fully agree that for those people college is likely to be waste of time.
I don’t think there were too many dropouts from the study population.
Well, I suppose it is quite difficult to find a large cohort of college aged individuals amenable to taking tests periodically and sympathetic to advancing academic knowledge.
The major study has n=1860. The first was almost certainly a pilot study or proof-of-concept. It was a conference paper, not a journal one.
Remember that you need a large matched control group for it to be meaningful.
Scott–You have Facebook etc., share buttons. Posting your posts includes a preview which is just the first few lines, which in this case is “epistemic status…”
It would probably improve your reach to friends of readers if you included a two line abstract at the start of each post.
Or, instead of modifying the post, do this.
I’m curious as to what you think of the Israeli/European model of college (generally three-year degrees involving almost exclusively subject-specific courses; to name an extreme example, I’ve never taken a college course that wasn’t in math/CS), as opposed to the european model. (I prefer the Israeli model, but I’m incredibly biased, both because I took it and because I personally hate doing humanities courses).
Also, does anyone have a link to somewhere you can do the WG test online? this is the first I’ve heard of it and I’m kinda curious.
Not Scott, but personally I would have loved this. I went to school in Canada and hated just about every second of the humanities courses I had to take, despite doing pretty well in them. (Actually, I did like philosophy. I guess that’s humanities.) In the required English courses, I actually had to re-read a number of books I’d already read in high school. It was largely a waste of time and money.
Also went to school in Canada. Also hated that I was forced to take 2 humanities classes in my first year alone. Hated the unnecessary math courses more (was a CS major), but I was super pissed anyway.
Speaking of Israel, it would be interesting to see if the critical thinking skills of Israelis doing army service improve between the ages of 18-20. (In the US it would be very difficult to find a group of intelligent, motivated 18-20 year olds who are not in college!)
(In the US it would be very difficult to find a group of intelligent, motivated 18-20 year olds who are not in college!)
Your bias is showing. There are many such young people to be found in the branches of the US military.
There is something of a stereotype that people (at least in the US) go into the military when they don’t have other options.
I would be very curious to see what kind of W-G scores 18-20-year-olds in the military (US or otherwise) might get compared to people who went to college. If they show a similar increase, it’s probably time to reevaluate the idea that there’s something about college specifically that improves critical thinking.
I don’t think it’s exactly that universal of a stereotype in America – I myself have found that only among Blue tribe people who don’t value military service to begin with.
A better experiment design might test college students, military members, and persons in jail. That way we could more closely identify the degree of change over time, vs the effect of the environment.
How much similarity do you think there is between the “enter military after HS” cohort and the “enter college after HS” cohort? How much difference?
I would expect to find substantial differences between the cohorts.
At the same time, I’d expect 4 years in the military to have a much more profound impact on somebody’s personality/skill set/ what have you than 4 years at the average U.
You also have to be careful because the military strongly selects by asvab. If you do your study at, say, Fort Gordon, your random 18-20 y/o research subjects are almost certainly going to be in the signal corps. If you do it at Ft Benning, the infantry. There are significantly different selection criteria, and development plans.
I second keranih’s suggestion, and was be tremendously amused if the results end up being that jail is better than the military is better than college for developing critical thinking skills.
(Which it easily could be just because the lower down you start on a scale, the easier it is to improve.)
Speaking of biases showing … who is being served? We aren’t talking about a waiter job where you bring people their food, we are talking about a job that involves killing people and destroying things. Service is exactly the wrong word.
I, for one, greatly value the service of having my enemies killed and their stuff destroyed. Hearing the lamentations of their women is optional.
@anonymous: It sounds like you think political legitimacy is a lie, which is a far more cynical and nihilistic viewpoint than the one you’re criticizing.
If there is such a thing as legitimacy, then there is such a thing as the legitimate use of force.
I never could understand those who manage to decry the military as stupid, brutal, and wicked while turning around and encouraging our best and brightest not to join. (I’m thinking, for example, of the “die-ins” students would conduct to block the sidewalks approaching my university’s ROTC building.)
I don’t think it’s controversial to point out that, however you feel about state violence, the military isn’t going away. With that as a given, why wouldn’t you want the very finest minds and hearts running it? When you make the military attractive only to the stupid and the wicked, you can’t be surprised when your military is stupid and wicked.
Service is exactly the wrong word.
Am I really going to have to quote Kipling at you?
Obligatory disclaimer of bias: my late father was a sergeant in the Irish army, so while I don’t have any of the military-worship some flagwavers exhibit, I also don’t think people in the armed forces are all dumb brutes chomping at the bit for the chance to shoot furriners.
My father served with the Irish UN deployment in the Congo (he liked the native people, wasn’t so thrilled about the Belgian settlers) and Cyprus (where he had discussions about the similarity of Irish and Cypriot traditional folk music with locals) and on the border in Ireland when things were heating up during the Troubles.
If I’m not a racist, if I have any thing in me that values honour, human dignity, honesty, duty and truth, it’s down to my ignorant, uneducated, poor (as in ‘rural labouring class’ poor) father who had to leave school at the age of twelve and go working to help support his family, who joined the army out of no other opportunities, and who probably picked up at least a taste of PTSD from his service both overseas and in his domestic military career (though there was no recognition of such things in his day, or for a long time afterwards; even now, I’m not too sure how much care the Irish Defence Forces take of personnel when it comes to mental health) and what he taught me and how he talked and behaved (always praising the native Congolese people and not extrapolating from ‘forces shooting at us’ to ‘all of them are killers’, talking about the Greeks and the Turks and the Cypriots as humans ‘just like us’ and not ‘the bad guys’).
And yes, I consider he did render service, to his country and to the global community as part of his UN service (which he never talked about in detail and which he didn’t seem to be much impressed by – that’s all part of what I suspect was the PTSD).
As I said, I’m not a fan of the rah-rah, flag-waving, our glorious military cheerleading (partly because Ireland doesn’t have a glorious military; we get all our kit second-, third-, and fourth-hand from other nations once they’ve finished with it; also, there is a strong tradition of neutrality in this country, though our recent governments have been doing their damnedest since at least the 90s to stealth-undermine this and get us tied up with EU and NATO military response forces).
I don’t know if you’ve ever been in a foxhole with bullets coming right at you, people trying to kill you, and fellow-soldiers getting killed on either side (that was my father in the Congo). But unless and until you have, please refrain from snippy little comments about “(W)ho is being served? We aren’t talking about a waiter job where you bring people their food, we are talking about a job that involves killing people and destroying things. Service is exactly the wrong word” in regard to the poor bloody infantry and other branches.
It’s humorous to me how all the same nonsense that’s derided when it is done by college students on the left comes out in spades when it comes to the military.
“You weren’t there man, you don’t know” doesn’t that sound an awful lot like “you have male privilege, you don’t understand the lived experiences of women”?
Speaking of biases showing … who is being served? We aren’t talking about a waiter job where you bring people their food, we are talking about a job that involves killing people and destroying things. Service is exactly the wrong word.
First off, I find it quite ironic that a textbook example of the sort of anti-military bias we are talking about starts off with a snide, “speaking of biases showing.” Be careful about projection, okay?
That said, there are several perfectly good responses to this. First, “military service” is simply the term that is used to describe being in the military. It’s theoretically possible that sometime, centuries ago, some bias was present that led to the development of this term. Currently, however, it’s not an indication of individual bias so much as it is simply an example of using the English language. Are you trying to argue that speaking English is biased against speakers of all the other languages out there, or something?
Second, military service is a profession which, more than any other I can think of, is characterized by obedience to orders. If obedience to orders cannot be called “service,” I don’t know what can. So your question has a very simple, direct answer: They are serving the government of whichever country they belong to. Now, you may wish to argue that most governments do not actually serve their people, but that’s an entirely separate argument. To say that individual members of the military are doing anything other than “service” requires a ridiculous definition of the word.
Third, it seems fairly obvious that your statement is meant to imply that you feel military action can serve no positive good. Right now I’m torn between a snide, “all of history would beg to differ,” and an even more snide “keep telling yourself that when someone with a stronger military bashes your head in or enslaves you,” but either of those arguments would require far more time and effort than I am willing to put in right now to make at anything higher than a meme-level, and have valid counter-arguments which I similarly don’t have time to bother rebutting, so for now I think we’re just going to have to agree to disagree on that point.
Two arguments may be structually similar and yet unequally valid. For example, in this instance, nearly all men have plenty of experience interacting with females and observing a variety of male-female interactions daily, so they can legitimately be said to have a basis for evaluating feminist claims about a typical woman’s lived experience. (In as much as their experience differs from the typical, so might any given womans). In contrast, large portions of society have next to no second hand experience with military or their daily lives or jobs.
This is not to say there is no knowledge to judge on, just that the claims are not symetrical.
Also, many people criticizing colleges are coming out of colleges. Likewise, many criticizing the miliarty (at a fundamental level) are coming out of … colleges.
How much similarity do you think there is between the “enter military after HS” cohort and the “enter college after HS” cohort? How much difference?
I would expect to find substantial differences between the cohorts.
This is purely anecdotal, of course, but in my experience, having gone to college after high school for a year, and then joined the military instead, I didn’t see much difference between the majority of the people who made either choice.
There may be some major differences between the people who excel in a military environment and the people who excel in an academic environment, and there also may be some major differences if we start taking the most prestigious colleges and universities into consideration, but the set of “all people who go to college after high school?” I don’t think they’re really very different at all.
In a lot of places, “you will go to college after high school” is such a foregone conclusion that treating all the people who do so as a homogenous group, and expecting to glean very much useful information out of it, is a foolish endeavor. And in places where, either culturally or economically, going to college isn’t such a foregone conclusion, I’ve found that both going to college and joining the military require a similar level of motivation to disrupt your status quo.
Are you talking enlisted or officers?
And then are you talking about enlisted who take advantage of GI bill to pay for college after enlistment, college students who pay for college using ROTC, service academy cadets, or normal college students who go to OCS afterwards?
Presumably the OP is talking about enlisted, since you’re not going to find any officers in the 18-20 age range, and even if one existed, comparing them to a college freshman would be a pointless endeavor, since they will already have a degree, and will have already been college freshmen.
Comparing them to servicemembers who used the GI bill after getting out would be similarly fruitless, since they also won’t be in the 18-20 age range. Comparing them to enlisted servicemembers who are using tuition assistance to take college courses while still on active duty might complicate matters, but as I said in a different comment below, that won’t be an issue until they’ve been in for at least two years.
The active duty US military is 85% male. US colleges are 55% female. That is a monumental difference. Even if we were to restrict the military cohort to the same age range as the college cohort, I think we would be extremely unlikely to reach parity.
I’m fairly certain we would find other large differences in the cohorts.
Well, I did say that observation was purely anecdotal, didn’t I?
I never intended to imply there would be no differences. As you point out, gender distribution is definitely one, and there will doubtlessly be plenty of others. However, from what I’ve seen, “level of intelligence and motivation,” which is the subject we were discussing here, is not one of them.
Either way, whatever differences there may be, I think the two populations are similar enough to be worth comparing. Will there be confounding elements that need to be worked out, such as gender disparity? Of course, but there’s confounders that need to be worked out in any study. I’d still say it would be worth doing.
Your bias is showing. There are many such [intelligent, motivated 18-20 year old] people to be found in the branches of the US military.
There are many intelligent, motivated 22-24 year old people in the branches of the US military. At 18-20, a significant fraction of them are still in school, some at the service academies but most in the ROTC programs of ordinary colleges and universities. Not nearly all of them, but enough to complicate any plan to design a study around the intelligent, motivated 18-20 year old military population.
This may be true for officers, but among the enlisted personnel I don’t think you would have too much difficulty finding intelligent, motivated 18-20 year olds who are not also taking college courses, and enlisted personnel are the ones that are normally subject to the unfortunate stereotype described above.
Now, this might be confounded by however long the study lasts. If it covers a full four year period, it will be harder to find highly motivated and intelligent enlisted guys who don’t want to take advantage of tuition assistance before their first enlistment is over. But if you’re limiting the study to the first two years, like the ones described in Scott’s post (although those might also run afoul of the issues Scott describes above) I don’t think you’d have any difficulty whatsoever finding intelligent, motivated enlisted personnel, who also aren’t in college, to participate.
The complication is that “officers” and “enlisted” are not arbitrary categorizations. There is almost certainly a correlation between a would-be soldier’s intelligence and motivation, and their decision to become an officer rather than an enlisted man. Certainly at the low end where the “not too bright, kind of lazy, but marginally acceptable” bar for the enlisted ranks is in a different place than for the officers.
If you know what that correlation is, you could account for it a hypothetical comparison of young enlisted soldiers vs. college students, but I don’t think we can put more than vague handwavy numbers on it at this point.
There is almost certainly a correlation between a would-be soldier’s intelligence and motivation, and their decision to become an officer rather than an enlisted man.
Less merit than money, I think, at the 18-20 year-old level. I am told that in a properly functioning military, the brighter ones would be identified fairly early on and encouraged to transition to the officer ranks.
This is true, but it’s also irrelevant, since as you said above, anyone in the 18-20 age range who is planning to become an officer will not be an officer yet, and will not be subject to this study.
There’s really nothing to confound this study. Any study of 18-20 year old servicemembers will, by definition, only include enlisted personnel, because people in ROTC programs and the service academies are not yet part of the US military, even if they are planning on being so eventually.
Additionally, you may have a point about a correlation between a decision to go officer vs. enlisted and the subject’s intelligence and motivation, but I don’t think it would be quite so clear a correlation as you are assuming. Having your degree paid for followed by at least 4 years of service, and 4 years of service followed by your degree being paid for, are both similarly rational decisions, and depending on your economic and social standing at the time, there are valid reasons to make either choice. It’s true that officers get paid more, but enlisted personnel also have a chance to start saving and investing far earlier than a broke college kid in an ROTC program, and the sooner you get started on that the better. Plus, depending on their MOS, enlisted personnel may also have the advantage of learning a trade that can get them a well-paying job as soon as they get out. Additionally, if they are able to effectively utilize tuition assistance while they are still in, the money from the GI bill can be put toward graduate programs and industry certifications. Being an officer may be a cushier job with more benefits, of course, but that also fails to take into consideration people who specifically want to be in a combat-related MOS, and those people would definitely be better served going enlisted. Such people definitely exist, and while you might not be able to understand their motivations, they shouldn’t be discounted as unintelligent or unmotivated. (Though, to be fair, they also forego the “learning a useful trade” advantage I described above, unless they’re planning to go into law enforcement, security, or mercenary work.)
That isn’t to say that going officer isn’t likely to correlate to higher intelligence and motivation, mind you, just that I wouldn’t say the correlation is especially clear. Either way, it’s safe to argue that there are a significant number of highly intelligent and motivated 18-20 year old enlisted personnel out there. Sure, we might only be able to give handwavy numbers at this point, but isn’t “we can only give handwavy numbers at this point” sort of the whole reason to do studies like the one being proposed in the first place?
Edit: Keranih kinda’ beat me to the punch here while I was typing up my reply, but yeah, transitioning to officer from enlisted is always an option, too. There are plenty of reasons to stay on the enlisted side, however, such as simply enjoying what you do, preferring to work more directly with your troops, not enjoying the more political aspects of advancement in the officer ranks, and simply coming to see the enlisted side as your tribe, just to name a few. I mostly glossed over that, though, since the study being proposed would only cover people in their first enlistment, so discussing the relative merits of going officer vs. being a SNCO are kind of beside the point.
Bloodthirsty isn’t terribly difficult to understand. It’s certainly difficult to respect, but that’s something else entirely.
I think you guys are also forgetting about special programs. Medics, Divers, Aircrew, etc… are all pulled from the enlisted ranks and a nugget who scores high on their ASVAB or otherwise shows promise in Basic will probably get invited to try out.
Anecdotally, a good number of kids join up with the specific intent of getting into such a program. so that could also be a confounder.
Bloodthirsty isn’t terribly difficult to understand. It’s certainly difficult to respect, but that’s something else entirely.
Thank you for demonstrating your unparalleled ability to understand the point of view of people you don’t agree with, here. It’s quite refreshing!
Yeah, that’s part of what I was going for when I talked about people going enlisted to learn a useful trade.
With a few notable exceptions, such as pilots, every specialized job in the military is on the enlisted side. Other than whatever they got their degree in, of course, the only transferable skill most officers gain from their time in is management and leadership experience, (and considering some of the officers I’ve worked with, even that is questionable :op ) and you can get that on the enlisted side as well, albeit with a somewhat less big-picture perspective.
When I was considering joining the military after college, I’d almost certainly have tried to become an infantry officer. You’d get actual leadership experience after 4 years, whereas for other jobs, you’re pretty much some higher ranking officers secretary.
The most closely matched of them are likely at the Air Force Academy or the Naval Academy or West Point.
I’m not sure if this is true. OP said “in college,” not “extremely selective universities.” It is remarkably tough to get into the service academies, especially as compared to, say, Hopkinsville Community College or Mississippi Valley State University.
Moreover, such a broad definition of “college” as the place where motivated and intelligent young people are only to be found is, well, too broad to be of much use. West Point selects for “service to something greater than oneself” just as Liberty University selects for “Christianity” just as San Diego State selects for “beer consumption.”
I’m not sure Thanatos counts as something greater than oneself.
If your comment can be replaced with the word “boo” and lose no content, that is a worthless comment.
Soldiers kill people, plumbers put their hands in toilets, garbagemen pick up trash. These are not surprising facts to anyone over the age of seven. But unlike you, most of us generally don’t treat people who do those unpleasant jobs as Dalits. Some of us are glad that there are people willing to do dirty work for the community and think that their sacrifices deserve recognition.
If you want more people to share your anti-soldier stance, you should at least provide a logical reason why soldiering is an unnecessary and evil profession beyond “eww killing.” Otherwise you’re just shitposting.
Where is the sacrifice? Plumbers clean toilets because they are paid to do so. Garbageman pick up garbage because they are paid to do so. Not because they are heroes who want to serve the greater good. Same thing with those who kill for money.
Should we have a national holiday celebrating the sacrifices of fisherman who risk life and limb so that we can have lobster bisque?
That would be a good point if we actually paid soldiers half as well as plumbers. As it stands now, the pitiful amounts of money soldiers make is not a sufficient inducement for men to join up: the sense of duty and desire for respect is a much bigger selling point. Maybe you would be satisfied with paying premium to contract everything out to PMCs instead of having occasional parades?
Also, I would be thrilled to have national holidays for fishermen, loggers, miners and the like. We have a lot of dangerous poorly paid highly skiled blue-collar jobs in this country which are absolutely vital to its continued dominance. Yearly acknowledgement would be nice.
That depends on whether a PMC only system would be more or less likely to produce murderous adventures abroad for no apparent reason. I’d learn towards less.
We in America have one national holidy for military servicemen. We also have one labor, surely including those who stock lobster bisque.
I don’t know about the comparison to plumbers, but I don’t think soldiers are particularly badly paid in the U.S. at present. For a detailed, although possibly biased, comparison see:
“Should we have a national holiday celebrating the sacrifices of fisherman who risk life and limb so that we can have lobster bisque?”
I don’t know where you’re from, but in the US the Labor Day federal holiday on the first Monday of September does indeed celebrate, among other things, the contributions of people who work difficult, dangerous non-military jobs.
ETA: Randy got the scoop.
There will likely come a time in your life where you will have to call upon the police, the fire department, a medic, or some relation who is simply better equipped than you to pull your metaphorical ass out of the fire.
Long story short, pacifism, is a privilege of the protected.
The pay really isn’t that bad. 😉
In Europe (or at least in the European country I am most familiar with) you get your general knowledge and “breadth” in high school. University is for going in depth into one subject. The students that come out of high school there (especially the kind of high school you go to if you want to go to university) are on par with older American students who have taken their general education classes in college.
One important difference between that European country and the US is that high school students there don’t choose their own classes. They can choose between a few different types of schools, and maybe a handful of plans offered by the school, but then the classes are fixed throughout their high school career. And what is acceptable in a class plan is determined by the state, not by the individual school.
In contrast, the American system seems to allow high school students a lot of liberty in choosing their classes, with the result that lots of them waste lots of time on useless garbage. Then, when they go to college, they are forced to take the classes they didn’t take before.
The American system seems obviously bad to me, not just in practice (wasted time, delays), but also in theory: they give students freedom of choice when they are too young to make good use of it, then they take it away when they are older and more mature. I must admit that I don’t know all the details, though.
I’m a Kiwi and we have a combination of choose your own course at high school, to a growing degree over time (I had to take English until I was nearly 17, my second-to-last-year of high school) while university is focused compared to what I’ve heard of American unis.
Reaching far back in my memory, when I attended highschool you didn’t really get to pick very many of your classes. You might have been able to choose what year you took what classes to one >ahem< degree or another, but you had to have a certain number of hours of math, english etc. That may have changed (BTW, I went to a large highschool–there were 850 in my graduating class).
By allowing students (and presumably their parents) some flexibility schools in less populated areas (like western states) could have larger highschools that served a larger area, but also were able to offer a broader slate of vocational training as well as college prep classes.
I get the sense that the European system is very “tracked” and hierarchical whereas the American system is more egalitarian – where you go to school is mostly a question of geography, at least in the public system.
This has benefits – it would suck to get labeled “not the sort for college” in 8th grade. But then, a lot of kids obviously aren’t college material, and it would have been nice to not have to sit with those goobers in required history classes in 12th grade.
I’m curious for more impressions from people who’ve gone to school in Europe – partly because I have a nagging sense that the people in the US demanding “free college for everyone, like in country X” have a poor grasp of what “college” and “everyone” means in country X.
“it would have been nice to not have to sit with those goobers in required history classes in 12th grade.”
In HS, at least around here, the number of AP and honors courses that college bound kids take would seem to suggest that, if you are sitting with “goobers” now, you probably wouldn’t be able to go on to college were you in Europe.
So, what are the odds that in a European system you would have been tracked out of college in 8th grade, had the awful experience, and then forced into vocational school?
1) not all schools offer AP everything, or make it feasible to take AP everything
2) in the particular case I was referring to, it was a state required course on US government that the available AP history course did not satisfy
I actually think more and better vocational schools would be great – a lot of students without “college bound” academic skills still have tons of impressive and potentially lucrative mechanical / electronic / crafty aptitude. Those students were just as poorly served by the current one-size-fits-all system as the future collegians.
Not all European systems are tracked. From readings on Wikipedia, it seems that the Netherlands and Belgium have the most classically tracked system while the Nordic countries are closer to the North American model. Most other countries are in-between. I will add that what all European schools have in common is a very serious series of standardized tests necessary to graduate school.
You have to be very careful when designing a tracked education system. When the United Kingdom attempted to implement mass secondary education along a continental army, you got the every unsatisfactory Eleven+ exam that caused problems from the start. If you were one of the lucky kids that got into a grammar school than you were guaranteed a great education. The majority of kids that got into a secondary modern got a horrible education even in the vocational fields which they were supposed to be trained for. Most of the resources were vested in the grammar schools. The third type of school created by the Butler Act, the Technical Institutes, never functioned as the STEM academies that they were supposed to be and were even more grossly underfunded than the Secondary Moderns.
There are benefits to both the egalitarian model and the tracking model of education but there are also a lot of perverse incentives in both models. The perverse incentive in the tracking model is too underfund the education of those that didn’t get into the higher tracks even in vocational fields. If your going to assume that somebody should learn to be a stylist or a mechanic for living than you should at least train them to be excellent in those fields.
The standardized testing is one of the things that bothers me. On the one hand, every time the U.S. or a state tries to implement or add real consequences to a standardized test, there is much wailing and gnashing of teeth about everything from the horrors of “teaching to the test” to disparate impact to basically “I should still get paid no matter how dumb my students are”. On the other, often the same people continually complain about how far behind in education the U.S. is compared to other countries, many of whom have much more serious standardized testing and / or tracking regimes that would cause riots if they were imported here.
I disagree pretty strongly. If someone is not the sort for college, that means they are the sort for something else. A future mechanic spending their time around other future mechanics, and taking classes suited to their interests, is probably happier than a future mechanic spending their time around future analysts and taking classes suited to their interests.
I think the hypothesis has to be that there are going to be labeling errors in any tracking system, and the younger the child is when she is labeled, the errors become more frequent. It certainly *would* suck for little Alan Turing to be erroneously labeled “not the sort for college”.
@FJ: Instead of mislabeling, say, 5% of children, we should mislabel 66% of children? Why didn’t I think of that?
A serious proposal should consider both benefits and costs. The modern American education system actually does take the latter approach–they pretend that every child is suitable for college, that no child should be left behind, and so on. Similarly, the values of people not interested in higher learning are totally ignored, in a way that I cannot give examples of without getting unbearably snarky.
Most ‘mislabeling’ will happen on the boundary–the random happenstances of chance means that someone with an IQ of 110 might show up at 105 or 115 on the test. And if the cutoff between two tracks is at IQ 110, they might find themself in the lower track half the time, and the upper track half the time, simply due to chance. But they’re a marginal student, for whom the lower track and upper track are probably roughly equally inappropriate! Someone whose IQ is actually 130 is very unlikely to find themself in the <110 band due to labeling errors.
@Vaniver: if you can accurately estimate adult IQ at a very young age, then I agree with you. Is that actually possible? To the best of my knowledge, it’s possible to predict whether a particular child will have a severely subnormal IQ as an adult, but fine-grained predictions for normal children are not accurate. Someone whose IQ will eventually be 130 has a high chance of being mistakenly labeled as a <110 individual if testing is conducted at a young age. But I am not an expert in this field, so I welcome evidence demonstrating that you can make reliable predictions on adult IQ for normal individuals at a young age.
FJ, we can look up the answer to your question. I’m a little confused about what they did, but I think Deary et al found a correlation of 0.66 between IQ at 11 and 80. If this is a bivariate gaussian (scatterplot), then a score of 130 at age 80 predicts a score of 120 at age 11 and a 12% chance of less than 110.
To put that in context it is between the correlation of identical twins raised apart (.76) and fraternal twins raised together (.55).
@Douglas Knight: that’s very interesting, although it seems to cover the entire range of school-attending children. As Figure 3 shows, a substantial number of those kids were substantially below normal IQ. I absolutely concede that, if you have a child who tests out as mentally retarded, it is very unlikely that she will grow up to be a genius.
But even leaving aside that quibble, I suppose those numbers turn it into a judgment call. Is the social benefit of putting little kids onto permanent educational and career tracks based on IQ worth the risk that about one out of eight geniuses would accidentally get branded a dullard? I could see reasonable people taking either side of that argument.
It used to be much more tracked decades ago. There were different types of middle school too, which limited access to different types of high school, some of which in turn did not grant access to university. But then they unified middle school, made it so that you can access university from any high school, and made it feasible to transfer between tracks (though you have to catch up on what you’re missing, AFAIK).
Basically, now nobody tells you “you can’t go to this school”. You can go wherever you want, but if you choose a school that’s too hard for you, you’re going to have a bad time, and you’re probably going to transfer out.
One disadvantage of the European “focused” model shows up if, halfway through, you figure out that you don’t actually like the subject you’re studying. If you want to switch to another degree, you have to basically start your studies all over again. I get the impression that in the American system, switching majors is relatively simple because there’s such a large number of shared courses between all majors that most of your work doesn’t go to waste.
(Though of course if you’re in a country that charges no tuition, the cost of starting entirely over isn’t as bad as it would be in the US, though you still lose years of your life.)
How is that an advantage?
You can’t say that anything about either system is good until you say what the point is. If the point is to see how much the student perseveres, it’s a disadvantage.
You could achieve this advantage by taking the European system and truncating degrees to a single year. Is this a good idea? Indeed, an American degree is 2 years of being well-rounded, 1 year of electives, and 1 year of depth. (2 years of depth and no electives for an engineering degree). Why not go further to 4 years of being well-rounded or 4 years of electives? or 1?
(Europe probably does make it too hard to switch between closely related fields. Perhaps chemistry and biology should be the same for the first year. Perhaps everyone who needs calculus should take it together. Or perhaps not.)
I graduated from high school in 1998, and as of then there was not much choice (except in English – we did have an unusual elective system for junior/senior English that I hear has since been abolished). You had a take a certain number of years for each of the four major subjects, and any choice beyond that was more theoretical than real – there was certainly an expected path that any college-bound student would take. You did get to choose which foreign language you took (or none, but only a non-college-bound student would have dared to do that – remember that our parents were still watching over our shoulders and supervising our choices!). The rise of standardized testing has likely made the expected course even more rigid.
While I think a lot of requirements are bullshit, I really do think that a liberal arts education overall is better than just doing one subject.
My first degree was in electrical engineering and while on the one hand it was very restrictive in terms of the number of papers we had to do in the engineering school it was quite varied in terms of ties outside the university.
I found this example: https://www.assessmentday.co.uk/CriticalThinkingTest-Questions.pdf
you would have to rate your answers by hand with their answer sheet
Wait, do you have to take humanities courses during a STEM degree in all US colleges? I thought this was only true in “liberal arts colleges”, and that this was why they are so named, more or less. If you’re doing a CS degree in Caltech, Stanford or MIT, do you still *have to* take humanities courses? How many?
(My Israeli CS degree included two mandatory physics courses, so wasn’t only math/CS, but nothing non-STEM was required).
Yes, you do. For example, at MIT you have to take 8 Humanities, Arts, and Social Science courses. At Caltech you have to take 12 (since they’re on the trimester system). (You have to take ~9 at my small liberal arts college, counting language).
There are a few that don’t REQUIRE them, but I doubt there’s that many people at Brown who get out without taking any.
Do foreign languages count as “humanities” for these purposes?
Sometimes the requirement is just “humanities” and sometimes they are more specific. For an example of the latter, a school might require: a freshman writing seminar, two social science courses, two math courses, two physical science courses, two literature courses, and a sequence of two foreign language courses plus a total of thirty-four courses overall and meeting the requirements to major in at least one department. Which for something like physics might be: four specific math courses, six specific physics courses and four other physics courses picked from among a group of classes at the 300 and 400 level.
In general I think the distribution requirements are a good idea, but unfortunately many of the courses used to satisfy them turn out to be not very good. That in turn leads some, often those with self professed strong critical reasoning skills, to dismiss entire fields of inquiry. In part they turn out to be poor because they try to serve two masters — both those considering majoring in a subject and those looking to fulfill distribution requirements. It is hard to teach well to both.
Horse/Water/Drink. The idea that a university degree should be more than just a job-skills certification is sound. Students can be and are required to put in the time that would be required to acquire a broad knowledge base and, yes, critical thinking skills, in an environment conducive to such.
But it’s practically impossible to prevent people from slacking off. If you could get every professor in the “breadth” classes to insist on and grade for rigor, the students would have to follow. And if all the students insisted on rigor, the professors would have to provide it. Reality is, the students who want an easy A will match with the professors who want a light teaching load that doesn’t interfere with their research (or maybe the 1-2 courses they really care about teaching).
I recently read an article, which I unfortunately can’t find right now, proposing that universities centralize distribution requirement courses rather than leaving it up to the various departments to teach them. Build a school within a school that doesn’t have any research responsibilities or advanced courses but is totally dedicated to teaching the 100 level courses that most professors hate.
Of course to make it work, you’d probably want to pay the instructors more than the starvation wages adjuncts get. That in turn would cut into funds available for administrative bloat, but we can dream can’t we?
Do profs really hate teaching 100 level classes? I know Freddie DeBoer argued that was only true for a few research heavy classes. And as far as I can tell, the teachers doing 100 level classes are the ones who enjoy doing them.
It probably varies from school to school and department to department, but in my fairly narrow experience there isn’t a long list of volunteers to teach pre-med mechanics or intro to statistics. There ends up being a rotation where everyone pitch in every other year or so (or they just get adjuncts / grad students to do it.)
Up to a point.
ABET (the accrediting board for engineering schools) requires that engineering programs require a certain amount of humanities or social sciences courses and places restrictions on what you can and what you must do. English 1A is required. Up to one basic foreign language class is allowed, but literature classes in a foreign language count. There is (or was when I was in school) a depth requirement – must take two courses from the same department, at least one of which is upper-division, and a breadth requirement – outside of English 1A and foreign language courses, you can’t take all your H/SS courses from the same department. There’s a requirement for two upper-division courses in the total.
Generally yes, but how those requirements are structured varies quite a bit between schools. Mine required students to take classes matching a bunch of different requirements; I forget all of them, but they were along the lines of “qualitative”, “arts/culture”, “ethnicity/race”, and so on, for a total of eight or ten categories. These didn’t map closely to the traditional arts/humanities/sciences breakdown, and the labels didn’t always make sense, but it was nearly impossible for example to get the “ethnicity/race” requirement sorted without taking a course in African-American Studies or Native American Studies or some such thing.
I’m not a big fan. It smacks of some administrator ticking off boxes on some political mandate; if you’re going to get a well-rounded education this way, you’re getting it despite, not because of. (I ended up putting a lot of my requirements into linguistics and anthropology, which was a good choice — but I also took some poli-sci classes, which wasn’t. I wish I’d taken more stats and economics courses, but it was hard enough fitting all my requirements in around my major.)
Actually I’d say most engineers probably need more (or at least more focused) humanities courses than less – it’s the stuff we’re (generally) less good at, and if the analyses that cross my desk are any indication the average engineer has never been subjected to a serious writing skills course (or at least it didn’t stick).
So much of what engineers do is communication, and it’s frustrating that technical communications skills are treated like an afterthought, shucked off to the lowest ranking adjunct or Asst. Prof. with nothing better to do.
Anyway, at Michigan while I was there, engineers had to take a number of English / lit / writing courses and a bunch of pure math / physical science. You also had a few “non technical” electives. Engineers did avoid the foreign language and “race and ethnicity” course requirements that students in the main liberal arts college part of the University had to take.
I ended up filling those requirements out with a math minor, Classical history, and Russian history.
If I had a magic wand I would prevent publication of n=47 studies.
I had social sciences teachers that said you can dismiss studies with n<30, so by that rule this would make it.
You can show a robust effect with small N if the effect is really dramatic; if you feed an experimental drug to two people and they both drop dead, you don’t need to run a thousand-man RCT to know you probably don’t want to be taking it.
The trouble is that in social science research, dramatic effects are rare and controlling for outside influence is hard.
Yeah, a hard minimum on sample size is silly: if I stub one toe in the dark, I don’t have to stub the other nine to be sure the coffee table is there.
However, the snubbed toe is silent as to whether the coffee table has been replaced with the footlocker; and is only of limited use in making sure that you’re in the living room and not the downstairs guest room.
One of my economics professors said that you needed at least 20 degrees of freedom to hold your head high. Of course the labour guy just laughed at him.
The smaller your sample size, the more intensive a study you can do, all else being equal. It’s nice when they have very intense small studies, and somewhat less intense correlational large studies, and you can get the same answer from both.
Also, I’m betraying my ignorance here, but aren’t statistics supposed to help us get around this problem? p = 0.04 in a sample size of 30 means the same thing as p = 0.04 in a sample size of 30,000, right? It’s just harder to get.
Actually, it’s slightly more complicated than that, due to the ever-present vagaries of p-values: http://andrewgelman.com/2009/06/18/the_sample_size/
A study with low statistical power has a reduced chance of detecting a true effect but it also reduces the likelihood that a statistically significant result reflects a true effect. It can also exaggerate the magnitude of the effect. In the most general sense larger studies are likely to be better thought out simply because of cost/time effects.
Sorry. I am sure you know all of this. I was just fleshing out my comment above about small n studies.
It was worth writing up for the benefit of a lot of non-Scott people, I think.
Aside from study design, low-n studies are more likely to produce wildly inaccurate results. The odds of “this study is wrong by chance” are set by p value, but that doesn’t tell you how wrong to expect. As you say, small studies imply larger effects with lower confidence, and as a result imply larger errors if you get unlucky. Large studies with the same p imply weaker effects, with a decent chance of getting a false signal but little chance of being wildly inaccurate (if there were, say, a strong correlation in the other direction, you would have found it).
Beyond pure theory, I tend to worry that small samples show more reporting bias. That’s not necessarily about intentional dishonesty, but it’s cheaper to toss up a lot of small-n samples and wait for significance. Plus, if you’re not using a Bonferroni correction, you can do one large/medium sample on many things and be basically guaranteed to find some kind of significance.
Your statement about p-values is essentially true, but I think it’s also insufficient. Like any other simple confidence value, it’s masking a lot of other worthwhile information. It’s the usual criticism of simple Bayesian propositions: “True with p = 0.7” doesn’t address the shape of the probability curve, or the different ways in which the statement can be false (systematic error, effect nonexistent, effect too weak for test, etc).
The simplest insight is that you’re looking at a single proposition, rather than a distribution. Low p in a small sample suggests a strong peak in a broad distribution (you wouldn’t have gotten p < 0.05 unless the effect was potent, but if you are wrong you could easily be very wrong), while the same low p in a large sample suggests a small peak in a gentle distribution (you can pick up on a very small effect, perhaps too small to be real, but with millions of data points it’s quite unlikely that you’re wildly wrong).
Obviously there are other numbers (effect size) that help convey this information, but not all p-values are created equal. If I understand correctly, low-n should make us more open to the possibility that a well-designed study produced a radically incorrect result.
Andy McKenzie’s link is a great starting point here, but I’m definitely not up to a full understanding myself.
Multiple commenters on here are making the mistake of treating all study designs as the same. They’re not. The number of participants is 47, but the number of samples is 94. They’re taking two samples per participant, and then making a within-subject comparison for each participant. This is vastly different from the studies normally being criticized for low n that take one sample per participant and then make a between-subject comparison. Such between-subject comparisons introduce between-subject variability which is a gigantic confound, and requires much larger sample sizes to control for. The study is also making a between-subject comparison which is an issue, but the study is not invalid for that reason.
Yeah the difficulty with “college improves critical thinking” is that it’s really hard to define “critical thinking.” You can measure how well people do on a specific test, but inevitably there will be objections that it doesn’t cover all the important things.
Though the thing about having to partially suspend critical thinking skills in order to truly belong to a social group rings depressingly true for me. Of course there’s such a thing as friends who can disagree and remain friends, but large social groups tend to be constructed around shared ideals and bonds are reaffirmed by agreement and validation.
What I want to know is, can we quantify the value of “critical thinking skills”? Is it worth subsidizing (not necessarily fully) three to four years of education to develop some modicum of critical thinking skills in the general populace? Especially given that [A] to some extent those skills develop naturally anyway, [B] lots of people, especially those who end up being influential decision makers in the future, go to college even without subsidies anyway.
There may also be a selection bias here. Maybe being somebody who is going to go to college will cause your critical thinking skills to rise whether you actually go or not.
They tried pretty hard to match cases to controls.
What if it’s more than “pretty” hard to match cases to controls and you just can’t learn anything about this particular topic with that methodology? I mean look at that u-shape! You read a lot of studies like this, so I’d be surprised if you often come away with the impression that controlling for things isn’t fraught. Surprised in that I’d want to better understand your model.
Read the paper itself–it is extremely hard to argue that the effect they find is caused by something besides
i. extra hours of classes or
ii. Some variables that affects the decision to take more classes a lot but doesn’t affect current critical thinking, academic motivation, or a plethora of other variables
I’m sure that some of the effect is ii., but you are going to have a hard time arguing that i. doesn’t matter significantly.
Oh yeah, and also the “u-shape” thing is misrepresented. This feels like selective application of rigor.
There seems to be some selection bias in the Pascarella studies you mention. Namely all the college people they look at are people who applied and were accepted to college, and the non college people did not. Since colleges select students that have an aptitude for growth, the actual college effect may be smaller still.
Same goes for the U shaped curve. Maybe the low hour students are gaining critical thinking by also working full time; whereas the half time students are just bad or unmotivated students, yet are not in the position to need to work full time.
What the study needs to do is find 30 poeple who were accepted to college, and tell half that they can’t go.
They control for hours worked–the regression variable is insignificant (see table 1).
Would linear regression be able to explain a U shape?
That seems a bit drastic. In some countries a significant number of students defer their acceptance for a year (and don’t have to reapply). Construct a study which compares accepting and deferring students matched for SAT, SES, Personality etc. at the least we could estimate the effect of one year of college.
STEM Master Race here. Students hate it when you try to make them learn critical thinking skills. They want to be able to play ‘find the formula’ instead. I have an entire lecture dedicated to trying to force them to think about the assumptions attached to an equation rather than the equation itself… it’s like pulling teeth every time.
If you reward people for playing “find the formula” every day and then change things up just one day, it’s unsurprising they won’t do well. They might simply not realize that you want them to do something different.
Or more likely they’re getting irritated because they don’t care about learning, they just want to get their degrees.
Or perhaps they are not interesting in that particular ‘why.’
Not surprisingly, that’s not what I try to do. I never encourage “find the formula”; I just have one lecture that is particularly focused on the perils of playing “find the formula”.
I’ve had the opposite experience (in mostly social science/humanities classes). If you try to make students learn facts they’re like “This is boring, we can always just look up facts in a book”. If you just tell them “We’re learning critical thinking today, write an essay about whether you agree with this book’s argument or not,” then they like you.
But they’ll usually just do an essay about their feelings on the topic under discussion. To get them to give good reasons is hard, and it’s very hard to get them to understand that they have to rebut plausible reasons for views opposed to theirs (as they seem resistant to grasping that views opposed to theirs have plausible reasons to begin with).
I recently was in a discussion with a woman who thought a stereotype was an argument, and that she could substantiate it by saying she went to a certain school.
And another woman backed her by praising her statements as “careful and considered”, demanding I respect her “obvious” credentials, and concluded by abusing me as narcissistic and rude.
School is no guarantee that they will manage to learn critical thinking.
This is boring, we can always just look up facts in a book
And how do they know the book is reliable (there are plenty of hack jobs, axe-grinding hit-pieces – as in the ‘David Cameron and the pig’s head’ anecdote from a biography of David Cameron by a very disgruntled ex-Tory grandee – and sincere-but-demented tomes floating around out there), the facts are still current and have not been superseded by more recent research, or that they’re reading all that they need to read (Just One Book isn’t going to cut it). CHECK YOUR SOURCES, CHILDREN!
I’m sure the little bippies do love writing essays about ‘agree or disagree’; it’s easy to bullshit your way through a “This book presents a compelling argument for…/this book exhibits a shoddy understanding of…” without having to bother your little brain about actually engaging with the content or forming an opinion.
FACTS ARE FUN! FACTS ALLOW YOU TO PUT YOUR HOBNAIL BOOTS ON AND STOMP ALL OVER THE OPPOSITION’S LOUSY ARGUMENT THAT IS COMPLETELY IN ERROR! 🙂
…don’t get me started on STEM majors writing essays. I’ve assigned them. They’re not pretty.
One of my history professors told me that the engineers in his classes had a bimodal distribution – engineers were his best and worst students. Those who really liked taking the class actually put more work into the papers, etc., while those who were only there because it was required were putting in C- work just to get through the class.
Also, thinking about the assumptions is graded as severely as “find the formula” and often on the exact same principles — you had to put down what the teacher thought you should think.
This is not a happy-making situation.
I’ve seen the reverse. I was in a software engineering class with some people who had several years of Identify and Challenge Every Assumption behind them. That was fun.
“Now to start with, choose a username that is all lower case letters”.
“Why all lower case?”
It scares me when you have an engineer who can’t do anything other than find the formula. You can train a chimp to thrown numbers into an equation and turn the handle; you’re paying a professional to understand what the hell is going on in the math. I’ve had several cases where you need to understand the limits of an equation–i.e., what assumptions went into making it.
Andrew Gelman’s candy weighing experiment is good for making students examine the assumptions behind a formula, by giving them a case in which the formula gives the wrong result: http://andrewgelman.com/2008/05/08/doing_the_candy/
I used it when teaching my students the continuous form of Bayes’ Theorem, and then had them figure out What Went Wrong when our estimate for the weight of the bag was way off.
“Students in college more-than-full-time do best, but students in college about half-time do worst.”
I thought full-time was 15 credit hours per semester? If that’s correct, full-time students do the worst, and half-time students do similarly well as students taking an extra course or two. (or am I reading the graph wrong?)
I would hypothesize that unmotivated students disproportionately fall into the group taking a regular schedule.
Usually 12 is full time. But this the total number of hours over a full year, so 24 is full time. (“semester hours”)
There is no standard measure of credit hours; it varies from university to university.
Source: I am a college student with friends at multiple colleges. My friends range from an average of 12 hours per semester, to an average of 18.
This essay severely misrepresents the second Pascarella paper. There were two groups of students studied, a group enrolled in four year colleges (n=1860), and two year colleges (n=216). Of course, you mention that one of the studies has n=37 (twice), but never mention this sample size.
You begin with mocking The study claims that it did indeed find this but that there was some “nonlinearity” in the results, but the paper’s actual description of the nonlinearity (in the abstract even) is In the two-year, but not the four-year, sample the relationship between semester hours and critical thinking deviated significantly from linearity. Students attending a two year college full-time still derived the largest critical thinking benefits. However, the lowest levels of critical thinking accrued to those enrolled between 7-20 semester hours. Students enrolled for 6 or less hours actually had somewhat higher end-of-first-year critical thinking., which doesn’t dismiss the problem at all, as you seem to imply.
The important part of that quote is “in the two-year, but not the four-year sample.” The graph you show is only from that 216 person sample. They discuss this effect in their conclusion, mentioning that small sample size is probably the culprit given the large number of variables being regressed on and the population-matching adjustments. But they also admit that this is a serious flaw and that it highlights the importance of checking for nonlinear effects in this kind of analysis. They also very honestly say that this is worth further investigation and they don’t have a good explanation. You know, like when you say: These results don’t make much sense and probably shouldn’t be taken too seriously..
However, the main result from the study is from the n=1860 four-year sample, which is actually the sample from which you get the .41 SD effect size you mentioned. They test for non-linearity but found none. Furthermore, the effect appears general rather than conditional on any specific variable.
After this, you note a fatal flaw with these studies: But those of you who went to my talk last week hopefully know what my next question will be: how long does this last?. Of course this is actually just what the last sentence of that same paper says Finally, this study is limited by the fact that it was only able to trace cognitive growth over the first year of college. We cannot be sure that the effects we observed would persist over subsequent years.
So just the single primary source (from 1989) I bothered to check shows that you completely misinterpreted the result you use to show how flimsy this effect is and to boot, puts forth the exact same question for future research you use as your conclusion.
I’m sorry this sounds so frustrated, but people (including me) trust you a lot to give good analyses and I almost never see people go back to the primary sources to double check. And, because social science being bad fits into the cultural preconception here that most of it is bunk and can be dismissed off-hand in favor of personal bias and intuition (people here are already using this to critique the groupthink of modern colleges (most of the papers are pre-1990), politically correct indoctrination (pre-1990), and straight-up dismissal of any study with n<47), it seems doubly-important to represent the research properly. Your conclusion is right-ish (there is evidence of there being a significant increase in some measures of critical thinking due to college, but far from definitive, and this research doesn't address longevity past college (I may address your similarly un-nuanced claim about preschool in another post)), but the presentation encourages straight dismissal and disdain of the literature. All of your criticisms are addressed in or also presented by that single paper (and they offer many more as well!).
Edit: Paper is from 1994, not 1989
Edit 2: Paper does not use Watson-Glaser, but rather CAAP, which was apparently developed by the ACT people
Edit 3: Wow, this sounds way more critical than I intended–never write angry.
Not to mention Scott spending a whole post harping on how awful it is for journalists to misrepresent studies by ignoring details or getting them wrong.
Scott did say he had a bias he was trying to overcome. Maybe he did not do that.
I appreciate you pointing out these things.
I’m surprised this comment is not getting any traction.
This comment seems to me the heart of the rationalist endeavor. If it is not the one generating interest, maybe I was wrong in in some assumption of mine.
Or maybe people think you are right so they are not arguing with you. The comments that get the most replies tend to be CONTROVERSIAL.
I agree your comment is great.
This goes in the “list of disadvantages of not having upvotes”. Probably nothing more to it.
I’m sorry. I misunderstood the nonlinearity as applying to both samples. After looking it over again, I see you are right, and have corrected the post. I’ve also added an edit in bold saying that I made the mistake and asking people to re-read the piece. I’ve also tweeted a notification that I was wrong and people should read the correction so that people who have already read it will know to check back. And I’ve added it to my list of mistakes page above.
I did not say anywhere that Pascarella denied the nonlinearity, and in fact I specifically said that he mentioned it. If my original understanding had been correct, I think it would have been weird to publish the study as “shows that attending college increases critical thinking” even if you added a caveat at the end about linearity. Given that I now realize this was only for a subsample, I agree that his caveat was sufficient.
Your quote is not the same as my complaint. My complaint is not that “maybe there would be a lower effect in junior and senior year” (which is what I think Pascarella is saying in what you quote), but that there’s a big difference between “works while you are in college” and “has an effect that is maintained even outside of college, twenty years later”.
Nor am I deliberately citing older papers. Most of the good research on this is older. I cannot find any more modern research that has anything like an out-of-college control group (Academically Adrift doesn’t seem to based on the reviews I read, although I can’t access the whole book) and most of the modern stuff continues to refer to Pascarella and call it authoritative.
I do think you’re being unnecessarily mean about this; if you had just pointed out that I misread that one paper, I would have been happy to correct it. I don’t think it changes the conclusion – that there is evidence that college increases critical thinking skills while you’re in it, but little investigation as to whether the effect persists.
This may be a minor mistake/misunderstanding, but what makes it subjectively important, is that I for myself, and judging by “I’m sorry this sounds so frustrated, but people (including me) trust you a lot to give good analyses” this applies to PSJ alike, have assigned a very low (maybe unfairly superhumanly low) probability to “Scott Alexander makes a mistake”, so subjective suprise is high.
I guess the rationalist way to go about this is that we tacitly correct our priors and do not get upset about this because that’s just the way things are. But this is a normative model. As a descriptive model of author/reader relationship it falls short.
Scott makes mistakes all the time. So does Scott Sumner. So does Steve. I only really fault the present Scott for it, though. The others’ can be explained by either malice or naivete; the present Scott’s mistakes can only be explained by ignorance.
I again sincerely apologize, I was far too critical and could have presented the information in a better way that wouldn’t lower the level of discourse. I know you are always careful to add corrections as soon as possible and as openly as possible and shouldn’t have editorialized.
As for the second point, yeah, I think I had a ToM disconnect. If the study was as you had interpreted, some sarcasm would have been warranted so it’s not as if that was an additional problem on top of the interpretation.
Although the quote does not necessarily imply the same thing that you mean, the sentiment is repeated again, and more unambiguously in reference to post-college changes in Pascarella(2005) and the paper you link in the “of philosophy” link. I assume the issue is that doing that kind of longer study would be extremely expensive (as it can’t ride on the back of government sponsored test assays given during college) and becomes even harder to develop adequate controls. But everyone seems very excited for somebody to find a way to do it.
Yeah, it does seem that Pascarella (1994) and Pascarella (2005) are the only studies with good non-college controls, but they are both extremely large sample sizes and I’m not sure anyone has an idea on how to improve the design. Once again, they rely on these large, national studies of student achievement, so can’t really be replicated often (although it looks like we should be due for a new one soon!).
I agree that your conclusion was right, but I still think that the presentation (based on the comments) led people to get “there is little-to-no evidence that college increases critical thinking skills while you’re in it, and the effect probably doesn’t last.” And so I think I was criticizing you based on the response which isn’t really fair. As a social science student currently in college, I took the general criticism far too personally.
Exaggerating ideas is good for describing them and motivating people to understand them IMO. I don’t mind too terribly much that Scott overdid the sarcasm, and I also don’t mind that you overdid the calling out of his sarcasm.
Just dropping a quick note to say I appreciate the work here to both point out and correct the error. Credit to PSJ and Scott.
I’m commenting here on the entire discussion about the paper. I think it’s a great data point of how good science (here defined more broadly than just the scientific method) is at getting the low hanging fruit, and just how hard it is to get anything else. Scientific discoveries with a strong signal against background noise (or strong correlation, or high significance, or however your field defines the truth) require much less evidence to support than ones with low signal, so those are going to be discovered first. But continuing the same rate of discovery will require ever greater resources and coordination in order to identify signals of the same strength. Eventually the resources made available are simply insufficient to answer some important questions.
Figuring out what kids should learn in order to be successful adults and what the best way to teach them is are great things to figure out, but if it there were strong signals of those answers then those questions would have been answered already. In some (many?) fields clever technology and statistics can help find a signal, but as we find more of the obvious signals we should expect the rate of scientific discovery to decline.
The bulk of this seems to have been settled – accuracy, tone, and impact on Scott’s conclusions.
I just want to call attention to one thing you mention in the last paragraph: “I almost never see people go back to the primary sources to double check”. I realize that I’m guilty of this, and only check Scott’s primaries if they strongly surprise me, or interest me beyond the limits of his discussion. Scott’s record on analysis is exceptionally good (he’s in my top tier with Ben Goldacre and a few others for this sort of stuff), which tends to lower my interest in double-checking.
This, I suppose, is a learning moment. With most science journalism and analysis, I assign a double-digits probability to the possibility “This is so misleading/inaccurate that it’s utterly meaningless”. With Scott, I tend to round that possibility down to near zero unless it’s a highly emotional topic like social justice. In the worst case, that might mean that Scott’s typically good analysis is actually leading me to more incorrect conclusions than other media is. I doubt that the situation is that bad (mainstream science news seems to be up near 50% wrong, to the point where I’ve essentially stopped reading their coverage of non-fundamental science), but it’s still a worthwhile reminder that I’m probably overcorrecting quite a bit.
No real fault of Scott’s, of course. It’s not on him that I saw “exceedingly good analysis” and filed it under “essentially infallible analysis”. It’s just a good reminder for me and anyone who’s done similar that I ought to seek accuracy over received wisdom, and that I might even be in more danger when I find a good source.
“But in any case we need a better study design to conclude anything from this.”
I agree in principle, but in your “ample free time” you should read about mediation analysis as a way to suggest plausible hypotheses a better design might test.
Full disclosure: I work on that stuff sometimes, so obviously I think it’s super neat.
I am aware that this is highly speculative, but what i make of it is this. College is an oppportunity to gain critical thinking skills. A small part of the effect is kind of unavoidable due to interacting with other people in the structure, reading assigned books etc. A bigger part of the effect comes from the way students utilize that environment, in the following called attitude for the sake of simplicity. Let us assume students have either a good attitude, seeking out opportunities to learn, or a bad attitude, doing what is expected of them and spending the rest of the time gameing, partying, whatever. Attitude would work as a linear multiplier of the critical thinking skill gain effect. So the skill gain would roughly be equal to “college effect” plus “college effect times attitude”; with attitude > college effect, and both > 0
Datapoint 1: U-shaped function of hours enrolled to critical thinking skills. Were critical thinking a direct effect of college education, we had to assume a constantly rising graph. I propose that students with a good attitude disproportionally individualize their curricula, thus being overrepresented on the sides of the graph. They might take only classes that they assess as expecially helpful, have challenging jobs besides college, read unassigned books, generally optimize their gains by minimizing time spent in classes (type a). On the other hand, they might try to make the most of college by takeing as many classes as possible and study extra hard, those are on the right side of the graph (type b). Both might feel uninvolved or alienated towards other students (Datapoint 2) because because they are there to learn, not for campus life. Type a would have significant parts of their social circle outside of college, whereas type b would miss opportunities to socialize for extra classes and studying sessions.
That is of course, if the studies Alexander mentioned above are all correct in their general findings, which is in itself a not too probable assumption. Anything i missed? Other theories?
Edit: after reading PSJ’s comment it seems the u shaped graph is merely a weird artifact, not sustained in the bigger samples. Which makes all of the above obsolete. I wil still leave it here as it stands for discussion.
How many people here think they got little from going to college besides the piece of paper that says they graduated with a degree?
Are you really saying that your teachers taught you nothing that you would not have learned?
My experience was quite different. I had some teachers who were awful, but by and large I got quite a bit from each class. As a counter-factual, my schooling was before “object oriented programming” became standard practice, and I frequently wish that OO had been taught as part of my degree. Not because I can’t use C# or java, but because my instincts are all procedural.
In the first months of my job, I learned more about programming that I learned during my entire education up to that point. If I were to regress in time before I went to study, I’d instead look for a job, maybe taking up studying until the time that I could get said job.
I think that if it can be learned easily in the first months of your job, then it might be waste of time and *money* to learn it at college. Doing things for real is always more effective way then mock doing them.
Pet peeve: moreover, if all the job requires can be learned in the first months of the job, the job should not require college degree at all. It is ridiculous to demand that people pay and finish college for coding/administration jobs that are perfectly possible without it.
College time is better spent doing things that gives you base so that those first months learning is effective. E.g. basic programming and first projects (that are guaranteed to end in mess for most students).
Whenever people start talking about being able to learn things “easily in the first months of your job” I think it skips over an important element.
Knowing that there’s something there to know.
A friend of mine dropped out of a chem degree and ended up working as an analyst. He has a good head on his shoulders and learned quite a bit of programming but he’d never studied it formally and had never studied algorithms, data-structures or theory of computation.
To him it was black magic that the programmer he worked with always seemed to be able to write things which worked far far faster than anything he could write.
I was able to sit down with him and cover a fair bit of the very basics of algorithms and datastructures so that he would know that there’s something to know.
I don’t know every algorithm or every datastructure but I did learn how to break down a problem into parts that could fit general cases and then search for the related algorithms.
He’s a very self-motivated person so once he knew that there was something to know he went away and learned more about it but there’s a lot of such cases and a great deal of college is simply covering the bare essentials so that you know that there’s something to know.
Interesting, I had same thing with graph algorithms. I could not for the life of me solve a single exercise. So when they were in the mix, I always had low score from them. I thought I just sux at them.
Then a friend explained me two basic graphs datastructures and how to find minimal path. It was just very basic max 20 minutes long explanation and graphs suddenly clicked in me. I ended up being good at them, because now I suddenly knew how to think about them and was able to “invent” algorithms and solutions to other exercises.
I think it is similar effect as you described.
Something similar is probably going on with a lot of things, there are those first hints you need to get from somewhere. Once you have them, the rest follows naturally. If you do not have them, the whole thing is impenetrable puzzle.
One of the things that occurs to me is that, even though an individual degree holder may use only a subset of his degree, the entire pool of degree holders is still likely to use the full set of knowledge.
An undergrad degree prepares one to do many things, but an individual will not do all of them.
Whenever people start talking about being able to learn things “easily in the first months of your job” I think it skips over an important element.
Knowing that there’s something there to know.
Err -that is also part of learning thing. First you make a domain map, then learn the domains
Also programming is sufficiently complicated domain that it requires more than a mere month to master. Just like chemical engineering, dancing or writing
I think that it’s more “in the first few weeks/months on the job” suddenly it clicks: ah! that’s what that weird little bit we studied is used for!
And the big things you worried about for exam passing purposes aren’t really that big a deal, while obscure stuff can be a life-saver when you pull it out in the middle of a project 🙂
There is always going to be a big difference between going in to “Now I’m using this stuff in reality, for an actual purpose, and fitting it into the way Things Are Done here in The Real World” and how you studied the subject.
CS programs really aren’t an attempt at teaching programming (the occupation). They are more closely an intro into the academic field of CS (which is useful, if only tangentially to students). It seems like CS is quite similar to law and perhaps medical school – the departments must do academic work to justify their existence and consequently divorce themselves from their useful bits. Graduates must undergo a rigorous internships for a couple years as a net drain on their employer. Of course, CS is different from the other two in that the field isn’t licensed and the material is most often self taught.
College was a necessary experience for me in some ways, but I’m not sure how well it prepared me for a career, and I don’t feel like I was exposed to any radically new perspectives beyond what I’d already gotten from books and interacting with other people outside of the classroom setting. Yeah, I learned some stuff from teachers that I wouldn’t have otherwise, but I don’t feel like there was anything I learned that I theoretically could not have learned on my own. And a lot of it I’ve just totally forgotten at this point.
Granted, I had an artsy-fartsy major and my experience is probably very different than that of someone in STEM. But I went to college because I was told by everyone that it was absolutely essential to getting any job that wasn’t flipping burgers. And what I learned there hasn’t really been essential to any of the jobs I’ve held; I don’t even feel like the degree itself was essential. I sometimes wonder if going was the right choice and how my life might be different if I hadn’t.
College is probably necessary for a lot of people who are training for something specific, but the message that gets pushed on kids coming out of high school is that college is just what you do–regardless of whether you can afford it, regardless of whether you even know what you want to study yet–and if you don’t, you’re doomed to failure. And the result is all these people with crippling student loan debt and no job prospects, etc.
I don’t doubt that college does confer benefits, but benefits proportional to the cost? That I don’t know about.
And I don’t think the solution is to subsidize it for everyone but to explore more affordable options that achieve the same effects.
>How many people here think they got little from going to college besides the piece of paper that says they graduated with a degree?
Me, but I went to a taxpayer-paid small local college – commuting from my parents house, I loathed the idea of dorm discomfort without mama hotel service and had no need for “freedom” and “experience” i.e. fucking intense status competiton – in a not particularly rich European country. Surely the kind of US universities where you get in debt for $200K would have been different. You get what you pay for.
And to be fair it was not even their fault, the problem is that office jobs are under-demanding. You learn advanced statistics like correlation and regression and then find out no boss ever wants a more complicated stats than percentage changes of sales last year from this year, because the boss typically did not go to any kind of college, he just started working as a used car salesman at 17 and now at 57 owns a big chain of car dealerships. You as a college trained employee are often kind of overeducated compared to your boss’ bandwidth.
Why do you assume that there aren’t better jobs out there?
I think I got a lot from both colleges that I attended, both from professors I engaged with (I was paying out of my own pocket for most of it so they DAMN WELL were going to explain that stuff to me), and in interacting with and learning from fellow students.
If I had it to do all over again I’d do it in a heart beat. But I’d probably not get a degree in Fine Art.
Virginia Tech alum of Computer Science and professional programmer here, and I’ll give a qualified yes here. I would not have been able to jump into my current job without several years of working on hard problems and getting a firsthand appreciation for the common beginner programming errors.
My own feeling is that based on what I’ve seen of the industry, CS degrees are 90% signaling, because technical management either can’t or doesn’t want to know how to tell if the job applicant claiming technical skill is bullshitting them. So they outsource the decision to a reputable college, and assume that anyone who made it through four years of a rigorous program of study can probably figure things out on a technical level.
Now, there are lots of things I learned in college I wouldn’t have learned on my job. But they’re all things that are completely unrelated to my job, like operating system details or how to write in Lisp. Furthermore, actual development work is 99% writing something in the context of what other people have written and are writing; the model of “Go off by yourself and write a practice program, which you will then discard and never have to maintain ever again.” is very poor practice for programming as I’ve experienced it in the real world.
How many self-taught programmers have you worked with? How many job interviews have you conducted?
I ask because your answer suggests that you are a relatively recent grad.
If you don’t have the counterfactual example of what it looks like to interview or work with non-CS majors, your confidence in the 90% estimate should be low, I would think.
Younger people without degrees working as programmers are rare-ish nowdays but I can think of a couple of people I know personally who skipped degrees. Both though are the kind of kid who’s been programming since the age of 6.
Programming is also one of those fields where it’s possible to sit down with a couple of senior programmers for a few hours and for them to get a very realistic idea of your skill level.
A degree gets you past the HR drones more than anything else to give you a chance to talk to the technical people.
I know a few decent programmers who did not degrees in CS. One was halfway through when he dropped out for money/got a programming job offer reasons.
Most of the non-degreed people I have interacted with are the “brute force” types, though. Some are the “buzzword of the day” types.
One of the things I really appreciate about working with more recent grads is that they tend to use newer paradigms in effective ways, and then I get to pick up those skills.
I’ve been programming for about a decade, and with maybe a handful of exceptions, all of my co-workers and non-professional co-programmers have had degrees. But what I’ve noticed is that the degree by itself doesn’t seem to predict much; I’ve had a fair number of programmers with degrees and not a lot of appreciable skill. What I haven’t seen is any programmers who have their own independent projects and are willing to talk through what they learned by coding them who aren’t, at the bare minimum, competent.
That being said, people are clever bastards, so I have no doubt that even now there’s a degree-mill equivalent program teaching people to clone GitHub projects by rote and mention a few interesting talking points in the code.
I’ve long thought that Maintenance 201 should be a core CS class. It would revolve around some large program that the students need to use and develop. Each subsequent class would be given some set of changes to make to the existing codebase bequeathed on them by the previous class. Perhaps even make students take this course twice, so they can feel the full weight of their previous transgressions.
After a few iterations, that codebase will drive men mad who attempt to read it, which is valuable preparation for the work world.
“euck! What does this comment even mean! ‘ph’nglui mglw’nafh Cthulhu R’lyeh wgah’nagl fhtagn’ and why is there an object called ‘blood cache’ here.”
After a few iterations, that codebase will drive men mad who attempt to read it, which is valuable preparation for the work world.
Oh, yes. Even lower down the chain than the rarified world of coding, it would be invaluable experience: “That data you need for the report due by 8 a.m. tomorrow is in a file somewhere on one or the other of the drives to which you may or may not have access, in amongst ten years’ worth of work done four different ways by twelve different people across three different applications. Have fun searching!”
I am programmer too and disagree with “Go off by yourself and write a practice program, which you will then discard and never have to maintain ever again.” being worthless exercise.
First larger project students do is mostly ethernal mess and so unmaintainable that many barely finish it. Which is why you could not possibly allow them to have that experience on the job. The mess you generated and suffered through is what makes you understand why code cleanness matters.
Also, I worked multiple times on independently for longer time and being able to finish task independently without every day supervision was advantage on the job. As in, me being able to do bigger thing without constant supervision gained me trust and more interesting tasks then people who could be trusted only with bite sized tasks or needed constant attention from management.
Teach maintenance doesn’t mean don’t teach greenfield.
Mmm. My own CS experience was that there was one professor widely regarded as an evil bastard, because his classes did follow the paradigm of “This bit of functionality is your first programming assignment, improving it is the next, refactoring both is the third, doing a major change that requires you to have really refactored well is the fourth, and so on.” For the students who had only done the throwaway project (and expected to be able to skip or blow off a section without it impacting their entire ability to continue in the class), it was one heck of a weeder course; nothing makes you appreciate unit tests like having to review code you wrote months ago against a tight deadline while dealing with another, tighter deadline.
I would see this as super cool level 2. Many CS students are programming first time in their lives when they enter college. Around the time when the first project happen. The “changing requirements under tight deadline” thing is like asking drivers to do stunts in driving school.
You have to teach them about coding itself and unit tests and proper structure before you unleash tough weeder out course that assumes they already know these concepts on them.
Students are not coming in knowing everything you knew when you started to work – especially if you are someone who started either coding or administration during high school. Most come in knowing much less.
I only use a little bit of the knowledge I gained from college (I majored in a hard science), but in the process of earning my degree I was forced to repeatedly practice a hard-for-me-to-describe skill I think of as “organized thinking”, which has proven invaluable in my professional career.
And the actual knowledge that I use is mostly stuff I learned at the very beginning, which I then repeatedly applied in almost every subsequent class. By the time I started needing it in real life, I already knew it backwards and forwards.
This fits nicely with my pet Theory of Education, which is that school is less about imparting a wide variety of knowledge and more about driving a very few things into your head so hard that you couldn’t forget them if you tried.
I think that probably significantly overlaps with “logical problem solving”, which is the most valuable skill I took out of my time.
The question isn’t “did you learn nothing and just get a piece of paper?” it is “did you learn notably more than you would have just being alive and interacting with people for 4 years?”.
I’m a college drop out and I have had conversations with college degree holders where I used information I learned in school (my midwife was surprised I knew about non shivering thermogenesis) and I have had similar chats where my experience in random jobs X, Y and Z that I held after dropping out was relevant. It is the supposed gap between the two- that you gain much more from college than from typical non college experience that is at question.
Learned some stuff.
The real charm was spending four years at a place where people’s eyes did not glaze over whenever an interesting topic of conversation came up.
I have a CS degree. I learned a decent amount about engineering while I was earning it, but I probably learned more from working on MUDs; hell, that probably taught me more than my first real programming job. On the other hand, the CS curriculum did give me a really solid applied math background that I couldn’t have easily duplicated at the time; maybe through Coursera or one of its friends, now, but that wasn’t an option then. There are a lot of programming disciplines out there that don’t need that kind of background (or at least no more than one module on computational complexity), but my particular sub-specialty is not one of them.
All my breadth requirements classes, however, gave me nothing that I couldn’t have gotten (with less propaganda mixed in) from reading Wikipedia and arguing about what I found there on SSC. And that was something like a third of my classes at my school. Maybe I’m unusually interested in history and anthropology for a techie; probably am, actually. But I didn’t get that interest from college.
But would you have actually done all that work? The applied math and the non-degree work.
I would most likely have gone into a different sub-specialty if I didn’t have the applied math background. The non-degree work I think I’d have done the equivalent of, but not in the same way: I’ve written fewer formal essays on the humanities or social sciences since graduating than I wrote as an undergrad, but I’ve probably written a lot more in total. I’ve definitely read more — and not just Wikipedia, snark aside.
I suspect that much of your ability to extract value from Wikipedia comes from the fact that college gave you a solid intellectual foundation.
How does college give you a solid intellectual foundation?
By persuading you to read a lot of stuff and think and argue about it.
Which I was doing already, and would certainly have continued to do if I’d joined the Navy or found a job in industry instead of going to college. Are there subjects it’s hard to find good material on outside of a campus? Sure. Are there subjects where it’s useful to have the structure of grades and a syllabus? Again, sure. But I don’t think your average general education course gets much out of the latter, and it’s definitely not consistently the former.
You can do a lot of thinking and arguing and end up totally unmoored from reality. I’m sure you can come up with examples, perhaps even from this comment section 🙂
Not that you can’t do that after going through college of course. But I do have the impression that self-educated people are much more likely to end up with really weird views.
College persuades you to read a lot of stuff but it’s important that the “stuff” is, say, an econ textbook instead of Zero Hedge.
I suspect self-educated people are more likely to end up with weird views because self-education is not the norm, more than because self-education itself inclines toward weirdness.
I could also point to more than a few historical schools of thought that were academically respectable and yet had, as you put it, come completely unmoored from reality; I’m not trying to get all STEM Master Race here, my beef’s with the requirements structure rather than the humanities and social sciences in general, but it does happen. But anyway, if you’re going for “able to read Wikipedia productively”, I doubt weirdness matters much.
1.) Selection: there is difference between pursuing what you personally find interesting and learning a bunch of things someone having good background in the field picked up. Second case have you learning things that you personally find less interesting, but are important/representative of that field.
2.) I do not know many people who would actually solve tough physics exercises as part of their pleasure physics reading. There are plenty of people who read pop physics and think they are doing equivalent of physics at college.
3.) Assuming good college/school, school forces you to do things you personally dislike and would not do on your own. Where you would stop learning normally, incoming tough math exam gives you no choice but to battle it out, learn and understand damm theory/exercises.
Yeah, coercion is part of the whole thing. And for all the tales about internal motivation, when you do not have one in something, external motivation works too. Less well, but better then no motivation at all. And you end up better for it in long term.
Most people learn languages because they need to, not because they would love memorizing words or training grammar. External motivation (exam) and deadlines helps a lot in that case.
You seem to be arguing against something that I’m not arguing for.
My experience (UK university, social-science-plus-humanities degree) was that I acquired some combination of benefits, listed in approximate order of lifetime utility:
1. Piece of paper
2. Practice writing
3. Practice being publicly shown to have no idea what I was talking about
4. Practice being the dumbest person in the room
5. Practice defending ideas that it had never occurred to me to question
6. Oddly useful tidbits of knowledge
7. Totally useless tidbits of knowledge that I would later deploy in trivia contests to cover my bar tab
#5 seems the closest to what people mean by “critical thinking.” Would I have gotten it in the military or doing something else with my time? Hard to say. But I definitely did a lot of it, in scenarios which frequently led to #3. I assume everyone would agree that #3 is important for a snot-nosed punk like myself, so it was time well-spent.
My own experience was very uneven.
Thinking back I can point to a handful of professors and classes that were worth price of admission.
If I had had the option to skip the rest or take other subjects in their place I would have.
What ARE critical thinking skills? I’ve never actually been very clear on that. Are you supposed to be all like QUESTION EVERYTHING all day, every day?
“Critical Thinking Skills” are the educational shibboleth to distract from the fact that we no longer teach maths, language, history or science. See also: the Arts, Humanities, all “-studies” etc.
All of these things can be well and good, but not if the kids can’t read. And the kids can’t read, I’m here to tell you.
Modern educational theory is based on ignoring the process of learning, and jumping right to the secondary characteristics of very intelligent people, and trying to teach THAT to a lot of average and sub-average kids. Hence: Common Core Maths. They took a shortcut that people who are abnormally good at math sometimes use, and created a national curriculum based on trying to force kids to do something they are incapable of doing.
As an analogy, it would be like noticing that people who read the Lord of the Rings at age ten do really well in school, so you show the movie to the whole school to raise their educational level. It misses all the prep-work and preselection that goes into the original correlation.
Actually, I think this accounts with our over-emphasis on college more generally: ability to send young people to four years of finishing school to network and think about life is largely a *result* rather than a cause of wealth. But stopping people from trying to get high status by imitating what high status people do is probably like stopping the Earth from orbiting the sun…
I’ve certainly encountered numerous bits of advice for thinking or reasoning better which struck me particularly for being things I already did without anybody ever suggesting them to me. This comment reminded me that I’m kind of curious as to whether there is very much research into which techniques of thinking are things that it makes a difference to try to adopt, vs. those that are symptoms of being higher IQ (or whatever) but not really productively learnable (and perhaps not even helpful; the correlation between people doing things some particular way and being higher IQ is presumably in some cases a coincidence rather than evidence of that approach being superior).
Well, the nonlinearity and even the absence of a simple quadratic diminishing returns curve suggests that there are other factors at play.
Some random thoughts
-What are the differences in lifestyles between people spending 6 hours per week and those spending 12 or 24+ hours per week?
-Is there any relationship between “belonging to friend cliques” and time spent in college? I’m going to guess that there would be a double peak, both low and and high
-Is there a link between belonging to friend cliques and critical thinking? Is there an echo chamber effect here?
-What about a link between the course and critical thinking?
And probably a lot more. Point is, we don’t actually know. And that we don’t know much about something as basic as college as college just makes me go :O
The study controls for some lifestyle variables, but not many. Other research does so much more rigorously. I address your concerns about non-linearity in a comment above. The questions about friend associations are also addressed in the literature, in part by Pascarella himself! (start here)
Always do a quick google scholar search before assuming we haven’t tried to study something 🙂
Couldn’t you just as easily conclude that people are more likely to hone critical thinking skills in the context of disagreement than in the context of agreement, so people who are more likely to disagree with their peer group or instructors would get more practice (examining their evidence, refining their logic, etc.)? Then it wouldn’t be a matter of “requiring partial suspension” of critical thinking so much as not being forced to engage in it nearly as often.
I would also expect, say, “aggressive, confrontational” grade school children to be more likely to show gains in pugilistic skills between kindergarten and sixth grade. Being “aggressive and confrontational” is neither necessary nor sufficient for learning a martial art, but a positive correlation would make sense, regardless.
And if colleges are developing an ideological monoculture, wouldn’t marginalized dissenters be getting significantly more opportunities for real training in critical thinking?
Don’t want to bore you, Scott, but if you could at least entertain the thought as a working hypothesis that the job of colleges is Progressive indoctrination (see the tell-tale “a better education population makes a better functioning democracy” i.e. Prog professors will teach the proles what values to vote for), you could see the idea of critical thinking as having these double meanings, one is the more objective meaning and the other is the anti-traditonal-values and similar kind of more indoctrination oriented stuff.
Read Richard Rorty’s infamous “Universality and Truth” essay where he argues that as no universal truths exist, all education is necessarily indoctrination and the often quoted part: “The fundamentalist parents of our fundamentalist students think that the entire ‘American liberal establishment’ is engaged in a conspiracy,(…) The parents have a point. Their point is that we liberal teachers no more feel in a symmetrical communication situation when we talk with bigots than do kindergarten teachers talking with their students. (…) We are going to go right on trying to discredit you in the eyes of your children, trying to strip your fundamentalist religious community of dignity, trying to make your views seem silly rather than discussable.”
This is another potential interpretation of “critical thinking”.
Of course you could just say Rorty is just the opposite political extreme to me. True, but he was immensely influential, while I am just a random troll.
(BTW let me clarify that I too find fundamentalist views silly, the problem is that that human mind cannot bear too much reality, so usually when you pour religion out of the cup, the end result is pouring political ideology, political salvationism into the cup. What Rorty means is not simply emptying the students cups but filling it up with PC stuff like no homophobia.)
Rorty was an immensely influential troll. He had some interesting ideas amid the rhetoric, but almost everyone who is familiar with him thinks he was intentionally obnoxious to invite controversy and attention.
I think that’s a hypothesis that needs addressing (although my personal biases would lead me to tack “..and that’s a good thing!” on).
I always ask why it takes tens of thousands of dollars to teach critical thinking. If this is really the purpose of college, then it can be done in a far more cost effective manner than the university model. So the purpose of college basically comes down to “critical thinking”, signalling, and social experiences, which can all be done in more efficient ways. So why is college considered so important?
Because education in general is treated like some sort of magical panacea for all the world’s ills.
So the purpose of college basically comes down to “critical thinking”, signalling, and social experiences, which can all be done in more efficient ways.
What are more efficient, ready made and packaged, socially accepted ways? The only another alternative way I know is US Army(Navy/Marines etc)
Colleges exist and thrive for a reason – there is no alternative(and I suspect like every monopoly academia will fight tooth and nail anything which is a threat )
Tyler’s Law says that “on every question, there is some literature”, but it often feels that “if the question is very important, the literature will often be poor.”
As a college professor I can at least attest to the attitudes of my colleagues. I don’t think anybody in my department (History) is actually trying to impart “critical thinking” to anyone. We make an honest effort to teach the substance and methodology of our field, and that’s it. “Critical thinking” is more the sell they give you on college tours and brochures from the admissions office.
I do think our more rigorous courses force students to a) read a lot, b) form arguments about their reading, and then c) deploy those arguments in the face of intellectual resistance. For incoming first-years these demands can be quite alien. I think that says something about the state of public education in the US. Anyway, it makes sense to me that any or all gains in terms of “critical thinking” would happen in the first year. My colleagues never tire of remarking on the relative (intellectual) uniformity of second- through fourth-year students; it is the first-years who stand out a mile. It also makes sense to me that students who are relatively alienated or feeling combative towards their fellows would be doing better on “critical thinking” metrics. These are the students who are getting a lot of practice disagreeing and making arguments. If it’s all kumbaya all the time you’re probably not flexing too many intellectual muscles.
“For incoming first-years these demands can be quite alien. I think that says something about the state of public education in the US. Anyway, it makes sense to me that any or all gains in terms of “critical thinking” would happen in the first year.”
I thought basically the same thing: First year students regaining whatever critical thinking high school supressed.
I don’t think they’re regaining a skill they already had; rather, I think they probably gain a higher standard for their own thinking and writing due to exposure to higher standard/more being demanded of them. Not to be elitist, but we had a gym coach teaching us history in high school. In college you will generally be taught history by someone with a PhD in history. That is a big difference.
In much of high school you can get away with basically writing summaries of books read for English class. In college that is generally not enough. But you will also be exposed to this demand in your freshman English class, meaning maybe 3 more years of it show diminishing returns?
Maybe what we really need is for more people with PhDs to teach high school? Like, I know it seems they might be wasted on them, and maybe they would be, but maybe you just start thinking critically when you are taught by people who uh… have to think critically (no offense, critical thinking high school teachers of whom I’m sure there are more than a few)?
From my own experience (anecdote warning) many years ago I saw that people who went to university (UK set up is a little different to the USA) and only returned to their parent’s home between terms seemed to develop more maturity than those who went to university and lived at home, or returned every weekend. Is ‘maturity’ linked to critical thinking? I think so, but we need some studies…
So one confounding element I’d like to see investigated is the effect of ‘staying away from home’.
“and another study that shows a weird u-shaped effect”
Um, not really? I mean, if you plot that graph in your post with a proper y-axis, you get a “u”-shape with a negligible dip in the middle. The change in values is absolutely tiny.
This graph is yet another example of how anyone – scientists, journalists, whatever – can represent data as misleadingly as they want.
I don’t get this obsession with showing the origin on every axis, on every chart. It’s not misleading at all, just look at the damn Y axis. Maybe it’s misleading if you’re illiterate and can’t read the numbers. For the rest of us, it’s far clearer because the (admittedly small) variation from point to point can be seen more easily.
“it’s far clearer because the (admittedly small) variation from point to point can be seen more easily.”
That’s the POINT. The truth is that it is difficult to see. To deliberately make it more easy is to intentionally mislead people.
I’ve noticed a lot of reasonably smart people get unnecessarily confused by misleading graphs. In order to maximize correct understanding for the largest audience, better to err on the side of standardization to an easily understood norm rather than customization for appearance’s sake.
But insofar as the u-shape is tiny, the real effect is also tiny. You can’t deny the u-shape significance but keep the significance of the real effect!
Exactly. Correct conclusion that should be drawn is that both the u-shape and the real effect are small. I find it even more concerning that the real effect is exaggerated with such zoom-ins.
I fixed the graph! Here you go!
Is zero basis really appropriate?
Who would actually score zero on a critical thinking assessment?
Ok, good point. I do like to default to zero, but in this case the minimum score on the measure is 40, and the max is 80. I adjusted the graph to reflect that.
Here’s another test for you.
After 4 years in the military many get out and go to college.
A lot fewer get out of college after 4 years and go in the military.
Thus college must be better for critical thinking!
(that was intended as humor, For the record I’ve got around 10 years of service in uniform and twice worked with military personnel outside of the US).
Of course we go to college after the military! We’ve got that GI Bill to burn, and who doesn’t want to chase coeds and drink beer for a few years? Worse comes to worse, you can always go dark side, get commissioned afterward and go right back in!
I think my opinions of university are clouded by two things. Firstly, that a certain amount of knowledge/experience has to be good for society, please, please please. Now, I think we overemphasise how much university provides that, as opposed to learning practical skills. And overemphasise how reliablely university courses are actually relevant and retained. It’d be better for education being an ongoing thing. But I’m scared by the idea of reducing education.
Secondly, to the extent that university is just a big self-perpetuating selection criteria for middle/upper-class-ness, that’s a big problem, but I’m scared that in the short term rejecting it will entrench rather than break down the boundaries…
“sense of participation in a friendly, supportive peer environment may require a partial suspension of one’s critical thinking skills.”
It might not be just cynical. Friendly and supportive environment is often codeword for “it is inappropriate to contradict others or arguing with them too much”. When people try too hard to be nice and friendly, people end up being silent or going around instead of risking heated arguments.
Of course, the model should not be openly hostile environment, just the one that is not so supportive that it discourage open or even heated disagreements.
Developing critical thinking skills is entirely useless even if done successfully. You’ll just produce people who get really good at making “isolated demands for rigor” (as described in a previous post on this blog). Even when you have good critical thinking skills, nobody’s going to get you to apply them to ideas which are socially esteemed within your peer group.
Suggestion for the U-shaped curve: Students who are very part-time are probably doing one night class while working. Students who are doing full-time or more are working full-time on school. Students who are taking 2-3 classes are probably screwups who can’t handle a full-time anything, and are more likely to be hard-partying folks not learning many useful things. (This is not by any means universal, but in my experience is more often true than not)
Sounds about right.
File this under selection effects trump everything part MCCXI.
The graph in college credit is interesting, 12-15 credits would be a liberal arts major, anyone with labs would have 16 or more (maybe less than 15 if they took 4 courses – but that is the exception). I would suspect a strong selection bias.
As a philosophy instructor, I would like to offer some evidence to support the view that philosophy teaches critical thinking skills to majors.
Philosophy majors do better than just about anyone on post-graduate exams, including ones that are far outside the study of philosophy, such as GMATs. This suggests that they possess very general thinking skills.
The GREs are very similar to the SATs in terms of skills they measure. Philosophy majors don’t just do better than everyone here (except, in certain years, physics majors), they do so despite the fact that incoming philosophy students do fairly average on SATs. This would suggest that students are learning the skills necessary to do well on these tests while in college, rather than simply being the sorts of students who do well on them to begin with.
In terms of long-term effect, there is at least some evidence that philosophy majors maintain the skills they acquire in college. Philosophy majors don’t have good starting salaries, probably due in large part to a Marco Rubio-like prejudice against the practical value of philosophy. However, by mid-career, philosophy majors earn more than any other humanities major (and significantly more than welders, Mr. Rubio). This suggests that they have the skills necessary to do well on the job and earn promotions at higher rates than their coworkers. This would seem to show that they maintain a general ability to think through problems they face in a variety of circumstances.
None of this is conclusive, but I think it provides a pretty good initial case for the view that philosophy improves students’ thinking skills, and that it does so in a way that lasts.
>Philosophy majors don’t have good starting salaries
I find it even strange they can find jobs at all. Presumably with bigger firms with a training plan, not the smaller ones who postpone hiring to the last minute then they need to hit the ground running.
Why would they be worse off than all the other people whose degrees aren’t directly relevant to the job? An engineering degree followed by an engineering job is the exception.
Philosophy grads are smart and hard working, companies want to fill their next to entry level positions (Client representatives, sales teams, low level supervisors) with smart and hard working people. Seems like as good an explanation as any.
I would suggest that you remove references to politician’s names.
Your main point is interesting and doesn’t deserve to be subsumed by a political fight.
I think I’m past the editing window. I only mentioned him because he specifically said we needed fewer philosophy majors. Odd for a politician to take such a specific stance against a discipline.
I am not sure what you mean by “incoming philosophy students” whose SAT scores are “fairly average.” Are those students who report an intent to major in philosophy and/or populate the lower-level courses? Or are those the students who actually end up majoring in philosophy?
If you are talking about the difference in metrics between those students at the start of an undergraduate philosophy program and those at the end of a philosophy program, I’d fear selection bias. Student metrics like GPA seem to increase over the course of many college major programs, particularly those that are (perceived to be) challenging. This almost always turns out to be the effect of weaker students opting out.
Today’s mid-career philosophy majors got work and gained seniority in a different economy than today’s starting philosophy majors. I’m not sure the salary difference supports your conclusion. Also the relevance of major for salary/career prospects seems to depend enormously on the school. At my institution employment prospects are remarkably uniform across all humanities and social science majors. The Studies programs have slightly worse statistics, which again is probably selection bias. Also a few majors seem to take slight employment or salary hits but when you look closer it turns out they’re sending a lot of people to grad school. I wonder if Philosophy has that dynamic.
The difference is between students who state an intention to study philosophy and those who get a degree. It is possible that the group of students who graduate from philosophy programs is very different from the group that plans to go into them, which is one reason I’m not completely sure the data supports the conclusion. It would have to be the case that the variation is much greater than the variation in other fields in order to explain the degree of increase relative to other disciplines, though.
I’m uncertain how the changes in the economy will affect these results. We obviously can’t know the mid-career salaries of recent graduates. But I don’t see why the changes would affect the conclusion that the skills stick with people.
“It would have to be the case that the variation is much greater….”
According to your data, it is already the case that philosophy majors have the highest GRE scores, so some metrics are going to have to exceed the usual range.
Otherwise, apologies, I misunderstood your argument about wages. You want to read high salary as a proxy for skills where it might benefit your view, while reading low salary as a product of bias where it might undermine your view. I don’t think that approach can be convincing.
I don’t see why not. I’m assuming the following:
1- Mid-career salaries are a reflection of actual productivity.
2- Starting salaries are a reflection of anticipated productivity.
3- Actual productivity is a product of skills.
4- Anticipated productivity is a product of estimation of skills.
5- The greater the difference between actual productivity and anticipated productivity, the greater the evidence for bias in the estimation of anticipated productivity.
6-The greater the difference between mid-career salaries and starting salaries, the greater the difference between anticipated and actual productivity.
This would entail that the data on salaries of philosophy majors supports the view that starting salaries are a product of bias while mid-career salaries are a product of skill. It’s assumed that they will be no better or slightly worse than other humanities majors, but their actual productivity exceeds these expectations, leading to higher mid-career salaries than those from other majors. I’m not sure which of these claims you would want to reject.
What is so awful about Arum and Roska’s explanation of the relationship between student social engagement and critical thinking?
Awful in the sense of depressing – they’re saying that society rejects people who think critically.
Actual critical thinking is anti-social. Society is tribal signalling. The guy saying “are you sure about that recent story that totally validates all our biases?” is not signalling tribal loyalty.
From the meta-level, there is a balancing act to be done by a successful society. Enough critical thinking that you don’t start making insane errors, not enough to destroy the social bonds of society.
On the personal level, most of us, even the very rational and argumentative, know when to let something an older relative said slide. You have to balance being publicly right and not having a public.
Maybe the critical thinkers are rejecting society.
If we take the U curve as real it is fairly easy to come up with a hypothesis. How about hours in class/hours on campus not in class. Students that take a lot of classes obviously spend more time in class vs time out of class. Students that take only 1 course a semester (especially as freshmen) are also (probably) not spending much time on campus. Most (all?) colleges have a minimum number of registered hours to live on campus and 1 course wouldn’t qualify, so 1 course students are living off campus and probably working when not in class. Meanwhile students getting the “full college experience” are the ones taking the minimum number of courses to live on campus and are the ones scoring the worst on critical reasoning tests.
This certainly fits my perception of college life, free time is mostly wasted time.
Are there any studies about acquiring cognitive skills that don’t just treat non-college as a control? Like, not college vs not college but college vs automotive mechanics vs retail workers? I could swear I met a grad student who was into this question, but maybe she was the first….
It also seems like studies of this type would get obsolete rather quickly, unless we assume that the college experience in 1987 is the same as the college experience in 2015. (It seems like students in 2015 are explicitly demanding less critical thinking and more “safe spaces” to be free from opinions they don’t already hold, but this could just be selection bias from the media.)
If you consider ‘critical thinking’ to be ‘the ability to read, write, and understand postmodern/poststructuralist literary and/or film criticism’ (which is not a totally insane interpretation — I’ve seen intelligent people who, in good faith and with no clear ulterior motives, claim that this is the original meaning of the term), then I would make the argument that a university education is quite useful, if only because university-educated people have a better chance of having had previously encountered the words “rhizomatic”, “simulacra”, “verisimilitude”, and “detourn”. That said, there are good arguments for the position that that particular set of philosophical movements does not earn its specialized vocabulary in the same way that, say, the hard sciences often do and that the specialized vocabulary of Frankfurt School derived philosophy has an element of obscurantist shibboleth to it.
The tulip subsidies metaphor as it applies to Bernie Sanders doesn’t really work though. Free tuition at public colleges does not mean paying whatever ridiculous prices private schools decide to charge. In the tulip metaphor it would be the equivalent of having the government grow its own nice little patch of tulips, and give them away for free to anyone who wants to get married. Perhaps this would be a bit of a waste of resources compared to overhauling the whole tulip based social structure, but it’s not really all that expensive, and it conclusively solves the upward price spiral.
As long as you limit it to one tulip per citizen then such a system can work quite well.
Then how come you can’t pay for tuition at a public college with a summer job anymore?
You do realize that higher education e. g. in Germany works as described by “Quite Likely”. I’m not saying its perfect, but it can be done.
You should be somewhat familiar with the irish system.
Irish citizens got to do a free degree.
If you had to repeat a year you paid approx 6000 Euro for the year.
If not repeating summer work plus a very modest grant covered most of the expenses of most students. I came out of Uni with zero debt and I wasn’t unusual.
Just because the US tends to pick the worst of all possible combinations of free market and government regulation doesn’t mean that everyone does.
It didn’t seem to turn into a tulips-spiral.
Irish citizens got to do a free degree.
How many students, as a portion of the college-ready population? Compared to the USA? Compared to the UK?
If you had to repeat a year you paid approx 6000 Euro for the year.
This matches what I’ve seen of free colleges in other countries – college is free, but the first year classes are three to four times the size second year, with high failure rates.
There was still a need to purchase books, rent, etc, so there was still a socio-economic split between ‘could afford college’ and ‘could not afford college.’ With a higher emphasis on passing that first year, the students who did not have to take a second job to pay for room and board were more likely to succeed.
(This is not to say that state-funded uni doesn’t have some upsides. Just that ‘free college’ is not a universal good anywhere at this time.)
My course had the highest failure rate in the hole Uni and did indeed have a 50% failure rate in first year but that was the exception. Most courses had far lower failure rates.
People from low income families could also get a grant towards various expenses. It wasn’t perfectly flat but you didn’t need to be particularly well off to go to uni. I worked a job all the way through as did most of my classmates.
I worked a job all the way through as did most of my classmates.
Just not the ones who dropped out, because they were of the better than 50% who failed a needful course.
(Unless I’m not using “course” the same way you are. For the USA, “course” is one set of classes generally taught by the same instructor on a particular area of study. “History of Latin America” or “Agricultural Economics and Political Policy” or “Poets of the English Romantic Period” or “Calculus II” or “Environmental Toxicology.” All of these might be taken over one semester for all the school work in that half of the school year. Is that what you meant by ‘course’?
By course I mean the whole 4 year degree.
We’d call “Calculus II”, “network computing” or “Discrete math” a module where a 4 year course might include between 20 to 40 modules.
As for fraction of the population who complete 3rd level education
The equivalent figure for the US appears to be 44%
There are a lot of people saying “you totally can in other countries, the US has a different problem”, but that doesn’t seem like a fair representation to me.
Most of the well done studies I’ve seen on US college tuitions conclude that government subsidies are a substantial cause of tuition increase, explaining perhaps half of the above-inflation price hikes.
The obvious differentiator between US colleges/tulip-subsides and European colleges/things-that-work is that the European model includes a significant amount of government influence in price regulation. It’s the same model that works with utilities and other natural monopolies – the government lets some people establish a monopoly and seek rents, but strictly limits the size of those rents.
If UK tuitions tripled overnight, the UK government wouldn’t keep subsidizing the same proportion of an education (and indeed, they would lash out quite aggressively at the people raising tuitions that far). It was a minor scandal when they moved from free to charging a few thousand pounds with need-based aid.
US tuitions, meanwhile, are set by private organizations with little intervention. When they go up, the government grumbles but still pays a roughly constant percentage of money owed to colleges. The remainder is covered by the upper (and upper-middle) class, along with a group of federally-defended loan sharks who know that none of the usual measures (i.e. bankruptcy) will keep them from getting their money.
In effect, the US system is a destructive tulip subsidy, while the European system is the directly government managed system Quite Likely described. It’s essentially a principal-agent problem: the US government committed to putting up money based on a number picked by someone they can’t control.
Just like with health care we have a uniquely bad system because we refuse to jump one way *or* the other.
The US uses a different system though.
In Ireland the state negotiates with the collages and comes up with a rate that they will pay per student. This money never enters the students hands and the Uni has strict limits on what excess fees they can charge the students. Places are allocated strictly according to a system of testing at the end of highschool. John Moneybags scored 300 points while you got 310? You get the place in the course.
In the US the government offers loans to students which they can then go to a Uni with and the university can charge whatever price they feel like. If there’s competition for places they can up the prices until students are maxing out their loans and family resources. It’s blindingly obvious that such a poorly planned form of subsidies is going to push prices up.
“The tulip subsidies metaphor as it applies to Bernie Sanders doesn’t really work though. Free tuition at public colleges does not mean paying whatever ridiculous prices private schools decide to charge. In the tulip metaphor it would be the equivalent of having the government grow its own nice little patch of tulips, and give them away for free to anyone who wants to get married”
If more tulips could simply be grown with minimal effort the price would never skyrocket. One cannot simply “produce” more college without effecting the cost/price of college. This is the best understood portion of economics, increases in demand will either be met by increases in supply or increases in price.
You’re assuming a simple fair market rather than one where parties currently making money use their position to stifle competition.
This isn’t a theoretical thing. Lots of european countries run publicly funded uni systems.
And how are you going to pay the gardeners for your tulip garden?
What are you doing to do when the tulips from that garden are distinctly inferior because the grade of gardener is inferior?
What happens when people who buy their own tulips revolt at the subsidy?
Top 10 rankings contain 5 US universities 4 of the others are UK universities where the prices are strictly controlled by the state.
So there’s something wrong with your assumptions.
Cambridge, Oxford, UCL, Imperial. How are these soundly beating so many US universities when by your logic they should be producing inferior quality?
Well, here in the US we don’t have students rioting over student fees.
Traditional research universities and four-year colleges seem to be a strange sort of institution that isn’t readily created to meet either economic demand or government policy. I can’t find a sorted national list, but here in California the last state-founded research university was UC Irvine in 1965; the three CSU campuses founded since have a total enrollment of less than 3,000 and do not seem to have any great prestige.
Governments can build community colleges on demand, and the private sector for-profit colleges, but these don’t have the track record we are really looking for in the critical-thinking department, or for that matter in the job-skills department or the granting-valued-credentials department. Building good institutions has always been a black art; maybe the federal or various state governments still have the knack for it in this arena, but I’m skeptical of any plan that depends on large numbers of new research universities or four-year colleges being built ex nihilo, by anyone.
UC Merced was founded in 2005 and currently enrolls about 6000 students. But it’s not very prestigious compared to the other UC schools, at least not yet.
Not sure how I missed that one; thanks for catching it.
So, in the past fifty years, the population of Californians age 18-22 has increased by roughly two million. In the same period, the government of the state of California has managed to establish new four-year colleges and universities with a total enrollment of about ten thousand. The government “growing its own nice little patch of tulips” does not seem to be the answer here, unless someone knows the secret for increasing Sacramento’s performance in this area two hundredfold.
Embiggening the tulips might do a bit better, as the existing UC/CSU institutions increased their enrollment by 200,000 students over the same period. So if we’re willing to accept that only 10% of students will go to 4-year colleges, the state can maybe keep up with that. Or we can figure out how to make the state government ten times as effective at college expansion, without busting the budget or reducing quality by creating unmanageable behemoths enrolling half a million students.
Or we can maybe do something completely different.
Where do you get your population figures?
You can’t reproduce the labs that easily.
There’s archaeology, but there’s not a lot of jobs in that, and they’re closely tied to research; if you’re not doing grant-funded research, you make most of your money as an archaeologist by analyzing sites dug up during major construction projects (and signing off that continuing to dig isn’t going to be a scientific or cultural disaster).
I can’t think of any others. Architecture has some similar features but it’s not a perfect match, and it’s one of those jobs where what you actually do early in your career is totally different from what you trained to do in school.
Essay writing is necessary for almost any worthwhile degree, even STEM (if you define “essay” broadly), but essay grading is not doable for free online yet (the big MOOCs have tried but are not really anywhere close) and actual essay feedback is of course impossible.
I’m suspicious of “moderately good [study] designs”… Seems like “a little bit pregnant”. If there’s one thing I’ve learned from SSC, if a study isn’t bulletproof, the most you can confidently conclude from it is “something happened”!
In this case there’s a pretty obvious problem with the first Pascarella study, which is that subjects self-selected whether to be in the control or treatment group. Given that going to college is a Big Life Choice, it’s likely tied to tons of possible confounders: values, ambitions, family circumstances, like / dislike of school, etc. Yes, Pascarella tried to pair students along the dimensions that he could measure–but all that means is that his measurements don’t detect whatever difference that led to the students taking dramatically different life paths.
To be fair, I can’t imagine a study design that would address this issue, short of building a placebo university (Go PU!!). But I think the right response isn’t “well, we did our best, here’s the numbers”, I think the right response is “well, we don’t have the tools to study this problem quantitatively, let’s study it qualitatively”.
There is no such thing as a perfect study design, especially when you can’t ethically randomize people. The best you can do is try a bunch of different moderately good study designs, replicate them a few times, and declare victory if everything is pointing to the same place.
You can also just think about it logically and decide that it’s hard to see why choosing to go to college (after controlling for how smart you are, etc) would make you gain critical thinking skills more quickly. Or why that would be correlated with how much you study. Or why it would show a linear relationship with how many hours of college coursework you took.
There has to be a middle ground between gullibility and Cartesian skepticism.
Well, let’s say hypothetically that “motivation to learn” is a variable that leads to increase in critical thinking skills. I would expect it to correlate with deciding to attend college, as well as # of hours you study / amount of course-work you take. Maybe in a world without college, where instead people just got a 4-year vacation, those people would be the ones spending their discretionary time reading books or blogs or having long intellectual discussions in coffee-shops, and you’d see the exact same effect.
I agree there should be a middle ground between gullibility and skepticism, but I feel like it should be drawn somewhere other than here 🙂
Or rather — I feel like it’s being drawn based on the limits of our investigative tools, rather than based on reality. If you’re trying to get to the moon, and your spaceship can only travel 100,000 miles instead of the full 239,000, you don’t launch the spaceship, you go back to the drawing board.
I thought about this a little more, and realized the question “how credulous should we be when looking at research like this?” is bigger than I thought it was.
This is not value-neutral research. Colleges aren’t some fundamental law of physics; rather, they are contemporary societal institution. Colleges are arbitrary in the sense that as they exist today, they represent a tiny quadrant of the possibility space of ways we could raise our young to adulthood. This tiny quadrant is not interesting because it is especially illustrative of fundamental laws of human learning; it’s interesting because it exists here today and we all live with the consequences of it.
So the decision to study the performance of colleges isn’t an abstract scientific one; it carries political implications, because implicitly it is trying the question of “is the status quo good?” Moreover, Pascarella’s decision to choose “critical thinking” as the thing to study about college implicitly contains value-judgments about the purpose of the educational system.
This study is a referendum on a social institution. It’s an attempt to say “this is what colleges should do; are they doing it?” That’s why this research exists and why it gets headlines and why people talk about it.
So the standard for evaluating said research should be, does it meaningfully contribute to the discussion? Does it move us closer to a place where we can say “yes, colleges work”, or “no, colleges need reform.”
When I look it in that light, I think my original critique of the study was too mild. I don’t think you can meaningfully take a social institution that bifurcates society into participants and non-participants, compare participants and non-participants along one narrow dimension, and say anything that meaningfully contributes to an evaluation of the worth of that institution. We need the full context: why people chose to participate in the institution, why others did not, how that affected the lives of those who did, and those who didn’t across all dimensions, and the impact that having these distinct groups have on society.
I guess what I’m saying is that the analytic tool of “controlled study” is not a good tool for evaluating non-linear systems as complicated as college is. Understanding a relationship between two variables without understanding the entire context of the system that mediates that relationship is not helpful for holistically evaluating the health of the system.
Too much critical thinking may lead a good liberal arts college senior to ask themselves “Wait, I just spent $100,000 on WHAT?”, which should quickly dampen either college enrollment or critical thinking 🙂
Maybe we can judge gains in critical thinking by comparing how many humanities undergrads start out seeking PhDs, and how many drop out before they complete one. 🙂
We don’t know that. All we have is a regression coefficient and an R². We know it’s not a U-shape because they tested that, but it could be anything else. It could be monotone but not linear or it could be a crazy zig-zag. They didn’t produce a graph. They didn’t even give us the 6 numbers corresponding to the same 6 buckets as in the 2-year college case. In fact, they imply that they didn’t graph the 2-year case until after they had a regression coefficient saying that it was U-shaped. All we know is that the 4-year effect size is only 1.5x the 2-year effect size. That’s plenty small enough not to be linear, or even monotone.
I’m a tiny bit proud of Tumblr today, because this nonsense from 2012 (“Vatican in awe! Vatican in dismay!”) showed up again as a post by someone for whatever reason, got one uncritical comment, but was then taken apart by theology/history students (and let me clarify, these were not religiously-motivated; one person referred to “Jesus fandom”, “fanfic” and “Big Name Fans” so treating the early centuries of Christianity on the same level as fandom wars – which is not the worst way of looking at it – while still explaining the errors in the posted article).
That is, somebody who had knowledge of the topic and was not writing from a sectarian viewpoint was able to say not alone “this is dumb”, but why it was dumb. (I contributed my own little pebble to the pile re: history, but it was only following in the wake of the other). It was a real pleasure to see someone using their historical studies to educate and inform.
So I’m feeling much more optimistic about young people going to college nowadays 🙂 I don’t know exactly if that was an example of critical thinking, but it was a lovely example of using knowledge accurately, never mind if they believed/disbelieved.
If school taught critical thinking, or teaching how to think, you would expect instruction in one subject to translate to better performance in subjects.
There’s a fairly large literature on the transfer of learning in schools that finds this doesn’t happen. If you teach people Algebra, they’ll do better in Calculus, but they won’t typically do any better in language, art, social science, history, English, or other mostly-unrelated subjects.
See here for a survey, and ideas on how to improve the matter. First chapter available here (pdf), worth your time.
This seems like a complete death knell for the idea that school as-is teaches critical thinking.
(Sub-note: If it feels like this must be wrong, because you can remember a time you applied some principle of Calculus to a problem in history, consider that you are probably smarter than the average bear, and that this research is about the average bear.)
My understanding of critical thinking is somewhere along the lines of “Encouraging students to dissect and analyze a topic as opposed to taking it at face-value.” Which I think is valuable in both STEM and non-STEM.
(Apologies if the following is something that should be in another thread. I believe it’s related enough to this post, but if not, let me know.)
My preference is that schools teach a combination of accurate facts and critical thinking. I would rather not rely on only test scores, or only free-form essays, to judge quality of education. Does anyone who has school age children (or remember being a school age child!), or just plain know about the way schools in the U.S. are structured, know an effective way of making sure the school they’re sending their child to isn’t useless?
Scott is simply trying to pull up the ladder.
After having attained his degree, he decided it would be easier to oppress other people, than to compete with others.
A diploma for everyone!
Just for the recond, would that be an example of critical thinking learned at college, or acquired autodidactically?
Please comment on this:
(Seems to demonstrate mind/brain distinction, since there certainly IS such a thing as a male or female MIND! (Female minds are cuter.))
They’re making the same error people make when they talk about IQ not being real. Scott had a great post about this problem relating it to comas here.
This is idiotic. They say:
“In total, the group identified 29 brain regions that generally seem to be different sizes in self-identified males and females.”
Then they say that since very few people have exactly gender-congruent traits in every one of the 29 regions, “there’s no such thing as a male or female brain”.
Imagine I said “there’s no such thing as people who act in masculine or feminine ways”. I prove this by making a list of twenty-nine male vs. female cultural roles. For example, “men wear suits and women wear dresses”, “men like to play football, women like to cook”, “men eat steak, women eat salads”, and so on.
Then I note that in my sample of a thousand people, nobody has every single male trait. Therefore, there’s no such thing as people who act in masculine ways, and the stereotype that men act in masculine ways and women in feminine ways is a myth.
On the other hand, if you were a little smarter than that, you could say that there are a lot of men who act in 20/29 masculine ways, and a lot of women who act in 20/29 feminine ways. Or you could say on average men act in 15 masculine vs. 5 feminine ways, and on average women act in 5 masculine vs. 15 feminine ways. On this analysis, masculinity and femininity seem to exist after all.
This is the classic example of an isolated demand for rigor – take your opponents’ position, define it as true only if it is true in a perfect crisp Platonic sense in which even “chairs exist” is not true, then when that’s obviously wrong declare that you have proven your opponents false.
Come on, people, you don’t need me to tell you these things.
But can we really be sure that this thing we sit on has the chair-nature?
Does it have the credentials?
Tip: Don’t ask this question of a Zen master. At least, not in person.
I still don’t understand how we are supposed to believe, at the same time, that:
A) there are no innate mental differences between men and women, gender is a social construct; and
B) transgender people are born with the mind of the gender the identify with, and their body is mismatched.
At least TERFs reject B, and trans-activists reject A. But the mainstream progressive position seems to require holding both A and B! How does that even work?
Trangender people are not the same thing as masculine women or feminine men. They are supposed to be people who “feel like being other gender”.
Otherwise said, a women that wants to join marines and be tough or women that likes to do high voltage engineering are not transgendered. Man who reads and writes romance books or would like to teach small kids is not transgendered. My little pony fans are not trangendered. This is what a. refers to.
On the other hand, a person born penis who plays shooters, hates romance still can be transgendered and feel like women. This is what b. refers to.
Being transgendered is about identity while male/female brain thing is about idea of proper way how to be woman/man.
Social construct is about girls wearing both skirts and pants while boys wearing only pants. It is unlikely that there is something genetic about that particular preference, however very small kids already have it and know who is supposed to wear what.
While of course everybody has contradictory beliefs about some things, and no doubt some progressives have the contradictory views you mention here, you have not exhausted the territory. I am something of a progressive myself; I don’t know if I’m mainstream, but I do, for example, support the Democratic party, which seems like an awfully mainstream leftist thing to do in the U.S. I’m certainly not a TERF, and neither am I a trans-activist. But rather than accepting both A and B, I reject both. Or at least I think they are both false, though in some circumstances a simple claim is better than a complex claim, and so when the truth is complex sometimes a simple falsehood is a better guide than trying to apply the complex truth.
“Critical” means “decisive”. Which implies decision making and judgement. So my idea of critical thinking entails “accurately judging the truth-value of a proposition”. Which is different than (for example), questioning the source’s ulterior motives, or recalling information from memory.
(Besides the definitional dispute,) one issue I have with “critical thinking” is that people think it’s a useful skill per se. A panacea to any and all questions. Like I said under the Reporter Degrees of Freedom post, “Are fusion reactors just around the corner? Just apply critical thinking!” 
No, that’s not quite how it works. I do think that critical thinking as an attitude can (once acquired) be generalized across several fields, like logic can. But the ability to productively apply critical thinking to a particular topic only comes after acquiring a basic understanding of the relevant field.
As an analogy, critical-thinking : basic understanding :: swiss army knife : raw material. A swiss army knife can be used towards lots of various purposes. But it’s not every useful on its own. It’s a tool used to shape and manipulate a material object such as wood.
Similarly, critical thinking needs a “model of the subject matter” to analyze in order to be useful. Otherwise, the only things to analyze are the extant contents of one’s mind, which are probably not a suitable medium for the desired object. Alternatively, if you try to cut something too “dense” like steel with nothing more than your pocket knife, you’re gonna have a bad time. (I picked a knife analogy deliberately, because “analysis” is semantically similar to “cutting”.)
But I think most people’s idea of critical thinking is “you can solve all the world’s problems with this one weird trick (without investing the time to grok the pertinent field)”. Since critical thinking is supposedly so compact and versatile, that’s what we expect kids to be taught in school. And on the off chance that “that one earnest nerd” takes the critical thinking lessons to heart, (s)he thinks they’re suddenly an expert in Quantum Tachyon Defibrillation. There’s a redditor born every minute.
 “You don’t have enough fissile material” said Tom critically.
For those of us teaching philosophy and similar classes, what are some ideas for how to encourage genuine critical thinking among students? (I adopt, as an operational definition of ‘genuine critical thinking,’ being like Scott.)
Here are a few of my ideas:
(1) Use argument mapping: http://dailynous.com/2015/01/12/mapping-philosophical-arguments/
(2) Teach students probability theory.
(3) Emphasize the evaluation of arguments, rather than positions. Some textbooks do this better than others. In applied ethics, Boonin and Oddie’s What’s Wrong? does an excellent job of focusing students’ attention on specific arguments related to hot-button topics.
(4) Discuss politically sensitive topics, and encourage students on *all* sides of issues, including non-politically correct sides, to speak openly. Consider engaging in the following dialogue with your students, borrowed from Jonathan Haidt (http://heterodoxacademy.org/2015/11/24/the-yale-problem-begins-in-high-school/) [“Centerville” is an elite private high school]:
Me: What kind of intellectual climate do you want here at Centerville? Would you rather have option A: a school where people with views you find offensive keep their mouths shut, or B: a school where everyone feels that they can speak up in class discussions?
Audience: All hands go up for B.
Me: OK, let’s see if you have that. When there is a class discussion about gender issues, do you feel free to speak up and say what you are thinking? Or do you feel that you are walking on eggshells and you must heavily censor yourself? Just the girls in the class, raise your hand if you feel you can speak up? [about 70% said they feel free, vs about 10% who said eggshells ]. Now just the boys? [about 80% said eggshells, nobody said they feel free].
Me: Now let’s try it for race. When a topic related to race comes up in class, do you feel free to speak up and say what you are thinking, or do you feel that you are walking on eggshells and you must heavily censor yourself? Just the non-white students? [the group was around 30% non-white, mostly South and East Asians, and some African Americans. A majority said they felt free to speak, although a large minority said eggshells] Now just the white students? [A large majority said eggshells]
Me: Now lets try it for politics. How many of you would say you are on the right politically, or that you are conservative or Republican? [6 hands went up, out of 60 students]. Just you folks, when politically charged topics come up, can you speak freely? [Only one hand went up, but that student clarified that everyone gets mad at him when he speaks up, but he does it anyway. The other 5 said eggshells.] How many of you are on the left, liberal, or democrat? [Most hands go up] Can you speak freely, or is it eggshells? [Almost all said they can speak freely.]
Me: So let me get this straight. You were unanimous in saying that you want your school to be a place where people feel free to speak up, even if you strongly dislike their views. But you don’t have such a school. In fact, you have exactly the sort of “tolerance” that Herbert Marcuse advocated [which I had discussed in my lecture, and which you can read about here]. You have a school in which only people in the preferred groups get to speak, and everyone else is afraid. What are you going to do about this? Let’s talk.
(5) Have your students take the “ideological Turing test.” Assign them to write two short essays on some divisive topic, one arguing for their position, one arguing for the opponent’s position. Consider having other students read these and guess the authors’ true position, although this would put a lot of burden on the students unless your class is very small.
I have done (2) and (3) in my classes, and found them to be fairly successful. I very much want to try (1) and (5), and, with the current campus crazies going on, find myself feeling more of a need to engage in the kind of dialogue mentioned in (4), although I haven’t done so yet.
“I adopt, as an operational definition of ‘genuine critical thinking,’ being like Scott'”
I appreciate the compliment, but can we try to tone down the level of cultishness around here?
How shall we fuck off, oh Lord?
… Now I don’t know whether I should follow this directive or not!
If it makes you feel better, I think you’re wrong about lots of stuff.
Also, writing ‘being like me’ would have been poor form.
Eeney-ooney wannah! Eeney-ooney wannah!
There is a subreddit solely dedicated to how terrible you are. Does this increase or decrease your perceived status as a cult leader?
If it’s the one I am thinking of, it’s A: about how the slatestarcodex subbreddit is terrible, and B: just one goddamn guy posting everything.
You know, I was just thinking that it was probably no coincidence that your triumphal East Coast tour was followed by PSJ’s brutal smackdown in this thread. Adulation will make you careless! You should be wary.
It was a really, really easy mistake to make. The only figure they included in the entire paper was showing the one effect they found likely to be spurious. Even if their research was fine, their writing organization was horrendous. 🙁
If you like, we could hire a guy to follow you around whispering “Always remember you are but a man.”
Based on my experience of taking a lot of classes designed to teach practical critical thinking (i.e. pick apart the methodology of these five journal articles and design your own experiment), I’m comfortable saying that you cannot teach critical thinking in a classroom. The students who care already figured it out years ago, so you’re just torturing the rest of them who are only there for the degree.
My best advice is to take whatever it is that you’re supposed to do with philosophy in the real world and make them do that. Real problems are much less forgiving than any professor and troubleshooting a failed attempt provides much more feedback. Two months of working as an intern is worth two years studying as a student.
Real problems are much less forgiving than any professor and troubleshooting a failed attempt provides much more feedback. Two months of working as an intern is worth two years studying as a student.
Agreed, with the caveat that one has to ensure the feedback is titrated to the student’s expectations/understanding. Some students will be devastated that so many holes (ie, more than two) are found in their solution, others will hold that the problem was in reality for failing to adhere to the perfection of their solution. (I expect that most instructors already know this, but I’ve had classes from ones who didn’t.)
Really? I’m still learning about experimental design. There are such a vast number of ways to design an experiment that figuring out which one is superior for every single type of problem is a very difficult problem. I certainly wasn’t very good at it when I finished my Bachelor’s, and I would say that is an important part of critical thinking. Many who seem otherwise quite capable know little about it.
Step 1, have them write a paper where they are supposed to use critical thinking.
Step 2, point out in detail what they did wrong.
Step 3, make them re-write the paper incorporating your criticism.
Step 4, do steps 1-3 over and over again until they form a model in their mind of what you’re looking for. Conveniently, this “model of what you’re looking for” is known as “critical thinking.”
I don’t mean to get too entirely meta, but, one of the main findings of “teaching research” is that teaching anything is made extraordinarily difficult by the general tendency for humans to think they have understood something, and then stop paying attention or exerting effort, way before they have actually correctly understood that thing. This is why the Direct Instruction method of pedagogy has been shown to be superior, IIRC. It’s a method based on showing not only examples of a concept, but examples of what that concept is not, emphasizing and forestalling the types of errors the student may be making in their impressions.
So, if you’re in a course that’s supposed to be teaching critical thinking, the brain of every student is probably saying, “Oh, I already know how to think critically. Obviously none of these other saps do. I’ll just go through the motions and listen to what the professor is saying, but really, what does he have to teach me?” In order to break through this kind of defense, you have to repeatedly nail the student on their actual failures to think critically.
I feel like it’s blindingly obvious that my 4-step process is the correct and only way to go about this, and of course the lack of funding and poor student/teacher ratio etc. etc. makes it impractical, which means, basically, actually teaching critical thinking is impractical.
Good thoughts. I know other philosophy instructors who have had success with the following sequence of assignments:
(1) Reconstruct philosophical argument X in half a page.
(2) Taking into my critiques of your original reconstructed argument, rewrite it so that it’s valid, and then offer a criticism of the argument.
(3) Taking into my new critiques of your reconstructed argument, which is still not valid, rewrite it again, and then offer a response to your criticism.
(4) Revise it one last time in response to one last set of critiques.
Argument mapping is very useful, I’ve found. In all of my philosophy classes, I teach argument mapping the first day and require the students to make an argument map for every reading. The feedback I have gotten has been mostly positive.
Thanks! Do you have any recommended argument mapping resources? I haven’t looked at much other than the links in the Daily Nous article.
Not really; you’d think it’d be more popular. Some basic logic texts start out with some type of mapping. Oddly enough, Schaum’s Outline of Logic is one of the bests at doing this. But I mostly adopted it after following the work of Tim van Gelder in the early 00’s: http://timvangelder.com/
Show where their assumptions break.
I’m an amateur talking to a pro here, but I did something similar in a history class targeted at high-schoolers.
I kicked it off by polling them on the following questions:
1. Should a pastor simultaneously act as a government official, such as a governor or as an ambassador?
2. Should a pastor call out a politician by name for his policies?
3. Should the church be willing to use state force to combat heresies?
They answered (almost unanimously) as you would expect. Then I presented actual events where Ambrose was faced with those questions, and asked what they thought he should have done. They flipped almost immediately on the first two questions, and I got them to waver on the last one when I brought ISIS into the mix.
I can’t promise that they thought more critically afterward, though, or even remember it now.
I have a different question: let’s assume college *does* teach “critical thinking”? What’s so great about critical thinking? My seniors are definitely way better at writing a paper analyzing a novel than my freshmen, but do they have fewer cognitive biases? Are they more likely to be informed voters? Or make better life choices in general? I’m pretty skeptical. Maybe instead of college Bernie Sanders could just send everyone a free copy of LessWrong Sequences?
Reminds me of this tumblr ask, which wondered whether reading LW and SSC could substitute for a university education.
His uncle wasn’t referring to the generic critical thinking of: take this text from any domain and evaluate its truth claims.
Instead he’s probably referring to the idea of ‘critical thinking’ as challenging the beliefs and values of the middle class. The Zinn-esque: “Oh, you were taught Thomas Jefferson was a great American, huh?”, “Oh you think the police are here for your protection?”
I’m pretty skeptical about how well LW or SSC work at debiasing their target audience of smart but questionably rational folks with solid science or philosophy backgrounds and ideological commitments to empiricism. Start showing them to random high school graduates and I don’t know what’ll happen, but I really doubt it’ll be in the direction of less bias.
Just showing them unedited would probably not be useful for a large segment of said population, but some sort of digest version or a set of though exercises based on them might help inculcate the basic ideas, which are really not that complicated, but which most people haven’t really thought of. Or maybe make them read “Harry Potter and the Methods of Rationality”? I haven’t looked at that myself yet, so I don’t know how effective it might be.
I guess my point is that students are definitely learning to do something in college, but I’m not entirely sure there is a long-term transfer effect on the quality of their thinking for most of them.
If there’s one sort of depressing fact I learned in the realm of exercise, which I think also applies, to a lesser extent, to thought it’s that there’s less transferability than you think: you can’t become a great pitcher by throwing a lead ball. This will train your muscles to throw a lead ball, not make them super good at throwing a regular ball. Similarly, I feel like if the goal is to produce citizens who are more informed voters, or who make better life decisions, or who are not taken in by spurious news stories, then you might as well train people in those specific skills, rather than hoping that the ability to analyze Moby Dick will somehow transfer to those.
On the other hand, there are base level skills like pure muscular strength, which do seem to have some use: pitchers do lift weights because, while it doesn’t increase their pitching skill, it does increase the gross level strength of the muscles involved. I’m not sure which mental “muscles” might be equivalent–some claim that doing logic puzzles, crosswords, studying foreign languages and otherwise challenging the brain can help, though this seems to be more in terms of preventing age-related decline.
I guess it gets back to my general sense that college *is* accomplishing some good things, but that those good things are so completely tied up in a mess of largely unrelated, needlessly expensive, and, in a few cases, actually harmful things, that I very much suspect we could get the pros of college for much less time and money if we better understood the goals.
Methods probably works better as advertising for LW/SSC/MIRI than it does as a teaching tool; its characters pull off a lot of neat stunts, but not in a way that’s easy to duplicate (and which might embarrass you if you try). And it doesn’t give you enough of the background to figure it out for yourself, where it’s even possible outside Fictionland.
Much like all the other fiction-as-philosophy that I’ve ever read, come to think of it. Stranger in a Strange Land, Ender’s Game, The Game, anything Ayn Rand ever wrote…
Yeah, the main thing HPMOR actually teaches you is that if you have a time machine you can pull off damn near anything, to which most people would reply, “well, duh.”
It’s fairly useful in making people think, “Hmm, maybe some of the other stuff this guy has written is worth checking out,” though. MAYBE it’s enough to help make some people think, “Hmm, maybe rationality should be more of a consciously applied process than something I expect to be able to do instinctively,” but in general I’d imagine that someone who hasn’t already grasped that, at least to some extent, would drop the book after the first couple chapters.
Not sure if anyone has commented about this (300 is a lot to read through), but what about autodidacticism? That is, someone who is primarily self motivated and an active, self-directed learner despite circumstances. Even people in schools practice this, at times unknowingly, naturally.
How could we tell if the critical thinking increase is due to the actual curriculum of the school system or due to the subjects’ inherent, varying levels of autodidactic behavior, or so many other variables really? Perhaps it is augmented or triggered by a guiding figure or event? Where are you most like to find an autodidact and is it a significant enough variable to try to account for?
Basically, what is better, autodidacticism (with a mentor or not) or intense, involved schooling (catch, train, release)?
Obligatory rant on topic—> I am skeptical of universities. To me, if you want an education you go to the library, or to the field, to where it’s happening; if you want something, you go for it. If you want to meet people (and possibly gain a mountain of debt) you go to a university. I would hire someone self taught and motivated over someone waving their diploma around like a flag any day.
This gives me an idea: after my oldest graduates high school, send him to live in a college town, in a house with college students. He’ll meet people, he’ll learn stuff (because there’s no way to stop him), and I wouldn’t be surprised if he sneaks into a few lectures. With no homework or grades to worry about, he’ll be able to spend an entire week on a single subject if he wants.
Best of all, it’ll only cost me room and board ($9,000, less if I pick a town with lower rent) vs. actually enrolling him ($33,000, assuming a fairly “generous” 50% financial aid package).
Maybe college is like working out: People show increased muscle/fitness while consistently going to the gym, but this doesn’t necessarily translate to being stronger for the rest of your life once you stop.
I have this image in my head of employers demanding that employees go to “critical thinking classes” once a week. It’s both hilarious and terrifying.
For all we know students who can develop advanced critical thinking skills are able to perform successfully enough to graduate from college and students who fail to develop them do not. It doesn’t necessarily mean that college imparted those skills. It might just be a way of testing the development of those skills. Obviously there’s some kind of correlation.
One possible purpose of college education is to enable the elites to communicate with each other using “dog whistles” that will not be understood by the common people. Along similar lines, the purpose of graduate education is to enable people in each field to communicate without being understood by outsiders.
In other words, college really does teach cryptical thinking.
I strongly disagree with describing a within-subject study design as terrible, but another study with a small separate control group as moderately good. I think a straight within-subject design is perfectly fine. It’s a replicable effect even if the interpretation of what that effect represents is in doubt. In contrast, a between-subject design of less than 50 people is generally not a replicable effect. The control group aspect may improve the quality of the study very slightly, but only slightly. I recognize that Pascarella is comparing the within-subject differentiation between-subjects, and not simply comparing across subjects, and that improves things as well.
It seems obvious to me that if you want to teach young people critical thinking, in the sense of “critique sources’s truth claims” rather than “make your mind a computer that only runs Critical Theory software”, you need to make all students learn philosophy. Get rid of the minor system and make students of all majors take a set of courses on logic and the canonical philosophers.
I think it’s also worthwhile to test the hypothesis of whether extending that to a Classical/”Great Books” curriculum would be more beneficial than philosophy alone.
Scott specifically mentions that the hypothesis that philosophy courses teach critical thinking has been raised, tested, and found wanting.
I read the linked article, and either Huber is wrong and Ortiz right or the philosophy courses in the data set aren’t teaching the right material. Because it would be extraordinary if having to read Plato, Aristotle, Lucretius, Epictetus, Augustine, Aquinas, Bacon, Descartes, Spinoza, Hobbes, Locke, Hume, Kant, Hegel, Mill, etc. and write papers defending positions in ontology, epistemology, and ethics correctly using modern logic (eg. Quine) wouldn’t improve one’s domain-general ability to analyze truth claims. Unless the problem is that modern formal logic is too specialized a skill and we should still be teaching out of the Organon.
I’d be completely unsurprised if numerous courses in moving vague verbal formulas around thoroughly failed to improve domain-general truth-weaving abilities.
Come on guys, we know this.
If you think Huber is misrepresenting Álvarez Ortiz, you should check for yourself and make a more specific claim.
I think the pull quote from Huber might be incomplete about what Ortiz claims to have found, and that the OP mis-characterizes what Ortiz found somewhat.
The pull quote in Daily Nous says that Ortiz’s study found a very small effect for just taking a philosophy class — so small that we maybe shouldn’t think there’s any real effect there. But Ortiz’s study also shows that students who take a critical thinking class with lots of argument mapping (abbreviated “LAMP” in the paper) show serious gains in critical thinking over taking another non-philosophy, non-critical-thinking class: ~0.1 SD improvement (baseline) vs. a ~0.7-0.8 SD improvement for those taking a critical thinking class with lots of argument mapping.
Huber & Kuncel’s complaint that Ortiz’s n may be too small is still reasonable, as is their point that Ortiz sometimes assumes linearity of CT improvement. But those complaints are not evidence for the OP’s contention that “Specific explicitly-advertised “critical thinking classes” don’t” improve critical thinking performance on tests.
But those complaints are not evidence for the OP’s contention that “Specific explicitly-advertised “critical thinking classes” don’t” improve critical thinking performance on tests.
For what it’s worth, my anecdotal impression is that most Critical Thinking classes that are explicitly labeled as such are not classes using lots of argument mapping. Although it’s gaining popularity, argument mapping does not yet seem to be that well-known among philosophy teachers. When I taught a Critical Thinking class several years ago I looked at several textbooks, and I don’t recall any of them having argument mapping (a technique I was myself unaware of at that time). The textbooks were also uniformly very bad, including the one I ended up using. The class I taught did not go well, and I think others who have taught Critical Thinking have had similar experiences. There are a variety of reasons for that, but I think two of the biggest ones are: (1) Critical Thinking teachers (including me when I taught it) tend to focus on informal fallacies, and informal fallacies are basically worthless to learn. (2) Students don’t care about learning how to X when they can’t see the point of X-ing. They will learn more about critical thinking when actively learning about something else interesting, e.g., ethics, with concepts like validity, soundness, circular reasoning and so on brought in after students realize that they help for evaluating arguments that they are independently interested in.
I think I could teach a decent Critical Thinking class now, were I to teach it, but it would not look much different from an introduction to philosophy or applied ethics class that I would teach now.
In addition to what GFA said, one philosophy class isn’t nearly enough time to gain the skills in how to think you gain from studying the philosophers. Argument mapping/LAMP looks like a good start, but then you have to actually engage with the most influential arguments about the great issues. My working hypothesis is that it would take the equivalent of a minor, if the class texts are complete texts by the philosophers.
So my specific criticism of Huber is that “Ortiz’s n is too small”, while technically correct, is being misapplied to write off philosophy as a whole discipline. We need a larger n, but more importantly, we need to see the results of the equivalent of a minor with LAMP + studying the philosophers directly rather than textbooks.
What kind of philosophy?
This is like saying that science does not increase critical thinking. What science?. What does that mean? Chemistry? Political science? Psychology? Biology?
If this experiment is to have any validity, it can’t just test umbrella terms.
You can’t intuit the whole from a single part. The common example; A sculptor is asked to sculpt a lion, but is only given a single paw as reference. Never having seen a lion, will the sculptor make an accurate work? Of course not.
I agree with Le Maistre Chat, and further that there are many bad teachers of many different things. From my experience, many philosophy classes are more like light satires of what philosophy actually is. That being said, there are obviously some bad philosophers in the world today, and some bad philosophy, just like in everything else, you will find people who just aren’t good (Mainly the specialists).
This experiment has far too many unaccounted variables. etc.
Also, slightly unrelated,
Why are there so many Neil Degrasse Tysons in the world? Some people are just bent on dismissing thousands of years of human thought from some of the most brilliant minds as useless, and ignominy to those who study it. I think it’s almost a fad to think philosophy is pointless and science is infallible.
“I agree with Le Maistre Chat, and further that there are many bad teachers of many different things. From my experience, many philosophy classes are more like light satires of what philosophy actually is. That being said, there are obviously some bad philosophers in the world today, and some bad philosophy, just like in everything else, you will find people who just aren’t good (Mainly the specialists).”
My thought is that in a good philosophy class, the main teacher is the dead philosopher you’re studying, and the professor is there to catch your bad arguments. It takes a certain humility, that your job as a teacher is making youth get the most out of minds greater than yours, rather than teaching being a distraction from your real job of publishing papers in hopes of being hailed as the next Russell or Quine.
“This experiment has far too many unaccounted variables. etc.”
Yes; the data collected by Ortiz and criticized by Huber is scraps of we need. There’s a reason we have the aphorism garbage in, garbage out.
“Also, slightly unrelated,
Why are there so many Neil Degrasse Tysons in the world? Some people are just bent on dismissing thousands of years of human thought from some of the most brilliant minds as useless, and ignominy to those who study it. I think it’s almost a fad to think philosophy is pointless and science is infallible.”
One factor is that from before the Scientific Revolution to Einstein’s generation or so, you had to get a liberal arts education (in its pre-politicized sense) before being allowed to specialize in a natural science. This probably has something to do with why Einstein could have an intelligent discussion of epistemology with Niels Bohr, or have a friendship with Goedel where they argued about whether Spinoza or Leibniz had defined God right.
Great observations, Maistre, a refreshing perspective on the scholastic aspect of philosophy.
Some schools still recommend or require philosophy classes for many students, I think, though they are normally business ethics or a quick touch on professional ethics.
I’m not sure if such a lackadaisical approach does more harm or good. Maybe that’s an experimental question that another study like this one can, or has already, looked at.
In the linked article it really looks like we just don’t have good evidence either way — it’s not that philosophy has been found to not do better than other courses, it’s then the data we have on it just aren’t good enough to tell either way.
I would also be very surprised if philosophy done well did not improve critical thinking skills. I know, I know, no true scotsman, self-serving bias, etc. — but still, so many of the skills emphasized in philosophy seem obviously relevant here.
Scott writes: “Specific explicitly-advertised “critical thinking classes” don’t. … Classes that you might think would teach critical thinking, like logic, or philosophy, or statistics, don’t seem to do especially well.”
Yes, there are many classes entitled “Critical Thinking,” “Logic,” or taught in philosophy or statistics departments that do not improve critical thinking. But in that philosophy discussion that was linked to in support of this claim, there is a meta-analysis that suggests the above-quoted sentences are misleading:
C. M. A. Ortiz (2007). Does philosophy improve critical thinking skills? The University of Melbourne, Victoria, Australia
From the abstract:
“The meta-analysis results indicate that students do improve while studying philosophy, and apparently more so than general university students, though we cannot be very confident that this difference is not just the result of random variation. More importantly, studying philosophy is less effective than studying critical thinking, regardless of whether one is being taught in a philosophy department or in some other department. Finally, studying philosophy is much less effective than studying critical thinking using techniques known to be particularly effective such as LAMP.”
Excuse me, but screw “critical thinking”. You don’t go to school for that. You go to school for domain knowledge.
Which paper is your quote about suspension of critical thinking from? I can’t seem to find it with a simple search.
Assuming no testing bias etc. the historical downward trend to “skill development” seems like it could be explained by a larger randomized sample size.
Another possible explanation is that College is teaching less. Anecdotally there is a soft push towards grad school and I’ve heard it said “a bachelor’s degree is the new high school diploma and a master’s is the new bachelor’s degree.”
It could be the case that 4 years of college in the 80’s IS worth more than 4 years of college in the 2000s.