Originally Posted by Wren
It wasn't an argument that art isn't good to have, but you have to roof and provide food first to your children.

And too many go into the arts because you can get a degree in it and then you can't get a job in it. If less went into them, the truly dedicated would and then there would probably be more scholarship money for the truly talented.

What this whole argument was about, was weeding out the ones that really don't need a college education since they were a drain on the country and they should be redirected to vocational training. Hence my solution, which I think works. You want a liberal arts degree and then try and get a job selling in Costco, pay the tuition. If you get a degree in computer science and keep a job here instead of bringing in someone from India or China, then your tuition gets repaid.

Anyone know how many people get a degree in English literature and then not do anything with it afterwards, not teach, not write, not edit? My pure guess is at least 80%. I bet it is in single digits for engineering.

I think that this is a very fair-- and true-- observation.

The bottom line is that what higher education means has been subverted significantly over the past 30 years. While I think that interesting, niche areas of study are a fine thing, I also tend to think that most of the time they are better left for graduate study, not undergraduate degree programs. I'm old school that way. STEM is about the only place where that sensibility has been staunchly retained-- and I think it may be no small coincidence that a degree in "physics" still has value as it always has, whereas one in "early-American Queer studies" lacks the punch (and employability) that "Sociology" once had. At the undergraduate level, one simply isn't (yet) prepared to do the kind of focal study, with a wider foundation UNDER that focal study.

There's a reason why my undergraduate degree was more general than my PhD, you know? The one thing supported the other, and while yes, my PhD is in a fairly arcane and not-terribly-generally-useful thing, it's purpose was far more about demonstrating my ability to APPLY what I'd learned as an undergraduate and take it to the limits of what current technology and my own cognitive abilities could bear. It says something about my potential as a person, not necessarily that the subject area IS what I do or can do.

Anyway. Tangent, that.

I think that without rolling back the clock on what we MEAN when we say "higher education" we are going nowhere with higher ed in this country. You simply cannot allow 17-19yo children, as a cohort group, to pick and choose what they are "interested" in knowing. The problem is that they don't KNOW what they don't know. Catering to them as though they are mostly autodidactic is foolish in the extreme, and yet the "student as consumer" model has done just that.

Well, caveat emptor-- we should have thought about it when faculty were sounding the warning in the early 90's about this nonsense. Who knew that narrow, "self-determined" courses of study devised by students wouldn't turn out to be very good, er-- "education" in larger terms?

Well, any PARENT should have known it. There's a reason why we don't allow second graders to "determine" what their curriculum needs to contain. Because the majority of them aren't capable of knowing, much less implementing it for themselves, that's why.

General education cores at universities have existed for a reason. We ignore that history at our peril-- and we HAVE been ignoring it.

Engineers need to learn communication skills (whether they wish to or not) and social workers need to understand enough physics to appreciate policy challenges and be educated voters. Neither group is especially good about recognizing that need at the time.






Schrödinger's cat walks into a bar. And doesn't.