Books and papers have already been written on this predictable outcome in higher education.
An astonishing number of students start college in America without finishing it: Roughly 40 percent of college enrollees don’t go on to get a degree within six years of starting to work toward one.
The good news is that in recent decades things have gotten a bit less bad. By one calculation, at four-year state schools that didn’t make the top 50 public universities in U.S. News & World Report’s rankings, the graduation rate within six years rose from about 40 percent for students starting in the early 1990s to about 50 percent for students starting in the late 2000s. (The phenomenon was not limited to non-elite schools.)
When Jeff Denning, an economist at Brigham Young University, started looking closely at the data on college-completion rates, he was a bit perplexed by what, exactly, was driving this uptick. He and some of his BYU colleagues noticed that a range of indicators from those two decades pointed in the direction of lower, not higher, graduation rates: More historically underrepresented groups of students (who tend to have lower graduation rates) were enrolling, students appeared to be studying less and spending more time working outside of school, and student-to-faculty ratios weren’t decreasing. “We started thinking, What could possibly explain this increase?” Denning told me. “Because we were stuck with not being able to explain anything.”
An academic technocrat who can’t explain anything. Might that be part of the problem? Emerson had all this ages ago: “One of the benefits of a college education is to show the boy its little avail.”