GOOD

Is it possible to create a standard definition of a credit hour?

What is a meter? Most people would be satisfied with an answer of “about three feet.”
Official definitions, however, turn out to depend on some pretty fanciful, abstract, and far-flung benchmarks: one ten-millionth of the distance between the Equator and the North Pole; or, more strangely, the distance between two lines on a metal bar made of 90 percent platinum, located somewhere in Paris.

The more we look into any standard form of measurement, from the length of a second to the value of a dollar, the more we can see that it’s based on an arbitrary consensus where everything is defined in terms of something else. And yet that doesn’t mean we can do without them. In order to operate in the world we have to trust that we all mean more or less the same thing by reference to these standards.

In the world of higher education, the equivalent of the meter is the credit hour. John Meyer, a venerable sociologist of higher education at Stanford University, was the first to point out to me just how strange a convention this is: “The idea is that your value in the mind of a rational god is a standardized thing in this world.”

Because in fact, there’s nothing standard about the content of a credit hour, in terms of how it’s actually spent. As an undergrad I whiled away pleasant classroom hours discussing Emily Dickinson, while my friend, an engineering major, spent grueling sleepless nights grinding out problem sets, yet we both earned equivalent credits towards our degrees, give or take a few distribution requirements.

Keep Reading Show less
Articles