NEWS
GOOD PEOPLE
HISTORY
LIFE HACKS
THE PLANET
SCIENCE & TECH
POLITICS
WHOLESOME
WORK & MONEY
About Us Contact Us Privacy Policy
GOOD is part of GOOD Worldwide Inc.
publishing family.
© GOOD Worldwide Inc. All Rights Reserved.

The Credit Hour Conundrum

Is it possible to create a standard definition of a credit hour?

What is a meter? Most people would be satisfied with an answer of “about three feet.”
Official definitions, however, turn out to depend on some pretty fanciful, abstract, and far-flung benchmarks: one ten-millionth of the distance between the Equator and the North Pole; or, more strangely, the distance between two lines on a metal bar made of 90 percent platinum, located somewhere in Paris.

The more we look into any standard form of measurement, from the length of a second to the value of a dollar, the more we can see that it’s based on an arbitrary consensus where everything is defined in terms of something else. And yet that doesn’t mean we can do without them. In order to operate in the world we have to trust that we all mean more or less the same thing by reference to these standards.

In the world of higher education, the equivalent of the meter is the credit hour. John Meyer, a venerable sociologist of higher education at Stanford University, was the first to point out to me just how strange a convention this is: “The idea is that your value in the mind of a rational god is a standardized thing in this world.”

Because in fact, there’s nothing standard about the content of a credit hour, in terms of how it’s actually spent. As an undergrad I whiled away pleasant classroom hours discussing Emily Dickinson, while my friend, an engineering major, spent grueling sleepless nights grinding out problem sets, yet we both earned equivalent credits towards our degrees, give or take a few distribution requirements.







In the United States system, you can earn undergraduate credits by writing basic five-paragraph essays, practicing ballet, interning for a political campaign, or building a web site—and legitimately so: These can all be valuable learning experiences. Conservatives love to make fun of undergraduate seminars where the coursework involves watching Lady Gaga videos or porn or Mexican telenovelas, to say nothing of the lazy professors—we’ve all had them—who replace lecture time with movie time. And yet somehow, if you play your cards right, all those credit hours, however you spent them, add up to a degree that can be your most important passport to a better job and a better life.

This type of pleasant chaos is now coming under greater scrutiny. In June, the House Education and Labor Committee held a hearing to try to better define a “credit hour.” And while it may seem esoteric, this is a federal matter because federal financial aid, both grants and loans, is such an important factor in funding undergraduates. And members of Congress are especially concerned that certain for-profit colleges, which soak up more than their share of federal student aid, may be inflating the definition of a “credit hour” in order to keep customers—I mean, students—happy.

But the idea of creating a standard definition of a credit hour—the proposed definition is “one hour of classroom or direct faculty instruction and a minimum of two hours of out-of-class student work each week for approximately 15 weeks for one semester or trimester hour of credit"—quickly collapses into absurdity when you think about the diversity of experiences that are accommodated under the “credit hour” blanket, not to mention the possibilities of innovative uses of technology.



For example, the Open Learning Initiative at Carnegie Mellon created a statistics course that was a blend of online practice with a specially designed tutoring program and in-person instruction. The course met twice a week for eight weeks, versus four times a week for 15 weeks in a conventional course. The computer-assisted students learned more and retained the information just as well as the students in the conventional program, but they did it in one-quarter of the total classroom time. Should they get one-quarter of the credit hours?

These difficulties don’t mean, however, that we should throw up our hands at the possibility of ever coming to a better consensus on the definition of a credit hour. (Besides cracking down on credit inflation, another very good reason to do this is to improve transferability of credits, since half of students start out at community colleges and at least 60 percent transfer at some point in their career.)

One possible answer is to promote more visibility into exactly what goes on inside various classrooms—something that sites like Academic Earth, the Open Courseware Consortium, and Einztein, which show lecturers at top universities at work, can do. Another is to promote publishing students’ work to the open web, as is done at UMW Blogs, and greater discussion and collaboration amongst students at different universities, as can happen on study network sites like StudyBlue.



It may not be possible to measure the content of a credit hour more accurately, but we can certainly be more precise about it.

Anya Kamenetz is a staff writer for Fast Company and author of "Generation Debt." Her latest book is "DIY U: Edupunks, Edupreneurs, and the Coming Transformation of Higher Education."

More Stories on Good