I remember vividly, my undergraduate degree. Semester after semester, I took 20+ credits, balancing private lessons with ensemble work, education courses, and my general studies. On top of that, I worked as an RA and held a few odd jobs over my time in school in order to make ends meet. The accumulation of work often buried me so deeply that I remember finishing each semester feeling like I had made it through the race but didn’t really have anything of quality to show for the work.
One of the biggest reasons many of us are over stretched during college is that higher education values the amount of time we are physically in a classroom, rather than our ability to show competency. In Cracking The Credit Hour, Amy Laitinen takes an objective look at the state of higher education. This is a must read for all and here are a few takeaways from the report:
- The Credit Hour—the unit of measurement that determines competency for college students—was developed in 1906 by Andrew Carnegie. The reason for this was to establish a unit for college teachers to receive a pension. This was the birth of the 12 hour teaching load.
- Establishment of a credit hour set the stage for the following statement: “[C]ollege degrees are largely awarded based on time served, rather than learning achieved.” Carnegie, in his pursuit to help teachers, inadvertently caused the field to make a unit of learning based on time, not learning outcomes even though “[he] stated explicitly that in the counting [of the unit for purposes of designating a pension for professors] the fundamental criterion was the amount of time spent on a subject, not the results attained.”
- Our pursuit of standardization of the credit unit turned into the credit hour, causing a one size fits all mentality across academia. Today, one credit hour is equal to one hour of time in a classroom. Unfortunately, this has little (if anything) to do with competency in a particular subject.
- Students that go through a four year degree aren’t necessarily more competent when they finish. “The 2011 study Academically Adrift found that 45 percent of students completing the first two years of college and 36 percent completing four years of college showed no statistically significant improvement over time on a test of critical thinking, complex reasoning, and communication skills.”
- Today, the credit hour determines full time status for students. This has major implications for the amount of tuition universities are able to charge and the amount of college loans students are able to attain.
Cracking the Credit Hour has left me with the following questions for our students:
- Will higher education stop assigning credits based on time served? Can you imagine a competency based system by which institutions could assess credit for prior learning in order to make space for work in a different area? Think about the amount of additional time students would have to practice their art in this scenario.
- Could detaching the credit unit from time served allow us to give students access to classes that will help them thrive in our economy? Our students are overstretched with traditional requirements. While important aspects of a well rounded education, why are students expected to sit through classes if they already know and understand material covered in a particular course? Work which is considered “core” may be redundant for some students and it keeps them taking courses that may be integral to their success after college.
- Can we pay for and value learning instead of time? I’m really intrigued by what this might look like for a music major. Certain aspects of our discipline—like applied instruction— are rooted in hours of time spent learning an instrument, however, in a time when college tuition is through the roof expensive, any opportunity a student can prove that they already know and understand the material, might help create more value in an already expensive education.
How does this resonate with you? I’d love to hear from you in the comment section below.