Over the past year I have become increasingly vocal about my critiques on the education system in the United States. While the public, K-12 school system deserves a pi-length essay detailing its shortcomings, I want to focus on what is referred to as 'higher-learning' or the university-model today. Although it would be easy to call into question the funding source of public universities, the ever-expanding number of government employees on the payroll and so on, I want to put those arguments aside and focus on only one: The current university paradigm is an inefficient method of learning!
College seems to be incredibly inefficient in regards to money, time, and retained knowledge. The average in-state, public four-year university tuition costs for the United States in 2016 is $36,556 for an undergraduate degree. That's tuition alone! That doesn't factor in materials or books "needed" or if it takes an individual longer than 4 years to complete their chosen degree (only 36% of students graduate in 4 years!). The average out-of-state, public four-year university tuition costs for an undergraduate degree in 2016 is $91,832! Price alone, however, tells us nothing. The real question is: is that too high? YES! When considering the vast amount of free information available today (to name a few):
Khan Academy
Free Code Camp
100s of MOOCs
MITx
Wikipedia
Mises.org Library
Google Scholar
Econtalk.org
Steemit
And perhaps the greatest university on Earth: YOUTUBE
I don't see how one can justify the enormous cost of attending college. I don't see the value. What about STEM majors, one might ask? I haven't heard a compelling argument against the notion apprenticeship-style programs or positions wouldn't be more effective. Imagine, rather than paying thousands of dollars to acquire an undergraduate degree, then thousands more to get graduate level qualifications, as many of these fields require, and instead getting paid to be an understudy!
A doctor's career path is roughly as follows: 4 years undergraduate, four years med school, four years residency. Is there not a more efficient method than 8 years of academia and then essentially a four-year apprenticeship? What is the opportunity cost of attending university? What else could students spend all that money on? Perhaps starting a business, traveling the world, or perhaps hanging out and reading as they discover their sense of self?
Then there's the matter of time. "The price of anything is the amount of life you exchange for it," wrote Henry David Thoreau; I agree. For a standard, undergraduate degree, it is generally expected one can graduate in four years. Four years is a long time! I have little doubt, the vast majority of individuals can learn everything they have retained from their college years in far less than four years via youtube and the other free sources I mentioned above.
Making this more troubling is that only 36% of students even graduate in four years! The average undergraduate degree takes six years to complete, as of 2013! I assumed that number was skewed upward so I investigated the median. The median graduation-length for public universities in 2008 was 55 months! I did not learn enough in college to justify the four years I spent there. Most individuals spend even more time in academia than I did, and I can't reconcile it.
Now let's take a quick look at knowledge retention. The argument behind college is a noble one: a more well-educated humanity produces better outcomes. I will accept this hypothesis. The question then becomes: is the university system producing well-educated people?
In my dialogues with my friends and family, most readily admit they retained little to nothing they were taught from a college classroom. Indeed, most of what individuals retain is often their activities outside the confines of a classroom. The highest retention rates are on things individuals seek out to learn for themselves out of their own curiosity, rather than those things learned solely for testing purposes. I have yet to find evidence classroom-based knowledge is well-retained. The most impactful learning experiences are hands on--to do and see! Without the stresses and scheduling difficulties college causes, imagine the wonderful ways people could use that critical time of growth actually doing things of value instead of being tucked away in a classroom.
I myself have an undergraduate degree in Finance from the University of Texas at Austin. I enjoyed my time in college a great deal and grew tremendously! But my growth was primarily from activities having nothing to do with classroom, and nearly everything to do with being in a new city, having the ability to completely organize my day for the first time, meeting new, introspective individuals, and having amazing and thought-provoking conversations. In the 4 years post college (what I like to call my freedom degree), I have learned even more from exploring ideas and the world first-hand! The things I learned during my time attending college weren't from my time in college; what I learned happened in the space between classes.
I recognize the idea of college is a noble one. But noble ideas are just that and nothing more. One must investigate whether the end-goal is being achieved. I do not believe college is creating well-educated and well-rounded individuals. And even if college is meeting this noble notion, it is doing so in such an inefficient manner as to make all achievements irrelevant. Don't be fooled into thinking college has a monopoly on education. I believe...it's time to break the chain. Be braver than I and recognize the illusion.
To Bernie I say, education is already free you jackass! ;)
When people see these educational innovations like you describe in your blog post, someone almost inevitably asks, "What about accreditation?"
And that's the problem, maybe the main problem with our current education system.
Accreditation is the effort to measure and control quality from a central perspective. According to the US Department of Education, "The goal of accreditation is to ensure that education provided by institutions of higher education meets acceptable levels of quality." It's a great idea in theory. Accreditation is supposed to make it easy to gauge the value or effectiveness or transferability of an educational experience. It sends a message to students and those who pay for it that the student has met a certain level of skill development or understanding.
The problem is that it's a false measure. First, the stamp of approval of an accrediting body is no guarantee of quality education. Second, education and the context in which it is used takes place locally, not globally (although education clearly has global effects), and what counts as mastery and understanding varies depending on context.
Third, accreditation becomes a distraction, both for the educational institution and for the student. Instead of making student learning and achievement the highest priority, the accreditation process shifts the focus on meeting organizational needs and requirements.The accreditation process takes what should be an individualized effort, or perhaps a community effort with an individualized focus, and turns it into an effort to mass produce a consistent product of predetermined quality.
The student seeking an accredited education doesn't ask, "What will it take to master this skill or subject?" and "What does mastery of this skill or subject mean in my life and my community?" Instead, the student asks, "What does it take to get an A?" (or B or C or whatever the student thinks is required) and "What will an A from this institution mean for my future employability?"
Similarly, instead of focusing on students, the administrator asks, "How many students have to fail for us to retain our accreditation?" and "What subjects will the accrediting body require us to offer?" and "What kinds of accreditation will keep us fiscally solvent?" and so on.
When we encounter a great educational idea or product, "What about accreditation?" is the wrong question to ask. The first questions should be something like these: "Does it work?" "Does it help students (or this particular student) learn and grow and develop into good human beings and effective members of communities?" and "Does it strengthen the local community?"
The first step in healing our educational systems will be to focus first on students and on providing opportunities and resources of their learning and also on their families and communities and on discovering how to bring all of those stakeholders into harmony…kind of like what the Kahn Academy has begun to do.
Very true. My friends and I started a website called The Free Academy, which you can check out at freeacad.com. It's purpose is to get rid of the formal lines of student and teacher. Everyone can be both within any given situation. And of course, it's about learning outside the state accreditation.
"About
The Free Academy is a group of students/teachers with the great privilege to study life and liberty outside the confines of State accreditation.
Membership is free and open to all and begins as soon as you start participating. The goal here is informal communal learning and personal growth. We strive to teach each other as we teach ourselves. Learn whatever you want and share whatever you find interesting. This site is our virtual playground. Register a username, ask questions or start discussions in the comments and when you’re ready, make your first post!"
The accreditation process reminds me a lot about the healthcare industry in the US and the limitations posed by the board to limit the number of doctors to keep the salary of those doctors already practicing artificially inflated.
Try putting youtube on your resume under education...see where that gets ya :)
Haha, it is very true formal education still opens certain doors. I understand why an individual would choose to attend college from an immediate access perspective.
But the real question is: are the doors being opened from college a signaling effect or an increase in human capital? As in, do employers seek to hire college graduates primarily because it signals they have certain character traits the employer wants a la greater persistence, time management skills, ability to follow rules and play the game (so to speak) OR do they seek graduates because they believe college really increased their intelligence therefore making them a better potential employee?
My belief is college performs much more of a signaling effect than a human capital increase. If you agree, the problem with college is it acts as a mere hurdle (a government subsidized one at that) one must jump over to meet the prerequisites of the game. The issue metastasizes when college degrees are so prolific they become a high school diploma and the hurdle one must now jump over is even higher sucking up more resources while not increasing the productivity of the individual.
Basically, if college isn't actually increasing the human capital of an individual, it is my opinion it should be incumbent on the employers to find their own signals instead of piggy-backing on the very expensive government-subsidized signal of universities.
But someone has to be brave and break the chain because as long as there is an ample pool of graduates available in the labor market, businesses will rationally continue to use this signal and filter out all those who didn't jump the hurdle even if they are more productive and intelligent.
I think people that are self-educated become entrepreneurs rather than employees. That has been my experience at least. Really enjoyed your post by the way. :)
That is true in my experience too. I think that is because right now relying on self-education is viewed as a risk and entrepreneurs tend to be risk-seeking individuals. Or even more likely: I don't know what I'm talking about, haha :)
Thank you, btw!
Bernie doesn't give a crap about other people he just wants all the money saved for himself and the bitch he works for now.