by Kenneth Wee and Karla Erickson
“The paradox of education is precisely this – that as one begins to become conscious one begins to examine the society in which he is being educated.”
In 2011, a Pew Research Center report found that nearly half the American public considered teaching students job-relevant skills and knowledge to be the defining purpose of a four-year college education. While a good three quarters (74%) of college students walk away from their alma mater feeling like they have grown intellectually and personally, only 39% considered that the central aim of their education. As more and more fresh undergraduates regret not gaining more work experience while in school, overwhelmingly major in STEM fields and business, and identify “getting a better job” as a main reason for attending college, the strings that tie education to employment have only tightened.
Ironically enough, the ancient philosopher Plato might offer us the best take on neoliberal education yet: “education is teaching our children to desire the right things.”
Curiously, it was really only after World War II, when breakthroughs in manufacturing technology and expanding global markets brought demand for skilled labor, that American universities and colleges started explicitly seeing education through the lens of occupation. Larger universities like the University of California at Berkeley – whose president Clark Kerr claimed that colleges existed to “produce socially and economically useful knowledge” – set a precedent for smaller colleges by driving industrial research and equipping students with employable skills. As the post-war economy grew and college became the new highway to the American Dream, enrollments across higher education institutions tripled between 1950 and 1970.