As anyone who has scanned recent U.S. education headlines knows, the humanities face a crisis of legitimation amidst a tech-driven economy in which the mantra of ‘job preparedness’ seems to have trumped the traditional academic notion of humanist scholarly inquiry. Faced with the task of defending the relevance of their field of study, academics have justifiably cited the critical thinking skills that are gained via a humanities education.
More often than not, however, many of these very same academics proceed to undermine this eminently legitimate point by claiming that a university education should bear no relation to vocational concerns. Indeed, whenever anyone parrots out this shaky line of reasoning, I find myself pondering the following question: In what sense has the American university ever stood entirely apart from concerns about employability?
Granted, the venerable model of the liberal arts college that first arose in America in the 17th century claimed to be a forum of education that once existed almost entirely apart from market-influenced pressures. Yet, while liberal arts colleges like Harvard and the College of William & Mary were historically founded with the intent of providing select young men with a moral education that would cultivate a sense of civic responsibility, both of these institutions have since morphed into research universities that feature numerous professional programs.
While small liberal arts colleges continue to exist in America, only a handful have been able to thrive without seeking some form of compromise with market-influenced exigencies. Considering how early American liberal arts colleges had to prepare young men for future positions as educators, clergymen, and civic leaders, one could conceivably argue that these institutions were always partially attuned to vocational imperatives. Tellingly, a young George Washington received his surveyor’s license from the College of William & Mary in 1749.
It was in the wake of the U.S. Civil War (1861–1865) that American research universities emerged and began offering a variety of professional programs geared towards training a new generation of men and women for employment within a rapidly modernizing American economy. This inaugural developmental phase of the American research university was followed by a second major expansionary phase during the immediate post-WWII decades, a period lasting from roughly 1945–1975, one that has since come to be referred to as the ‘golden age’ of American higher education. During this period, universities experienced unprecedented growth, and the humanities thrived as a field of academic inquiry.
Yet, while this second expansionary phase was ostensibly justified in meritocratic terms, it also intersected with vocational imperatives geared towards training individuals for employment within a new knowledge economy. To this end, the humanities were of pivotal importance. Aside from providing an academic foundation for future educators, the field placed strong emphasis on textual analysis, which was considered a valuable form of training for the future knowledge workers needed to fill roles in the nation’s then burgeoning corporate and governmental bureaucracies.
In our wired global economy of the present, however, the bloated hierarchies that defined these past corporate and governmental organizations have been reconfigured and streamlined, resulting in legions of clerical positions rendered obsolete. Amidst this highly competitive global economy, critical reading and writing skills are clearly more important than ever before. Yet, the Web 3.0 technological turn has brought about a marked shift in the medium through which information is produced and disseminated.
While in the past well-honed critical reading and writing skills were sufficient to get humanities graduates hired for entry-level communication positions, the contemporary technological shift to digital communication has resulted in a socioeconomic environment in which corporate and governmental organizations now increasingly expect applicants to be familiar with complex multimedia software applications, social media skills, and digital analytic techniques. Given that communication skills have historically been the stock-in-trade of humanities graduates, it is imperative that contemporary humanities students be exposed to new digital skill sets if they hope to secure employment in communication-related fields upon graduation.
In this regard, the solution to the current crisis of legitimation affecting the humanities might very well reside in the growing interdisciplinary field of digital humanities, which should not be confused with online education. For the record, I will state that I have no vested interest in this field outside of casual theoretical conjecture. Nonetheless, I fail to see why my own relative digital ineptitude should lead me to embrace a ‘hold the fort’ mentality about the current state of the humanities. Just as the cultural turn of the 1970s enriched humanist inquiry by moving it away from the staid and supposedly apolitical methodology of New Critical analysis, the humanities of today might be revitalized via a digital turn.
By at least providing students with the option of learning various digital techniques, humanities departments could offer undergraduates both an instantly marketable skill set and an enriched scholarly experience. How might this work from a praxis-oriented perspective? Well, a student working on an essay about James Baldwin, for example, could learn how to develop a sophisticated multimedia document incorporating textual analysis with data mining techniques and video interviews with Baldwin scholars.
While critics of the digital humanities will undoubtedly argue that neoliberal-minded university administrators will simply use the field to transform the humanities into corporate training grounds, no self-professed digital humanist I’ve ever encountered has expressed any interest in providing students with narrow vocational training. As an educational field, the digital humanities would integrate traditional humanist pedagogy with technological learning, thereby providing students with the option of taking various specialized digital humanities courses that might be offered either intradepartmentally or interdepartmentally throughout the humanities. Thus, traditional courses – like African American History and Shakespeare – could be offered alongside courses focusing on various theoretical and applied digital humanities techniques.
In short, this integration of humanist learning and digital technology would not simply be concerned with training students for corporate employment. After all, even entry-level positions in social justice organizations and non-profit NGOs now require that prospective employees possess hard technological skills that will allow them to hit the ground running. Just as adjuncts and teaching assistants should have the right to expect fair compensation for their labor, graduates of humanities programs should have the right to be educationally equipped to secure gainful employment after spending three to four years of their lives and tens of thousands of dollars pursuing a degree. If humanities scholars continue to parrot out the questionable line that a university education should bear no relation to concerns about employability, then they will do so to the increasing detriment of their field as a whole.
17,687 Total Views, 2 Views Today