Harvard Crimson
Let Them Eat Code
We never thought we would see the day when adults were bemoaning the foresight and responsibility of the American youth. But the national anxiety over the decline of the humanities major smacks of exactly that sentiment. Young adults, some argue—under pressure from their parents and an unforgiving job market—feel they must pursue practical paths. For others, the fact that fewer students are choosing to study things like English and art history is evidence that our society’s cultural fabric is fraying.
But we’re not especially sorry to see the English majors go.
Economies change, as does the demand for certain types of expertise and skill. Increased mechanization and digitization necessitates an increased number of engineers and programmers. Humanities apologists should be able to appreciate this—if Thought Catalog and Instagram are any indication, they’re fans of the internet, too. It’s true that fewer humanities majors will mean fewer credentialed literary theorists and hermeneutic circles. But the complement—an increased number of students pursuing degrees in science, technology, engineering, and math—will mean a greater probability of breakthroughs in research. We refuse to rue a development that has advances in things like medicine, technological efficiency, and environmental sustainability as its natural consequence.
What is more, the decline of the humanities major need not give us reason to anticipate the decline of the humanities: Academics do not have unique access to the instructions for being human. Whether they study history and literature, applied math, or organismic and evolutionary biology, people will continue to seek truth in philosophy, solace in music, and company in the pages of books.
In fact, we suspect that humanities professors’ effective surrender of any claim to objectivity—that is, their admission that they cannot provide authoritative understandings of texts on the grounds that no such understandings exist—was the first nail in the humanities’ coffin. Why spend four years listening to lecturers warn you that you can never really know anything? Or worrying that failing to dissect a text or event along the lines of race, class, or gender will result in an accusation of moral and intellectual responsibility (or worse, a bad grade)? The disciplines that once could claim to open the mind and free the spirit now seem to endorse a specific, sometimes discouraging, type of thinking.
We are skeptical that those who study math, science, and engineering chose their fields because they were enamored of the idea of living problem set to problem set. Indeed, if students are choosing “rigorous” concentrations not out of a love for objectivity but out of a fear of professional failure, then that is lamentable. (This is especially true for Harvard students—not because we should feel entitled to successful careers, but because many of us are so eager to please that we’ve never been less than perfectly vigilant about anything in our lives.) Still, impracticality is not a virtue in and of itself. Practical value and intellectual merit are not mutually exclusive, and one does not necessarily dwell where the other is absent. Learning to program, or research, or calculate can be enjoyable as well as useful.
Just as wheat and corn survived the decline of the American farmer, our culture, our values, and our yearning to understand our acquaintances and ourselves will survive the decline of the humanities major. To those who are upset with the trend, we say: Let them eat code.