From following England’s Premier League in recent years, I’ve picked up a phrase that we don’t tend to use over here: to play on the front foot. In soccer, it describes a team that is committed to being aggressive in attack. Rather than retreating into a low block, absorbing pressure, and looking for chances to counter, a team on the front foot is proactively seeking to take control of the game and score goals.
Today, let me translate that lingo to another season that ramps up in August — the American academic year.
Being on the front foot is not the posture that we historians and other humanities professors are accustomed to taking. For years now, we’ve been forced to defend: our programs, our positions, our very reason for existence. And it would be easy to fall back into a defensive shape entering this fall, when we’re not only facing the familiar pressures — students preferring courses of study that lead more directly into promising careers, politicians attacking us for doing indoctrination instead of education — but the accelerating development of artificial intelligences that can seem to do the very work that we expect from our students.
I’m sick of defending.1 Let me suggest three reasons why, viewed most optimistically, the dawn of the age of generative AI creates the perfect conditions for humanities to start playing on the front foot.
(Mostly, I’m writing to encourage my fellow humanities professors, but I also hope that at least one non-academic group is listening in. If you happen to be in a position to make hires for your company, agency, church, or other organization, I’ll have a request of you at the end of this post.)
1. AI will inspire new attention to what it means to be human. Since people like me are so accustomed to seeing AI as an existential threat, let’s start with an existential opportunity for the humanities.
Right now we’re still sorting out how best to harness, regulate, and unleash technological potential that seems to grow exponentially. But I have no doubt that we’ll soon realize that the list of functions that a generative AI can do in place of humans is not nearly as long as some claim it to be. Despite the best efforts of the most brilliant engineers and the worst intentions of the most aggressive entrepreneurs, we’ll soon come to a new appreciation of all that humans can be — and all that their technologies will never become.
As ever, that realization will both inspire and horrify us, dismay and delight us. Human complexity will spur human ingenuity.
But a different kind of ingenuity: not the inventiveness of the tool maker, but the inquisitiveness and insight of the humanist. As the limitations and pitfalls of artificial intelligence become clearer and clearer, we’re going to need human intelligences trained to study humanity itself. That means scholars in the humanities — history, philosophy, literature, languages, and the like — but also artists, and scientists of the behavior and social bent.
2. AI is prompting new creativity in teaching and learning. If I’m right, we should expect a resurgence in scholarship, as AI prompts historians, philosophers, linguists, and other humanists to ask new questions — or, more likely, very old questions asked within new contexts. But hopefully that burgeoning desire to look afresh at humanity itself will spur new interest among young humans as they go to college.
When they get to our courses and programs, college students may find fields of study that have been reinvigorated by new approaches to teaching and learning. That revival is already underway, at least if New York Times opinion writer Jessica Grose is right. Earlier this month, she shared some of what she heard back from humanities professors when she asked them how the fear that AI was letting students “cheat their way through college” had led them to reimagine their courses, building on a rethinking of older methods of instruction and assessment prompted by the COVID lockdown. “Through a combination of oral examinations, one-on-one discussions, community engagement and in-class projects,” reported Grose, “the professors I spoke with are revitalizing the experience of humanities for 21st-century students.”
Not to say that everything will change all at once. This fall the advent of AI will send me back to in-class, writing-intensive exams in my upper-level Modern Europe course, but I’m not going to let the existence of ChatGPT stop me from assigning a research project that culminates in a traditional paper. In my summer course on the First World War, I made only slight tweaks to how I structured, evaluated, and described essay writing. Yet despite it being an online course where students had every opportunity to hand off their work to a bot, I’ve rarely been so happy with the quality and originality of essays that I read.
Still, the availability of such AI tools does make me want to keep experimenting with historical simulations, unessays, exit interviews, and other strategies that take different paths to the same outcomes that have always been central to the humanities.
3. AI makes study of the humanities both transformational and transactional. However differently we teach our fields, such study will continue to transform students in familiar ways. None of the pressures we’ve faced in my career — economic, political, cultural, demographic, or technological — have made me any less confident that studying the humanities can make humans more curious, thoughtful, imaginative, empathetic, passionate, humble, adaptable, persuasive, and comfortable with complexity — that of the world and that within themselves.
But what if AI is reshaping the economy such that those traits become not just hallmarks of transformative education but the goals of transactional students looking to connect their education more directly to their careers? For it’s very likely that what AIs can and can’t do will make the learning outcomes of the humanities into the very attributes employers most need in their employees.
“The Rise of AI Will Make Liberal Arts Degrees Popular Again,” claimed one article last month. The author wasn’t a humanities professor like me, but a writer for the business magazine Inc. “With AI taking over more routine business and tech tasks,” Jessica Stillman reported that “experts say the value of a liberal arts degree is set to rise.” Even as computer science majors discover that their own discipline has made their labor redundant,2 Stillman found corporate leaders like Standard Chartered CEO Bill Winters valuing an undergraduate major in international relations more than an MBA from Wharton. When the “technical skills are being provided by the machine, or by very competent people in other parts of the world who have really nailed the technical skills at a relatively low cost,” said Winters, “I’m going to go back to curiosity and empathy.” Stillman found even AI enthusiasts describing “the ability to deal with the messiness and unpredictability of people” as “the most AI-proof skill,” while self-professed “tech futurist” Lindsey McInerney predicted that “the skills acquired in the pursuit of the humanities are not only going to be the most indispensable, but some of the most highly sought-after.”
With all that in mind, let me suggest two charges to two different types of reader:
If you’re a humanities professor like me, enter 2025-26 on the front foot, with confidence and renewed purpose. Invest in the students you have, instead of resenting those who chose other fields. Whatever energy you need to expend in creating disincentives to student cheating, it’s far more important that you excite your students to learn and encourage them to feel capable of such work. When you run into transactionally-minded students, make the case that I’ve made for the professional benefits of the humanities — and trust that they’re still experiencing transformation even as they acquire marketable skills and valuable traits.
And if you are in a position to hire new employees for an organization, your version of the “front foot” metaphor should be to get out ahead of what the Inc. article had to say about the impact of AI on the workforce. McInerney claims that the traits honed by the humanities — the “‘the creativity, inspiration, imagination, and dreaming’ that make us ‘the most human that we can be’” — are “going to start popping up on every job description.” So be the first in your industry to rewrite those descriptions accordingly, not the last.
Better yet, give humanities-loving students a clearer pathway from college to career before they even hit the job market. Whether they work in the corporate, nonprofit, or public sector, I’d love to partner with readers in the Twin Cities to create an initiative that offers Bethel history, philosophy, and political science majors a chance to connect their burgeoning skill set to real world experiences. Email me to talk about how you can set aside a certain number of summer internships for such students, take part in classes like our Applied Humanities capstone seminar as an industry expert, then give our graduates first crack at applying for entry-level jobs in your organization.
Waiting to see what AI can do is old news. Investing in humans who understand humans is the future.
Perhaps not surprisingly, I support a Premier League club whose motto is “To dare is to do.”
Not to say that CS is an unworthy field of study! But I do wonder if news about rising unemployment rates in that once-trendy discipline will push those students to pair their studies of programming with more humane fields that help foster the creativity, ideation, empathy, storytelling, and ethical discernment that is essential to innovation and leadership in the tech sector.