At the recent Achieving the Dream conference in Portland, Ore., facilitators challenged a dominant narrative surrounding artificial intelligence that faster and easier learning is inherently better.
Throughout the event, which was held March 2-5, educators returned again and again to a deeper concern of not whether students can use AI, but whether they can critically think, judge, and decide in a world increasingly shaped by it. And part of developing students’ thinking, presenters said, is interrogating who is invited in and who is left out by the growing emphasis on AI.
In her opening keynote, Dr. Sarah Elizabeth Lewis, an associate professor of African and African American Studies at Harvard University and the John L. Loeb Associate Professor of the Humanities, connected today’s debates about AI, workforce readiness, and the humanities to a much older struggle over representation and belonging. Lewis recounted the story of her grandfather, Shadrach Emmanuel Lee, who in 1926 questioned why Black Americans were absent from his history textbooks. When told they had done nothing worthy of inclusion, he persisted — and was expelled from school.
Dr. Sarah Elizabeth Lewis
Denied formal credentials, Lee turned to a career in art and music, becoming a painter and jazz bassist and creating the images he had been told did not belong. Lewis traced this family history as foundational to her work, framing visual and cultural literacy as forms of civic literacy to the ways societies decide who is recognized, remembered, and valued in public life. The question her grandfather asked nearly a century ago, she suggested, has not disappeared. It has simply migrated into new systems, now embedded in institutions, technologies, and increasingly, AI-driven decision-making.
One session reframed “AI readiness” not as tool mastery, but as human readiness. As AI automates routine writing, coding, and analysis, what becomes valuable is the ability to ask better questions, recognize what is missing, evaluate bias, and decide what matters. One model assignment illustrated this shift: students use AI to generate an initial draft on a local issue, then critique it by identifying missing perspectives and contextual blind spots before conducting interviews in their communities and revising the work. AI becomes a starting point, not a substitute, and students remain responsible for meaning-making.
Humanities as civic infrastructure
One faculty panel featued Dr. Jason Michael Leggett of Kingsborough Community College and Dr. Donna Hunt of Lorain County Community College. Together, they rejected the notion that the humanities are abstract enrichment separate from workforce preparation. Instead, they positioned humanities education as part of a civic infrastructure that is essential to equity and democratic participation.
Leggett, who teaches constitutional law and directs the Kingsborough Center for Civic Engagement, challenged the idea of the classroom as a “safe space” insulated from injustice, noting that students arrive already shaped by structural inequities. He argued that the role of the humanities is not to avoid those realities but to create structured learning environments where students can name injustice, connect personal experience to broader systems, and develop the language to advocate for themselves without reducing education to unproductive trauma.
Hunt approached the question from a workforce perspective. A poet and program coordinator for the Mandel Foundation Humanities to Career initiative at LCCC, she described creative practice as a discipline of uncertainty and failure, revision, and trying again. She argued that these are precisely the capacities students need in both work and life. The problem, she noted, is not that humanities students lack workforce skills, but that institutions often fail to help students name and translate those skills in ways employers recognize. When students gain that language, she suggested, they gain agency.
Molly Phelps director of the Humanities to Career Program at Bunker Hill Community College, spoke in another session about how students receive earned skills badges to put on their LinkedIn profiles that can not only tell employers about the soft skill sets they have, but they empower students with confidence to talk with employers about their skills. Phelps said one of the big takeaways from the work with humanities students is, “all degrees are professional.”
Lewis connected these classroom practices to a broader civic concern, arguing that when education treats creativity and interior life as irrelevant to public work, societies lose their capacity for truth telling and repair. In moments of political polarization and cultural fracture, she framed the humanities and arts as essential tools for reckoning with history, power, and democracy.
A fragmented landscape
Outside the formal sessions, that tension between possibility and uncertainty was evident in off-the-record conversations with campus leaders navigating very different stages of AI adoption. Some institutions had already established AI departments or task forces. Others had no formal policy governing AI use in the classroom. In several statewide systems, individual colleges were serving as informal beta sites, testing AI tools and pedagogical approaches on behalf of the broader system. The result is a fragmented landscape marked by experimentation, uneven capacity, and, in many cases, hesitation.
That unevenness is not unique to AI. Like earlier disruptive innovations, AI is forcing institutions to confront questions of governance, pedagogy, equity, and trust way before clear answers exist. The challenge community college leaders acknowledged privately is not whether AI should be embraced, but how to integrate it as a tool rather than allow it to become an impediment to learning.
Employer expectations reinforce the urgency to get this right. In a follow-up exchange, Dr. Mohamed Ghonimy, assistant professor of information technology at North Central State College, said advisory committee feedback reshaped his program’s approach by prompting the faculty “to place a stronger emphasis on soft skills. As a result, we developed intentional assignments and assessment tools designed to help students build and demonstrate those skills throughout their two years,” he said.
He also argued that humanities expertise is essential to AI’s real-world functioning, even when it’s not visible on the surface. “In order to create an AI system that can generate, review or edit documents, this system must be trained by English Major professionals,” he continued. “Any AI system must be trained by the people who are experts in the field of the AI system.”
On student readiness to detect bias, he was blunt about the learning curve. “I do not think we are there yet … we are working on it,” he said, noting that outputs are shaped by prompts and that “we as faculty are learning right alongside our students.”
For community colleges, the most diverse sector of higher education, the stakes are especially high. The risk is not simply that students will misuse AI. The deeper danger is that, without humanities-centered pedagogy, students may never develop the interpretive, ethical, and civic capacities needed to navigate automated systems that increasingly shape work and public life.
Nearly a century after Shadrach Emmanuel Lee asked why he was missing from history, educators are now asking AI a parallel question: whose knowledge counts, who is represented, and who decides?
At Achieving the Dream 2026, the answer was clear. In an age of artificial intelligence, the humanities do not slow progress. They make it humane.















