The gulf between those working to integrate AI into their teaching and those swearing off its use entirely is growing wider by the month. It’s not just about comfort with technology; it’s about pedagogical identity, ethics, trust, and the role of higher education in a rapidly changing world, reports Faculty Focus (Aug. 13, 2025).
Some faculty are experimenting with AI-graded orals. Others are defaulting to analog tools like in-class handwritten exams. Still others are choosing not to address AI at all—perhaps hoping it will fade.
AI may or may not upend higher education, but in the meantime, it’s prompting urgent questions: What are we assessing? What do we value? How do we prepare students not just to perform, but to think, reflect, and adapt in a world where generative tools are the norm?
Faculty skepticism toward AI isn’t unfounded (data privacy, environmental electricity toll, murkiness of “scraped” datasets, student creativity loss, voice and bias, etc.). It’s easy to reduce the AI debate in education to one issue: cheating. And yes, generative AI makes it easier than ever to outsource writing, coding, or even lab reports.
But neither is pretending this technology doesn’t exist. AI isn’t just a technological shift; it’s a mirror reflecting what we value in education, labor, and society at large. In today’s classroom, silence or neutrality sends a message.
So the most important place to start is also the simplest: your syllabus. Be specific about when, how, and why students are or are not allowed to use generative tools. If AI is restricted for certain assignments, explain the rationale. If it’s allowed, clarify what constitutes appropriate use—and what crosses the line into misrepresentation. Our goal is to model critical thinking. When we articulate our stance on AI, we teach students how to approach emerging technologies with intention rather than fear or opportunism. It’s a pedagogical opportunity. It invites students to see learning as more than task completion—and faculty as more than enforcers of boundaries.
Our students don’t need us to have all the answers. They need us to model how to live with the questions. They need to see that thoughtful, ethical, human learning is still possible, especially in a world full of algorithms.
