AI is no longer on the horizon — it’s here, already shaping how students learn, complete assignments and develop skills. Whether we choose to actively integrate AI into teaching and learning or simply acknowledge its presence, one thing is clear: our approach to education must be AI-aware.

Faculty are well-positioned to lead this shift. With deep expertise in their disciplines and a clear understanding of the skills students need, faculty are best equipped to determine how AI fits — or doesn’t fit — within their courses. AI’s role in a chemistry lab looks very different from its role in a history seminar, and no single institutional policy can account for these nuances.
Rather than applying one-size-fits-all AI policies, institutions have an opportunity to support faculty in exploring AI, assessing its impact and determining its place within their disciplines. This isn’t about adding more to faculty workloads — it’s about providing the time, space and resources to help them lead the conversation on their own terms.
This article is part of a monthly column provided by the Instructional Technology Council, an affiliated council of the American Association of Community Colleges.
The reality of AI in education
According to the 2024 Digital Education Council Global AI Student Survey, 86% of students use AI in their studies, with 54% using it weekly and nearly one in four using it daily.
Meanwhile, faculty adoption lags. The 2025 Digital Education Council Global AI Faculty Survey found that while 61% of faculty have used AI in teaching, 88% do so minimally. Many remain unsure how to integrate AI effectively.
At the same time, a Cengage Group study shows graduates feel unprepared for an AI-driven workforce, with more than half saying their programs didn’t teach them AI skills. Employers agree — 73% already use AI at work and want new hires who can, too.
A faculty-led, department-driven approach
AI’s role in education cannot be dictated by a one-size-fits-all policy — it must be shaped by faculty content experts who understand the skills, assessments and industry expectations within their disciplines. Faculty are best positioned to determine how AI can enhance learning, where it risks replacing essential skills, or when it simply needs acknowledgment within their field.
At the same time, AI’s rapid evolution makes it challenging for individual instructors to assess and integrate effectively on their own. A faculty-driven, departmental approach allows instructors to collaborate, share insights and shape AI’s role in ways that align with their disciplines and student learning goals.
Navigating AI together
A structured faculty development model can guide educators through four key steps to thoughtful AI integration:
- Explore AI hands-on. Before faculty can guide students in AI use, they need firsthand experience with the tools themselves. Experimenting with AI in a low-stakes, discipline-specific setting allows instructors to see what AI can and cannot do in their field. Hands-on exploration helps demystify AI, reducing misconceptions and uncertainty. Faculty can start by testing AI for lesson planning, content creation or feedback generation, then discuss their findings with colleagues. Through collaborative experimentation, faculty can build confidence and determine where AI might enhance, rather than disrupt, learning.
- Evaluate AI’s role in learning. Not all AI use is beneficial. Faculty must analyze whether AI acts as a skill accelerator, skill replacer or skill distorter in their courses. For example, in World Languages program courses, does AI-powered translation support language learning, or does it replace critical thinking in sentence construction? Does AI-assisted design inspire creativity, or does it diminish foundational skills? Faculty working together in departments can map AI’s impact on core learning objectives, ensuring it strengthens — not weakens — competency-building.
- Redesign assignments and policies. Once faculty understand AI’s role, they must adapt course materials to ensure AI supports student learning rather than replacing essential skills. This could mean modifying assignments to encourage responsible AI use — such as requiring students to explain AI-generated insights or compare AI output to their own work. Clear AI policies should define when, where and how AI can be used in coursework. By proactively shaping assignments and assessments, faculty can promote academic integrity while leveraging AI’s strengths.
- Foster ongoing faculty collaboration. AI is evolving too quickly for faculty to develop one-and-done solutions. Departments must create structured spaces for ongoing discussion, where faculty can share strategies, refine policies and adjust their teaching approaches as AI advances. This could take the form of faculty learning communities, interdisciplinary workshops or department-wide AI review sessions.
Leading AI integration
Effective AI integration depends on faculty’s content expertise – only they can determine how AI fits into their specific courses, disciplines and professional standards.
Institutions that fail to invest in structured faculty development risk:
- Applying one-size-fits-all AI policies that ignore the nuances of different disciplines.
- Leaving students unprepared for AI-driven workplaces.
- Allowing AI to shape learning passively, rather than guiding its use intentionally.
Higher education is at a crossroads: will AI be thoughtfully integrated into teaching and learning, or will students continue to navigate it on their own? The answer depends on whether faculty — those who shape student learning every day — are given the tools, time and autonomy to lead the way.
* * *

Kate Grovergrys, MA, is a full-time faculty member at Madison College in Wisconsin. She develops professional development on topics such as inclusive teaching practices and artificial intelligence.
Tina Rettler-Pagel, Ed.D., is also a full-time faculty member at Madison College. She spends most of her time on projects and initiatives focused on digital learning, but also supports faculty in exploring and planning for the pedagogical opportunities of generative AI.
Grovergrys and Rettler-Pagel are members of the ITC Affinity Group.