Fostering student AI fluency for career readiness

iStock

“You go ’round and around it
You go over and under
I go through…”

Those lines from the song “I Go Through” by O.A.R. feel uncomfortably familiar in higher education right now.

Artificial intelligence (AI) did not arrive slowly or politely. It arrived embedded in workplace tools, search engines, writing platforms, data analysis software, and design environments. And it arrived faster than most academic systems could comfortably absorb.

In response, some folks in higher education are trying to go around AI. Some faculty hope it fades. Others focus on prevention and detection. Many wait for perfect policies or definitive answers that have yet to appear. But as the lyrics remind us, meaningful progress rarely comes from avoidance. If AI is here to stay, and it is a part of how we will work and learn, the only viable path forward is to go through it.

For community colleges especially, this moment is not optional. It is foundational for career readiness.

This article is part of a biweekly series provided by the Instructional Technology Council, an affiliated council of the American Association of Community Colleges.

AI fluency and career readiness are now inseparable

When faculty hear the phrase “career readiness,” they often think of professionalism, communication, ethics, teamwork, adaptability and problem-solving, according to a Cengage Group report. National career readiness frameworks reinforce this view, emphasizing competencies such as critical-thinking, communication, technology use, equity and inclusion, and professionalism as essential across industries, says the National Association of Colleges and Employers.

AI fluency fits squarely inside this landscape.

Employer data consistently shows that AI is already embedded in day-to-day work. Large majorities of employers report using generative AI for research, writing, data analysis, brainstorming and product development. Yet, conversely and perhaps alarmingly, at the same time, many colleges still discourage or restrict AI use broadly, creating a growing mismatch between academic practice and workplace reality.

Students notice this gap. Survey data shows that many students do not feel adequately prepared to use AI effectively or responsibly, yet they expect institutions to provide more guidance, training and transparency around AI use.

This is not a call to abandon rigor or standards. It is a call to align learning with the environments students are actually entering.

AI literacies are career literacies

The Dimensions of AI Literacies project, reimagined by Opened Culture, offer a useful framework for understanding what students need to be ready for their future careers. Rather than reducing AI readiness to tool proficiency, the framework emphasizes both action and judgment.

The dimensions can be understood as two complementary domains:

  • Four skill-based literacies, focused on what students do with AI in real workflows
  • Four mindset-based literacies, focused on how students think about, evaluate, and take responsibility for AI use

Together, they reflect what employers are asking for: not unquestioned or automatic adoption, but thoughtful collaboration between human expertise and AI systems.

AI skill-based literacies – How we use AI:

  • Cultural AI literacies – Recognizing the connections between people, AI-informed resources and tools, and points of engagement within AI tools and AI-enabled environments.
  • Creative AI literacies – Engaging in ideation and generative actions using AI, focusing on how AI can add value and introduce new possibilities within specific contexts.
  • Constructive AI literacies – Utilizing AI tools to build, remix and generate new content, applying AI capabilities.
  • Communicative AI literacies – Leveraging AI technologies to convey ideas effectively, recognizing the sociocultural practices and nuances that AI interprets and influences in different settings.

AI mindset-based literacies – How we think about AI:

  • Confident AI literacies – Developing the ability to solve problems and manage learning within AI-driven environments by understanding and harnessing their unique features and potentials.
  • Cognitive AI literacies – Expanding intellectual capabilities by engaging with AI-enabled processes and environments.
  • Critical AI literacies – Examining the power dynamics and ethical considerations inherent in AI practices, reflecting on the broader societal impacts of AI-driven decisions and actions.
  • Civic AI literacies – Employing AI knowledge and skills to contribute positively to society, using AI to foster community empowerment, engagement and societal progress.

When mapped to career readiness competencies, the alignment is striking. Evaluating AI output reinforces critical thinking. Transparent AI use strengthens communication and professionalism. Choosing appropriate tools supports technological agility. Reflecting on AI’s limits and impacts builds ethical reasoning and civic awareness.

AI fluency, in other words, is not separate from career readiness. It is part of it.

Helping students go through AI: Practical strategies for any discipline

Faculty do not need to become AI experts or redesign entire courses to support this work. Small, intentional instructional choices can have a meaningful impact. The strategies below are discipline-agnostic and grounded in both employer expectations and classroom realities.

  1. Be transparent about your own AI use. One of the clearest messages from students is that silence around AI creates confusion and anxiety. When faculty avoid the topic, students are left to guess what is allowed, expected, or risky.
    Faculty can model responsible practice by:
  • Naming when and how they use AI in their own work.
  • Explaining why AI is appropriate for a task, or why it is not.
  • Sharing how they verify accuracy, address bias or revise outputs.

Transparency reframes AI as a professional tool that requires judgment, not a shortcut to be hidden.

  1. Elevate what is uniquely human in your course. As AI becomes more capable, teaching and learning must center what is uniquely human. This requires identifying the human capacities most central to your course and intentionally allowing those capacities to shape learning activities, assignments and assessments. Courses that emphasize interpretation, decision-making, creativity, empathy, contextual reasoning and ethical judgment prepare students for durable roles in an AI-enabled workforce.

Faculty can:

  • Design assignments that require explanation, justification or reflection.
  • Ask students to defend decisions rather than produce answers alone.
  • Emphasize learning processes, not just final products.

This approach aligns directly with employer priorities and reinforces the mindset-based dimensions of AI literacies.

  1. Create structured opportunities to evaluate AI output. Students need guided practice in engaging with AI as a tool for augmentation rather than automation. Instead of accepting AI output at face value, learning activities can position AI as a collaborator in an iterative process that requires human judgment, questioning and refinement. Short, low-stakes activities can help students build this habit across disciplines.

Examples include:

  • Verifying AI-generated solutions in technical or quantitative fields
  • Critiquing tone, clarity, accuracy or assumptions in AI-written text
  • Identifying bias, missing perspectives or potential safety concerns in AI recommendations
  • Revising or improving AI output based on human feedback and disciplinary standards

These activities strengthen critical AI literacies, reinforce the role of human oversight, and mirror real workplace expectations where AI supports, rather than replaces, professional judgment.

  1. Embed AI as one step in existing assignments. Rather than adding new assignments, faculty can embed AI into work students already complete. Embedding AI in this way models effective use, clarifies expectations for students, and aligns with workplace AI practices.

For example:

  • Use AI for brainstorming or outlining, followed by human refinement.
  • Generate examples or datasets with AI that students must explain, evaluate or critique.
  • Require students to document their AI workflow alongside final submissions.

This approach keeps rigor intact while preparing students for careers in which AI is embedded within everyday work and professional judgment remains essential.

  1. Dialogue with students about AI, rather than dictating rules. Conversation about AI is essential to developing student AI fluency. Students are already using AI. The more important question is whether they are learning how to use it well in academic and professional contexts.

Dialogue can include:

  • Discussing where AI supports learning and where it may limit skill development
  • Exploring ethical gray areas and tradeoffs rather than treating AI use as purely right or wrong
  • Inviting students to reflect on how AI may shape their future work and professional roles

These conversations support career readiness by helping students build agency, responsibility and professional judgment in AI-rich environments.

  1. Teach transparent AI use as a professional skill. In the workplace, employees are increasingly expected to disclose when AI assisted their work. Students benefit from practicing this norm early, using clear and shared language for describing how AI was used.

Faculty can:

  • Provide short AI-use disclosure templates or structured reflection prompts.
  • Use tools such as the AI Assessment Scale (Furze) to clarify expectations for appropriate AI use across assignments.
  • Include transparency and justification of AI use as part of grading criteria.
  • Normalize ethical attribution and explanation as professional skills.

This approach builds confidence, reduces ambiguity around acceptable AI use, and aligns academic practice with professional expectations.

  1. Work collaboratively at the program and department level. AI fluency cannot live in a single course. When expectations vary widely across programs, students receive mixed signals.

Departments and programs can:

  • Discuss where AI use supports learning and where it undermines genuine skill development.
  • Align expectations across courses when possible.
  • Identify which AI literacies are most critical at different stages of a program.

This faculty-led, department-driven approach supports coherence and equity, especially for students navigating multiple instructors and modalities.

  1. Use reflection to strengthen human-AI collaboration. Finally, reflection helps students internalize what they are learning about themselves as thinkers and professionals.

Effective prompts include:

  • When did AI accelerate your work?
  • Where did it mislead you?
  • How did your judgment improve the outcome?
  • How might you use AI differently next time?

Reflection transforms AI from a tool into a learning partner.

Going through, not around

Could it be that the O.A.R. lyrics of going through, not around, remind us that disruption often brings possibility and clarity alongside discomfort?

The workforce is not asking graduates to avoid AI, fear it or embrace it uncritically. It is asking them to use judgment, communicate clearly, evaluate responsibly, and collaborate thoughtfully.

Those are human skills.

Community colleges are uniquely positioned to lead this work. By embedding AI fluency across disciplines, faculty can ensure that students graduate not just with credentials, but with the confidence and competence to navigate an evolving world of work.

We do not go around AI; We go through it.

And we help our students do the same.

* * *

Tina Rettler-Pagel, Ed.D., is also a full-time faculty member at Madison College. She spends most of her time on projects and initiatives focused on digital learning, but also supports faculty in exploring and planning for the pedagogical opportunities of generative AI.

Kate Grovergrys, MA, is a full-time faculty member at Madison College in Wisconsin. She develops professional development on topics such as inclusive teaching practices and artificial intelligence.

Grovergrys and Rettler-Pagel are members of the ITC Affinity Group. In addition, they are both participating in a research project for Madison College’s Institute for Equity and Transformational Change focused on leveraging AI for inclusive teaching and learning.

The owner of this website has made a commitment to accessibility and inclusion, please report any problems that you encounter using the contact form on this website. This site uses the WP ADA Compliance Check plugin to enhance accessibility.