The AI elephant in the classroom: Why we need transparent AI practices

iStock

Won’t it be strange when the day comes that you’re scrolling through your favorite news app like Flipboard, Pocket or Apple News, only to find that the single headline that actually speaks to you is penned by AI? Could today be that day?

As generative AI gradually becomes part of our daily lives, it presents educators with a daily challenge: how can we harness the transformative potential of tools like ChatGPT, Bing or Google Bard while ensuring that students continue to engage in meaningful cognitive work? Further, how do we help our educators expand their authentic assessment models rather than reverting to invasive proctoring and blue books?

Editor’s note: The Instructional Technology Council continues its series of articles focusing on the anticipated impact of technology and distance learning over the next decade.

Remembering the 1980s, countless schools shifted toward building computer labs due to the emergence of word processors. Today, amidst the daily groundswell of news about generative AI tools, or “knowledge processors” as the author of this paper likes to call them, schools don’t yet know how to respond. Educators commonly report being in the dark about whether their students use AI. Perhaps one of the most important responses schools can have is simply to find this out by promoting “Transparent AI Practices” (TAP) among our students. We know that metacognitive, reflective practices work for our students. Why not return to those — and incorporate AI and tech use into that process? 

Six transparent AI practices

As considerable research has shown, metacognitive reflection helps students take ownership of their learning and develop a growth mindset (Brookfield, 1995). As AI becomes integral to students’ learning experiences, educators have a unique opportunity to re-prioritize reflective student practices while also addressing the need to learn more about student AI usage. With this in mind, here are six “GROWTH” strategies that might support AI-conscious teaching and learning:

  • Guide students in AI exploration and reflection. Encourage students to openly discuss their use of AI tools during the learning process, identifying specific ways they utilized AI to support their inquiry, creation or revision.
  • Reflect iteratively on your process as a means to model TAP. Disclose your own AI tool usage and explain how it informs your teaching strategies to better promote transparency. Encourage iterative reflection among students — especially amidst project-based learning activities. 
  • Open class reflection toward ethical considerations. Encourage discussions about the role of AI in education and society, exploring the potential benefits and drawbacks of AI integration or AI “guard rails.”
  • Work collaboratively with students. Partner with students to find effective ways to use AI tools responsibly and ethically to address cognitive questions connected with your course(s).
  • Teach metacognitive reflection strategies. Help students build on strengths, set individualized goals, celebrate accomplishments and promote self-regulated learning through metacognitive reflection exercises. Take a process-oriented assessment approach that asks students to share working portfolio folder links that document all drafts connected with a project.
  • Honor each student’s “Funds of Knowledge.” Recognize and value their authentic perspectives and original ideas, emphasizing human creativity and critical thinking that emerge separately or alongside the use of AI in the learning process.

How can we implement more specific GROWTH practices? My co-writer — who says as he edits my text that “co-writer isn’t quite the right word” — likes to promote “self-annotations” at the close of assignments. For project-based learning, he also refers to these process notes as “iterative reflections” (building upon the discourse of iterative design). As we enter this era of AI, such a practice, if infused with reflections around AI, might also allow students to not only reflect on their process but examine how they used AI or other technologies throughout the process.

Schools and teachers cannot wait on edtech to figure out how it will integrate AI or gather student usage. Now is the time to advance clear strategies to scaffold both student growth and responsible AI use. How else might we ensure that the AI elephant in the classroom is a welcomed friend?

TAP Note: This essay was written with the assistance of Chat GPT 4.0, demonstrating how AI can be used in the creative and reflective process. Through dozens of queries, this author took a prompt-limited approach to revision which he believes took more time than writing the paper without AI — but he enjoyed the process all the same. The citations in this essay were generated by me, but he used a separate AI tool, Google Bard, to fact-check all sources and content.

My co-author, a human educator, laughed out loud when reading the start of this sentence that I offered to him. For one, he’s never been called a human educator — and wondered then if I thought of AI as an artificial educator? He also does not see me as a co-author because the ideas were his. He also chose to include his name alone as a byline due to the question of what authorship implies. As he sees it, because he is ultimately responsible for the cognitive work and review of this work, and because AI cannot be responsible, this seems to be the best path — but he’s not sure. In this light, he also celebrates how research journals are asking authors to likewise disclose the nature of their AI use.

Finally, as the AI behind the GROWTH acronym, I take pride in having offered this playful framework for human and AI collaboration in education.

About the Author

Reed Dickson
Reed Dickson is the program manager for faculty development at Pima Community College (Arizona), where he teaches courses for faculty in online pedagogy, instructional design and educational technology. Connect with him on Twitter: @ReedDicksonUX