Teaching for Integrity in the Age of AI: From Compliance to Culture

Inspired by Chapter 2 of The Opposite of Cheating: Teaching for Integrity in the Age of AI by Tricia Bertram Gallant and David Rettinger

Academic integrity is not a checklist or compliance form. It is a living culture shaped by what we model, how we design, and the conversations we hold with our students. Gallant and Rettinger remind us that integrity is cultivated through transparency, design, and dialogue, not surveillance or punishment. The real challenge now is how to teach integrity in an age where AI is everywhere.

The U.S. Department of Education’s 2023 report, “Artificial Intelligence and the Future of Teaching and Learning”, encourages educators to treat AI as a design opportunity for advancing human-centered learning, not a threat to academic honesty. Recent data highlight the urgency of this work. According to the Higher Education Policy Institute’s 2025 Student AI Survey, 92% of undergraduates report using generative AI tools, up from 66% the year before. A Guardian report found that AI-related misconduct cases have tripled since 2023. The takeaway is clear: integrity education has to evolve alongside AI literacy.

2025 Snapshot: AI & Academic Integrity

Use and Attitudes

  • Over 85% of undergraduates use GenAI tools (Inside Higher Ed, HEPI 2025)

  • 61% of students want clear, course-level AI policies

  • 33% of students were concerned about being accused of plagiarism or cheating (Campus Technology)

  • While 45% believe using AI for editing is “acceptable academic support”

Institutional Responses

Faculty Trends

  • A significant gap exists between student and faculty adoption: only 61% of faculty report using AI in teaching, and of those, a large majority (88%) do so minimally (ASEE AI Training 2025 led by Drs. Adita Jori and Andrew Patz).

  • 82% of instructors use GenAI for feedback or rubric design (EDUCAUSE 2025 AI Landscape Study)

  • Detection tools now use watermarking and metadata tracing, but false positives remain a major concern (arXiv 2025)

Model Integrity

Students notice how we work. They learn from the way we check our sources, document decisions, and acknowledge mistakes. Modeling integrity starts with transparency.

As the EDUCAUSE 2025 AI Landscape Study notes, many universities are investing in training that helps faculty engage AI responsibly. Modeling integrity now means showing how to use AI intentionally, not avoid it.

This aligns with findings from Gu and Yan’s 2025 meta-analysis, which showed that students benefit most when teachers scaffold AI use and talk openly about it. When instructors frame AI as a learning partner, not a shortcut, students develop stronger judgment and accountability.

Make Integrity Explicit

Integrity should show up as often in our discussions as it does in our policies. When we talk about it before projects, during collaborations, and after challenges, students begin to see ethics as part of the learning process.

Tricia Bertram Gallant and David Rettinger emphasize that ethical behavior thrives when it’s designed into the experience. Singer-Freeman, Verbeke, and Barre (2025) found that students across all academic levels want clear guidance on what’s acceptable AI use. If we make expectations explicit, we replace anxiety with understanding.

A recent MDPI review on Generative AI and Academic Ethics reinforces this point, noting that while GenAI can enhance engagement and efficiency, it also increases risks to originality and ethical reasoning.

Use Clear, Simple Language (The Social Institute)

Students need to understand A.I. policies to be able to follow them. That means avoiding jargon and overly technical language.

Instead of: “A.I. assistance must align with established academic integrity principles.”

Say: “You may use A.I. for brainstorming ideas, but not for writing entire sections of code or essays.”

Establish consistent Rules Across Departments or Schools

One of the biggest sources of confusion is inconsistent enforcement when it comes to A.I. rules. Departments or schools can develop a universal A.I. guidelines that applies to all instructors, rather than allowing individual educators to set conflicting rules. Over half of students (58%) report that their school or program has a policy, but a substantial number (28%) say it differs, with some courses or professors having a policy and some not (Forbes 2025). Consider creating an instructor handbook outlining departmental or school-wide A.I. best practices to make sure they are consistently communicated to students.

Frame Integrity Positively

Instead of framing integrity around rules, frame it around growth. Students respond better when they see ethical choices as part of their professional development.

A Packback editorial on academic integrity in 2025 argues that punitive detection systems often erode trust and discourage learning. When faculty shift from surveillance to conversation, integrity becomes something students take ownership of, not something they fear.

Clarify Expectations

Ambiguity creates rationalization. In the age of AI, clarity is an act of fairness.

The National Centre for AI’s 2025 student study found that first-year students, in particular, feel confused about when and how AI use is acceptable. Faculty can address this by defining boundaries early and discussing examples. Transparency about tools, citations, and documentation helps students learn discernment.

Research from arXiv’s 2025 watermarking study adds that while detection tools are improving, they still make errors. Open conversations about what these systems can and cannot do build trust and understanding. Institutions like MIT and Duke University (22 minute mark) provide sample policy language for faculty to adapt. These statements define what “appropriate help” means and require students to cite AI contributions when used. Clarity transforms anxiety into accountability.

Normalize Conversations About Ethics

Ethics belongs in everyday learning. Conversations about bias, authorship, and data use should happen alongside technical instruction.

A 2025 study on synthetic media ethics found that students value open discussions about deepfakes and misinformation but often lack the frameworks to evaluate them. Integrating these discussions into our teaching helps students connect ethics to both academic and professional practice.

Use the Syllabus as a Moral Document

The syllabus sets the tone for integrity. Transparent grading policies, clear AI statements, and flexible revision options communicate fairness and care.

Universities are redesigning their syllabi and assessments to support “authentic learning” instead of reactive policing. The University of Melbourne’s Assured Learning model and the UCL Education AI Initiativeare leading examples, focusing on oral exams, reflective portfolios, and transparent assessment design.

Respond to Misconduct Constructively

When integrity violations occur, they can become moments for growth. Reflection, accountability, and dialogue teach more than punishment ever could.

The Packback 2025 Integrity Report encourages “growth-oriented remediation,” noting that many flagged cases stem from confusion, not intention. At Indiana University, we can uphold policy while still approaching each case as a learning opportunity.

Building a Culture of Integrity

Integrity thrives when it’s shared across the institution. Faculty, staff, and students each play a role.

The University of New South Wales’ 2025 partnership with OpenAI illustrates this shift: giving staff controlled access to ChatGPT within a responsible use framework. When universities model integrity through their own practices, students learn that ethics is not a barrier to innovation—it’s the framework that sustains it.

Final Thought

Teaching for integrity in the age of AI is about creating conditions where honesty becomes the natural choice. When we model transparency, design for trust, and engage in open dialogue, we teach more than content—we teach character.

As Amanda McKenzie, Director of Academic Integrity at the University of Waterloo, Canada, shares, “Integrity is not the opposite of cheating. It’s the presence of purpose.” When that purpose runs through our teaching, policies, and partnerships, we do more than protect academic standards. We prepare students to lead with integrity in a world increasingly shaped by AI.