Beyond the Hype: A Practical GenAI Resource Guide for Faculty in Technical Disciplines

As faculty that teach technical disciplines, you are in a unique position. You aren’t just figuring out how to use Generative AI; you are teaching the students who will build, deploy, and critically evaluate these tools for years to come.

The challenge is twofold:

  • How can you leverage AI to improve your own teaching (e.g., create coding examples, debug assignments, or design better projects)?

  • How can you effectively integrate AI into your curriculum as a core competency (e.g., teach prompt engineering, model limitations, and AI ethics)?

The internet is flooded with AI resources, and it’s impossible to sift through them all. This post is a practical, curated guide to help you find the most useful resources for your courses without the noise.

Start with IU: Key Local Resources

Before diving into the wider web, start with the excellent resources available directly from IU. These provide the foundational context and policies for our community.

Generative AI 101 Faculty Resources
Description: An overview of the GenAI 101 Course available to all at IU. Also includes a syllabus insert that can be used to promote the course to students.

Kelley School of Business “AI Playbook”
Description: A “living guide” developed by the Kelley School for faculty on the use of generative AI in teaching, grading, and research. It outlines shared values and emphasizes that faculty expertise remains central.

When to use: When you want faculty-facing guidance on when and how to use generative AI in assessments, course design, and feedback workflows.

A Quick Starting Point: Three Actionable Resources

If you want to branch out, here are three high-value resources to review in 10 minutes or less.

  1. For Your Curriculum: Teach CS with AI: Resource Hub for Computer Science Educators

    • What it is: A hub specifically for integrating AI into CS courses. It includes lesson plans, project ideas, and pedagogical strategies for teaching AI in computing.

    • When to use: When you’re not just using AI, but actively teaching AI concepts, ethics, or applications within a CS or Informatics course.

  2. For Your Pedagogy: Harvard University:“Teaching with Gen-AI” resources

    • What it is: High-level guidance from Harvard on course design, with excellent case studies and strategies for handling risks like hallucinations and superficial reasoning.

  3. When to use: Use this before the semester starts. It’s perfect for designing your syllabus, setting AI policies, and building responsible use guidelines into your course from day one.

  4. For Your Students (and You): AI for Education: “Effective Prompting for Educators”

    • What it is: A focused guide on how to write better prompts. It includes frameworks (like the “5 S Framework”) that are perfect for teaching students a structured approach to “prompt engineering.”

    • When to use: When you want to move students beyond simple “ask-and-receive” and teach them how to partner with AI to get better, more reliable, and more complex results.

The Deep Dive: A Curated Resource Library

For those with more time, here is a more comprehensive list organized by task.

1. How to Use AI in Your Classroom (Pedagogy & Assignments)

2. Helping Students (and You) Get Better at Prompting

  • AI for Education: Prompt Library

    • Description: A comprehensive, searchable collection of ready-to-use prompts and templates specifically for educators.

    • When to use: When you need quick, plug-and-play prompt templates for lesson plans, student tasks, or administrative work.

  • More Useful Things — Prompt Repository for Educators

    • Description: A repository of prompts for instructor aids and student exercises, curated by researchers Ethan and Lilach Mollick.

    • When to use: When you want tested, inspiring prompt sets, especially for idea generation or in-class activities.

  • Anthropic Prompt Library 

    • Description: Anthropic’s (maker of Claude) public library of optimized prompts for business, creative, and general tasks.

    • When to use: When you want to show students (or yourself) “what good prompting looks like” from an industry leader.

3. How to Teach AI in Your CS/InF Courses (Curriculum & Literacy)

  • Teach CS with AI: Resource Hub for Computer Science Educators

    • Description: A hub dedicated to integrating AI topics, tools, and teaching strategies in CS courses.

    • When to use: Use when teaching a CS course and you want to integrate AI content (topics, labs, projects) directly.

  • metaLAB at Harvard: The AI Pedagogy Project / AI Guide

    • Description: A curated site with assignments and projects to integrate AI in pedagogical practice, focused on critical thinking.

    • When to use: When you are designing a module on AI literacy, critical AI thinking, or assessing students’ interaction with AI tools.

  • Ideeas Lab: Teaching & AI resources

    • Description: A resource hub with teaching materials and tools, particularly aimed at engineering and technical fields.

    • When to use: When you want resources specifically tailored for engineering domains that integrate AI in assignments.

  • AI for Education: “Generative AI Critical Analysis Activities

    • Description: Classroom activities to help students critically examine AI outputs, ethics, and limitations.

    • When to use: When you want to design modules around AI ethics or have students evaluate AI rather than simply use it.

4. Taking it Further: Building Your Own AI Tools

5. Professional Development & Staying Current

  • IBM Skills Build for Educators: College Educators resources

    • Description: A professional development site offering modules and training materials to build AI fluency and integrate digital skills into teaching.

    • When to use: When you want a structured PD path for yourself or want to build a course around AI literacy and workforce readiness.

  • University of Maine: LearnWithAI initiative

    • Description: A practical, “how-to” oriented site for faculty on integrating AI into courses.

    • When to use: Use when you want a site focused on faculty development and practical course integration.

  • Future-Cymbal Notion Page: Shared collection of AI-Teaching Resources

    • Description: A collaboratively curated Notion page of ideas, links, frameworks on AI in education; less “formal guide,” more open resource aggregation

    • When to use: Use when you want to browse a broad, ever-updating set of ideas rather than a polished handbook.

  • AI Resources – Lance Eaton

    • Description: It collects a wide variety of resources for educators around generative AI in the classroom — such as sample syllabus statements, institutional policy templates, teaching ideas, and faculty development materials.

    • When to use: When you are designing or revising your course syllabus and need clear language about how you will (or won’t) allow AI tools in student work.

  • Newsletters for Staying Current:

    • The Rundown -Daily newsletter summarizing AI news across research, policy, and industry.

    • The Neuron – Broad coverage of emerging AI trends and commentary, often with education-adjacent insights.

    • The Batch – Weekly deep dives into AI research, tools, and development—ideal for those following the tech side.

    • The Algorithmic Bridge | Alberto Romero – Thoughtful essays analyzing AI’s social, ethical, and educational impact.

    • Everyday AI Newsletter – Daily newsletter (and accompanying podcast) aimed at making AI accessible to “everyday people” whether educators, professionals, or non-tech specialists.

Conclusion: Start Small, Start Now

You don’t need to redesign your entire curriculum overnight. The best approach is to start small.

Pick one thing to try this month. It could be using a prompt library to help you write a coding assignment, adapting a syllabus policy, or introducing one critical analysis activity in a senior seminar. By experimenting now, you’ll be better prepared to lead your students in this new, AI-driven landscape.

Did I miss a great resource? Leave a comment and let me know!

Building a Framework for Academia-Industry Partnerships and AI Teaching and Learning Podcasts

In March, I shared the an overview of the  “Practitioner to Professor (P2P)‘ survey that the CRA-Education / CRA-Industry working group analyzed. They recently released a report titled Breadth of Practices in Academia-Industry Relationships which explores a range of engagement models from research partnerships and personnel exchanges to master agreements and regional innovation ecosystems.

Key Findings and Observations

The report organizes its findings from the workshop into three categories: observations, barriers, and common solutions:

  • Observations A major theme was the critical need to embed ethical training into AI and computing curricula through both standalone courses and integrated assignments. It was noted that while academia is best suited to drive curriculum development, input from industry is essential to ensure the content remains relevant to real-world applications.

  • Barriers Key barriers to successful collaboration were identified, including cultural differences and misconceptions between academic and industry partners. For instance, industry’s focus on near-term goals can clash with academia’s long-term vision. A significant practical barrier is the prohibitive cost of cloud and GPU hardware, which limits students’ experience with cloud and AI development tools.

  • Common Solutions Effective solutions include the fluid movement of personnel between organizations through internships, co-ops, sabbaticals, and dual appointments. Streamlined master agreements at the institutional level also help facilitate research collaborations by reducing administrative friction.

Strategies for Research Collaboration

The report outlines a multi-level approach to enhancing research partnerships:

  • Individuals Faculty and industry researchers can initiate relationships through internal seed grants, sabbaticals in industry, dual appointments, and by serving on industry advisory boards.

  • Departments Departmental leaders can foster collaboration by strategically matching faculty expertise with industry needs, offering administrative support, and building a strong departmental brand with local industry.

  • University Leadership Senior leaders can address systemic barriers by creating a unified, institution-wide strategy, developing flexible funding models, and implementing master agreements to streamline partnerships.

  • Regional Ecosystems The report emphasizes the importance of universities partnering with local industries and startups to build thriving regional innovation ecosystems, which can drive economic development and secure government support.

Education and Workforce Development 

With the rise of generative AI, the report highlights an urgent need for universities and industry to partner on education.

  • Curriculum Adaptation Computing curricula need to be updated to include foundational concepts in DevOps and scalable systems, which are often not part of the core curriculum. While AI literacy is essential, the report suggests a balance, with 80% of instruction remaining focused on core computer science skills. Ethical reasoning should be integrated throughout the curriculum, not just in a single course.

  • Workforce Programs To meet industry demands for job-ready graduates, the report advocates for university-industry partnerships in co-op programs, internships, and capstone projects. It also points to the need for universities to offer flexible programs like certificates and online courses to help upskill and reskill the existing workforce.

Recommendations

The report concludes with five main recommendations for universities, industry, and government:

  1. Enhance research impact by combining academia’s long-term vision with real-world problems from industry. This can be achieved by embedding faculty in industry and industry researchers in universities.

  2. Leverage the convening power of universities to build partnerships that benefit the wider community, using mechanisms like industrial advisory boards and research institutes.

  3. Accelerate workforce development by aligning university programs with regional innovation ecosystems and having industry invest in talent through fellowships and internships.

  4. Deliver industry-relevant curricula grounded in core computing principles, and collaborate with industry experts to co-design courses in high-demand areas like AI and cloud computing.

  5. Establish new incentives and metrics to recognize and reward faculty for their contributions to industry partnerships in promotion and tenure evaluations.

AI Teaching and Learning Podcasts:What If College Teaching Was Redesigned With AI In Mind?

https://learningcurve.fm/episodes/what-if-college-teaching-was-redesigned-with-ai-in-mind

A former university president is trying to reimagine college teaching with AI in mind, and this year he released an unusual video that provides a kind of artist’s sketch of what that could look like. For this episode, I talk through the video with that leader, Paul LeBlanc, and get some reaction to the model from longtime teaching expert Maha Bali, a professor of practice at the Center for Learning and Teaching at the American University in Cairo.

The Opposite of Cheating Podcast

https://open.spotify.com/show/5fhrnwUIWgFqZYBJWGIYml

(Produced by the authors of the book with the same name) the podcast shares the real life experiences, thoughts, and talents of educators and professionals who are working to teach for integrity in the age of AI. The series features engaging conversations with brilliant innovators, teachers, leaders, and practitioners who are both resisting and integrating GenAI into their lives. The central value undergirding everything is, of course, integrity!

Teaching in Higher Ed podcast, “Cultivating Critical AI Literacies with Maha Bali”.

https://teachinginhighered.com/podcast/cultivating-critical-ai-literacies/

In the episode, host Bonni Stachowiak and guest Maha Bali, a Professor of Practice at the American University in Cairo, explore the complexities of integrating artificial intelligence into higher education.

Bali advocates for a critical pedagogical approach, rooted in the work of Paulo Freire, urging educators to actively experiment with AI to understand its limitations and biases. The discussion highlights significant issues of cultural and implicit bias within AI systems. Bali provides concrete examples, such as AI generating historically inaccurate information about Egyptian culture, misrepresenting cultural symbols, and defaulting to stereotypes when prompted for examples of terrorism.

The Actual Intelligence podcast

speakswith Dr. Robert Neibuhr from ASU regarding his recent article in Insider Higher Ed: “A.I and Higher Ed: An Impending Collapse.” Full Podcast: https://podcasts.apple.com/us/podcast/is-higher-ed-to-collapse-from-a-i/id1274615583?i=1000725770519

with Bill Gates having just said that A.I. will replace most teachers within ten years, it seems essential that professional educators attune to the growing presence of A.I. in education, particularly its negative gravitational forces.

“A Map Makes You Smarter. GPS Does Not.”: A Story About AI, Work, and What Comes Next with Jose Antonio Bowen

Jose Antonio Bowen is introduced as a Renaissance thinker with a jazz soul. His background includes leadership roles at Stanford, Georgetown, and SMU, as well as being the president of Johnstreet College. He is also a jazz musician who has played with legends, a composer with a Pulitzer-nominated symphony, and the author of “Teaching Naked,” 30% off with the code TNT30 at Wiley “Teaching Change,” and “Teaching with AI.” 30% off Teaching Change or Teaching with AI with Code HTWN at JH.

He provided a workshop for us on AI Assignment and Assessments, where he mentioned:

“A map makes you smarter. GPS does not.”

It was such a small, quiet moment, but it cracked open something bigger. Because this wasn’t just about directions. It was about how we’re all starting to think less, remember less, and—if we’re not careful—become less, all thanks to the technology we depend on.

The Decline of Entry-Level Everything

Dr. Bowen shared that Shell, a global energy giant, had laid off nearly 38% of a particular workforce group. Internships? Vanishing. Entry-level jobs? Replaced.

Replaced by what?

Artificial Intelligence

Tasks that used to belong to interns or fresh graduates—writing reports, creating slide decks, analyzing data—are now handled by machines that don’t take lunch breaks or need supervision.

And that’s where the real twist came in: the people who still have jobs? They’re not the ones who can do the task better than AI. They’re the ones who can think better than AI. Who can improve, refine, and oversee what AI produces.

If AI is writing the first draft, the humans left in the room better know how to write the final one—with nuance, clarity, and insight.

Offloading Our Minds, One Task at a Time

Back to that GPS quote. Dr. Bowen called it “cognitive offloading”—how we gradually stop using certain mental muscles because tech is doing the lifting.

We used to memorize phone numbers, navigate with paper maps, even mentally calculate tips at restaurants. Now? We ask Siri.

The scary part isn’t that we’re forgetting how to do these things. It’s what happens when we offload creativity, problem-solving, and thinking itself.

Because if AI can be creative—can write poems, code apps, design marketing plans—what do we do? What’s left for us?

Creativity, Reimagined

But here’s where things got interesting. Dr. Bowen isn’t anti-AI. In fact, he practically gushed about it.

He showed how AI can be used to spark creativity, not stifle it.

He explained how students could upload a 700-page textbook and have the AI turn it into a podcast. A nine-minute podcast. With baseball analogies, if that’s what helps them learn.

He talked about using AI to create personalized assignments: instead of a generic math problem about trains, give a politics student a question about voter turnout rates. Suddenly, they care. Suddenly, they’re engaged.

Because AI isn’t replacing the teacher—it’s becoming the chalk, the blackboard, the entire toolset that a smart educator can use to make learning come alive.

Prompt Like a Pro

Here’s another nugget that stuck with me: prompting isn’t coding. It’s storytelling.

Don’t just ask the AI to “fix your proposal.” Ask it to “transform your proposal into something your provost will love.”

Use emotion. Use intent. Give context. AI, it turns out, responds best when it knows what you’re really trying to say.

The 70% Problem

Still, AI isn’t perfect. Dr. Bowen introduced what he called the “70% problem.”

AI can do a lot of things—but only up to a C-level standard. That’s fine for a rough draft. It’s dangerous for a final product.

If students rely on AI to do the work, and they can’t take it past that 70% mark, then what happens when employers expect more?

The solution? Raise the bar.

What used to be acceptable for a B or C should now earn an F—unless the student can make the AI’s work better, smarter, more human.

From Tools to Teaching Assistants

The future of education, the he argued, is not about banning AI—it’s about designing with it.

He showed how teaching assistants could use AI notebooks filled with chemistry texts to answer student questions on the fly.
How AI can test business plans, simulate presidential decisions, or offer critiques from the perspective of a political opponent.
How students can train AI to “be” Einstein and ask it about thermodynamics at their own pace, in their own language.

AI isn’t replacing teachers—it’s becoming part of the classroom, like textbooks once were.

The Arms Race

Of course, there’s a darker side. AI can cheat. It can take online courses for students, fake typing patterns, even simulate human error.

Dr. Bowen called it an “arms race” between those building smarter AI and those trying to prevent it from being misused.

But even in this, he saw hope.

If educators embrace AI—not as an enemy but as a creative partner—they can design assignments AI can’t complete alone. They can build simulations, storytelling challenges, and editing tasks that require a human mind.

Because at the end of the day, that’s what this moment demands: humans who think more deeply, ask better questions, and create things worth remembering.

Final Words

The session ended with a simple truth:

“AI raises the floor. You must raise the ceiling.”

Whether you’re a student, a teacher, a manager, or a job-seeker, AI is now the baseline.

It will write the first draft, sketch the first idea, solve the first problem.

But it’s still up to us to bring the brilliance.

AI can produce work at a “C” level, which is problematic if students can only perform at that level. Instructors need to raise their standards and expectations. Assignments that would have been considered a “C” should now be evaluated as an “F” if they only meet the level of quality that AI can produce.

Implications

Students need to surpass AI capabilities to be competitive in the job market, especially in fields like coding and writing.

And maybe—just maybe—it’s time we all learned to read the map again.

Midterm Feedback via Google Illuminate | GenAI Essentials | Upcoming Opportunities

It’s a great time to think about receiving midsemester feedback on your course. I have written about the topics in the past on my blog, however, this time I used a new resource that might be helpful for multiple purposes:

Google Illuminate is an AI tool that creates podcast-style audio summaries of research papers. It’s designed to make complex academic information more accessible and engaging.

How it works

  1. Input a research paper, PDF, orURL into Illuminate*, or search for a topic.

  2. Illuminate generates an audio summary that explains the key points of the document.

  3. You can ask questions about the document, and Illuminate will provide a text output.

  4. You can convert the text output into a podcast answer.

*I do NOT recommend inputting copywritten information into this tool. However, it is optimized for computer science topics, and supports research papers hosted on arXiv.org

I used some of my previous blog posts to create a podcast that cover several aspects of creating developing and analyzing mid-semester feedback using the following prompt:

Create a relaxed and spontaneous conversation with a laid-back and curious host and a lively, fun, and relatable guest. They’ll dive into the topic in a free-flowing, casual style that feels like you’re eavesdropping on a chat between friends.

I used 4 resources from my blog:

Screenshot of what is now Google Notebook that shows a prompt for picking an audio for a podcast generated from my blog posts.

A 4-minute conversational podcast related to providing mid-semester feedback was generated that provides a decent overview of the topic is generated with a text transcript.

screenshot of the podcast interface

You might use this tool in your class to generate an overview to dense topics that students can listen to and/or read.

Resource to share with students:

GenAI Essentials: Practical AI Skills Across Disciplines (Student-Facing)

https://expand.iu.edu/browse/learningtech/courses/genai-essential-skills

 

This course, developed by the the Learning Technologies division of University Information Technology Services (UITS) covers:

  • Prompt Engineering – Crafting precise prompts to generate accurate and useful AI outputs.

  • Evaluating AI-Generated Content – Assessig reliability, credibility, and biases of AI-produced information.

  • Ethics and Limitations of GenAI – Understanding responsible AI use, ethical considerations, and potential risks.

  • Information Literacy in a GenAI Age – Applying verification strategies and library resources to fact-check AI-generated sources.

  • Studying and Learning with GenAI – Using AI tools for note-taking, summarization, and personalized learning support

Resource for you

The faculty facing version of the course  https://iu.instructure.com/enroll/M7FE9Ecovers the same topics, but also includes assignment templates and rubrics that you can incorporate into your own course:

screenshot from the Canvas Gen AI Essentials at IU Course