Excited to share that I've joined the Generative AI Faculty Fellows Program!
This opportunity comes at the perfect time. Rather than acquiring more knowledge about AI, I'm now focused on applying what I know to transform teaching and learning in meaningful ways.
The fellowship brings together colleagues committed to ethical, pedagogically-sound AI integration. I'm particularly energized by the collaborative aspect—working alongside peers who understand both the transformative potential and the responsibility that comes with these tools.
My focus? Building systems that enhance student learning while maintaining critical awareness of AI's broader implications. This is about action, not just information.
For those who've followed my recent shift from "just one more course" to actually building: this is what application looks like. And I'm here for it!
Looking forward to sharing what we create together.
#HigherEd #FacultyDevelopment #TeachingAndLearning #GenAI #IU #FacultyFellows #GenAIFacultyFellows #AIInEducation #TeachingInnovation
The Growth Edge: When Learning Gets in the Way of Doing
For too long, I've been a perpetual student, always reaching for the next course, the next certification, the next piece of knowledge. The joy of learning is undeniable, but there's a quiet tension that comes with it: the gap between knowing and doing. Last week, that tension reached a breaking point, and I had a profound realization: it's time to stop learning and start building.
The Catalyst: A Familiar Impulse
It began with Mindvalley's new AI Clone Building program. My finger hovered over the purchase button, a familiar impulse to acquire more knowledge. I've taken Mindvalley courses before, and they've been genuinely valuable. Their approach to teaching AI through a personal development lens, rather than dry technical tutorials, helped me get started with custom GPTs. I'm not here to criticize their offerings.
But this time, something shifted. As I reviewed the program details, a moment of clarity struck me: I already know how to do most of this. And the parts I don't know? I can figure them out.
The Realization: What I Already Know
Mindvalley, in fact, reiterated for me one of the most valuable lessons I still use today: audience reframing. It's not just about building a custom GPT or chatbot, but about teaching that GPT to understand who you're talking to and why it matters. For example, creating workshop materials for skeptical engineering faculty requires a different approach than writing for excited undergraduate researchers. MindValley calls this "context injection," and it's brilliant. This skill transfers to everything I do, from faculty development to coaching and curriculum design. It's fundamental to good teaching and coaching, not just an AI skill.
This realization extended to other areas of my AI journey:
I've already built custom GPTs. They are straightforward once you understand the basic structure: upload content, set instructions, test, and refine. The technical barrier is not as high as it feels.
I already understand context injection. I apply it in my faculty development work, coaching conversations, and curriculum design without needing another course.
I have access to excellent alternative resources. Domestika and Udemy offer technical AI courses for a fraction of the cost. YouTube provides detailed tutorials. My institution offers valuable tools and training. The knowledge is available; I just needed to recognize I can use it.
What truly resonated was this: I wasn't looking for information I didn't have. I was looking for permission to trust what I already know. And that realization? That's the good kind of uncomfortable.
The Shift: From Learning to Building
This shift from perpetual learning to active building is particularly significant now. I recently completed the Gallup Global Strengths Coach course and will soon take the certification exam. While I've informally coached for years, this formal training is opening new doors, allowing me to offer my coaching skills more intentionally. My connectedness strength has never been more relevant; I genuinely love supporting people in finding and reaching their goals.
Suddenly, AI tools are no longer abstract concepts. They are directly connected to the work I'm actively building. The coaching certification provided the urgency and clarity to move from simply acquiring knowledge to applying it to create tangible value.
What I'm Actually Building Now
My focus has sharpened, and I'm now actively engaged in:
Custom GPTs for coaching contexts. I'm developing assistants trained on my coaching voice to help with client session prep, follow-up materials, and workshop design. These tools will augment, not replace, human interaction, handling administrative tasks so I can focus on actual coaching.
Audience-adaptive content generation. I'm systematically applying context injection principles. A single workshop outline can now be adapted into five different versions, tailored for early-career faculty, mid-career researchers, or department chairs.
Streamlined faculty support workflows. I'm automating repetitive tasks like scheduling, resource compilation, and initial feedback drafts. This frees up more time for meaningful conversations that genuinely advance people's goals.
The technical skills are either already present or quickly learnable. What I'm building now is the confidence to trust those skills and act on them.
A New Framework for Learning
Looking back, I've consistently underestimated my ability to figure things out independently. My Top 5 CliftonStrengths (Learner, Ideation, Developer, Achiever, and Connectedness), suggest I'm built for self-directed exploration. Yet, there's been a gap between intellectual understanding and emotional trust.
My Learner strength's growth edge is this: once I achieve competence, I often move to the next learning project instead of fully applying what I've learned. The accumulation of knowledge becomes satisfying in itself. With AI tools, I learned to build custom GPTs and understood the principles, but I didn't fully implement the systems I could use in my coaching and faculty development work. I kept learning more about AI instead of applying what I already knew.
This moment, recognizing I don't need another course but need to build the things I've been planning, is about working with this growth edge. The knowing is done. Now, I'm practicing the doing.
Questions to Ask Yourself
If you're considering more AI training, here are questions worth asking:
What specific skill are you missing? If you can name it precisely, you can likely find a targeted resource for less money than a comprehensive program.
Are you looking for information or confidence? Information is widely available. Confidence comes from actually building things and seeing them work.
What have you already learned that you're not giving yourself credit for? Seriously, make a list. You might be surprised.
What would you build if you already felt competent? Start building that. Competence follows action, not the other way around.
Resources If You're Building Your Own AI Systems
Since I enjoy helping people learn, here are resources that have genuinely helped me:
For Learning Custom GPT Creation:
OpenAI's official documentation (free, comprehensive)
Domestika's "Build Custom AI Assistants" courses ($10-40 - aimed at architects, but has general transferrable skills)
YouTube tutorials by AI Jason and Matt Wolfe (free, current)
For Content Transformation:
NotebookLM (free from Google, remarkably effective)
Claude Projects (what I'm using for this conversation)
ChatGPT Projects (with Plus subscription, $20/month)
For Automation:
For Voice/Avatar Video (if you actually need this):
Key Principle from Mindvalley Worth Keeping:
Context injection. Always tell your AI: Who is the audience? What's the background? What outcome do you want? What tone is appropriate? This framework transforms generic AI outputs into genuinely useful content.
Conclusion: The Real Lesson
It's Saturday evening. I have custom GPTs and ChatBots I've built before. I have new coaching skills I'm developing. I have a clear sense of what I want to create next.
What shifted this week wasn't learning new information. It was recognizing that I need to stop learning and start building. This is a hard shift for someone whose top strength is Learner. The pull to take "just one more course" is real. The satisfaction of acquiring new knowledge is immediate and tangible.
But building something meaningful from what I've learned requires a different kind of commitment. It means sitting with the discomfort of imperfect execution. It means choosing application over acquisition.
Mindvalley's programs provided foundations I'm genuinely grateful for. The audience reframing skill alone was worth the investment. However, the next phase of my AI learning won't come from another structured course. It will come from actually implementing the systems I've been planning, trusting my Learner, Ideation, and Developer strengths enough to create tools that serve my coaching and faculty development work.
This isn't a rejection of structured learning. It's recognizing when I've learned enough structure to create my own. More importantly, it's recognizing when continued learning becomes a way to avoid the harder work of actually doing. And honestly? That realization feels like growth.
Sometimes the most valuable learning doesn't come from the course you take. It comes from the moment you realize you don't need it anymore. I'm not saying I'll never take another Mindvalley course. I'm saying I've reached a point where I can choose to learn with them strategically, rather than reaching for them automatically because I don't trust my own competence. That's the difference. Now I just need to actually build the things I've been planning. The knowing is done. It's time for the doing.
If you want to talk through your own AI learning journey, coaching development, or just need someone to remind you that you probably know more than you think you do, I'm always up for that conversation. You can find me on LinkedIn or through my website.
A Note on This Post
Yes, I used Claude to help write this. Not because I couldn't write it myself (I’ve written blog posts for years), but because one of the things I've learned about AI is that it's excellent for helping you articulate thoughts you're still forming. The ideas are mine. The realization is mine. The voice is mine. Claude just helped me organize the rambling version into something readable. That's what good AI use looks like: augmenting your work, not replacing it.
The Ethical Tension I'm Still Sitting With
In all honesty, I haven't resolved my feelings about AI's broader impact. I work at Indiana University's Luddy School of Informatics, Computing, and Engineering. AI is central to several things we do. I help faculty integrate it into curriculum, design AI-resilient assessments, think through pedagogical implications.
But I'm also deeply aware of environmental costs, data privacy concerns, and the ways AI systems can perpetuate harm against marginalized communities. Faculty in my own school (people who build these systems) have a wide range of perspectives on appropriate AI use.
I don't think that tension is something to resolve by taking another course. I think it's the ongoing work of using powerful tools thoughtfully. I'm going to keep using AI, and I'm going to keep questioning how I use it. Both things can be true.
Week 3: Making Technical Coursework Matter to Students
The Week Three Reality Check
Let me be direct. Week three is when you lose them.
The novelty has worn off. Students are juggling five courses, jobs, relationships and various other responsibilities. Deadlines are colliding. And at least one student in your room is thinking, “When am I ever going to use Big-O notation in real life?”
The research is clear on what happens next. When students cannot explain how coursework connects to professional practice, they are more likely to disengage, change majors, or do just enough to get by (Margolis & Fisher, 2002; Meyer & Marx, 2014). Expectancy-value theory explains why. Even confident students check out when they do not see task value (Wigfield & Eccles, 2000).
This is not a motivation problem. It is a visibility problem.
Your students already care about the tech world. They follow industry news, track hiring trends, argue about AI tools, and know which companies are growing or cutting teams. Our job is not to convince them the field matters. Our job is to make the connection between the course and the field explicit.
Below are three high-impact strategies you can implement without rebuilding your course.
Strategy 1: Cite Industry Certifications in Your Assignments
Why this works
Industry certifications represent consensus about what practitioners are expected to know. They are public, specific, and regularly updated. Most importantly, students recognize them as legitimate signals of professional value.
The 15-minute transformation
Take an existing technical assignment and add two things:
The certification exam domain it aligns with
A realistic professional scenario
Before
“Implement a binary search tree with insert, delete, and search operations.”
After
“The AWS Certified Developer exam, Domain 2.3, requires understanding data structure selection for application optimization. You are a junior developer evaluating data structures for a real-time leaderboard that processes 50,000 updates per minute. Implement a binary search tree, then write a one-page technical memo recommending whether this structure fits the use case. Cite time complexity and compare it to alternatives.”
Where to find certification standards
AWS Certifications: aws.amazon.com/certification
Google Cloud: https://cloud.google.com/learn/certification
CompTIA: comptia.org/certifications
Cisco: https://www.cisco.com/site/us/en/learn/training-certifications/certifications/index.html
NCEES: ncees.org
ISC²: isc2.org
PMI: pmi.org
Research foundation
Situated learning research shows that authentic professional contexts increase motivation and support knowledge transfer (Brown, Collins, & Duguid, 1989; Lave & Wenger, 1991). Guzdial and Tew (2006) found 10–15 percent improvements in retention when students could immediately see professional relevance.
Strategy 2: Let Students Reverse-Engineer Job Descriptions
Implementation: a 40-minute in-class activity
Preparation (10 minutes before class)
Pull 4–6 job postings from LinkedIn or Indeed. Mix experience levels from internships to mid-career. Choose companies students recognize. Most importantly, select postings that genuinely align with your course content.
In-class sequence
Part 1: Analysis (15 minutes)
Students work in groups of three or four. Each group analyzes two job postings and categorizes requirements into technical skills, tools, soft skills, project types, and education requirements.
Part 2: Pattern recognition (10 minutes)
Compile findings as a class. Students are often surprised to see communication and documentation listed in 70–80 percent of technical roles. Prompt discussion with a simple question: “What gaps exist between these requirements and your current skill set?”
Part 3: Course mapping (15 minutes)
Close the loop explicitly.
“This week’s database design project addresses the ‘design scalable data schemas’ requirement that appeared in every data engineering position we analyzed. Your deliverable includes both technical implementation and documentation, which aligns with the communication skills emphasized in most of these postings.”
Make it stick
Reference these connections throughout the semester:
In assignments: “Builds the API design skills listed in 14 of 18 backend developer roles”
In lectures: “This is why documentation matters. You identified it in nearly every posting”
On rubrics: “Professional communication, 20 percent. This aligns with the ‘translate technical concepts’ skill from technical lead roles”
Research foundation
Goal-setting theory shows that specific, meaningful goals improve motivation and performance (Locke & Latham, 2002; Morisano et al., 2010). Studies of computing job descriptions consistently show that employers value professional skills alongside technical competence (Radermacher et al., 2014).
Strategy 3: Teach With This Week’s Tech News
Why current events work
Students already follow tech news. When coursework connects to headlines, relevance becomes immediate rather than hypothetical.
A practical system
1. Set up a news radar (one time, 15 minutes)
Create Google Alerts for your language or field plus terms like vulnerability, breach, or breakthrough. Subscribe to Hacker News, relevant subreddits, and one or two industry newsletters.
2. Build flexibility into your syllabus
Designate one or two responsive assignments that can pivot based on current events. Example language: “Assignment 4 topic will be determined based on current industry developments, announced in Week 5.”
3. Move quickly
Current events have a two-week relevance window.
Examples
Cybersecurity
Following a major breach, students audit a sample application for vulnerabilities, produce a report using the OWASP Top 10 framework, and write an executive summary connecting findings to public reporting.
Cloud computing
After a major provider announcement, students design a cloud architecture for a realistic startup scenario, justify service choices, and compare costs across providers.
Data ethics or informatics
Students analyze a newly released AI model for bias, transparency, and deployment risks using ACM and IEEE ethical frameworks.
Engineering courses
Students analyze a recent infrastructure failure using course methods, examine code compliance, and propose inspection or mitigation strategies.
Research foundation
Authentic assessment and real-world relevance increase engagement and retention (Gulikers et al., 2004; Schell & Janicki, 2013).
Bonus Strategy: Strategic Microlearning for Prerequisites
The problem
Student preparation varies widely. You cannot spend week three reviewing basics without losing half the room.
The solution
Self-paced microlearning modules, three to five minutes each, focused on a single concept.
Why this works
Cognitive load theory and the spacing effect both support short, focused, just-in-time learning (Cowan, 2001; Sweller, 2011; Cepeda et al., 2006).
Example
Before advanced networking topics, provide optional modules on binary conversion, hexadecimal notation, or IP address structure. Each module includes a focused objective, a worked example, a few practice problems, and self-check solutions.
Your Implementation Checklist
Set up Google Alerts
Identify two or three relevant certifications
Review 8–10 job postings
Build one or two responsive assignment slots into the syllabus
Create a job description analysis worksheet
Revise one major assignment to include explicit industry connections
Run the job description activity
Add real-world relevance statements to upcoming assignments
Create microlearning modules for common prerequisite gaps if needed
Assess breaking news within 48 hours for assignment potential
Reference job requirements in lectures and rubrics
Track which connections resonate and refine next term
The Bottom Line
Students in computing and engineering are not unmotivated. They cannot see how coursework connects to what they care about: real work, current technology, and professional competence. The evidence is consistent. When students see personal and professional relevance, they persist longer, invest more effort, and perform better (Hulleman & Harackiewicz, 2009; Hulleman et al., 2010). You do not need to rebuild your course. You need to make visible the connections that experienced faculty already see.
Start small. Revise one assignment. Run one activity. Set up one alert.
Your students are already engaged with the tech world. Help them see how your course prepares them to enter it.
Leveling Up with Kindness: The Quiet Infrastructure of Joyful Teaching
Joy in teaching is often framed as enthusiasm, energy, or charisma.
But the longer I work alongside faculty and students, the clearer it becomes: joy is less about performance and more about infrastructure.
Joyful teaching does not magically appear because we love our discipline or design clever assignments. It emerges when the environment is humane enough for people to think, struggle, and grow without fear.
That is where kindness enters—not as softness, but as structure.
Recently, I came across Justin Mecham’s Level Up with Kindness framework, which organizes kindness into three layers: Foundational, Relational, and Cultural. Reading it alongside my own reflections on joyful teaching, I realized something important:
Joy is sustained by kindness that is practiced consistently, not occasionally.
Below is how I see these layers playing out in real classrooms, teaching teams, and academic cultures.
Foundational Kindness: The Conditions for Learning
Foundational kindness is not inspirational. It is practical.
And without it, joy cannot take root.
This includes things like:
Showing up on time
Respecting boundaries
Offering help without being asked
Acknowledging effort, not just outcomes
These behaviors may seem small, but they signal something essential: you matter here.
In my post on joyful teaching, I wrote about how students need to feel safe enough to engage deeply. Foundational kindness is what creates that safety. When expectations are clear, time is respected, and help is normalized, students are freed from guessing games. Cognitive energy shifts from self-protection to learning.
This is especially critical in computing, engineering, and data-heavy courses where students can often feel behind before they begin.
Joy does not come from removing rigor.
It comes from removing unnecessary friction.
Relational Kindness: Trust as a Teaching Practice
Relational kindness is where most people think kindness lives. But it is also where it is most often misunderstood.
This layer includes:
Listening without immediately fixing
Offering honest, specific praise
Supporting people during periods of stress
Celebrating others’ wins publicly
What stands out to me here is how intentional these practices are. None of them are automatic. They require attention, patience, and restraint.
In joyful teaching, relational kindness shows up when:
We let students explain their thinking before correcting it
We acknowledge effort even when the result falls short
We protect students’ dignity when they struggle publicly
It also shows up among colleagues when we stop treating burnout as a personal failure and start treating it as a design problem.
Relational kindness builds trust, and trust is what allows students and faculty alike to take intellectual risks. Without trust, everything feels performative. With it, learning becomes collaborative.
Cultural Kindness: The Signals That Shape Behavior
Cultural kindness is the hardest layer because it is collective, not individual.
This includes:
Welcoming new voices, especially quieter ones
Modeling emotional regulation under pressure
Leading with humility
Protecting team boundaries
Creating space for honest, hard conversations
Cultural kindness answers the unspoken question: What actually happens here when things get hard?
In joyful teaching cultures:
Mistakes are treated as data, not defects
Feedback flows in more than one direction
Saying “no” is seen as professionalism, not lack of commitment
This layer matters because students and junior faculty are always watching. They learn what is valued not from mission statements, but from reactions.
Joy cannot survive in cultures where people feel disposable.
Kindness Is Not Extra. It Is the Work.
One of the quiet myths in academia is that kindness is something you add after rigor, productivity, and excellence are addressed.
In reality, kindness is what makes those things possible over time.
Joyful teaching is not about being endlessly positive. It is about building systems where people can bring their full cognitive and emotional capacity to the work without burning out.
Kindness, practiced at all three levels, is not a personality trait.
It is a design choice.
And like any good design, it requires intention, iteration, and care.
A Closing Reflection
If joy feels elusive right now, it may not be because you are doing too little.
It may be because the system around you is asking for too much without enough kindness built in.
The good news is this: kindness scales.
And when it does, joy often follows.
The Flipped Tech Class: From Theory to (Computer) Terminal
In my last message, I emphasized that the first week is less about content and more about establishing a technical foundation. We looked at how visual technology roadmaps and active first-day engagement help students master the course's "operating system" so they can focus on the discipline itself. We also explored using LMS analytics for early support—recognizing that students who don't engage in Week 1 are at a higher risk of struggling later (Macfadyen & Dawson, 2010).
Now that students are comfortable navigating the tools, Week 2 presents the perfect opportunity to establish active learning norms through the Flipped Classroom mode.
As the second week of the semester begins, the "honeymoon phase" often meets the reality of technical hurdles. This is the perfect moment to establish active learning norms. In tech education, the flipped classroom isn't just a trend; it’s a necessity. We are training students for an industry where "reading the documentation" (pre-class work) is the prerequisite for "shipping code" (in-class application).
The Pedagogical Shift
The flipped model moves passive content delivery (lectures) online, reserving the classroom for high-intensity problem-solving. The evidence is compelling: a meta-analysis of 225 studies found that active learning environments improved exam scores by 6% and reduced failure rates by 1.5x (Freeman et al., 2014). For computing specifically, flipping improves both conceptual mental models and actual syntax proficiency (Hao, 2016).
Phase 1: Designing Effective Pre-Class Content
The success of your Tuesday lab depends entirely on what your students did Monday night. According to Mayer’s (2014) Cognitive Theory of Multimedia Learning, your digital materials should follow two key principles:
The Segmentation Principle: Keep videos between 6-9 minutes. Attention drops off a cliff after the 6-minute mark (Guo et al., 2014).
The Worked Example Principle: Students learn better by studying an expert's process than by being thrown into a problem without a map (Sweller, 2006).
One of the biggest misconceptions about the flipped classroom is that you must spend your weekend in a recording booth. In fact, curating high-quality existing content is often more effective than creating it from scratch.
Leverage the Global Tech Community
Don't reinvent the wheel. If a world-class engineer has already explained Binary Search Trees with high-end animations, use their expertise. Your value is not in being a "broadcaster," but in being the editor-in-chief who selects the most accurate and clear resources for your specific objectives.
Sharing Ad-Free YouTube Videos
Nothing kills student engagement like a 30-second unskippable ad for a VPN right in the middle of a technical explanation. To provide a "clean" viewing experience:
Use YouTube's "Embedded" Link: When you embed a video directly into your LMS (Canvas, Blackboard, Moodle), it often strips away the sidebar distractions and "up next" recommendations.
Modify the URL: You can add
-nocookieto the URL (e.g.,youtube-nocookie.com/embed/...) to enhance privacy and reduce tracking-based ads.Import a YouTube link through Kaltura (or similar tool your school may sponsor) which usually removes the adds.
Adding Interactivity
To turn a passive YouTube video into an active learning experience, use a tool like PlayPosit (recently integrated into WeVideo Interactivity) or Camtasia. This allows you to "wrap" a video in a layer of assessment.
How it works:
The Video Overlay: You take a YouTube video on Memory Management and set a "bulb" (interaction point).
The Interaction: At the 4:00 mark, the video automatically pauses. A sidebar opens with a coding snippet.
The Assessment: The student must identify the memory leak in the snippet before the "Play" button is re-enabled.
The Data Benefit: These tools sync directly with your LMS Gradebook. Before you even walk into the classroom, you can see a dashboard showing exactly which students watched the video and which specific question (e.g., "The Difference between Heap vs Stack") stumped the majority of the class.
Example: Curated Module for Web Dev
Video A (YouTube): "How the Internet Works in 5 Minutes" (e.g., from Code.org or Kurzgesagt).
Video B (WeVideo Layer): A 10-minute deep dive on DNS Lookups with 3 embedded check-for-understanding questions.
The "Prep" Result: Students arrive knowing the difference between an IP address and a URL, allowing you to spend the entire class building a local server environment.
Phase 2: Maximizing In-Class Application
With the "what" out of the way, class time focuses on the "how." Here are three realistic structures for faculty:
Think-Pair-Share Code Reviews (20 min): Present a "dirty" Python function. Give them 3 minutes to find the bugs, 7 minutes to refactor with a partner, and 10 minutes to debate the "cleanest" solution. This leverages elaborative interrogation; explaining why a fix works deepens the memory (Dunlosky et al., 2013).
Mini-Project Sprints (40 min): Assign a "Ticket." Example: "Our API is slow. We have 500k records. Fix it.” Teams must diagnose, prototype (index vs. cache), and document.
The Immersive Edge: Use VR or 3D web visualizations to show abstract concepts. Seeing a 3D visualization of a B-Tree traversal or a Network Topology can help students build spatial mental models that 2D slides cannot provide (Merchant et al., 2014).
The Achilles Heel: "They Didn't Watch the Video"
Strategy: Build Accountability via "Interpolated" Testing
The most common reason flipped classrooms fail is students arriving unprepared. Brame (2016) suggests that the medium of video is not inherently effective unless instructors promote active learning through specific interventions. If you don't address this, your active learning session will stall.
1. Use the "Testing Effect" to Drive Preparation
Instead of a simple "watch and hope" approach, package your videos with interactive questions. Research shows that students who answer questions interpolated (inserted) between short video segments perform significantly better on subsequent tests than those who simply watch the content.
Reduce Mind Wandering: Quizzing during or immediately after a video reduces reported instances of student "mind wandering" and keeps them focused on the technical concepts
Correct Overconfidence: Students often perceive video as "easier" than text and overestimate their mastery. Low-stakes quizzes provide immediate feedback, forcing students to accurately self-assess their understanding of a codebase or schema
Lower Exam Anxiety: Students who engage with interpolated questions report feeling less anxiety about final assessments because they have been "practicing" the recall of the material all along
2. Technical Implementation: Kaltura and PlayPosit
To implement this without adding to your grading load, use your campus tools to automate the process:
Kaltura Quizzes: Since we use Kaltura, you can import any YouTube video and add a native Interactive Video Quiz. The video will automatically pause at a key technical decision point, requiring the student to answer before proceeding.
PlayPosit (WeVideo Interactivity): Similarly, PlayPosit allows you to embed questions that sync directly with your LMS.
Guiding Questions: If you don't want to use a formal quiz, provide "guiding questions" for students to consider while watching. This focuses their attention on the most important elements of the video and improves performance on later tests.
3. Summary of Best Practices for Tech Videos
Based on the principles of Cognitive Load and Engagement, keep these guidelines in mind when selecting or creating your Week 2 materials:
Additional Strategic Solutions for Accountability:
The Low-Stakes Quiz: Make a 5-question quiz due 2 hours before class. This increases viewing rates from 60% to nearly 90% (Brame, 2016).
The "Entrance Ticket": Students must submit one specific technical question about the video before they can join the day's group activity.
The First 5 Minutes: Run a live poll (TopHat/Mentimeter). If 80% get the diagnostic question right, move to the project. If not, do a 5-minute "Just-in-Time" micro-lecture.
The Flipped Tech Class Contract
To set the tone for the rest of the semester, here is a template you can post on your LMS. It shifts the "blame" from the instructor to the professional standards of the tech industry.
Our Learning Agreement
In this course, we use a Flipped Learning Model. This mirrors the industry reality of "Continuous Learning": a Senior Developer is expected to read the documentation (Pre-Class) before attending the Sprint Planning (In-Class).
Your Responsibility: Engage with the short video modules and complete the "Entrance Quiz."
My Responsibility: Ensure class time is never a repeat of the video. I will provide "Just-in-Time" coaching as you tackle complex builds.
The Reality Check: If you come prepared, you will build a portfolio of work during class hours. If you come unprepared, you will spend class time catching up, missing out on the expert feedback that improves your grade.
Conclusion: Looking Ahead to Week 3
As we move into Week 3, be prepared for a shift in student energy. The "novelty" of the semester often fades here, leading to a Motivation Crisis. In next week’s blog, we will explore Week 3: Connecting the Dots, where we pivot from how we learn to why it matters.
This strategy applies to a wide array of disciplines, not just tech:
Mining Professional Standards: In tech, we cite AWS or Cisco certifications. In Nursing or Education, this means aligning assignments with NCLEX or edTPA standards to show real-world relevance.
Reverse-Engineering Career Paths: We will analyze job descriptions from LinkedIn or Indeed. This helps English or History students see that "close reading" or "primary source analysis" are the "evidence-based decision-making" skills employers value.
Leveraging Current Events: We’ll discuss "pivoting" your syllabus to address real-world events—like using a recent data breach to teach Cybersecurity or a recent Supreme Court ruling to analyze Constitutional Law.
Microlearning Modules: We’ll look at providing focused, 3–5 minute "refresher" units on prerequisites. This allows Social Science students to review basic statistics or Language students to review verb conjugations without slowing down the rest of the class.
By situating assignments in these authentic contexts, we answer the fundamental question every student is asking: "Why should I care about this?"