Pedagogical Tips for the Start of the Semester

The first weeks of the semester are a unique window to shape not only what students will learn, but how they will learn. In STEM courses, where concepts can be abstract, skill levels vary wildly, and technologies evolve quickly, intentional, evidence-based practices can help you set students up for long-term success.

Below are a few strategies with examples and tools you can implement immediately.

Design an Inclusive, Transparent Syllabus

Evidence base: Transparent teaching research (Winkelmes et al., 2016) shows that when students understand the purpose, tasks, and criteria for success, they perform better.

Implementation tips:

  • Purpose statements: For every major assignment, include a short note on why it matters and how it connects to industry or future coursework.
    Example: “This database schema project builds skills in relational modeling, which are directly relevant to backend software engineering interviews.”

  • Clear expectations: Break down grading policies, late work policies, and collaboration guidelines into plain language, avoiding overly technical or legalistic phrasing.

  • Accessibility & flexibility: Link to tutoring labs, office hours, online learning resources, and note-taking tools. Indicate whether assignments can be resubmitted after feedback.

  • Create a one-page “Quick Reference” sheet covering key policies (late work, collaboration, grading)

  • Norm-setting: Add a “Community Norms” section that covers respectful code reviews, how to ask questions in class, and expectations for group work. In large classes, it’s vital to set expectations for respectful online discussions, effective use of the Q&A forum (e.g., checking if a question has already been asked), and guidelines for group work if applicable (e.g., conflict resolution strategies).

Establish Psychological Safety Early

Evidence base: Google’s Project Aristotle (2015) and Edmondson’s (1999) work on team learning show that psychological safety, where students feel safe to take intellectual risks, is essential for high performance.

Implementation tips:

  • Low stakes start: In week one, run short, open-ended coding challenges that allow multiple solutions. Make it clear that mistakes are part of the process.

  • Start with anonymous polls about programming experience to acknowledge the diversity of backgrounds in the room.

  • Instructor vulnerability: Share a personal example of a bug or failed project you learned from. This normalizes challenges in programming. In a large lecture, you can briefly mention common misconceptions students often have with a new concept, and how to navigate them.

  • Model Constructive Feedback: When providing feedback on early assignments (even low-stakes ones), focus on growth and learning. When addressing common errors in a large class, frame it as an opportunity for collective learning rather than pointing out individual mistakes.

  • Multiple communication channels: Set up a Q&A platform (InScribe) where students can post questions anonymously.

Use Early Analytics for Intervention

Evidence base: Freeman et al. (2014) found that early course engagement strongly predicts later success, allowing for timely support.

Implementation tips:

  • Student Engagement Roster (SER): https://ser.indiana.edu/faculty/index.html During the first week of class,  consider explaining the SER to your students and tell them how you will be using it. If students are registered for your class and miss the first class, report them as non-attending in SER.  It will allow outreach that can help clarify their situation. Here’s a sample text you could put into your syllabus:
    This semester I will be using IU’s Student Engagement Roster to provide feedback on your performance in this course. Periodically throughout the semester, I will be entering information on factors such as your class attendance, participation, and success with coursework, among other things. This information will provide feedback on how you are doing in the course and offer you suggestions on how you might be able to improve your performance.  You will be able to access this information by going to One.IU.edu and searching for the Student Engagement Roster (Faculty) tile.

  • Use Canvas Analytics:

  • Identify struggling students. “Submissions” allows you to view if students submit assignments on-time, late, or not at all.

    1. See grades at a glance. “Grades” uses a box and whisker plot to show the distribution of grades in the course.

    2. See individual student data. “Student Analytics” shows page view, participations, assignments, and current score for every student in the course.

  • Track early submissions: Note which students complete the first assignments or attend early labs

  • Personal outreach: Email or meet with students who are slipping to connect them with tutoring, peer mentors, or study groups.

  • Positive nudges: Celebrate early wins (e.g., “I noticed you submitted the optional challenge problem. Great initiative!”).

  • Proactive Outreach (with TA Support): If you identify students who are struggling, send personalized emails offering support and directing them to available resources (e.g., tutoring, office hours with TAs). Consider delegating some of this outreach to TAs in large courses.

  • Announcements Highlighting Resources: Regularly remind the entire class about available support resources, study strategies, and upcoming deadlines through announcements.

Key Implementation Strategies for Success

  • Start Small and Build Don’t attempt to implement all strategies simultaneously. Choose 2-3 that align with your teaching style and course structure, then gradually incorporate additional elements.

  • Leverage Your Teaching Team In large courses, TAs are essential partners. Invest time in training them on consistent feedback practices, student support strategies, and early intervention protocols.

  • Iterate Based on Data Use student feedback, performance analytics, and your own observations to refine your approach throughout the semester. What works in one context may need adjustment in another.

  • Maintain Connection at Scale Even in large courses, students need to feel seen and supported. Use technology strategically to maintain personal connection while managing the practical demands of scale.

Conclusion

By implementing these research-backed strategies, faculty can create learning environments where diverse students thrive, engagement remains high, and learning outcomes improve significantly.

The investment in implementing these practices pays dividends not only in student success but also in teaching satisfaction and course sustainability. As you prepare for the new semester, consider which strategies best align with your course goals and student population, then take the first step toward transforming your large enrollment course into a dynamic, supportive learning community.

Remember: even small changes, consistently applied, can create significant improvements in student learning and engagement. Start where you are, use what you have, and do what you can to create the best possible learning experience for your students.

References

  1. Winkelmes, M. A., Bernacki, M., Butler, J., Zochowski, M., Golanics, J., & Weavil, K. H. (2016). A teaching intervention that increases underserved college students’ success. Peer Review, 18(1/2), 31–36. Association of American Colleges and Universities.

  2. Edmondson, A. C. (1999). Psychological safety and learning behavior in work teams. Administrative Science Quarterly, 44(2), 350–383. https://doi.org/10.2307/2666999

  3. Google Inc. (2015). Project Aristotle: Understanding team effectiveness. Retrieved from https://rework.withgoogle.com/intl/en/guides/understanding-team-effectiveness

  4. Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences, 111(23), 8410–8415. https://doi.org/10.1073/pnas.1319030111

Evidence Based Research Supporting the Use of PlayPosit in classes

PlayPosit https://app.teaching.iu.edu/tools/playposit is an interactive teaching tool/application used to make interactive videos, also known as bulbs. The application can be integrated into the Canvas using your own videos or by extracting videos from other sources, such as YouTube or TED Talks, and convert them into an interactive, topic- and student-focused mini-lessons by adding questions at appropriate time-points in the video. Educators may also prepare video clips from longer recorded lectures and encourage student engagement by providing context or additional information through text and images on slides, inserting questions to check for understanding, including discussion and reflection questions, giving pre-recorded feedback as they see fit, and even incorporating polling. PlayPosit videos pause at certain intervals chosen by the instructor to give students an opportunity to respond.

PlayPosit settings allow for multiple playback options, including allowing students to rewind, fast forward or retake the activities. The instructor may also incorporate instant feedback after each question; so, students not only see whether they answered correctly but also understand the rationale. This feedback enables instructors to modify instructional activities midstream in light of their effectiveness, impact, and value. Because formative evaluations are designed to guide the teaching process – and are not used as outcome indicators – they are generally individualized evaluations that are under the control of the instructor and target specific instructional topics, issues or concerns.

van der Meij, H., & Bӧckmann, L. (2021). Effects of embedded questions in recorded lectures. Journal of Computing in Higher Education, 33(1), 235–254. https://doi.org/10.1007/s12528-020-09263-x

  • Researchers sought to examine the effectiveness of providing “open-ended embedded questions” in recorded video lessons to better prepare students for upcoming in-class topics.

  • Student surveys, user logs, and knowledge tests showed that learners “engaged significantly more with the embedded questions lecture” and showed “significantly higher” average scores on those topics. Researchers concluded that embedded questions “can increase the effectiveness of online video-recorded

     lectures.”

Lewandowski, H.J., Pollard, B., West, C.G. (2020). Using custom interactive video prelab activities in a large introductory lab course. 2019 Physics Education Research Conference Proceedings. https://doi.org/10.1119/perc.2019.pr.Lewandowski

  • As part of a broader redesign of a large introductory physics course at the University of Colorado Boulder, researchers developed pre-lab videos with embedded PlayPosit questions to help students better prepare for in-person lab activities.

  • Researchers found that 90% of students completed the pre-lab video modules, with the “vast majority of students spending a nontrivial amount of time engaging with each question.” 80% of students felt the activities prepared them well for in-person lab tasks.

Sherifi, D., Jia, Y., Hunt, T. J., & Ndanga, M. (2023). Evaluation of a PlayPosit guided group project’s impact on student engagement in an undergraduate course. Discover Education, 2(1), 32. https://doi.org/10.1007/s44217-023-00057-8

  • Students appreciated that PlayPosit activities were enjoyable and different from other learning resources. Eleven students expressed that PlayPosits were “enjoyable, “short”, “quick”, “not overbearing”, “not drawn out or boring”, “made learning fun”, “a unique way of learning”, and “kept students interested”.

  • PlayPosit increased the teaching presence of the professor by virtue of having multiple touch points with the material focused on relevant tasks.

  • PlayPosit contributed to better interaction of the students with the course content, and as per their comments, was helpful and beneficial, as well as interesting and attractive. Furthermore, students were more attentive to the other course videos and recordings.

Karpicke, J.D. (2012). Retrieval-based learning: Active retrieval promotes meaningful learning. Current Directions in Psychological Science, 21(3), 157–163. https://doi.org/10.1177/0963721412443552

  • Researchers compared the effects of different study strategies on student learning between three test groups in the same course: one studying concepts by rereading materials with no recall activities, one mostly rereading with some recall activities, and one reading just once with more recall activities.

  • While surveyed students believed rereading would be the most effective study strategy, those who simply reread the same materials with no recall activities performed poorest on assessments. Those who practiced just one retrieval activity “doubled long-term retention,” and those who practiced multiple retrieval activities showed even larger gains

For more information on how to integrate PlayPosit into your course, please visit https://app.teaching.iu.edu/tools/playposit

Getting Started with Gradescope

Gradescope is an online grading platform that streamlines the grading process for assignments, quizzes, and exams. It offers features such as AI-assisted grading, rubric-based assessment, and detailed analytics, making it a valuable tool for both instructors and students.  Some of those features include:

  • Customized Rubrics: Instructors can create customized rubrics within Gradescope that outline specific grading criteria and expectations for assignments. By tailoring rubrics to align with learning objectives and student skill levels, instructors can provide more personalized feedback that addresses individual strengths and areas for improvement.

  • Individualized Feedback: Gradescope allows instructors to provide individualized feedback on student submissions. Instructors can leave comments directly on student work, offering personalized guidance, suggestions, and encouragement to support each student's learning journey.

  • Assignment Variations: Gradescope allows instructors to create multiple variations of certain assignment types, each with its own set of questions or parameters. This feature enables instructors to provide students with personalized assignments based on factors such as skill level, learning style, or individual interests.

  • Flexible Grading Options: Gradescope offers flexibility in grading options, allowing instructors to choose between manual grading, automated grading (using, or a combination of both. This flexibility enables instructors to adapt grading methods to suit the needs of different assignments, courses, and student populations.

  • Grade Adjustments: Instructors can easily adjust grades within Gradescope based on individual circumstances or extenuating factors. Whether accommodating accommodations for students with disabilities, considering exceptional circumstances, or recognizing exceptional effort, Gradescope allows instructors to personalize grading decisions while maintaining consistency and fairness.

  • Learning Analytics: Gradescope's analytics dashboard offers insights into student performance trends, allowing instructors to identify patterns, common misconceptions, and areas for improvement. This data-driven approach enables instructors to personalize grading strategies and instructional interventions to address specific learning needs.

  • Student Engagement Tracking: Gradescope allows instructors to track student engagement with assignments and assessments, including submission times and completion rates. By monitoring student activity, instructors can identify students who may need additional support or encouragement and tailor grading strategies accordingly.

Additional resources:

Trying Something New In Your Course

Here are a few ideas to consider when trying something new adapted from Tips for Teachers.

Can I improve something I am already doing? 

Before looking to make wholesale changes to your teaching, based on your reflections, identify practices you already do and look for ways to improve them. This should take less time and effort and give you a platform of success upon which to make further changes in the future. For example:

Instead of several worked examples that you have to whiz through, choose one or two that you have thought carefully about. Spend time going through them. Consider modelling them in silence first, and then using carefully considered self-explanation prompts/questions to give students a better opportunity to understand the process.

How will I know if the idea works? 

How are you going to know if the idea has been a success or not? The more objective the measure, the better. For example:

If you are looking to boost your participation ratio by using tools like (Top Hat or PlayPosit), track the number of times you see responses from all students.

What will I have to stop doing? 

This is the question that gets asked the least, and yet is so important. Trying something new may mean you have to no longer do something else. This plays out in two ways: A new idea in the classroom may mean you have less lesson time to do something else. Is that a sacrifice worth making? Planning a new idea may mean you have less planning time to work on something else. Is that a sacrifice worth making?