Reflective Course Revision: Small Changes for Big Impacts

Definition(s)

Course revision: McGahan (2018) defines the course revision process: “The course revision process is used to improve the quality of the learner experience in the course (Adelstein & Barbour, 2017; Krieie & Bussmann, 2015). A course revision involves a process of evaluating the different parts of a course to determine what is effective, educationally relevant, easily understood, and what is not (Twigg, 2005).”

Reflective teaching: Yale’s Poorvu Center for Teaching and Learning (2025) defines reflective teaching as “When instructors engage in reflective teaching, they are dedicating time to evaluate their own teaching practice, examine their curricular choices, consider student feedback, and make revisions to improve student belonging and learning. This process requires information gathering, data interpretation, and planning for the future. Reflective teaching involves examining one’s underlying beliefs about teaching and learning and one’s alignment with actual classroom practice before, during and after a course is taught.”

Purpose

The purpose of this TAL Tips sheet is to lay out, concretely, the steps instructors can take in revising their courses to meet needs related to the student experience and/or student learning using reflective practice. We understand this as a stepwise process, based upon a desire to improve the student learning experience or meeting learning outcomes. This process can happen between semesters or during the semester and is intended to meet a need based upon reflection and/or feedback gained during the semester. While there is a strong foundation in educational development literature on initial course design or wholesale course revision, sometimes these models fall short when we are crunched for time in making revisions to courses, only desire to make small changes, or want to revise a course during the semester.

In the next section (Approach) we introduce a worksheet you can use to make revisions to your course and lay out ways in which the worksheet can be used. The example case section includes a reflection by UD faculty member, Stephanie Raible, who worked with a CTAL staff member to revise their course, ultimately leading to the creation of this worksheet and resource. While Stephanie worked with CTAL, just one faculty support unit in UD’s teaching and learning support ecosystem, she needed to do a lot of these revisions through an iterative consultation process. While we encourage you to reach out if you need help, the worksheet is also intended to function as a standalone resource that you can use yourself to lay out your thinking when making course revisions. 

Approach

This process relies on the effective utilization of the reflective course revision worksheet. Please click the button below to make your won copy of the sheet before proceeding.

The worksheet is divided into three sections:

  1. Identifying the issue
  2. Working through alternatives
  3. Implementing revisions

Identifying the issue

In the first section, five prompts (A-E) ask you to identify an issue, before taking you through a stepwise process to identify where an opportunity for change might exist. This section, which intentionally comes before exploring alternative options, mirrors, in many ways, the steps that we take when conducting teaching consultations with faculty. Take for example an issue in class (A), like students performing poorly on exams the last two semesters compared to the last five years you have taught a course. Non-reflective course revisions (skipping steps B-E) could include: making exam questions “easier,” offering extra credit to boost scores, and/or holding an extra study session for students. This approach is not “wrong” as the instructor in this hypothetical scenario would likely see an increase in performance as measured by exam scores. Generally speaking, however, this does not address a root issue with student performance and likely means that a broader student learning outcome (SLO) is not being met, or the students are struggling generally with this element (exams). 

Taking our hypothetical example further, and completing prompts  B-E, we dig a little deeper and realize that students are struggling specifically with answering any questions where mathematical equations are present (B). We confirm that this is the case by reviewing student feedback where some students note that the exams feel “hard” and they wish they were allowed some in-class notes to use during the exam (C). This is only one piece of evidence and somewhat corroborates our thinking that we need to do something different with the exam, but what? We then ask a trusted colleague in our department whether they are experiencing similar issues in their courses. One colleague notes that they have noticed students tend to really struggle with algebraic equations. You decide that this could certainly be the root cause (D) of the issue. You decide that an incorporation of basic algebra into the course, along with regular lectures and exercises linking empirical instances to the underlying algebraic theory should be incorporated into the course via some changes to lectures, in-class activities, and maybe some graded problem sets or worksheets. Thus the elements (E) that you are considering revising are not the exams themselves, extra study sessions, or extra credit, but elements of the course that need to be more thoroughly thought through before making changes. 

Working through alternatives

This section, divided into three subsections (design, assessments, closing the loop), asks you to both respond to the probe and identify where you might need help in answering the question. You will likely not need to fill out the far right column for all questions, but if you do, you should use this information to answer question C in the third section (Implementing Revisions). 

The three design questions are meant to help you more explicitly lay out your alternative, identify the strengths of this alternative, and also to think through the potential disadvantages to pursuing this line of revision, also referred to in the literature as pain points or bottlenecks. Once you go down a particular path by incorporating a revision into a course, it might simply not work as well as you had anticipated. Faculty who are early career or for whom student ratings of instruction are expected to be at a particular level may need to be cautious in implementing changes that are perceived negatively by students. 

The four assessment questions primarily deal with alignment. Every time you add something to a course, like an entire class session and problem set devoted to algebra, you will need to consider how this could skew or disrupt existing alignment between current and revised versions of the course. One important consideration here is that by having students complete work that is attached to points in class, you are signaling not only that the activity is important for them meeting desired objectives in the course, but also that their time and effort should be matched to that point value. The example case below provides some more background for how these kinds of changes work in practice. 

In closing the loop, you need to decide what kinds of evidence will demonstrate to you and anyone else who reads about your teaching (like a departmental promotion or review committee) that your intervention was effective at accomplishing the goals you set out at the beginning of this process in addressing the issue in the first place. Going back to our example, a rise in mean exam scores will likely not be sufficient evidence, but maybe a disaggregated mean increase in responses to exam questions where students must use algebra to find a solution is. Is a separate reflection paper, or exam wrapper appropriate? Outside of these direct and indirect sources of assessment data, what kinds of evidence will lead you, personally, to believe students are benefiting from the intervention?

Implementing Revisions

Three questions ask you to think about how these revisions can be implemented in the short (within or next semester), mid- (two semesters from now) and long-term (over the next five years), how you will communicate these proposed changes, and what additional support you might need. Key considerations here include:

  • Do you teach a standalone offering of the course or do you need to communicate with other instructors who teach the same course semester over semester about proposed changes? Do you need departmental or curricular committee approval to make changes? Is your course part of an (in)formal sequence of courses and thus, you might want to communicate with the instructor who teaches these students before/after you? 
  • Do you want to make a small change now with the hopes of a larger revision in the future? 

If you don’t know which resources you need to implement changes, CTAL can help either via a consultation or by connecting you with another unit in UD’s teaching support network. 

Application

Instructional Contexts:

This worksheet is designed to apply to all instructional contexts.

Disciplinary contexts:

This worksheet is designed to apply to all disciplinary contexts

Things to consider:

  • Time and resources are limited in all instructional contexts. You might not be able to incorporate the “best” version of course revision that looks and feels like something closer to course redesign. While improved student learning is ultimately the goal of interventions, sometimes smaller changes can still have big impacts
  • Course revision is likely not a new concept to anyone with experience teaching even just one class a few times. However, instructors tend to overlook two key elements of this practice
    • Documenting Reflection: Documenting changes to your teaching and evidence that those changes are impacting learning is something that you should try to do at least once a semester. This evidence is critical for teaching dossiers and makes the process of knowing what you did one or two or four years ago much easier when preparing these documents.
    • Assessment/Measurement of Student Learning: Collecting data over time and documenting this in your records is both relevant to preparing your dossier but also for actually learning whether what you did worked. It is important to remember here that interventions might not have immediate impacts, e.g. translate to higher test scores. Where possible, collect information in any form that can help you assess whether or not the intervention you have made, or future interventions you would like to make, are working.

Examples

Example Case: ENTR 420/620 Social Entrepreneurship

This example relates to the course, ENTR420/620 Social Entrepreneurship, a cross-listed course without prerequisites, available to undergraduate and graduate students from any college.  For the Fall 2020 through Spring 2022 semesters, the course received positive student feedback in its full-semester, asynchronous online format, and only small adjustments were made from semester to semester. 

Throughout the pandemic, Stephanie observed that student assignments reflected superficial reviews of the assigned readings. In Spring 2022, the end-of-semester student feedback confirmed that some students reported that their groupmates were not reading or only skimming the readings. Students who were doing the readings were saying that they felt they were carrying the weight of their group members, which raised additional concerns for how to build more accountability into the course design for completing the readings. 

Starting in Fall 2022, to encourage students to prepare and participate in their readings and corresponding assignments, Stephanie switched from assigning the physical textbook version to the electronic Perusall version, which supports students annotating the text and seeing each other’s comments. It helped to structure in more accountability by assigning a direct value for their engagement with the readings, and it could add further connection between asynchronous students.

That semester, ENTR420/620 was to be taught in two modalities: asynchronous online and in-person environments. To create consistency between the sections, the only difference between the syllabi was the in-person course had a once-a-week, three-hour course, and the asynchronous course had weekly video and media content with a corresponding discussion board. Despite the content and assignments being similar between the two sections, the students reported two very different experiences in the class. The in-person students reported having a great experience, noting that the electronic, Perusall textbook readings helped them to feel well-prepared for class discussion, and with everyone coming in prepared, she observed that the discussions were dynamic and well-informed. In contrast, the asynchronous online students reported feeling overwhelmed with too much content and too many assignments. They felt the Perusall default grading settings held too high a bar for engagement, and between the readings and other assignments, they noted that the course was more intensive than they had accounted for. Even after moving several of the online sections’ assignments to be optional mid-semester instead of required, it was “too little, too late” to change the students’ experience of the course. 

While some colleagues recommended assigning fewer readings, Stephanie knew the readings could be a valuable course element, as evidenced by the in-person class’s reactions. Moreover, the switch to Perusall did help with reading accountability, so she decided to keep the readings and the Perusall format intact for the next semester.

After she worked with CTAL, several changes were made for the course’s next running in the Spring 2023 semester. The following table highlights three main concerns identified in the Fall 2022 course feedback: the students were overwhelmed, felt the readings exceeded the time they allotted for them, and did not like having two things due on Fridays. By identifying the central issues, she and CTAL were able to brainstorm potential solutions to tackle each one. 

Issue during F22 Solution for Sp23 Result for Sp23
Students were overwhelmed
  • Reduce number of non-Perusall assignments
  • Make discussion board assignments into brief essay submissions
  • Change the Perusall rubric from its default setting to a custom setting with more manageable expectations
  • Students reported feeling the course had a workload that aligned with their expectations.
  • By not having to return to make comments, this shift helped to simplify the expectations of students in an average week.
  • Students no longer noted feeling the rubric was a problem.
Sttudents were spending more time on the Perusall readings than they planned Increase the point value of Perusall assignments By assigning a higher point value to the weekly readings, students felt their time and efforts on the readings were being adequately acknowledged.
Students were managing two weekly requirements, both with Friday deadlines Move Perusall assignments to Wednesdays and other assignments to Fridays By having two different deadlines, it kept students’ course pacing in a good place. Further, it helped students to use their readings as a source the Friday assignments, which was built into the Friday assignments’ expectations and rubrics.

 

As shown in the table, Stephanie and CTAL co-created a course design that supported the course learning outcomes and supported the course of readings. By helping students to prioritize the readings each week, the students had allotted more time to delve into the content in a meaningful way (i.e., writing annotations, engaging with classmates on the content). By having a higher point value assigned to the readings, students seemed to assign more value to the readings as a significant portion of the class content.

During the semester, Stephanie closely monitored the students in the asynchronous online section to see if the changes were being well received, and after reviewing both the informal midterm feedback and end-of-semester course evaluations, the course seemed more balanced to students again, with many students even reporting that the readings were one of their favorite course elements. Since then, only minor updates and adjustments have been needed, as the interventions made for Spring 2023 semester have continued to have a positive impact on the students’ course experience, while maintaining the instructor’s expectations for how they should engage with the content and each other.

Stephanie Raible is an associate professor of social innovation and entrepreneurship at the Alfred Lerner College of Business and Economics’ Department of Business Administration and Horn Entrepreneurship at the University of Delaware. She also serves as UD’s faculty director of the Social Entrepreneurship Initiative, which provides students with curricular and co-curricular opportunities to explore social entrepreneurship.

References

McGahan, S. J. (2018). Reflective course review and revision: An overview of a process to improve course pedagogy and structure. Journal of Educators Online, 15(3), n3.

Yale Poorvu Center for Teaching and Learning. (2005). “Reflective Teaching” https://poorvucenter.yale.edu/ReflectiveTeaching

Revised by: 

Revised on: