As the Fall 2025 semester comes to an end, I would like to take some time to reflect on my LDT502 course. Throughout this course, my teammates (Asifa and Anthony) and I were responsible for a specific design case. Each week, we would focus on a different element of the instructional design process, guided by the ADDIE model. As our final project, we were responsible for creating a mock-up of a training module. Please see my thoughts on our final project below. Also, feel free to view our mock-up before reading my thoughts! Our mock-up can be found here: LDT502 Instructional Materials Mock-Up.
*Please note: This evaluation contains hypothetical results, which assume the training has been put into production.
When we started our mock-up, we began with our learning objectives in mind. Our terminal and instructional objectives were clearly stated on the front page of our instructional module. These goals were designed to be easy for our students to measure their success in the training. Our terminal goal was: You will be able to create, edit, export, upload, and embed a 5–8 minute module introduction video using DaVinci Resolve, YouTube, and Canvas, following university branding, accessibility, and quality standards to increase student engagement. Ultimately, we wanted to prepare our learners to create an effective introductory video for a module in their course to boost student engagement.
All three of our instructional objectives served as stepping stones toward this terminal goal. Our first instructional objective was to teach our learners how to organize their media files. Our second objective was to allow them to demonstrate how to edit their raw video footage into a cohesive final product. Our third objective focused on equipping our learners with the skills to export, upload, and embed their videos in their own course. These objectives clearly align with the identified needs of our learners. Some of our faculty had little to no experience with video editing tools, while others had more advanced levels of knowledge.
This module served as an entry point for learners regardless of their prior knowledge or background. We strove to build clear connections between our activities and assessments. Each page of instructional text and assessment for each section closely aligns with its specified instructional goal. We took care only to provide essential information to decrease cognitive load and focus on learning retention. This resulted in a practical training module designed to teach learners how to create their own introductory video using the software, Da Vinci Resolve.
When evaluating the success of our training, we utilized the Kirkpatrick model. We started with Level 1: Reaction. Per the University of San Diego (2025), this level “gauges how participants responded to the training.” To gauge our learners' responses, we asked them to provide mandatory feedback in their final assessment. We asked: Was this training helpful? Were the steps in this training easy to follow? We also offered our learners the opportunity to ask additional questions. Our initial results found that our learners found our content approachable and engaging. Over 90% of our learners stated the module layout was easy to follow. Also, many mentioned that the written instructions were clear and easy to follow when used along with the screenshots we provided. However, to make the module even more accessible, my team and I could have created step-by-step video tutorials within each lesson for our learners to reference. This would have provided an additional way for our learners to interact with the content and bring another aspect of Universal Design Learning to our design process. If given the opportunity, I would focus on designing instructional videos to include in the next iteration of this training.
Next, we evaluated the success of our training by diving into Level 2: Learning. This level focuses on the knowledge our learners gain and the skills they develop through our training (Calhoun et al., 2021). At this level, we must determine whether our training has resolved the identified problem. We provided assessments at the end of each lesson (3 total). Then, we gave a final assessment encompassing the artifact that the learners would create (their introductory video). After parsing through the data, our learners were able to create their own videos with 95% accuracy and embed them into their own Canvas courses!
We are still gathering data for both Kirkpatrick Level 3: Behavior and Level 4: Results. While most of our faculty members have been able to include their introductory videos in an upcoming course, we will continue to monitor their remaining courses for the 2026-2027 school year. This will give us insight into changes in their behavior or if they revert to not posting introductory videos at all. Also, our terminal goal of increasing student engagement is being continuously evaluated. Once student engagement has been successfully measured throughout the Spring 2026 semester, we will be able to accurately assess our training based on Kirkpatrick Level 4: Results.
Based on my evaluation of our instructional materials, our instruction did an excellent job solving the identified problem of our design case. We equipped our learners with the proper skills to create and edit their own videos. These skills can be utilized throughout their time at the University of Skaro to increase student engagement. Our current assessment data shows our target learning goals have been met!
As I reflect upon this project, the analysis part of the ADDIE model resonated with me the most. Before this course, I did not truly understand the amount of work that goes into identifying the problem, determining learners' needs, and understanding their prior experiences. From our readings, I learned that this is often overlooked but an essential part of the design process. The success of a training heavily relies on the amount of time and effort spent in the analysis phase. The stages I found most challenging were the design and development phases. However, our instructor provided us with helpful feedback to incorporate into our task analysis and detailed design documents. This experience truly opened my eyes to the iterative nature of the design process. Working in a team greatly influenced the positive outcome of our project. We designed a practical training module tailored to our learners' needs and exceeded our stakeholders' expectations. I am excited to continue providing high-quality learning experiences in the future!
References
Calhoun, C., Sahay, S., & Wilson, M. (2021). Instructional design evaluation. Design for learning: Principles, processes, and praxis. https://edtechbooks.org/id/instructional_design_evaluation
University of San Diego. (2025). The Kirkpatrick Training Evaluation Model [+ Benefits & faqs]. University of San Diego Online. https://onlinedegrees.sandiego.edu/kirkpatrick-training-evaluation-model/