
EDLD 5318 - Usability & Reflection
Usability testing provided valuable insight into how students and colleagues navigated the course, interacted with the tools, and understood the expectations. Through informal testing, ScreenPal recordings, surveys, and student attempts at module tasks, I identified both strengths and areas for refinement. Key takeaways included the importance of early tech onboarding, simplifying access to recording tools, and maintaining a consistent layout to support diverse learners. The feedback affirmed that the course is engaging and well-structured, while also highlighting opportunities to streamline instructions and reduce cognitive load. This reflection process strengthened the overall design and reinforced my commitment to iterative improvement grounded in real user experience.
1. Who conducted the usability testing, and were they ideal testers?
For my usability testing, I asked two colleagues and twenty students to navigate the Start Here module and one activity from Module 1 of my Lemonade Stand to CEO Canvas course. My colleagues included my campus principal and our digital learning coach, both of whom have highly relevant experience with instructional design, technology integration, and online learning platforms. My student testers were sixth graders, the target age group for the course. They provided authentic user feedback from the perspective of learners who would realistically use this course during the summer program I am designing.
Although my testers were valuable, the situation was not ideal. Students attempted their recordings over the Thanksgiving break and quickly discovered that ScreenPal, the tool I had requested, was blocked by our district due to student data collection concerns. Many also struggled with the time limits in WeVideo, the approved alternative, and several adults were not able to complete the survey given their workload immediately after the break. In a future iteration, I would streamline the recording process by testing district-approved tools in advance, scheduling recording time on campus, and recruiting additional adult testers who have experience with online platforms outside of Canvas to provide broader usability insights.
2. How did your LMS/platform affect the testing and results?
I intentionally chose Canvas instead of Google Classroom because it offers stronger structure, better integration with third-party tools, adjustable module settings, and a more professional layout that resembles platforms like Blackboard from my undergraduate experience. Canvas is also the system my district uses, so students are highly familiar with its interface.
However, using a free personal Canvas account significantly limited my testing. Many tools I normally use, Khan Academy, DeltaMath, and certain district-licensed integrations, were not available outside of the district’s Canvas environment. I also could not pre-create accounts or configure assignments fully before enrolling students. This meant students could not experience several feedback-driven components of the course as intended. Additionally, testers from outside my district could not access the course without additional permissions, which made it impossible for my Lamar classmates to participate.
These constraints provided valuable insight into the infrastructure considerations needed if I offer this course publicly or as a summer program. In the future, I would build the course within the district’s Canvas instance or a paid Canvas environment to ensure full tool integration and accurate usability testing.
3. What lessons did you learn from the feedback?
Several testers, including adults and students, noted that some sections were too wordy. I suspected this during development and their feedback confirmed the need for clearer, more concise text. Students also pointed out areas where instructions needed simplification or visual cues to help them navigate independently.
Another key lesson involved accessibility of recordings. I learned that tool instructions must consider district-level restrictions and that students need structured time to record while supported, not independently during breaks.
The student feedback also reminded me of their familiarity with short form, visually engaging platforms such as TikTok and Instagram. Their expectations for pacing, clarity, and visual hierarchy differ significantly from adults, which helped me rethink how I will refine my multimedia and text components.
4. What changes did you make to address usability issues?
Based on tester input, I revised overly long passages, reorganized text into shorter chunks, added clearer icons and subheadings, and simplified navigation instructions. I also updated the Start Here module to include an orientation video rather than relying solely on written steps.
I made adjustments to the Module 1 activity to reduce cognitive load and ensure students understood the purpose of each step before beginning the challenge.
These revisions improved the usability of the course and created a more intuitive learning experience aligned with both student developmental needs and best practices for online learning design.
5. How did this process improve your course and the learner experience?
Usability testing helped me see the course from the perspective of both novice online learners and expert adult educators. The process highlighted the importance of pacing, clarity, multimodal access, and the consistency of visual design. It pushed me to refine my alignment between outcomes, activities, and assessments by ensuring students could navigate and complete activities without unnecessary barriers.
6. How will you address infrastructure and support needs for learners?
Going forward, I will:
-
Use only district-approved tools for student recordings and task completion.
-
Build the course inside the correct Canvas environment to ensure integration with Khan Academy, DeltaMath, and auto-graded activities.
-
Create parent and student onboarding guides.
-
Provide sample videos demonstrating how to complete key activities.
-
Ensure testing occurs during structured time so students are not troubleshooting alone.
These steps will ensure a smoother learner experience, accurate data tracking, and more authentic engagement with course content.




