u3: Video Case Study

Unit 3: Video Case Study

Next Up…
That’s all for this unit. When you’re ready you may begin Unit 4.


Video Transcript
Hi, I’m Dr. Anthony Chow, and welcome to the Unit 3 Case Study.

We’re now midway through the development cycle, and the first round of feedback was extremely helpful, which led to a number of refinements and changes in how we did things.

We now wanted to see what a larger group of people thought of our MOOC, so we released it via online survey to two of my graduate courses for their input and feedback. Let’s take a look.

Overall the ratings were very positive. We asked them to react to a set of statements on a scale of 1 to 7, with 1 meaning “strongly disagree” and 7 being “strongly agree.” We tried to ask about all of the major elements of the MOOC, starting with the log-in process.

As you can see, all participants felt that it was extremely easy. All of the statements in fact were rated highly with the lowest being my instruction, but still a 6.4 out of 7.

Let’s look into each statement and share some of the participant comments. All participants felt that the log-in e-mail they received and the process for logging-in was easy.

One of our biggest concerns was making sure that we oriented the user to what the MOOC had available. The feedback was very positive, and we added small introductory videos for each unit to give users a chance to see what each contain.

The participants like their video delivery, which involves using prepared scripts, a teleprompter so that I can look directly into the camera and read my script. While you lose some of the spontaneity, it is very difficult to relax and think when you’re being filmed with bright lights shining on you like a spotlight.

Also, participants like their different background and on-location shots. Prepared scripts also serve as our video transcripts, which we found some prefer, as reading can be faster than listening and watching.

Participants like scores being associated with the review sessions. We added time and accuracy to generate a score for each session, which many people call a gamification, as it adds competitive component between your previous attempts as well as with other people.

Another liked our use of actual examples rather than just text-based questions and answers.

Someone else did not like some of the grammar I used in generating some of the questions and answers. Good advice.

The hands-on aspect was also well received. It helped accentuate application of the content we were trying to deliver. One participant did want to see more options in terms of the hands-on comparison. Someone else mentioned it was especially helpful to do hands-on within each section.

The overall satisfaction rating for the MOOC was 6.5 out of 7, which we are very happy with. Overall, participants seemed happy with the sequencing and the different ways of engaging with the material. One participant did not like the color combination of text in some of our video slides. Another mentioned they appreciate the comic relief of Spiro, which they felt enhanced the learning process.

Overall, the participants liked the MOOC. They found it efficient and effective with the design easy to use and engaging.

Thank you for joining me for this case study. As you can see, overall satisfaction has been positive so far, over 30 users. So we do have a decent preliminary validity that suggests we’re on the right track. I must note that there is some bias, though, because they are my students. At the same time, they would have no trouble telling me their issues because it’s also anonymous.

The Unit 4 Case Study will detail our design and development pedagogy behind why we made the decisions that we did. Thank you for joining me.