Feedback using Video Screencasts

Origin

School of Civil, Aerospace and Mechanical Engineering

Contact

Tools used

  • Mediasite Mosaic
  • Turnitin
  • Graphics tablets and iPad
  • External microphone.

Background

Students weren't happy with the (written) feedback they received; sometimes they misunderstood the points their tutors were making, or simply didn't look at the feedback at all. Markers feared that students looked at the mark then ignored the constructive feedback, missing a learning experience.

A compulsory 10 credit Finite Element Analysis (FEA) unit on the 3rd year (UK FHEQ level 6) of a BEng/Meng Mechanical Engineering degree was used to investigate if the giving of feedback could be improved. The course averaged around 165 students during the four years of the study.

Creating traditional feedback for the 165 submissions was taking a total of around 30 hours.  Written feedback was dragged and dropped from a pool in Turnitin. Lecturers could see that many students hadn't even looked at the feedback.

Objectives

  • Give feedback to students which they consider helpful rather than confrontational.
  • Ensure the effort of writing feedback is worthwhile: make sure that it's viewed.
  • Improve the survey results for how satisfied students are with the feedback they receive.

What was done

Students were given a choice between receiving feedback via:

  • The existing Turnitin system of drag-and-drop quick marks and written comments
  • Short voice recordings
  • Screencast and voice recordings.

Lecturers recorded video screen cast (VSC) feedback using Mosaic, annotating with a graphics tablet (later an iPad was used).

VSC is an alternative to written feedback, where a tutor makes an audio-narrated video recording of their screen while they mark. The verbal component can fit a lot of information into a short time. Visuals and tone-of-voice adds further information, and makes a human connection between the tutor and student. Screen-drawing tools enable even richer feedback.

A decision was made to avoid time-consuming editing or to seek perfection: Recordings were paused when breaks were needed, and tutors simply carried on through any mistakes or background noises.

The new feedback methods were evaluated using two anonymised surveys, starting in 2015/16, the year before VSC feedback was introduced.

1. Firstly, the Engineering Faculty’s unit survey system was used to ask students seven questions about the unit, two of which were about feedback. (This kind of survey is standard faculty practice and did not add any additional workload to what students already expected.)

  • Opportunities to obtain feedback on progress?
  • Feedback prompt and useful?

2. An extra survey was created to get more detailed responses to the new feedback methods. Students answered using scaler ratings, yes/no and freeform text. Some questions were changed between different years of the unit, but the following were used each time:

  • Preference of written or video screen cast (VSC) feedback (where 1 is written and 5 is video)
  • Do you think the feedback received would be useful for other work outside of the FEA unit? (Y/N)
  • Any other comments? (freeform text).

What worked well

As shown in figure 1, over 60% of students wanted to receive screencast and voice recordings, around 30% wanted the existing Turnitin feedback, and under 10% wanted voice-only recordings for feedback.

A bar graph shows voice feedback was by far the least popular feedback channel. Traditional feedback via Turnitin and Quick Marks had around 30% popularity, with the rest of the vote (over 60%) going to recorded video screen cast.
Figure 1. Polled student response of preferred feedback mechanism

Creating the videos took a similar amount of time (around 30 hours for an average of 165 students each year) to creating the written feedback.  Importantly, markers reported less fatigue creating the VSC when compared with traditional written feedback.

Following the trial of VSC feedback there were 30% more responses to surveys than was normal.

Lecturers could show student work next to real-world examples, and found it easier to explain issues such as low resolution images.

Students actually watched the feedback videos, and it was possible to track this.

Responses to survey questions "Opportunities to obtain feedback on progress?" and "Feedback prompt and useful?" rose from around 3.5 to nearly 5, starting from when VSC was introduced (figure 2).

 

Two graphs show students' ratings of a) "Opportunities to obtain feedback on progress" and b) "Was feedback prompt and useful".
From the time that video screencasting was introduced, both graphs show an increase of the students' ratings of these two aspects of the course. In 2015 their ratings were between 3 and 4. By 2018/19 the ratings were approaching 5 out of 5.
Figure 2. Student score of (a) “opportunities to obtain feedback on progress”, (b) “feedback was prompt and useful”

Lessons learnt

  • It's always tempting to make improvents to a recorded video, but the temptation should be resisted, or creating feedback will take forever!
  • Audio quality must be good; some background noise is not a problem, but the basic level of voice recording has to be clear, which requires a suitable microphone (lapel, headset or tabletop), positioned a few inches from your mouth.
  • VSC feedback might not be practical, depending on the size of the cohort; don't commit to it unless you know you have the time and skill resources to deliver video feedback in time and to a good enough standard.