Quality Enhancement for E-Learning Courses: The Role of Student Feedback



administrator who was in charge of collecting the responses and also for
collating them and then making them available to the rest of the team. Three
elements were new to this strategy compared with previous strategies: the
questionnaire was centrally administered (previously each tutor sent out the
questionnaires for his/her module); it was sent out at a clearly defined time; and
the processes for collecting and reporting the results were clear to students and
staff.

In Course B an online survey had been used for several years with an explicit
procedure for monitoring and reporting but they were getting very low response
rates and poor quality responses, despite the well-established procedure. Whilst
the course leader collated and reported on the results, the rest of the course
team were unaware that these procedures were taking place.

In Course C no module evaluation was used. The file with the survey questions
was available in the VLE, but students were not aware of it and staff never
asked students to complete it. In this course, the lack of coordination between
the course leader and the development team that had created the survey
resulted in the procedure never being applied.

In Course D after several trials an approach using an online survey was agreed
upon. The strategy adopted was influenced by the high number of students
expected to complete the survey and thus the need to have an automated
system for analysis and reporting. The students confirmed that they were more
willing to respond to the survey now that it was available online.

These case studies showed that the implementation of module evaluations often
presented problems, and as a consequence they were ineffective for enhancement
purposes. The main reasons for this were:

the low number of responses

the absence of clear and effective strategies for collecting and processing the
results.

The main challenge for staff in these e-learning courses was to obtain enough
relevant feedback to make the collected data useful for quality assurance and
enhancement. Low response rates led course teams to discard the results as invalid,
regardless of their content. Low response rates were largely a consequence of the
fact that students were at a distance, and as a result course teams had less control
over the process. The strategies to overcome this included various attempts to
improve the questionnaires, such as making questions more meaningful in order to
motivate students to respond, and by changing the way the questionnaires were
administered (e.g. moving surveys online).

.. .we do it online but as you are seeing from comments that’s an area of
weakness. We don’t have enough evaluation, we want more.
(Tutor)

The absence of clear and effective strategies for collecting and processing the results
sometimes meant that the responses were left untouched or only superficially
analysed, so that they lost their potential to illuminate the evaluation and eventual
improvement of the course. In Courses B and D the teams were primarily focused on
the appropriate application of the questionnaires and in obtaining more results, rather
than on planning how the results were going to be analysed and later used, and who
would be responsible for this process.



More intriguing information

1. The Institutional Determinants of Bilateral Trade Patterns
2. Weak and strong sustainability indicators, and regional environmental resources
3. Comparison of Optimal Control Solutions in a Labor Market Model
4. Auctions in an outcome-based payment scheme to reward ecological services in agriculture – Conception, implementation and results
5. Sectoral Energy- and Labour-Productivity Convergence
6. The quick and the dead: when reaction beats intention
7. Quality Enhancement for E-Learning Courses: The Role of Student Feedback
8. The name is absent
9. National urban policy responses in the European Union: Towards a European urban policy?
10. The name is absent