Does it make a difference?
Replacing text with audio
feedback.
Dave King,
School of Sociology and
Social Policy,
University of Liverpool
Tel: 0151 794 2992
Centre for Lifelong
Learning,
University of Liverpool
Tel: 0151 794 1162
Nick Bunyan
Centre for Lifelong
Learning,
University of Liverpool
Tel: 0151 794 1163
Abstract
There is
a growing body of evidence which indicates that the potential learning benefits
of providing students with feedback, however well crafted, are often not
realised, with many students not valuing or understanding the feedback provided.
Moreover providing feedback is a time consuming activity for many tutors to
undertake, and is often perceived as wasted effort.
Within this context the paper examines the potential of audio feedback as
an alternative to traditional text based approaches. We draw on interviews with
staff and students on three Social Science modules together with an analysis of
the feedback itself to explore the value of this approach. The study finds that
providing feedback using audio files leads to improvements in both quantity and
(it is argued) quality. However anticipated savings in staff time were not
realised and possible solutions to this issue are explored.
Keywords:
student
feedback: audio files: Social Science assessment
Introduction
The
importance of feedback in developing students’ learning has been highlighted by
a number of commentators (Orsmond, Merry & Reiling, 2005: Gibbs & Simpson, 2005:
Nicol and Macfarlane-Dick, 2006). Moreover tutor feedback practice has been
identified as a measure of teaching quality (Quality Assurance Agency, 2006) and
according to the National Student Survey (Higher Education Funding Council for
England, 2007) a contributing factor to student satisfaction. In 2007 81% of
students taking part in this survey agreed that they were satisfied with their
course, but were less satisfied in the area of feedback. Based on the three
measures of student satisfaction with feedback, the survey revealed 54% of
students were satisfied that they had received detailed comments on their work,
59% were satisfied that feedback helped clarify things they did not understand
and 54% were satisfied that feedback on their work had been prompt.
When done
well, feedback can motivate students, inform them how well they have done and
how to improve (Brown, 2001). However there is a growing body of evidence which
indicates that the potential learning benefits of providing students with
feedback are often not realised (Chanock, 2000: Higgins, Hartley & Skelton,
2002: Duncan, 2007: Hounsell, McCune, Hounsell, & Litjens 2008). Themes which
emerge from this research include wasted staff effort and students not
processing the feedback they receive. In addition, providing feedback is often a
time consuming activity to undertake and any new feedback system needs to ensure
that it does not increase the workload burden on academic staff.
This
study explores the value of replacing text with audio feedback and the
consequences of this for students, tutors and on the nature of the feedback
itself. Similar approaches have been used with some degree of success in other
disciplines and institutions (Cryer & Kaikumba, 1987: Kirschner, van den Brink &
Meester, 1991: Merry & Orsmond, 2007: Rotherham, 2007) and we considered that
this approach may prove to be highly effective in Social Science disciplines
that require students to undertake extended forms of writing such as essays and
dissertations. We also believed that this method may also prove to be a more
efficient use of staff time and a more effective method for engaging students in
the feedback they receive. The following observation from Rust gives an
indication as to the potential attractiveness of an audio approach:
While
reducing the time you spend, this may actually increase rather than reduce the
amount of feedback given. Students frequently say that they get far more from
taped comments, including the tone of one’s voice, than they do from written
comments, and they do not have to cope with some of our illegible writing
(2001, p. 22).
Similar
claims have been made by other commentators (see for example Johanson, 1999:
Merchant & McGregor, 2006 and Rotherham, 2007). In fact, a small literature on
giving audio feedback has been in existence for nearly twenty years although the
early attempts used audiotapes. In one study with 12 students on a graduate
course in photochemistry in the Open University of the Netherlands, Kirschner et
al (1991) for example, found that, “The amount of time spent by instructors
supplying the feedback differed minimally whilst the amount communicated to the
students with audio feedback was significantly greater than the amount
communicated with written feedback” (p. 185).
More
recently, Merry and Orsmond used MP3 files to give feedback to a volunteer
sample of 15 biology students. All of the students viewed this method of
feedback positively for three main reasons:
a) that
it was easier to understand because handwriting is often illegible:
b) that
it had more depth because possible strategies for solving problems were included
rather than just stating what the problems were:
c) that
it seemed ‘more genuine’ indicating that speech is received in a more personal
way than writing (2007, p. 101).
From the
tutor’s point of view they found that "providing audio feedback did not save
them time" but they added "it might do so with more practice"
(p. 102). However Ice, Curtis,
Phillips & Wells (2007) claimed that giving audio feedback was able to "reduce
the time required to provide feedback by approximately 75%" and also that "this
reduction in time was coupled with a 255% increase in the quantity of feedback
provided" (p. 19).
Within
the context of such claims, this paper reports on the first part of a small
project which is investigating the value of using audio files to give students
feedback on assignments in a Social Science department at the University of
Liverpool. Currently, on most modules that have some form of coursework as part
or all of their assessment, staff in the department uses a standard feedback
sheet pre-printed with headings on aspects such as referencing, structure,
reading etc. Staff can also write comments on the assignment itself but students
will only get this back if they hand in two copies (as one has to be kept for
archive purposes). Students can also request a meeting with the marker to
receive further feedback but this does not happen often and is usually when the
student is unhappy with the mark.
One issue
with the current method of feedback is that students cannot always read the
handwritten comments. Some members of staff in the past have had notoriously bad
handwriting and it is not a problem confined to a few. As one member of staff
put it:
“When we
get to the 50th or 60th script, my handwriting has rapidly deteriorated……I fully
understand students find some things difficult to read. I don’t think it’s my
problem: it’s across a number of my colleagues”.
The
feedback form has limited space to write comments which often means that the
comments tend to focus on areas of weakness and there is little room to
elaborate on points. So there is a concern about whether students actually
understand them. Moreover it is not possible to vary the order of comments so
their impact cannot be controlled. If feedback is only concerned with giving a
brief justification for the mark then these are not major problems. But if
feedback is to function to improve the performance of the student in subsequent
assignments then this system has clear limitations. All these issues arise in a
context in which there is limited staff time available for marking and many
pressures on them from other directions. Using audio files is an attractive
option if it means giving better feedback to students without spending more time
or even saving time.
Trying it
out
The
project focussed on three modules: a large first year compulsory module and two
optional modules, one from year 2 and one from year 3. Four members of staff
were involved in marking the assignments which comprised part or all of the
summative assessment for the modules. The members of staff responsible for the
modules asked for 10 student volunteers to receive feedback on their assignments
by means of an audio file. Table 1 gives the details of each module with the
number of students, the number of volunteers in the sample and the nature of the
assignment.
Table 1.
The
modules and the assignments
Module |
Students |
Volunteer Sample |
Task |
Year 1 compulsory |
208 |
8 |
2,000 word essay |
Year 2 option |
29 |
7 |
2,500 word essay |
Year 3 option |
37 |
10 |
4,000 word essay |
In order
to introduce some variation into the tutor experience and to test the robustness
of different technical systems we asked the tutor on the second year module to
record the comments directly into a desktop computer using ‘Audacity’ audio
software (http://audacity.sourceforge.net/),
with the other three tutors using two different types of MP3 recorder/players.
The
tutors were asked to explicitly compare their audio experience with their
written experience on the same module. They were given a form on which to record
the time taken to mark the assignment, the mark given and any comments. As we
wanted tutors to explore the potential of audio feedback that was most
appropriate to their own marking context, we offered guidance on how to create
an audio file using the technology but did not prescribe how long this file
should be. We did however ask tutors to address the same areas as those set out
on the standard feedback sheet but not necessarily in the same order. The
intention was for the audio feedback to replace the free comments on both the
feedback form and on the assignment. The audio files were made available to the
students via the University's VLE (Blackboard) using the digital Drop Box tool.
Evaluation
In order
to evaluate the usefulness of audio feedback we first sought the reactions of
the staff and students involved. Three focus groups were held for each of the
three samples of students. Discussion in these groups was facilitated by two
Educational Developers using a semi-structured interview schedule. Some students
who could not attend submitted written comments. A focus group was also held
with the four members of staff involved. Appendix 1 provides details of the
discussion prompts used in these events. Focus groups were transcribed and, to
allow for comparisons with the standard comment sheet, each of the audio files
was also transcribed and analysed to determine whether the quantity and quality
of the feedback provided had changed.
Quantity
of feedback
The range
in time for each audio file and the corresponding word length is shown in
columns two and three in Table 2. We can observe that marker A gave the shortest
length of feedback, 1.43 minutes which equated to 221 words. In comparison
marker D produced audio files in excess of 12 minutes and on one occasion a file
of 21.26 minutes. This equated to 1957 words of feedback on a 4000 word essay.
Whatever the total length of the feedback, however, it is interesting to observe
that, allowing for variation in the pace of the speaking voice, 1 minute of
audio feedback was generating an equivalent of approximately 100 words.
Table 2.
Audio
file word equivalents
Marker |
Audio feedback (minutes) |
Audio feedback (words) |
Standard feedback sheet (words) |
A |
1.43-3.36 |
221-425 |
36-74 |
B |
5.27-8.48 |
592-923 |
69-144 |
C |
8.03-15.37 |
1,011-2,002 |
65-158 |
D |
12.09-21.26 |
1,086-1,957 |
83-225 |
For each
tutor a sample of the standard feedback sheets used for students not receiving
audio feedback was also examined. Looking at columns three and four in Table 2
we can compare the range in quantity of feedback provided by each of the tutors
when using the standard comment sheet and audio files. It can be seen that in
all cases student received more feedback via the audio format. While this
comparison does not include comments tutors may have written directly onto
assignments it does highlight the potential that audio presents for providing
more detailed comments on students’ work.
Getting
personal
Both staff and students commented on the personal nature of giving feedback via
audio files. Two of the tutors began every file with the student's name.
“I would
start them off by saying [student's name] whereas in written feedback I would
never use student’s name” (tutor)
Most of
the students were pleased with this personalisation of the feedback:
“I
actually found that by using the name was quite good because it felt more
personal and you are taking the time to read my essay. It just felt more
personal” (student year 1)
Although
some were not so sure:
“I found
it really weird to hear my lecture/tutor’s voice coming out from my computer”
(student year 1)
Student:
I didn’t want to hear what I had actually done wrong. Actually hearing and my
lecturer telling me what I had done wrong.
Interviewer: that’s worse than seeing it written down?
Student:
yeah, I think personally.
Interestingly one member of staff commented that the personal nature of the
feedback also influenced their choice of words:
“I felt
this was a more personalised form of feedback. Because of this, I was less
likely to use words like ‘poor' or ‘weak’. I was thinking this person will be
listening to this…so I will say ‘this is quite good’ or ‘this needs some work’.
Not just the tone of voice but the actual words I was using”.
And both
staff and students were sensitive to the fact that if the essay was of a very
poor or a fail standard, then audio feedback might be awkward both to give and
to receive.
“I
wondered what it would be like to fail a student. It would present a number of
difficulties”.
Student
responses
On the
whole the students who took part viewed the exercise positively. This was mainly
because of the amount and depth of the feedback compared to their experiences of
written comments.
“We got
a lot more feedback. You can fit a lot into 2 minutes” (student year 1)
“It was
great to have a voice as it made it easier to comprehend the comments by setting
them in a little more context. Verbalising gives much more depth and I was
impressed with 10 minutes of feedback” (student year 3)
Some
students commented on other advantages of the medium itself. One student
suggested that as a computer file the feedback was more useful for future
reference:
“I
rarely look at the feedback sheets when I am writing an essay because it’s away
or in a folder somewhere. But when it’s on a computer it’s easily accessible and
I probably will [listen to it again]”
(student year 1)
Another said:
“It was
really good. I listened to it quite a few times to get the whole feedback on my
essay. When he punched out particular parts, I was able to look back on my essay
without flipping back and forth between cover sheets. I found it quite useful
because you could read the essay while the feedback is being played”.
However
some students commented that it was harder to link the comments to the relevant
section of the essay and that written comments were better in that respect. As
one student commented:
“Whilst listening to the audio
file I found myself having to stop and pause it to follow the moderator and add
the comments on the essay myself. It just seems a bit of an exhaustive method
when it may be easier, for both the student and moderator, if the moderators
comments are written on the essay instead”.
Staff
responses
All of
the staff were concerned with how they sounded on the recording:
“The one
thing I was apprehensive about was ensuring that I did not come across as
awful”.
“I was
concerned with the quality of the voice”.
It is
easy of course, for tutors to check back on written comments, to remind
themselves what they have already written and to revise anything as necessary. A
major concern with the audio recordings was the lack of a facility to easily and
quickly review what had been said and the impossibility of editing the comments
without re-recording them from scratch.
“It was
incredibly difficult: I would regularly get phone calls and knocks on the door
and be disrupted. It was hard to go back to find the exact point. I knew how to
pause the recording: I didn’t know how to go back over the last points and
resume from there on”.
“I found
myself recording 2-3 times, I didn’t get any of them right the first page. I
lost my train of thought”.
So
although MP3 files have some advantages over older recording technologies, audio
tapes are easier to manipulate, rewind, fast forward, edit etc than digital
audio files.
Whilst it
is possible at present for students to compare the amount of written feedback
they receive with each other, the audio files give their exact length. Staff
were concerned that this might lead to some students focussing on the length and
making unhelpful comparisons with other students and tutors. And in fact some of
the students in the groups were well aware of the variability in the length of
the feedback given.
Staff
were also concerned that giving students feedback in an electronic format could
make it easy for some students to disseminate them via email or to post their
files on Facebook or YouTube. But the biggest concern of the staff members who
took part was that producing audio feedback was time-consuming compared to
providing written comments:
“There’s
so much effort and time. If we are going to put that much effort why not do an
individual essay tutorial that takes 15 minutes? That would be much quicker”.
Quality of feedback
When we compare the quality of the feedback given on the standard form with that
given in the audio file a number of observations can be made. Table 3 is a
sample of typical comments made by
the same tutor while using the different formats. While these have been selected
for illustrative purposes, it is argued that the audio format does lend itself
to a ‘richer’, more comprehensive form of comment. Whether this results in
better student understanding or adds to student confusion is an unanswered
question and one worthy of further investigation.
Table 3.
Comparing quality of feedback
Standard feedback form |
Audio file |
The essay does not include any real introduction or conclusion of
any note. The main body would also benefit from better signposting.
|
Right [student’s name], I’ll start with your introduction. Your
introduction sets a good context for the essay, it sets a
background. What I don’t think it effectively does at all is
signpost the essay in any way. What I would like to see you do is
introduce in your introduction the main ideas that you will be
talking about in your essay. Whether that means writing your
introduction last of all, that may be a possibility in the future.
What you haven’t really done is discuss the themes that you are
going to discuss within the essay. Of course this question is fairly
broad and can be interpreted in a number of ways, so in that case
you should make clear your interpretation and how you will tackle
this in the introduction.
|
Standard feedback form |
Audio file |
Some interesting points have been raised. However they do not come
together to form a convincing overall argument.
|
Your next paragraph I really like. You used Richard Giulianotti, a
Sociologist on sport and Michael Billig a Sociologist on national
identity. You use their work to make sense of what you are going to
say. I really like this combination. You also talk about how flags
are used to develop national…I think we call it national
identification. It’s an identification with the nation rather than
national identity which carries other meanings. I like this, it’s
good. You also begin to
develop this ‘us versus them’ dichotomy which helps to develop group
identities and you begin to sow the seeds of how this could be
developed in relation to sport. I think you take football as your
main example, which is fine. This is good. What I am a little
uncertain about is the use of Durkheim’s work which seems a little
tenuous to me. If you wanted to develop this, and I’m not saying you
couldn’t, but it must be expressed very clearly and in the analysis
extended and applied thoroughly to the context of sport. I don’t
quite understand the point you are making in this moment in time.
|
A second
observation from analysing the transcripts of the audio file concerns the way in
which the feedback comments convey the immediacy of the marker’s reaction while
reading the work. The spontaneous, perhaps unguarded nature of this reaction is
captured by the following two comments:
“The
next point you look at reversal theory as outlined by Apter. I don’t know too
much about this theory actually and so I think the way you brought it in is to
your credit and you have used it to explain hooliganism to a decent level.
Whether this is in your own words or whether is in Kerr’s words, is a little
ambiguous, I’m not sure.......”
“Next,
what I really like in your essay is the way you use Stanley Cohen’s work on
moral panics to look at Daily Mirror and Sun Headlines to explain hooliganism. I
think this is really, really good. I think actually if you wanted to develop
this idea, you could easily do this with a dissertation............”
In both
cases the normal editing of the
reaction that would occur in a written comment is absent. What seems to be
happening here is a form of playback
(Lunsford, 1997) whereby the reader is indicating to the writer how the writing
is being experienced and the emotions that this induces. As we can see from the
example above the marker uses the word
really on three occasions in the one paragraph to convey the reaction. It is
argued that on a standard feedback sheet this emotion would be edited out
resulting in a less authentic comment. It is has been suggested (Nicol and
Macfarlane-Dick, 2006) that authentic comments help the student understand the
difference between his or her intentions and the effects.
A final
observation concerns the ways in which the feedback comments lend themselves to
capturing how tutors in the discipline (in this case Sociologists) think. For
example one comment read:
“You
also in the second part of the introduction talk about dictionary definitions of
hooliganism. I know you get this from Joseph McGuire’s work and Joseph McGuire
is a highly reputable researcher in this field. But I think actually dictionary
definitions aren’t too useful in this case”.
What we
have here is a declaration of the tutor’s tacit knowledge (that Joseph McGuire
is highly reputable). This making explicit tacit knowledge is a feature that
appears throughout the feedback and is perhaps a spin-off of the increased
quantity. Moreover we would argue that this helps in the creation of what has
been labelled guild knowledge
(Sadler, 1989). Put another way, we would argue that the feedback that is being
constructed is conveying, in a subtle way, the meanings and discourses that
characterise the discipline. The
following comment illustrates this point:
“It did
seem to me that the question would have warranted a bit more concentration on
the structural perspectives that see society and the way in which society might
be said to criminalize individuals as important counters to the individualizing
perspectives that are more psychologically based, and that tend to dominate
offending programs”.
In this case the tutor is encouraging the student to think like a sociologist (rather than a psychologist)
Time
Spent on Task - Average time spent per script
We were aware that a key issue with this method of feedback would be how it
compared in terms of staff time with other methods. We therefore asked the
members of staff involved to keep a record of the amount of time they spent
giving feedback both to the students using the audio method, and also to a
similar number of students using the standard method. The results are shown in
table 4. We are not sure how accurately staff recorded the time spent so we
cannot claim that these figures are any more than a rather crude indication of
the differences between each method. That said, they do seem to bear out the
assertions of the staff that they found the audio method to be time consuming.
Only one member of staff spent less time giving feedback using the audio method,
and this difference appears to be negligible. The other three members of staff
spent between 6 and 14 minutes more time giving audio feedback than they spent
giving written feedback.
Table 4.
Mean time spent per script
Marker
|
Standard form
(minutes) |
Audio
(minutes) |
A |
23 |
20 (-3) |
B
|
39 |
53 (+14) |
C
|
20 |
32 (+12) |
D |
54 |
60 (+6) |
We cannot be sure that this difference is all or partly a consequence of the feedback method as there are other variables, such as the time taken to read an essay, which could make a difference. But given what staff told us about the need to review or re-record their comments, the method of audio feedback seems the most likely cause of the increase in time spent. It would be useful to repeat this exercise and attempt to record the amount of time spent on the different parts of the assessment and feedback process: reading the assignment: formulating comments: writing or recording them: editing or re-recording and so on. Without this information if is difficult to see where it might be possible to save time, although we make a few suggestions below.
Discussion
In making
an overall assessment on the value of replacing written with audio feedback it
is perhaps helpful to benchmark the audio form of delivery against the three
dimensions of the National Student Survey that we highlighted in our
introduction.
With
regards the first of these, receiving
comments on work, it is reasonable to conclude that feedback delivered in
audio format does lends itself to generating a greater quantity, and by
implication, more detailed feedback. This was certainly the case with all the
markers. Indeed it may be the case that there is a quantity threshold beyond
which any extra value to student learning is diminished.
In
relation to helping students clarify
things they did not understand we can be less conclusive. However our
analysis has revealed a richer, more authentic kind of feedback being generated
which may contribute to a better understanding of the discipline. Moreover the
favourable students’ comments that we received is an encouraging sign. Whether
this is a result of the novelty of receiving their feedback in this way, or
perhaps something more fundamental is a debatable point.
The final
dimension, feedback on my work has been
prompt, was not assessed in this study. However what we found was that none
of the tutors experienced any kind of time saving, in contrast to the study by
Ice et al. (2007). Clearly staff will not be keen to utilise this method of
feedback if it is more time-consuming than other methods. Even an extra five
minutes per assignment can mean another half days work on a module with 36
students. Part of the problem is undoubtedly due to staff unfamiliarity with
this method of giving feedback. For all the tutors this was a new experience.
Quite deliberately no staff development was provided prior to use beyond
explaining how the technology worked. We wanted tutors to experiment through
trial and error and find their own way of doing things. It might be that with
practice the amount of time spent could be reduced. However there are other
factors that may also be important.
The lack
of a means to review and edit the comments easily meant that staff spent time
pre-preparing their comments, reviewing them and perhaps re-recording the whole
file again. We are looking at alternative methods of recording which may make
the process of creating audio files easier for staff to undertake and remove the
need to re-record a whole set of comments. One possibility, (which we began to
recognise during the project) would involve tutors recording smaller ‘bite
sized’ extracts of feedback and inserting these files directly into the
documents that the student had submitted. This can be done with Adobe Acrobat
pdf documents, and while students would have to submit their assignments in this
format, it would allow staff to link comments to particular parts of the
assignment if necessary.
All of
the assignments that we looked at in this project were part of the summative
assessment for the modules concerned. The feedback therefore is not only
concerned with suggesting ways in which the work could be improved, it is also
justifying the grade awarded. It may be that in the less formal circumstances of
formative assessment, staff would feel less apprehensive about giving audio
feedback and would be more relaxed about giving 'off the cuff' advice without
reviewing and re-recording it (which would not necessarily be less helpful).
Finally
the lack of any inherent limitation on the length of the feedback in the audio
files together with the enthusiasm of the staff members may have resulted in the
provision of more feedback than would be the case under normal circumstances.
The tutors had volunteered to take part, they were aware that they were only
giving audio feedback to a small number of students and that their feedback
would be scrutinised more closely than usual by their students and by
colleagues.
So there
may be a need to give guidance about the amount of time spent on giving audio
feedback if this method is to have wider application. It may help if tutors are
reminded that five minutes will produce about 500 words of good quality
feedback, much more than they would be able to write in the same amount of time.
And, although the students were impressed with the amount of feedback they
received, those receiving the longest amounts of feedback were not necessarily
more pleased than the ones receiving the shortest amounts.
Conclusions
This
study set out with the intention of exploring the potential of using audio files
as a way to give feedback on student assignments at the University of Liverpool.
While the findings from the small sample are unlikely to be representative of
all staff and students, we do consider the practice examined in this case (i.e.
giving feedback on essays) is typical of what goes on in most Social Science
disciplines. With this in mind, we do believe that audio feedback can be used
successfully to meet student feedback expectations. These expectations can be
met through feedback which has the potential to be more personal, more in-depth
and we would argue more engaging. We see the application of this type of
feedback to have particular relevance in a formative context. However for these
benefits to be achieved, we recommend that a number of conditions would need to
be met prior to rolling out this approach – be this at the University of
Liverpool or elsewhere. Chief amongst these conditions would be meeting the
concerns of staff through the provision of ‘easy to use’ technology and
appropriate staff development support. Indeed this is how we plan to proceed at
the University of Liverpool and efforts are already underway in the provision of
training events for staff, the development of good practice guidelines and
appraisals of alternative technologies.
Finally,
this study, while developing our understanding in some areas, not surprisingly
has raised new questions in others.
Indeed one limitation of our
investigation concerns the fact that while the participating students (in the
main) seemed to appreciate the feedback they received, we can only speculate at
present as to whether their learning has been enhanced. Moreover we have not
explored the impact of audio feedback on different kinds of learners, or indeed
considered any accessibility issues when using this form of technology. Larger,
more sophisticated research studies would be needed for these purposes, but
these are areas considered worthy of further investigation.
Acknowledgements
We are grateful for the support of a grant from the University of Liverpool’s Teaching Quality Enhancement Fund.
References
Brown, G. (2001).
Assessment:
A Guide for Lecturers, LTSN
Generic Centre Assessment Series No.3.
Chanock, K. (2000). Comments on essays: do students understand what tutors write? Teaching in Higher Education,
5(1), 95-105.
Cryer, P, & Kaikumba, N (1987). Audio-cassette
tape as a means of giving feedback on written work. Assessment and Evaluation in Higher
Education, Volume 12(2), 148–153.
Duncan, N. (2007). Feed-forward: improving
students’ use of tutors' comments. Assessment and Evaluation in Higher
Education, 32(3), 271-283.
Gibbs, G. & Simpson, C. (2005). Conditions
under which assessment supports students’ learning. Learning and Teaching in Higher Education 1(1),3-31.
Higher Education Funding Council for England (2007) National Student Survey
http://www.hefce.ac.uk/learning/nss/
[accessed 18 August 2008]
Higgins, R., Hartley, P. & Skelton, A. (2002).
The conscientious consumer: reconsidering the role of assessment feedback in
student learning. Studies in Higher Education, 27(1), 53-64.
Hounsell, D., McCune, V., Hounsell, J. &
Litjens, J. (2008). The quality of guidance and feedback to students.
Higher Education Research and Development,
27(1), 55-67.
Ice, P., Curtis, R., Phillips, P. & Wells, J.
(2007). Using asynchronous audio feedback to enhance teaching presence and
students’ sense of community. Journal of
Asynchronous Learning Networks, 11(2).
Johanson, R. (1999). Rethinking the red ink:
Audio-feedback in the ESL writing classroom.
Texas Papers in Foreign Language Education,
4(1), 31-38.
Kirschner, P.A., van den Brink, H. & Meester,
M. (1991). Audiotape feedback for essays in distance education.
Innovative Higher Education, 15(2),
185-195.
Lunsford, R. (1997). When less is more:
principles for responding in the disciplines. In M. Sorcinelli and P. Elbow
(eds.) Writing to learn: strategies for
assigning and responding to writing across the discipline. San Francisco:
Jossey-Bass.
Merchant, A.R. & McGregor, K.M. (2006).
Improving the immediacy and quality of
feedback for physics students, Australian Institute of Physics 17th
National Congress.
Merry, S. & Orsmond, P. (2007). Students’
responses to academic feedback provided via MP3 audio files. Paper presented to
the Science Learning and Teaching Conference 2007.
Nicol, D.J. & Macfarlane-Dick, D. (2006).
Formative assessment and self regulated learning: a model and seven principles
of good feedback practice. Studies in
Higher Education, 31(2), 199-218.
Orsmond, P., Merry, S. & Reiling, K. (2005).
Biology students’ utilization of tutors’ formative feedback: a qualitative
interview study. Assessment and Evaluation in Higher Education, 30(4),
369-386.
Quality Assurance Agency (2006).
Code of practice for the assurance of
academic quality and standards in higher education:
Section 6: Assessment of students.
Rotherham, B. (2007). Using an MP3 recorder to
give feedback on student assignments. Educational Developments, Issue
8.2, 7-10.
Rust, C. (2001).
A Briefing on the Assessment of Large
Groups, LTSN Assessment Series No.12.
Sadler, D. (1989). Formative assessment and the
design of instructional systems.
Instructional Science 18(2):119-44.
Appendix
1: Focus Group Discussion Topics
Student
Focus Group
·
Current
attitudes towards and experiences of receiving feedback
·
Reasons
for participation in project
·
Thoughts
on receiving feedback via audio files
·
Listening
behaviour
·
Aspects
that might be done differently
·
Overall
comparisons with alternative feedback approaches
Tutor
Focus Group
·
Current
feedback practice and experiences
·
Reasons
for participation in project
·
Thoughts
on providing feedback via audio files
·
Recording
behaviour
·
Aspects
that might be done differently
·
Overall
comparisons with alternative feedback approaches