THE EFFECTS OF RACIALLY INCLUSIVE ASSESSMENT ON THE RACE AWARD GAP AND ON STUDENTS’ LIVED EXPERIENCES OF ASSESSMENT
Race and Assessment in Higher Education
ISBN: 978-1-83549-743-2, eISBN: 978-1-83549-740-1
Publication date: 9 October 2024
Citation
Campbell, P.I. (2024), "THE EFFECTS OF RACIALLY INCLUSIVE ASSESSMENT ON THE RACE AWARD GAP AND ON STUDENTS’ LIVED EXPERIENCES OF ASSESSMENT", Race and Assessment in Higher Education, Emerald Publishing Limited, Leeds, pp. 73-100. https://doi.org/10.1108/978-1-83549-740-120241006
Publisher
:Emerald Publishing Limited
Copyright © 2024 Paul Ian Campbell. Published under exclusive licence by Emerald Publishing Limited
The stories examined in the first half of this book sketched out how students of colour experience exclusion in assessment in three ways: In relation to barriers in pre-assessment support and post-assessment support and barriers that were more directly related to their racialised identities. I used these three emergent themes to form a tri-based framework in which to generate 20 recommendations for Racially Inclusive Assessment Guidance. These were then developed into the Racially Inclusive Practice in Assessment Guidance Intervention (henceforth RIPIAG) – see Table 1.
This chapter sketches out the efficacy of the RIPIAG, which was implemented in three HEPs – The University of Borne, Meadow University and Wiseman University. However, this was not a straightforward process. Firstly, we chose to only test the recommendations for racially inclusive pre-assessment support. This was because many of the recommendations which addressed the more obvious racial barriers in assessment were also connected to issues of transparency, which were addressed in the pre-assessment support guidance. Secondly, feedback processes in most HEPs are tied to extremely rigid and prescriptive quality assurance processes that can only be modified through lengthy administrative processes. Put simply, feedback is operated and managed by quality processes that are unique to each institution, which usually exist beyond the control of individual module convenors. This meant that we could not guarantee consistency in the application of the post-assessment recommendations and thus could not measure their efficacy for making feedback practice more racially inclusive.
The RIPIAG consists of four practical pedagogical resources: The Critical Assignment Schedule (CAS), The Critical Assignment Brief (CAB), The Modified Active Seminar Workshop (MASW) and The Active Group Marking Exercise (AGME) (see Table 1).
The CAS is a detailed timetable that sets out the key points in the assessment process for each assignment, from the start to submission. It also shows precisely when in the semester students should ideally have started and/or completed the various tasks that comprise the assessment.
The CAB is a 3-page document (maximum) that contains at least all of the following information:
Submission Deadline
Grade Weighting of Assignment
Assignment Instructions
Assignment Questions
Tips and Essential Things to Include (when completing each assignment question)
Learning Outcomes
Referencing Instructions
What is Academic Misconduct?
Non/Late Submissions
The MASW consists of a series of (inter)active and group-based learning exercises that cover, at least, the following areas:
What do I need to get started?
Structuring the Assignment
Formulating an Introduction
Assignment Do's and Don'ts
Key Advice: What are the differences between stronger and weaker assignments?
Learning the difference between anecdotal, evidence and critical arguments?
The AGME is a group-based activity where students mark previous scripts. Using a combination of the assessment content covered in the MASW and the marking criteria, students have to come to a consensus about the grade score for each script. In each case, they provide a rationale for the awarded grade using the descriptors in the marking criteria and the lessons learnt in the seminar to justify the grade given. They also must suggest one thing that could be added to improve the assignment, with an example.
The RIPIAG was trialed between September 2021 and December 2022, in six modules, across three partner HEPs, which all had student populations that were made of at least 37% who self-identified as belonging to a minority ethnic background. Only one core module per course could be selected (this was at the discretion of the partner HEP). The final sample consisted of 175 undergraduate students, of which, over 35% were domicile students of colour.
My previous experiences of developing, embedding and evaluating race inclusion interventions in education have shown that their effectiveness as tools for change are negatively influenced by a general lack of standardisation in their application (see Campbell et al., 2022). To avoid this issue, module convenors were provided with training workshops and with templates for each of the four teaching resources, to ensure a more consistent level of embeddedness of the intervention into their modules.
What then was the RIPIAG's capacity to foster a reduction in the race award gap (RAG) in student outcomes? What was its capacity to improve the qualitative experiences of minority ethnic students in assessment? The remainder of this chapter discusses the results of these two tests.
An Effective Tool for Reducing the General Race Award Gap Between Students of Colour and White Peers
The quantitative data generated from the assessment scores of the 175 students on treated modules showed that the RIPIAG was remarkably effective at reducing the RAG in the sample (see Table 2).
University and Module Code | RAG score for Treated Module | Module RAG Average for Previous 2 years | Course RAG at that Level | University RAG |
---|---|---|---|---|
University of Bourne M1 | 1.25% | 6.97% | 1.20% | 10.0% |
University of Bourne M2 | 1.80% | 4.11% | 2.85% | 10.0% |
University of Bourne M3 | 7.38% | 7.63% | −0.30% | 10.0% |
Meadow University M1 | 4.70% | 30.25% | 23.20% | 22.0% |
Meadow University M2 | 18.70% | 37.0% | 20.10% | 32.0% |
Wiseman University M1 | 8.0% | 10.95% | 12.00% | 18.6% |
Average score | 6.97% | 16.15% | 9.84% | 17.1% |
Specifically, Table 2 shows that the average RAG difference between students of colour and those who identified as White across all treated modules was 6.97%. The narrowest gap recorded was 1.25% and the widest 18.70%. In all cases, the RAG on modified modules was below the overall RAG reported at their respective HEPs. In 83% of modified modules, the reported RAG difference was lower than the 8.8% national average that.
To measure what we might describe as the ‘differences within difference’, we tested the impact of the modified modules on students' assessment performances against their performance on non-treated modules. We also measured the performance of students on the current treated iteration of the module against the performance of students on previous and non-treated iterations of the same module in previous years.
The results showed a similarly positive pattern of efficacy. For example, 66% of treated modules reported narrower RAGs when compared to the average RAG score recorded for all non-treated modules on that course and at that level. Also, all of the treated modules reported narrower gaps when compared to their aggregate performance for the previous 2 years.
Of course, the findings here do not account for important variations that we must ‘factor in’ when considering the impact and effectiveness of the intervention for reducing the RAG, such as a cohort with an unusually large cluster of stronger or weaker students within a particular minority-ethnic group in any given year. Nor does these data account for variations in the overall number of students of colour within any minority-ethnic group in any particular year. Annual variations in each year's cohort, such as these, make it impossible to have exact like for like comparisons between years. They also influence and skew slightly the veracity of the quantitative findings here and potentially in future. They also remind us that it is unlikely and unrealistic to assume that the intervention will lead to a seamlessly consistent and linear annual reduction in RAGs. Nonetheless, the triangulation and repetition of consistent patterns of RAG reduction reported in the performance data from across all modified modules, from different courses, levels and partner HEPs, provide the basis for confidence in the intervention's ability and potential to positively reduce the aggregate RAG.
While the overall patterns of reduction in RAGs on the sample are encouraging, it is important to also note that where disaggregated data for the performance of students from specific minority-ethnic groups were available, in almost all cases, the RAG for Black-heritage students remained wider than those recorded by all other minority groups. Students who self-defined as ‘other’, which included British East Asian students, reported the lowest RAG and in some cases outperformed White peers. This group was followed by students who self-described as South Asian and then those who defined as ‘mixed’ (none of whom outperformed White peers). Interestingly, these patterns of effectiveness and limitation of the intervention corroborated with, and were explained in, the qualitative accounts of the student participants.
The Qualitative Impact of the RIPIAG on the Everyday Lived Experiences of Students of Colour in HE Assessment
The accounts of all students on treated modules compared to those who were on non-treated modules showed that the RIPIAG was almost universal in its ability to enhance Black, South Asian and White students' experiences of the assessment process. Interestingly, each resource appeared to provide a different function for facilitating positive change in students' overall learning experiences when it came to assessment.
The Impact of the Critical Assignment Schedule on Students' Lived Experiences of Assessment
The CAS resource appeared to help students learn the assessment journey in its entirety. It helped them map when in the semester they should ideally start thinking about their assignment question, when to settle on the question, when to have a first draft complete and so on. This training was effective at helping students guard against their own (frequently inaccurate) commonsense schedules for completing their assignments. For example, students remarked that when they first saw that they ‘only’ had a 1,500- or 2,000-word essay to complete for their assignment, they rationally thought that it would only take them a few days to complete. Consequently, they believed they would only need to start preparing and working on their assignment a few days before its due date – instead of long before that.
More specifically, the CAS was also especially effective for helping students from all backgrounds develop a better understanding of when to begin the process of working on their assignments. The testimonies indicate how this clarity was particularly novel and transformative for students for whom university and, in turn, assessment at the undergraduate level, were new or alien. These were often FIF students and thus were less likely to have access to the kinds of kin- and social-networks that provide the essential ‘insider’ knowledge that makes it considerably easier to successfully navigate academic life. Without access to what we might describe as a bank of knowledge that is typically hidden from working-class and FIF students, they are often left to rely on their own ‘commonsense’ solutions to these problems, which often runs contrary to good assessment practice orthodoxy. The following accounts detail some of the ways in which the CAS was especially effective at teaching students how to mitigate these common but often costly miscalculations.
Like, when we first start, obviously we don’t know what’s … to be expected, essentially. So I think it gives us a good idea of how early we should be thinking about assignments. I’m not gonna lie, I have done some assignments, like, last minute. (Kara, Black Student, University of Bourne)
So, I think it does help in terms of giving us guidance on where we should be starting … ‘Cause we wouldn’t have known how much time to put into them … without that! … Obviously in the handbook it tells you how many words you have to do. And it’s like… all right, 1,500, that’s fine. Like, we can do that in two days, no worries. (Tuni, Black Student, University of Bourne)
The Critical Assignment Brief and Changes in Students' Lived Experiences of Assessment
The CAB was remarkably effective at enhancing all students' ability to make sense of their assignment questions and specifically on what it wanted them to address. It did this by enhancing their ability to successfully deconstruct assignment questions, that were often verbose. Students remarked that this changed their overall perceptions of assignment questions from instructions that were often unclear and in turn daunting, into a set of succinct instructions that were more straightforward and manageable.
[On other modules] you don’t really know what they’re asking for… [But the CAB] helps ‘cos then you have a clear idea. I at least have, like, a path… So it does give us that. It creates a little less panic, if you will. And it does help you build your assignment – [It] gives you a starting place… (Fatima, South Asian Student, Meadow University)
So, I think it’s helpful that it gives you the ‘write down’ of how the essay needs to be laid out … For me, it helps reduce the stress. Because you can break it down into smaller sections … Rather than thinking, I’ve got a 2,500-word essay that I need to do. So, okay, well I can concentrate on this section and then break it down that way. So I think that helps reduce stress as well. (Francis, White Student, Wiseman University)
I think those [CABs] were quite good because it gave me a sense of what you need to actually talk about. Um, so that was helpful when planning what you were going to say and linking it to the questions that were there. I think it was really good because it just gives you, like, a bit of a prompt. So, you’re not, like, completely clueless about what, where to start. (Mandeep, South Asian Student, University of Bourne)
I looked at [the CAB] and I was like, am I hitting this point? Am I getting that one? Like, it made it a lot easier. Whereas, for example, I’m working on an essay [on another module] now, which is due in a few days. [All we have been given is] just a question [and no CAB]. And … I’m planning it … And I’m making points. And I’m [asking myself]: ‘is this [what she’s writing] really relevant?’ (Amy, White Student, Bourne University)
[Without the CAB] It’s harder… Yeah, it’s like we, kind of, play a guessing game with everything else. Like, with the [assignment for another module] … I had no idea what I was doing for that one … (Viv’, Black Student, Bourne University)
The stories from students from all three raced groups and all three universities above triangulate to illustrate how remarkably effective the CAB was for providing students with a blueprint for how to deconstruct their assignment questions and for making the minimum knowledge/content requirements that were expected to be to be covered in the assignment task visible to them. This function of the CAB also had a particularly positive impact for raising students' confidence in their ability to succeed and, importantly, on reducing their general feelings of anxiety and stress that were usually brought on by assessment (Boustani, 2023).
The stories of assessment detailed in Chapters 2, 3, 4 and 5 illustrated how students of colour particularly found the language used in assignment questions to be verbose and confusing (and this was connected to wider social factors and not the result of any inherent inability). This made knowing what specific knowledge or skills the task wanted them to demonstrate difficult. To some, this made assignments and essay questions high jeopardy. The participants who took part in the evaluation echoed similar barriers to comprehension and resultant anxieties.
The inclusion of the exposition within the CAB helped them to deconstruct the instruction and, in turn, significantly reduced the jeopardy that typically accompanied the assignment questions reported by students of colour. Additionally, students remarked that having a resource which clearly outlined what each question was specifically asking was effective in reducing the time they spent on trying to ‘figure out’ what the question wanted them to demonstrate. In doing so, it enabled students to maximise the time and energy spent on showcasing the required knowledge or skill. This also meant that they were less likely to be faced with the prospect of having to choose between answering a question that they understood over a question on a topic that they were particularly interested in, passionate about, or knowledgeable on.
[W]ith a broader question, you, kind of, need an, an assignment brief to guide you 'cos anyone can go on a different tangent. And then you don't know which one’s right and which one’s wrong! (Jenni, Black Student, University of Bourne)
Yeah, I found [the CAB] helpful. Like, even before I started [my assignment], it gave me an idea of what each question entailed. So, I could choose a question… and have more idea what question to do. [Rather] than if I had just been [given] the question [without any exposition] … It’s not like the plan’s done for you … You’re able to pick out the one [question] you want … ’Cause, if it wasn’t broken-down like that … I don’t think I’d be that confident in doing it. (Alicia, South Asian Student, Meadow University)
[The CAB had] tips on how to approach a question. That helped a lot ‘cause it’s like … okay: ‘This is what the question’s asking you to do.’ And I think that helped ‘cause the whole guidance thing was more like, okay: ‘So this is roughly how you should approach the question.’ And, you know, this is where you should be going with it. So, yeah, that helped a lot! (Dee, Black Student, University of Bourne)
Importantly, the CAB was effective for reducing students' dependency on lecturing staff to complete their work. Consequently, this meant that they relied less on direct input from staff for reassurance about whether-or-not they were ‘on the right track’ for success, and it enhanced their ability to function as independent learners (which was corroborated by the staff testimonies examined in Chapter 7). This also meant that students (and especially those from raced backgrounds) no longer had to reach out to lecturers who they were often not comfortable in seeking out. Nor did they have to endure the psychological drama of having to overcome any sense of vulnerability that came with exposing any perceived lack of understanding for the task and any negative judgements that might be made about them.
Yes, so you’ve still got something to refer to back to. You’ve got the structure in front of you. Even if I don’t go back to the lecturer, I could look at that. (Mo’, South Asian Student, Wiseman University)
[The CAB] kind of shows you what someone expects from that assignment. But for the other modules that we had to do an essay with, I think it was just harder because it was just the questions. And even the questions were just very hard to understand what they meant. And there wasn't really any other advice that it gave us after that. (Mandeep, South Asian Student, University of Bourne)
I feel like the assignment brief was very useful. The fact that the question was there, but then it also broke down the question for you, made it easier for you to do your introduction. Because you knew what you had to talk about, and then style your essay … And then, at the bottom [of the CAB], it would have an extra point, which is the stronger essays would do ‘blah blah blah’. I thought that part, as well, was very useful because it, kind of, allows you to try and push yourself to see if you can reach what those stronger essays would do. (Jason, Black Student, University of Bourne)
Concerns centred on the negative effects of too much support are often raised by educators, who proffer that assignment support reduces students' ability to be innovative or to demonstrate creativity and individual excellence in their assessed work. The following student accounts, however, point to an opposite outcome brought about by the intervention.
I think with the creativ[ity] thing it’s like [the CAB] does both. Because it’s like if you have less guidance, its obviously [leaves you] open to more avenues [to explore] … At the same time, you’re also stressing about, is this the correct avenue I should be going down? (Valerie, Black Student, University of Bourne)
[H]aving some sort of guide when you’re writing an essay is so important! It just helps you. It just helps you guide your thinking. It’s not supposed to stop you from adding anything else. Like, as long as it connects and is valid … Then it’s fine. (Simone, Black Student, Wiseman University)
Not to name names or point fingers, but in a certain other module we had to write an essay. It was incredibly vague [question] on what the essay should even be about. That was traumatising to say the least! Because … If it’s not specific, in order to guide your thinking, then you could end up writing an essay that maybe is not even related and then perhaps you get a bad grade, because [the answer] was not supposed to be on that point. (Lexi, White Student, University of Bourne)
I think [the CAB] definitely helped guide my thinking. Seeing especially where it said to address the limitations. I already thought to do that, but seeing it written down, like, confirmed it for me and helped me to stay on that track and confirm what I was going to do. And it just helped me have more confidence when I was writing because I knew that it was on the focus of what [the lecturer] expected from us and wanted. (Jo, White Student, University of Bourne)
Including the ‘wrong thing’ or going ‘off topic’ was one of the most commonly rehearsed reasons for the participants' reluctance to be expressive in their assignments in other modules. Consequently, the CAB enhanced students' confidence to be even more creative within the confines of the assignment, instead of stifling it. The CAB, in addition to the other RIPIAG exercises and resources discussed throughout, were successful in making the parameters of the assignment more transparent for all students. In doing so, the CAB helped to make clear the pedagogical conditions and boundaries of the task. One effect of this was that it enabled students to feel more confident and reassured of what to, and what not to include when approaching their assignments. Put simply, the CAB provided the platform for students to be even more creative because it removed the fear of going off task – and ultimately failing their assignments.
The Effect of the Modified Active Seminar Workshops on Students' Everyday Experiences of Assessment
Overall, the testimonies illustrated the effectiveness of the MASW for facilitating a deeper and more accurate understanding of the assessment process for students of colour and for all students more widely. This was achieved through a combination of social and dialogic learning.
The MASW consisted of a series of ‘active learning’ activities ‘which provide module specific and ‘hands on’ assessment support, which made clear what it is that makes work successful and how this relates to the marking criteria’ (Campbell, 2022b, p. 8). The accounts below demonstrate how the active learning activities within the modified seminars directly transformed students' experiences of assessment from something which was ‘individual’ to one which was much more social and dialogic.
Not everybody understands marking criteria exactly the same… I think engaging in group work helped a lot to reflect off each other. (Deli, Black Student Wiseman University)
It helped to see multiple perspectives – but also [to see] multiple ways of doing the assignment. (Delorus, Black Student, University of Bourne)
You, like, have your own ideas, but then when you can speak with others [students], it just develops them [their ideas and comprehension] more. And with certain modules, you don’t really feel like you can do that. (Stacy, White Student, University of Bourne)
I feel that when you’re just doing your assignment, you’re just in a bubble [on their own] and you don’t realise it. (Niki, South Asian Student, University of Bourne)
The testimonies highlight that the active learning exercises required students to explain (the aspect of the) assessment to each other, challenge each other's response and then required them to either modify or defend their views to reach a group consensus. This approach to learning aligns with what Alexander (1996) described as ‘dialogic’ pedagogy. This is the idea that effective learning is achieved through a process of ‘meaningful talk’ (Alexander, 1996), similar to that described above. A ‘deeper’ knowledge/understanding is reached through justification, challenge/defence, modification and then re-comprehension. Importantly, meaningful dialog can only take place if both learners are of relative equal status (if any two people can be equal). Put another way, meaningful talk cannot take place between lecturer and student because the power imbalance removes the student's ability to engage in ‘meaningful talk’. This is because they will typically accept the lecturer's assertion as valid. This is what usually happens, for example, when assessment learning takes place through a more transmissive and passive mode of delivery, such as in a lecture. However, when the dialog takes place between peers in a group exercise, students are more inclined to engage in meaningful dialog where, using evidence, they discuss, challenge, justify, modify or confirm their understanding.
This approach is routinely employed in compulsory education and in HEPs to varying degrees, especially in relation to the kinds of comprehension building exercises employed within seminars. In contrast to their general taught educative experiences in HE, assessments were things which students largely did on their own and in silos (unless it was a group assignment).
It is unsurprising given this background that the students here found the siloed nature of learning, and of doing assessments, to be alien, stressful and often unhelpful. Conversely, they reported considerably higher levels of comfort, comprehension and confidence when learning assignment literacy through the more social and dialogic approach taken within the modified seminars.
The MASW also enhanced students' ability to make sense of marking criteria and learning objectives and turn them into meaningful instructions. For example, the Black, South Asian and White students surveyed in Chapter 2, 3, and 4 and here, all reported that they often found the terminology used in their module's Learning Objectives and in the marking criteria to be opaque, abstract and in some cases incomprehensible.
They all recognised that terms such as ‘critical argument’, ‘logical structure’ and ‘anecdotal evidence’ were all important – and frequently rehearsed – ‘things’ that needed to be demonstrated or avoided. However, in practice these terms meant very little to them when completing their own portfolios, coursework or exams. The testimonies below demonstrate some of the ways in which the exercises within the modified seminars were transformative in helping students to translate and, in turn, ‘see’ and learn what this terminology meant and looked like when it came to completing their own assessments.
There was one bit where [the seminar] actually - as silly as it sounds - explained what critical analysis was. So I’ve had formatives before where they’ve [other lecturers have] been like you need to be more critical. But she actually gave an example [and exercises to learn it]. And as silly as that sounds, [now] that [I have seen what critical analysis is, it] makes so much more sense. (Simone, Black Student, Wiseman University)
Not everybody can interpret that document [marking criteria] in the same way… Therefore [the seminar] gives us different options and different ways to understand what we need to do to receive a First. (Nisha, Black Student, Meadow University)
There’s so [much] jargon in the mark schemes … and with[in the] Learning Objectives… [Other Lecturers will] say follow the Learning Objectives… And sometimes I look at them, [and] I’m like, I still don’t know exactly what that means! So, yeah, kind of going through it [in the seminar helps] … (Becki, White Student, Meadow University)
So, now we know [how to] write and meet the Learning Objectives that are given to get the high[er] grades. Whereas, if we didn’t have that, and we just got given the assignment to do, I don’t think, … because I’ve got dyslexia personally, so I don’t think I would have understood how to structure each paragraph and get the higher marks [without the seminar activities]. I probably would have 40 or 50% max, if I didn’t get this! [the help in the seminar]. (Jack, White Student, Wiseman University)
The seminars also enhanced students' ability to breakdown the assignment from a large and daunting undertaking, into a set of smaller and more manageable sub-activities. Students from all three different racial backgrounds and across all partner HEPs reported that they often found the prospect of completing assignments (on other modules) to be ‘overwhelming’.
Students asserted that the new seminars were especially helpful for modifying their attitudes towards assessment. The active learning exercises within the MASW were particularly effective at enhancing students' ability to identify, separate out – or ‘breakdown’ – the ‘total’ sum of the assignment task (essay, presentation, report and so on) into a series of smaller and in turn, less daunting set of actions (e.g. ‘Structuring the Assignment’, ‘Formulating an Introduction’, ‘Learning the difference between anecdotal arguments, evidenced arguments and critical arguments’, and so on). Furthermore, it provided active and group-based exercises for each section, which taught students how to complete each self-contained component of the assignment. It also (actively) taught them what constituted a stronger or weaker sub-section and why it was.
This deconstructed and more scaffolded approach to completing assignments acted as a ‘baseline’ ‘checklist’. It was, for some, an essential ‘kit’ for ‘surviving’ the assessment process. In almost all cases, it was seen as a core contributor for success, and for making them feel at ease and confident when doing their assessments in modified modules.
The [seminar] broke-it-down essentially. Like, okay, this is what you need to tackle. These are the questions that you needed to ask yourself. That’s helped. I think that’s one of the reasons why we have the grades we have … [W]e’ve done okay [in our assessment] because of those templates and that guidance. I think it’s helped a lot. (Trevone, Black Student, University of Bourne)
[The seminar] really breaking-it-down. I liked the group work as well. It was really helpful. (Sumaya, South Asian Bangladeshi-heritage Student, University of Bourne)
The workshops and seminars that we’ve had around assessments have been really helpful. [B]ecause it gave us like a baseline on what to do. Seeing, like, a structure … it really helped for me to form my own essay. Like the [others have already] said, about [being] thrown in at the deep end [on other modules]. It didn’t feel like that [with the seminar]. It felt like we had that support and it was very helpful. (Stacy, White Student, University of Bourne)
I think it was good having the seminar on, specifically, the essay we had because it was almost like a checklist when you're going back and referring to it. Making sure I was on the right lines just made me feel more confident about my actual essay when I was writing, where another module didn't have that support. And I was, kind of … second-guessing what I was writing. I wasn't too sure if was on the right track! (Sam, White Student, University of Bourne)
I liked the: ‘how to structure your essays’ [part of the workshop]. Like breaking-it-down on what is a good paragraph, [and] what is not. I liked that. (Delorus, Black Student, University of Bourne)
‘Cause I’m more of a visual learner, I’ve got more of a picture in my head of how to lay it out [and] what to include… What to put in the main body. How to link everything back… Like, through the example paragraphs. This is an example of an anecdotal [argument]… (Belle, White Student, Meadow University)
The Impact of the Active Group Marking Exercise on Students' Experiences of Assessment
Testimonies indicated that the AGME had high efficacy for developing Black, South Asian and White students' comprehensions of assessment from being able to complete the smaller compartmentalised aspects of the assignments (learnt in the modified seminar workshops – see above) to comprehending how these self-contained and dislocated aspects all joined together to form a coherent narrative in a full assignment.
[The exercise] puts it in perspective. Like oh! That’s a 70 or that’s a 60. And then you could think: ‘Oh there’s the references and that’s something that I can use in my own words'. And then put it in to get that higher mark … So yeah, I think it puts [the whole assignment] in[to] more perspective [and makes what it looks like] more clearer. What they’re looking for. (Hena, South Asian Student of the Islamic Faith, Wiseman University)
Sometimes when a tutor’s explaining something or a lecturer’s explaining something, it can get a bit muddled. I’m a visual learner, so listening is really hard for me … But when I see it … It’s like that makes way more sense to me… I can apply what you said. (Natalie, White Student, Meadow University)
The exercises also appeared to enhance students' ability to better understand the module specific expectations of the assignment and mitigate against things like inter-marker variables. These are the potential and different ways of ‘doing’ the sum or aspects of the assignment that can often vary and are dependent on the preferences of individual lecturers (such as the differences in what is expected in an ‘introduction’ or a ‘conclusion’ that may differ between markers). This issue was cause for high anxiety across all student groups.
Being able to look at an example answer, especially with your first assignment. Or even going forward, because it sets the basis of where you should be at, when you’re going further. You’re able to add more information to that because you [see] what’s expected of you. (Sam, White Student, University of Bourne)
I do find example essays and how it’s done well. I find those good, because I find, like, different lecturers expect different things. (Belle, White Student, Meadow University)
Students remarked that the AGME was also particularly useful for helping them to see, learn and know what assignments at different levels looked like. Additionally, it was also helpful for improving their ability to see and discuss the different ways in which they might approach the task, and the different ways in which assignments were structured.
For students who were unsure of how to approach the task, they proffered that the exercise helped them to organise their thoughts. It also helped them to structure their ideas and formulate how they might approach the assignment activity. The comments below illustrate some of the ways in which the exercises appeared to be particularly useful to students when they were confronted with modes of assessment which were completely new, or for when the expectations of what constituted a higher-grade response had changed from one year to the next.
I think for me, it definitely helped to structure [my work] and I found [it helped me to see] how it should be laid out. (Dal’, South Asian Student, Wiseman University)
One thing I always struggled with personally is actually writing my ideas on a piece of paper. Like, the style of writing. And when I did it [the marking exercise], I thought okay, this is the difference! This is how they portray their ideas. This is how you’re meant to structure it. (Asifa, South Asian Student of the Islamic Faith, Meadow University)
It’s just [seeing] the ideas that you [can] implement into your work … does help you. Even for me to start my essay ‘cause I never know how to start my essay… Like, I never even know how to end my essay. It’s always the introduction, the conclusion for me [that’s the hardest]. The middle bit I’m fine with. (Sasha, South Asian Student of the Islamic Faith, Meadow University)
The AGME also had a direct impact on improving students' perceptions of their own efficacy to complete their assignment and importantly on their confidence of being able to produce higher level responses.
The overwhelming majority of participants remarked that the marking to the grading criteria component of the marking exercise challenged their instinctive ideas of what constituted good assessment practice and excellence at the undergraduate level. This was often at odds with best assessment practice as set out within the marking criteria and learnt in the seminar activity.
Some students admitted to having to fight against their original ideas of what assessment excellence looked like when marking previous assignments for the first time (and even after taking part in the active seminar). Their instinct here was to score exemplar work much lower and more harshly than the module convenor. In turn, the grading exercises helped underscore the lessons learnt in the MASW. It also facilitated a more accurate comprehension of what was required for assignments to score in each grade boundary.
Subsequently, the exercise provided a triple function. It reinforced a more accurate comprehension of what constitutes higher level work (learnt in the seminar). It helped students to see that what constituted higher standard work was often lower than their initial expectations, which served to reassure them of their own aptitude and ability to succeed. Importantly, it reassured them that producing a higher-level response was not beyond them.
In turn, the activity had a direct impact on raising student confidence. This appeared to particularly be an issue for students of colour or who were White and self-Identified as working-class. These are students who have historically found HE an alien space, which runs contrary to their own race and classed habitus (See Campbell, 2022b).
It was very easy to know where we’d be scoring … [against] those assignments. (Del’, Black Student, Meadow University)
I would say that when we had those essays, it made me feel a lot better about the assignment. Because the ones that I thought were bad were actually quite good. And it made me feel a bit better thinking that, you know, if I was to do an essay to this standard, I wouldn't fail. And I just think it was like a, a big, like, reliever when I read them. I think it was very helpful to, like, understand where I could put myself in terms of those essays (scores). (Millie, White Student, Meadow University)
I just think it helps because I was a bit stressed at first, but seeing someone else’s work and realising it is actually manageable and it can be done, that helped. (Cassie, White Student, Wiseman University)
Lastly, the exercises were particularly effective at improving students' ability to guard against another widely held inaccurate ‘commonsense’ assumption of what good assessment practice was. Many students of colour remarked that they were still the first in their family to go to university and that when in the Whitened university space, they tended to seek out and socialise with people from similar raced and class backgrounds. Rehearsing the comments made by the Black and South Asian student participants in Chapters 3 and 4, participants from racialised backgrounds here also commented that they would only seek out their lecturers for assignment support as a very ‘last resort’. Indeed, no students who were Black reported being comfortable going to staff for assessment assistance at all. As such, students here reported that typically they instead often drew on kin and friendship networks for answers to issues relating to their assessments – the course WhatsApp group was often a key resource here. In many cases, these networks often consisted of people from similar social or raced backgrounds to them and who were equally as unfamiliar with how HE and assessment worked (see Campbell, 2020). This meant that, in most cases, solutions to assignment problems offered would often be anecdotal or commonsense.
One popular position originally proffered by students, for example, was that cramming in as much information as possible, learnt during the module, into a single assessment response – what they described as a ‘scatter gun’ or ‘waffly’ approach – was a formula for success. Ironically, of course, the reverse is often the case: that to score highly, students often have to demonstrate depth and not breadth of understanding. However, it is not difficult to see why ‘intuitively’, this approach might be thought to be one which demonstrates high(er) levels of engagement, knowledge and comprehension.
[The Lecturer] showed us one [essay] that was really extensive … And then one that was really short. But the shorter one got more marks! Because it still went into detail, but the [previous] one didn’t go into detail … So, you can kind of tell… Not ‘cause of how big it is, but because it needs to back up your point - Your explanation. So all of that, I feel like it just really helped. (Sasha, South Asian Student of the Islamic Faith, Meadow University)
I think, looking at the different assignments and looking at the different grades that each one got, you compare it to what your writing is like. So, if you read through it and it turns out to be 40%, you know that from the other higher examples, what you sort of need to [do to improve] … your writing, to achieve that high grade. (Francis, White Student, Wiseman University)
I would say that particular exercise when we had to mark the different essays didn’t necessarily tell me what I should do to get a first, but it told me what I shouldn’t do to not get a first, if that was the right way to say it? … It didn’t show me what I needed to do, it showed me what I didn’t need to do. (Stevie, Black Student, Meadow University)
Before, I used to waffle in my introductions. And I just found out that’s not what you need to do! (Sonya, South Asian Student of the Islamic Faith, Meadow University)
Perhaps above all else, these final examples illustrate how the AGME directly enhanced students' ability to mitigate against these kinds of miscalculations. It also provided a direct reference point for students to recognise the strengths and weaknesses in their own assignments and assessment practice.
Some Concluding Comments: the Contrasting Experiences of Assessment Between Racialised Students on Modified Modules and Students of Colour on Non-modified Modules
The testimonies throughout clearly demonstrate the cross-racial impact of the intervention for improving the assessment experiences of students from all three racial backgrounds examined. Importantly, they provide an empirical account for how each resource contributed to a more transparent experience, which benefitted and enhanced all students' understandings of when to start, what to do, how to do it and what assessment success looks like. However, I want to finish by showing how the intervention also made a noteworthy and specific difference in the assessment experience of domicile undergraduate students from global majority backgrounds, when contrasted with students of colour who were not on treated modules.
Students of colour who were on modified modules were noticeably keen to press that the intervention had made them more clear-eyed and confident about what assignments at different levels required and looked like.
I think we both kind of knew the ballpark of where we were gonna get all of our grades [when I submitted] … I think it was very much like, like I knew where my weaknesses were immediately, kind of thing. So I was like, okay, have I developed the point fully? Like, I think I could tell where I’d done well and I could tell where I hadn’t done as well… And it’s like, have I answered this the best way? Maybe I haven’t referenced enough or I haven’t developed this point fully, etc. So I think it was made very clear. I think that’s the reason why the grades weren’t a surprise … (Dionne, Black Student, University of Bourne)
I think that when the paper laid out what the grading system was [taught to us], it helped us a lot, too. 'Cause, we didn't even know how our assessments were going to be graded until we came to that seminar. (Ranjeet, South Asian Student, University of Bourne)
So, yeah, I like everything on there [the RIPIAG] to be fair. So it does help us in that sense. So, we don’t need to go to him [the lecturer] ‘cos he’s already provided it. (Asifa, South Asian Student of the Islamic Faith, Meadow University)
When you get a grade that isn’t what you think relates to your effort… [We now know] that it’s not because the lecturer doesn’t [like you]… You can actually start to see how and why you got that grade … (Dee, Black Student, University of Bourn)
The accounts of the students of colour in our sample clearly illustrate how the increase in transparency of the assessment process brought about by the RIPIAG was transformative. It that Black and South Asian students better understood how they were being assessed and what was required to achieve desired grades. It also facilitated a stronger sense of trust between faculty and students of colour.
This was in stark contrast to the experience of Black and South Asian students on non-treated modules who, as we discussed in Chapters 3, 4 and 5, were often unaware of exactly how they were assessed. This situation led to these students of colour having to speculate when assessment outcomes did not match their effort, which often fostered feelings of racial foul play and distrust (see Campbell, 2022).
Lastly, the testimonies below remind us of the multiplicity of wider and acute anti-Black challenges and barriers in HE that impact negatively on Black students' chances of achieving outcome parity in assessment.
When it comes to academic writing, like I need a lot of reassurance, like I think it might be just me being anxious … Yeah, it [the RIPIAG] helps. But it’s not the only [thing I need] … I need more! If that makes sense? (Nisha, Black Student, Meadow University)
[The module convenor of the modified module] is the only one, or one of the only lecturers, that actually supports you. In terms of tries to explain things in different ways because he understands, or I can only assume that he understands, that not everybody understands academic language the same way … And he’s one of the only people that will do it! So if you wanna say dumb it down, yeah, he does dumb down the mark scheme for us… Whereas I've had [other] lecturers that you can tell from their background, that they don’t have that ability to, I'm not gonna say dumb down again, but … to make things transparent. Because that academic language is their normal language… (Del’, Black Student, Meadow University)
Importantly, these accounts also showcase some of the limitations of the intervention's ability to mitigate against the anti-Black inequities that exist outside of modules that contribute to the stymieing of Black students from achieving grade outcome parity with other raced students and White peers, such as a lack of Black role modules in faculties and in leadership. Or in student well-being services that have historically struggled to ‘reach’ Black students when they experience mental ill-health (Boustani, 2023). Black participants asserted that if the academe was serious about trying to eliminate RAGs, then it needed to also address these wider anti-Black barriers too.
I think if those … [RIPIAG] resources and the help we get in this module was [course]-wide, it’d be really useful. But I think again, [the ability to do well in assessments is also] based on your attendance or based on… how [well] you know your [and get on with your] lecturer, as well! (Del’, Black Student, Meadow University)
I think yeah [the RIPIAG] increase[d] my confidence a bit … I think it’s good. It most definitely will help if it was all across [all our modules]. But I don’t think that’s the only thing that needs to be included! (Kerri, Black Student, Wiseman University)
[The intervention is helpful] 'cause, in a sense, you don’t have to worry about not understanding what you need to do [in the assessment] if that makes sense … It takes away one struggle! 'Cause now you just have to worry about understanding your course and translating that into a first… So, yeah, it just takes away that issue [but not all of them]. (Nisha, Black Student, Meadow University)
- Prelims
- 1 Introduction
- Part 1 Exploring the Lived Experiences of Race and Assessment in HE
- 2 White British Students' Experiences of Assessment
- 3 Black British Students' Experiences of Assessment
- 4 British South Asian Students' Experiences of Assessment
- 5 Conceptualising Inter- and Intra- Race-Based Barriers in Assessment
- Part 2 What Difference Does Racially Inclusive Assessment Make, and For Who?
- 6 THE EFFECTS OF RACIALLY INCLUSIVE ASSESSMENT ON THE RACE AWARD GAP AND ON STUDENTS’ LIVED EXPERIENCES OF ASSESSMENT
- 7 Racially Inclusive Assessment and Academic Teaching Staff
- 8 Discussion and Concluding Comments
- Afterword: 12 Years a Black Race Inclusion Academic – Some Reflections on Working in a ‘Postracism’ Space
- Appendix
- References
- Index