Impact by design: planning your research impact in 7Cs

Niall Sreenan (The Policy Institute, King's College London, London, United Kingdom)
Saba Hinrichs-Krapels (The Policy Institute, King's College London, London, United Kingdom)
Alexandra Pollitt (The Policy Institute, King's College London, London, United Kingdom)
Sarah Rawlings (The Policy Institute, King's College London, London, United Kingdom)
Jonathan Grant (The Policy Institute, King's College London, London, United Kingdom)
Benedict Wilkinson (The Policy Institute, King's College London, London, United Kingdom)
Ross Pow (Power of Numbers, Cambridge, United Kingdom)
Emma Kinloch (The Policy Institute, King's College London, London, United Kingdom)

Emerald Open Research

ISSN: 2631-3952

Article publication date: 9 December 2019

Issue publication date: 12 December 2023

490

Abstract

Although supporting and assessing the non-academic “impact” of research are not entirely new developments in higher education, academics and research institutions are under increasing pressure to produce work that has a measurable influence outside the academy. With a view to supporting the solution of complex societal issues with evidence and expertise, and against the background of increased emphasis on impact in the United Kingdom's 2021 Research Excellence Framework (REF2021) and a proliferation of impact guides and tools, this article offers a simple, easy to remember framework for designing impactful research. We call this framework “The 7Cs of Impact” – Context, Communities, Constituencies, Challenge, Channels, Communication and Capture.

Drawing on core elements of the Policy Institute at King's College London's Impact by Design training course and the authors' practical experience in supporting and delivering impact, this paper outlines how this framework can help address key aspects across the lifecycle of a research project and plan, from identifying the intended impact of research and writing it into grants and proposals, to engaging project stakeholders and assessing whether the project has had the desired impact.

While preparations for current and future REF submissions may benefit from using this framework, this paper sets out the “7Cs” with a more holistic view of impact in mind, seeking to aid researchers in identifying, capturing, and communicating how research projects can and do contribute to the improvement in society.

Keywords

Citation

Sreenan, N., Hinrichs-Krapels, S., Pollitt, A., Rawlings, S., Grant, J., Wilkinson, B., Pow, R. and Kinloch, E. (2023), "Impact by design: planning your research impact in 7Cs", Emerald Open Research, Vol. 1 No. 3. https://doi.org/10.1108/EOR-03-2023-0007

Publisher

:

Emerald Publishing Limited

Copyright © 2019 Sreenan, N. et al.

License

This is an open access article distributed under the terms of the Creative Commons Attribution Licence, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


Background

Overview of the 7Cs

In a world facing increasingly complex societal, economic, political and environmental challenges, academic research and research institutions have the potential to make important, innovative contributions to society (Grimm et al., 2013; Benneworth and Cunha, 2015). At the same time, a broader institutional and economic context in the United Kingdom (UK) demands that those contributions be identified, assessed and measured (Wilsdon et al., 2015). As a result, much debate on the value of research impact has emerged, while numerous tools and, sometimes highly detailed, guides promise to help researchers and institutions maximise and measure the impact of their work. It is against this background, of societal and institutional demand, and the proliferation of advice and discourse, that the authors of this paper seek to make an intervention. Here, we introduce our key set of principles or tools for researchers, our “7Cs for Impact”: Context, Communities, Constituencies, Challenge, Channels, Communication and Capture (Figure 1).

Drawing on the core aspects of training courses delivered online and in-person at King’s College London, many delivered by the authors of this paper, these seven key principles offer a simple but effective set of concepts and tools for researchers who seek to make a positive influence through their work and to help them operate effectively within the broader institutional context of research impact assessment.

Starting by describing in further detail the background against which this paper is set, as well as its objectives and its drivers, this paper will then describe each of the 7Cs in detail, outlining how each of these principles applies to a range of stages and aspects of a research project and its design. Throughout, examples of how these principles would apply in practice are demonstrated with a hypothetical research project, offering researchers a further aid in their project planning. This paper also offers some concluding remarks on the 7Cs framework, reflecting on the need for flexibility and adaptation of the model.

We also acknowledge that serendipity can have a significant part to play in the outcomes and impact of any research project. However, simply because some of the outcomes of a project may lie outside our ability to plan does not mean we cannot still plan to maximise the chance of impact. To that end, planning can and should allow for the role of chance, researchers should be able to seize on opportunities as they arise, while also ensuring engagement and dissemination activities are focussed on the goals of the project and not scattered ineffectively. Through this framework we make the case that it is possible to design impact into a research project from the outset by a set of planned activities and engagements, rather than leaving impact as an afterthought or the product only of luck.

A brief history of impact

The idea that academic research can lead to societal, economic or cultural benefits beyond academia – what is now widely simply called “impact” – is not new. Despite their reputation as “ivory towers” (Shapin, 2012), knowledge produced in universities or similar institutions has always exerted influence on the societies, cultures and economies in which they are embedded – from medieval sites of legal, medical and theological learning in Western Europe to the modern world, where universities seek to encourage research and innovation to meet critical economic, societal demands. The first studies to methodically examine the contribution research makes to society date back to the 1960s and 1970s; while the UK’s national Research Assessment Exercise, which measured research quality (initially, purely in terms of academic contribution) in order to determine funding allocation, was inaugurated in 1986 (Marjanovic et al., 2009).

Nevertheless, efforts to assess, qualify, quantify and reward the impact of academic research on extra-academic spheres of societal activity have grown in significance in the last number of years (Williams and Grant, 2018). In the UK, this has been spearheaded by the Higher Education Funding Council for England (HEFCE), known since 2018 as Research England, which in its 2014 assessment of the Research Excellence Framework (REF) required research organisations to submit “Impact Case Studies” in addition to traditional research “outputs” such as journal articles and research monographs (Higher Education Funding Council for England, 2011). ‘Impact’, for REF2014, was (and is still for REF2021) defined as follows: ‘an effect on, change or benefit to the economy, society, culture, public policy or services, health, the environment or quality of life, beyond academia’ and was included as a key criteria for evaluating the research output of higher education institutions in the UK (UK Research and Innovation, 2019).

A significant amount of analysis in the UK has debated the value of this development. Analyses range from criticisms of ideologically driven managerial surveillance (MacDonald, 2017), critical analysis of the accuracy of so-called “Impact Case Studies” (Khazragui and Hudson, 2015), to defence of the importance of evidence-based forms of accountability for university research and its impact (Wilsdon, 2015). However, while disagreement about the ultimate value of the “impact agenda” seems endemic, the agenda looks set to continue undeterred. For Research England’s REF2021 exercise, assessments of research impact will account for 25% of the quality-related (QR) research funding allotted to universities (Kerridge, 2018)1. This means that in order to ensure continued funding for research, institutions and researchers in the UK, now and in the future, need a practical understanding of how impact works and can be achieved.

In addition to debate on the value and implications of the impact culture shift, there are now numerous guides, services, tools and institutional arrangements designed to help researchers familiarise themselves with impact mechanisms and terminology, to help empower them to undertake impactful research. These include the creation of new professional services roles, “impact officers”, now established in universities across the UK, who support researchers in research design and in writing effective impact case studies; research impact networks for these officers and related administrators; written and online guides (Denicolo, 2014; Economic and Social Research Council, 2019; Reed, 2018; Tilley et al., 2018); as well as private companies, such as Impact Tracker, Research Fish, Kolola and ImpactStory (Reed, 2015; Vertigo Ventures, 2019), which provide digital tools and guidance to support researchers and funders in the capture and evaluation of impact. These are the contexts in which this paper seeks to introduce the “7Cs” principles for impact. Although deserving of debate and deep analysis in themselves, it is our aim to cut through some of the complexity of the discourse around impact and the related array of resources with a simple, but effective, set of principles that allow researchers to integrate impact into their work from the beginning of their project.

Impact beyond the REF

The fundamental reason we offer the framework in this paper is our shared conviction that research can and does benefit society beyond the immediate context of academia or education. This is borne out by a major study from 2015, conducted by researchers at the Policy Institute at King’s College London in collaboration with Digital Science, undertaking a quantitative and qualitative examination of all impact case studies submitted to REF2014 (6,679 non-redacted case studies). The study analysed the diverse nature of research impact, the disciplinary origins of impact, and identified its major beneficiaries in areas such as policy, health, culture, industry and in specific geographical regions (King’s College London and Digital Science, 2019). The final report recognises the increasing importance of impact assessment in the UK for universities and funders, providing a valuable insight into impact case studies, as well as identifying some of the key problems with the REF’s assessment methodologies. At the same time, the report looks far beyond the REF by testifying to the transformative and ameliorative potential of research for society at large and revealing the current emphasis on impact as an opportunity to position research as a key driver of significant social, economic and cultural change.

The framework set out in this paper is intended to be simple to use, comprehensible for researchers at all levels of experience, as well as free to use. As is demonstrated in the synthesis of the impact case studies (2019), designing and delivering research impact is complex, non-linear and unique to each research project. For that reason, while impact officers and longer guides offer excellent support for researchers, we recognise that it is important to empower (often time-poor) colleagues to imagine, articulate and plan for the impact of their work themselves. That combination of complexity, specificity and time pressure demands a simple – but not simplistic – approach. This is why the framework we outline has been articulated succinctly as the 7Cs – an easy-to-recall, concise but effective set of concepts and tools to allow researchers to embed impact within their work. We claim no significant originality for any of these principles, as many readers will recognise some or all of them from their own practice or from similar guides. What we offer here, however, is a description of how each of these principles work, individually and in concert, as well as in practice through a hypothetical example provided throughout.

Finally, while the Research Excellence Framework functions as an exogenous institutional and financial impetus to focus on impact, our conviction is that a focus on research impact – holistically understood – offers academic researchers the opportunity to tackle highly complex societal, economic, technological and other issues. One model for our ethos is that of “public sociology”, which seeks to close the gap between an engaged sociological ethos and sociological methods, by engaging with wider non-academic groups and publics (Burawoy, 2005). Without care, an analogous gap between the ethos of research impact and its methods can open up, if those with knowledge of how to benefit society through research fail to communicate their knowledge to their various publics. While the Impact by Design training course and related workshops on which this paper is based reside largely within the walls of our institution, this article represents an opportunity to make freely available the core principles of this training and our collective experience, hopefully to the benefit of many more communities.

Learning through practice: developing the 7 C’s model

The framework offered here has been derived from practical experience in promoting impact through institutional capacity building; specifically, through the design and delivery of Impact by Design, an online and in person training course open to all staff at the university, and face-to-face workshops and impact clinics, designed to supplement the online learning material. Designed by a number of this paper’s authors, the online Impact by Design training course has been running for just under two years (as of July 2019) and provides a series of short online video classes as well as practical exercises for researchers and professional services staff. It has attracted over three-hundred enrolments since beginning in 2017. Some of the principles outlined in this paper are based upon the experience creating the course, its content and informal peer-to-peer feedback.

Consistent themes in all forms of feedback have included emphasising the importance of clarity and succinctness when discussing impact and a compartmental or modular approach to tackling impact project planning. Breaking down the nebulous concept of “impact” into concrete practices is seen as useful and communicating effectively for the purposes of impact was thought to be a key, practical lesson for participants, which, as we shall see, also involves a number of the other C’s outlined here.

In combination with online training, the authors of this paper have provided researchers at King’s and elsewhere workshops, clinics, one-to-one advice sessions and many other forms of pedagogy, skills training and development for impact2. In these sessions, the principles we have outlined are explored more deeply through dialogue. This paper reflects on the lessons we have learned from our work as “impact practitioners”, which we now seek to share with a still-wider community of researchers and research support professionals.

The 7Cs of Impact explained

The concept of “impact literacy” stresses the multiple skills required for researchers to make effective and timely research impact, the differential levels of literacy across the sector, as well as the broader set of institutional conditions that support researchers in creating impactful work (Bayley and Phipps 2019). The 7Cs framework is aimed at the individual researcher (or research team) and seeks to support impact literacy by offering an applied framework, applicable across all levels of skill and all disciplines. The authors of this paper have found that rather than emphasising abstract skills or knowledge for creating impact, researchers can benefit from asking themselves a series of questions, which we have divided into seven categories or principles (as outlined in Figure 1). Each of these questions is intended to clarify key objectives of any research impact project as well as to crystallise the ways in which that project can be constructed.

These principles do not separate “objectives” and “design”. Rather than linger on theoretical distinctions, our framework is intended to be pragmatic, focusing instead on fundamental building blocks of research impact (such as the communities you wish to reach, how you communicate, and the context in which you operate). For each building block, design and objectives are intertwined and mutually beneficial: effective research design is more likely to achieve its objective, while clearer objectives lead to better design. While we encourage researchers to begin to think about and apply these principles from the very beginning or planning stage of a research project, they can be applied at later stages of a project too.

In the following section, we describe each “C” in further detail and incorporate them into a worked example of a hypothetical research project; in this case, research to help improve how children and young people with mental health problems can be identified and referred by their general practitioners (GPs) (Box 1).

Box 1.

Hypothetical research project

Improving referrals to specialist mental health services for children and young people

While tools and resources exist for identifying adult mental health problems, there is a dearth of resources to do this for children and young people. GPs are usually the first gateway to identifying mental health problems, but studies have shown that they lack the training, skills and tools they need to do this effectively. The aim of this research project is to provide tools to help GPs identify mental health problems in children and young people, and to then refer them to the appropriate specialist services. The tool consists of a set of ‘trigger questions’ to keep in mind if a child or young person shows any signs of requiring mental health support. The research will trial the tool prototype in fifteen GP locations based in London, Manchester and Birmingham.

Context

What are the wider environmental, political, social, technological, legal and/or economic contexts to which your research may be relevant?

The fundamental shift in thinking required when considering societal, as opposed to academic, impact is to place the research in defined external and societal contexts. This requires researchers to think beyond the immediate academic, institutional and disciplinary setting, and to examine how variables outside the academy will affect the research in question as well as how that research could make a difference in certain contexts. While some researchers may already think this way about their research, it is nevertheless helpful to focus this analysis by using specific frameworks for identifying and examining wider contexts. One such framework is called a “PESTLE” analysis, a widely used tool in strategic and corporate planning, which examines the following external factors or contexts: political, economic, sociological, technological, legal and environmental3. For this exercise we ask researchers to consider how each of these external factors relates to their research project. In some cases, all of these factors will affect the project question; for others, only some will be of any relevance.

Additionally, we encourage researchers to describe the relation of their research to these PESTLE factors in terms of strengths and weakness and combine the PESTLE framework with a SWOT analysis (Strengths, Weaknesses, Opportunities and Threats) in order to provide a detailed picture of how the contextual factors identified could impact on the project (Figure 2). This combined framework describes how the external environment (PESTLE) creates both opportunities and threats for research impact and helps articulate the strengths and weaknesses of the project in relation to these. Usually, “strengths” and “weaknesses” relate to internal factors, such as the project team, resources, or limits to knowledge; “opportunities” and “threats” are usually external factors (such as timing or competitors) to which the project can respond.

A worked example relating to our hypothetical research project (Box 1) is shown in Figure 3. The figure is annotated with both questions and comments to help the research team contextualise their project and think more broadly about the implications of their environment on their project and vice-versa.

Communities

Who are the communities and beneficiaries of your research?

Arguably the most important ‘C’ to consider is the community or set of communities we are trying to influence and impact through our research. This refers to the beneficiaries of the work, which can be individual groups of people, organisations or institutions. Ask yourself who benefits, profits, is better-off or has experienced a change as a result of adopting, interacting, engaging with or using your research? In some cases, communities will be impacted by and benefit immediately from the research project, as is the case in projects where there are benefits to patients taking part in trials or to visitors to museums that use research to inform curation and exhibitions. In other cases, impacts on communities will manifest themselves at much later stages of a project, as is often the case with technology transfer or with less linear forms of cultural impact. It is possible too that a project can have both short- and long-term impact, with some influence becoming clear early on (in the case of co-created or action research), while the ultimate outcomes and benefits of that research may not become clear until later on. This highlights the importance of understanding the broader “impact pathway” or longer-term vision for the project.

In all cases, the crucial lesson is for researchers to be as specific as possible when identifying the communities that can and will engage with or benefit from their research. If the project is likely to affect health care workers, what type of workers will they be and in what roles? Where will they be located? And in what areas of their work will they feel the benefits? The more specific the vision of the target community, the more refined the research and impact design can be, maximising the possibility that the researcher and research will achieve its ultimate goal.

In our hypothetical example (Box 1), the long-term beneficiaries are the children and young people who ultimately need to access mental health services in the UK. However, within the immediate bounds of the research project, initially it will be GPs taking part in the study who will benefit by enhancing their ability to better identify mental health problems; and potentially those children and young people in the fifteen sites that are then successfully referred as a result of these GPs using these tools.

Constituencies

Who has a (positive) interest in your project and can influence change?

Constituencies are defined as individuals or organisations that have an interest in the research and the potential to influence change; they can be conduits to effect change and make an impact. Generally, they have positive attitudes towards your research, or at least the potential to develop sympathy towards your research, and help you make a difference to the sector you are studying.

It is important to note that constituencies might be the same individuals or organisations as communities, but this not always the case. An intervention to help teenagers access mental health services as in our hypothetical example will have those teenagers as the ‘community’, but the constituencies to effect change might be clinicians, practitioners or care workers, depending on the type of intervention or research being conducted. By contrast, a project that seeks to make a change in policy may involve civil servants as advisors to the project, inviting them to be constituencies for change, but they may also be part of the ultimate beneficiaries of the research once findings can be readily implemented into policy change.

Constituencies can be represented on an influence/interest matrix (Figure 4) – a model adapted from stakeholder theory (Mendelow, 1991). This is useful because it helps us prioritise our constituencies and develop plans to keep them engaged; there is simply no time to keep everyone involved and interested as cultivating relationships takes time and effort. From this diagram, the most influential people, groups or institutions are those in the bottom right corner (box D); those with significant interest in the work and also with the capacity to influence change are our key stakeholders. Those with a lot of influence but maybe not a lot of interest right now (box C) are also important to keep satisfied and engaged and through time they may move from C to D. There may be others who are very interested but do not really yet have the power to effect change (B) and they are kept informed of the work.

In our worked example, ‘the public’ may be in box A since perhaps many people are not interested in better mental health referrals for young people (or at least it is not a priority) or they do not have influence to make a change. Patient advocacy groups, however, may have more influence and be placed in box D, alongside key champions of the work our such as an influential clinician individual or group. Perhaps there is a member of parliament with an interest in mental health who is currently in box C (due to other conflicting priorities) and so we need to engage with him/her to inform them of our project and aim to move them to box D.

Challenge

What is the situation or challenge you will solve through your research?

In order to get key stakeholders or constituents to pay attention to the answers that your research produces, it is important for them to feel ownership of the question that you are addressing. To put it another way, it is much more likely that people will act on the results of your work, if it they feel it provides a solution to a challenge they feel is important.

One effective way to develop this question is to the use a Situation-Complication-Question process (Minto, 2009). The first step is to work with your constituent group to define the status quo or ‘Situation’. Working with our hypothetical example of mental health diagnosis for young people, an example Situation might be: “GPs are a key gateway for patients that require support for mental health”. Into this comes a ‘Complication’ – some form of challenge or problem: “GPs are a key gateway for patients that require mental health support, but they are not equipped to make referrals to mental health services for young people in particular”. At this stage, you are seeking to build a consensus with your stakeholders, producing a Situation and Complication in which they recognise and are invested. If at this stage your constituents remain unconvinced, perhaps you need to return to your model and refine or look for more evidence. This avoids creating a “Challenge” (Situation + Complication) in which they feel no sense of ownership. Any subsequent answer to this challenge would, under those conditions, be immediately contested.

The final step is to find a suitable question: “GPs are a key gateway for patients that require mental health support, but they are not equipped to make referrals to mental health services for young people in particular, so the question is …?”. The trick at this stage is to recognise that no single question automatically follows, even from such an apparently simple set of conditions. There are potentially dozens of potential questions: “How can we provide tools for GPs to effectively diagnose and refer young people?”, “What is the economic cost of creating a training programme for GPs?”, “How can we inform young people and parents about mental health?”, or even “Is it even a good idea for GPs to be such an important gateway for referrals?”. Finding the right question depends on the perspectives and interests of your constituents. A healthcare policy maker will have different priorities and views on what the best questions to ask are, compared to a charity operating in the field, a social worker or a government finance minister. By engaging with your particular constituents, you can discover the question in which you all share an interest.

Channels

What channels will you use to reach your key constituencies?

Researchers have a wealth of channels available to promote their work and its key messages in order to make an impact. These range from free, public channels such as blogging and social media, as well as traditional forms of media, such as print or broadcast journalism, to more targeted channels, such as policy reports or solution-focused workshops and round-table events. In general, no one channel is superior to the other as each has its advantages and drawbacks.

Social media, for example, can be highly effective in reaching a wide audience and is entirely free to use. However, using social media effectively is a skill that requires nuance and commitment, without which your message can get lost in an already crowded field. Similarly, while policy reports that target specific influential stakeholders offer you the opportunity to reach an important audience, these can be time-consuming and expensive to create and deliver. In all cases, when thinking about what channels you use to promote your work and make impact, researchers should think carefully about the advantages and disadvantages of each channel. Researchers should choose their channels strategically, based on the specific set of constituencies or stakeholders relevant to their project.

In our hypothetical example, our key constituencies are patient advocacy groups and influential clinicians. We might consider involving these groups in our research from an early stage by organising a series of workshops or round-table events. This could lead later to a report or briefing that could be used to reach and influence parliamentarians with an interest in mental health policy. Social media could be used to keep the public informed of the research – in the interests of engagement and transparency – but might not be the primary channel for making impact. The choice of channels should be linked to the constituencies identified for each project and a number of different channels can be used together to achieve our aims.

Communications

What is the appropriate style, tone and structuring needed to get your main message across? How do you ‘untrain’ the typical academic way of writing?

As academics or researchers, we are trained to write in a technical, specialised way that is often highly specific to our discipline or academic community. This influences not only our use of specialist terminology or conceptual language, but also how we structure an argument. Normally, we begin with an introduction, followed by a methodology or theory section, before providing some facts or evidence, analysis and a conclusion or set of recommendations. However, when communicating with communities and constituencies outside academia, it is crucial not only to use less technical language, but to invert the usual structure of our arguments. This means beginning with our recommendations and then our conclusions, followed by the analysis and the facts or evidence (and perhaps the methods as an Annex). While for many this might seem counterintuitive, this style of communicating is commonplace in government reports, papers by think-tanks and consultancies and in the media. Re-structuring an argument in this way ensures your audiences cannot miss the core conclusions of your work and will understand immediately their importance to the area you are trying to influence.

Another key element of communicating beyond academia is the narrative or storytelling. It is important to try to attract your audience’s attention. Is there something about your research that is counterintuitive or unusual? Or is there an element of your work that resonates strongly with your core audiences? Secondly, in communicating our research we should seek to appeal to your audience’s desires or interest. Can your research help your audience save money or achieve a policy aim or solve certain problems? Can it help to clarify for large audiences a complex or controversial idea? Can it help to treat or cure illness or produce new technology? Once, you have grabbed your audience’s attention and piqued their interest, only then can you use the reason, the logic, and the rationale for your ideas.

In the case of our hypothetical example, the mental health referral process might seem like a specialist subject for health policy experts and practitioners and difficult to communicate. However, some key statistics or facts about the challenges of GP referrals might help grab attention. For example, communications could lead with the consequences of inaccurate referrals on young people’s lives or on the economic cost to the NHS. This holds attention from the beginning and emphasises the value of the research project. The narrative could also focus on the potential of the research to improve this situation. Here we see the importance of understanding your audience: for a budget holder, communications could focus on the cost-saving potential of the research; for charities or advocates, emphasising the potential benefits to patients might be key.

These techniques are only a few of a whole host of rhetorical and communications tools you can use to maximise the impact of your work. One way of learning how your audience communicates and likes to be communicated with is by familiarising yourself with the publications, channels and conversations that happen within those groups, a process that can begin by engaging stakeholders early on in your project.

Capture

How will you demonstrate your impact?

“Capture” refers to having the evidence, indicators and/or measures that demonstrate that impact took place from your research, and for many, is one of the most challenging parts of planning and reporting on research impact. The types of indicators used will vary according to project and discipline, and it would be too prescriptive to try to describe the ‘right’ or even ‘appropriate’ ones to use. The REF2014 impact case studies analysis highlighted just how diverse the types of impacts from UK research were, suggesting the need for far too many metrics if these were to be standardised. We can, however, give examples of the types of indicators that have been used by others reporting on impact. These can serve as inspiration for ways in which you can locate the data that will prove the impact of your research, providing solid, factual foundations to underpin your contribution stories.

The analysis of the REF2014 impact case studies provides a good source of examples of indicators. Researchers who work in life sciences and medicine (‘panel A’ in the REF2014 assessment process) described impacts such as improving patient’s lives, qualitative narratives from clinical staff or patients with improved care delivery. They also reported that their research was cited in clinical guidelines, or that it was adopted in practice or policy. Researchers from the natural sciences and engineering (panel B) and social sciences (panel C) reported impact through commercialisation activities, such as spinouts, patents or licences that they produced from the research as well as adoption in policy and practice, as panel A had done. Many in the social sciences also talked about changing the nature of public discourse on a topic, something that is very difficult to measure or quantify, but can be achieved through analysis of media mentions on a topic, for example. Engineering and IT based researchers also described how they supported industry by providing them with, for example, a new manufacturing design protocol or framework that they could use. For researchers in the arts and humanities (panel D), impact was reported by indicators such as endorsements by creative industries such as BAFTA or critical reviews. They also provided measures such as footfalls, downloads, non-academic dissemination of work and again, creating a change in perception or public opinion, for example, through the way that people change their view on a topic after entering a new museum that had been curated from the result of the research. These roughly fit into the categories identified in Table 1.

Whatever indicators are chosen, it is important that they are directly relevant and important to the project itself, rather than just focussing on what is easily counted. It is also important to note that indicators provide signals of impact, but do not provide comprehensive assessment of the full range or the many factors that contributed to those impacts. The Metric Tide report on the role of metrics in research assessment notes that “carefully selected indicators can complement decision-making, but a ‘variable geometry’ of expert judgement, quantitative indicators and qualitative measures that respect research diversity will be required” (Wilsdon et al., 2015).

It is also critical to record your impact or any experiences that may have led you to have impact along the way. This may come in the form of blogposts you have authored, or a significant mention of your work or profile, or, indeed, more traditional sources such as citations and guidelines or parliamentary reports. All of these are important and help to construct the narrative – to tell the story – of your impact. Some research funders already ask for such examples through their regular reporting platforms such as Researchfish or annual reports, but our recommendation is to keep a simple diary that collects information on impact regularly – whether it is a collection of emails or files on a computer – so that these can be accessed especially when reporting on impact later on.

Bringing it all together: The 7Cs in action

Each of the 7Cs describes key tools and considerations for any researcher seeking to create impact through their work. In some projects, all seven will be key; while, in others, some aspects of the above principles will be more significant; and others may not apply at all (or may have very straightforward answers). It is important, therefore, to examine how these seven principles play out in practice to understand how the application of these tools varies and must be adapted from project to project and also how they work in concert with one another as a set of interdependent good practices.

Increasingly, funders require researchers to demonstrate their plans for impact by including a statement on ‘pathways to impact’ in funding proposals. These can take on different forms. Some researchers choose to list their beneficiaries (‘communities’ in the 7c’s), state their anticipated benefits and plan the associated engagement and communication channels as part of their research project. Another method is to map how impact can occur by using each ‘C’ to chart the journey from research to impact (Figure 5). From experience, we have found that course participants often find it helpful to work from right to left: to start by imagining their work’s ultimate potential impact and then working backwards from that point to identify a series of steps with which to achieve their goal. Mapping out who our work ultimately will benefit (communities) and the wider political, social and cultural environment in which we operate (context) allows us to envisage more clearly where we see our research going and who it might reach.

A word of caution is needed here too. While this framework can help establish an ultimate vision and context (and perhaps even inspiration) while writing a grant proposal, only certain elements of its activities and their final impact may be achieved and described in an impact pathway statement, particularly in the context of the time and resources available within a given project’s timeline. The main focus in impact pathways statement tends to be on articulating the impact and engagement activities – including ‘constituencies’ that will be used as agents of change, the ‘channels’ and ‘communications’ tools used to engage them and the processes through which they will be engaged to explore the common research and impact ‘challenge’. Including some content on how to ‘capture’ the success of impact can be useful to give reviewers confidence that there will be a plan in place to record the impact that took place.

The above is written with the caveat and strong belief by the authors that impact is not a linear, predictable process, so the pathway from our impact activities to ultimate impact suggested in Figure 5 is not intended to be a predictive or rigidly prescriptive model. We are aware that impact beyond academia can take an unexpected course of events and, as researchers, we should allow for the serendipity of impact to take place. Illustrating the impact pathway in this way provides a foundation to focus on those impact and engagement activities (the main component of an impact plan) that will directly support the vision for the research project.

Concluding remarks

Creating research impact is a complex, often non-linear process, that involves multiple, shifting variables and contexts, the ultimate success of which is dependent on input and support from external communities, constituencies and decision makers, over which we often have no final control. No two projects are the same and each requires a bespoke approach. With that in mind, we would be mistaken to offer this model as a prescription or universal tool; instead, it can be adapted, refined and tailored according to the needs of each project. It is also not intended to be predictive in any way, but a way of focussing a project plan on activities that will help generate events that can lead to impact.

However, experience has shown us that applying a simple, consistent set of basic principles – the 7Cs – can go some way to producing positive impact outcomes, even in varying contexts, projects and disciplines. While not every research project that applies these principles is guaranteed to make significant impact, in our experience, research projects with successful impact will have used these key principles effectively.

We encourage researchers interested in maximising the impact of their work to supplement our 7C’s approach (or any framework) with some further knowledge of what has worked for other projects. The REF Impact Case Studies 2014 database (REF 2014) is a valuable resource for this kind of knowledge, which can be searched according to area of impact, discipline and a range of other filters. This allows you to find discipline-specific examples of projects that have successfully created impact and begin to understand, in addition to the abstract principles outlined here, how the impact pathway functions in concrete reality.

We offer the 7Cs as a free, easy to use framework, designed to help researchers think in a focused and realistic way about creating impact and how it can be achieved. It is suitable not just for researchers seeking to do their first “impactful” research project but also those who wish to refine and re-examine their own methods.

And just as we encourage researchers to refine their approach to impact, we seek to refine our own models, methods and theories, by examining our principles against practical examples of what has worked – and what has not. In that light, the 7Cs are themselves open to development and improvement and the authors of this paper would warmly welcome feedback, as well as examples of how you have used this framework.

Data availability

No data are associated with this article.

Publisher’s note

This article was originally published on the Emerald Open Research platform hosted by F1000, under the ℈Quality Education for All℉ gateway.

The original DOI of the article was 10.35241/emeraldopenres.13323.1

Author roles

Sreenan N: Conceptualization, Writing - Original Draft Preparation, Writing - Review & Editing; Hinrichs-Krapels S: Conceptualization, Investigation, Writing - Original Draft Preparation, Writing - Review & Editing; Pollitt A: Conceptualization, Writing - Original Draft Preparation, Writing - Review & Editing; Rawlings S: Conceptualization, Writing - Original Draft Preparation, Writing - Review & Editing; Grant J: Conceptualization, Writing - Review & Editing; Wilkinson B: Conceptualization, Writing - Original Draft Preparation, Writing - Review & Editing; Pow R: Writing - Original Draft Preparation; Kinloch E: Conceptualization, Writing - Original Draft Preparation, Writing - Review & Editing

Grant information

The author(s) declared that no grants were involved in supporting this work.

Competing interests

No competing interests were disclosed.

Reviewer response for version 1

Barend van der Meulen, University of Twente, Enschede, Netherlands

Competing interests: No competing interests were disclosed.

This review was published on 07 January 2020.

This is an open access peer review report distributed under the terms of the Creative Commons Attribution Licence, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

recommendation: approve-with-reservations

This is more a conceptual article than a research article, based upon the course material developed and tested at several workshops in the UK. This is both a strength and a weakness, as it creates clear focus, but might limit the usefulness beyond the specific "impact context" as has been created in the UK over the past decade. This context though is well described and thus open to assessment for readers from other contexts.

Another strength of the article is that it is clear in its intent, that is to provide readers with a tool to design, develop or improve their impact strategy. Whether for all readers it is helpful that design and objective are interwoven, I am not sure. Even without moving into theoretical distinctions, one may easily assume that readers may have different objectives and that the ultimate design of a impact strategy will depend on these objectives. As the article provides only one example (on which more below) the blurring of objectives and design, suggests that there is only one approach to improving impact strategies. My own expertise and experiences in research projects and workshops tells me that it makes a difference whether impacts strategies are developed at eg. institutional or individual level, for accountability, acquisition or learning, or for (basic) research primarily aimed at understanding or (challenge driven) research aimed at change.

I also do appreciate the graphics and the clear tables and figures, which are helpful for the reader and reflect the origin of the 7C scheme as learning material.

The authors have made it difficult for themselves by trying to look for 7 C words. That might be attractive as a communicative tool. For a good understanding of the seven aspects, as one expect from a scientific article, it is less helpful. I found especially the explanation for communities and constituencies unclear. I could imagine that communities refer to a broader set of stakeholders, while constituencies refer to those directly involved. But the stakeholder matrix introduced, brings in all communities as constituencies. Either the authors should merge the two under one heading (and be happy with 6C), or make the distinction between the two more clear.

In the explanation of both concepts, the emphasis is on the benefits. That is a pity, as the context analysis allows to identify also actors that will face costs (in any form) from the research findings or implementations. Neglecting these costs in the impact strategy may actually hamper full impact in later phases. There are a lot of examples these days in which research findings become part of politicized contexts, and simply ignoring the tensions between communities and ignoring differences in appreciation of research findings will not help researchers to increase their impact.

Acknowledging such differences in appreciation, may also be helpful for the challenge section, which reads now a bit too much as "we need to convince the constituents". Actually we know that stakeholder engagement (as I tend to call it, though it starts with an S) requires openness to the experiences and perspectives of the stakeholder and willingness of the researchers to learn from stakeholders.

For the last three C's, communication, channels and capture, the authors struggle with the rather straightforward example project they have introduced. In the communication and channel sections, it is unclear why for the rather specific aim the example project has, researchers would develop communication styles and channels for the general public or parliament. The specific target groups of the example project are GPs and maybe client organisations.

In the last section on capture the benefit of having a example project is lost, as it is suggested to throw out the net widely and capture what you can. I would say, with such a specific intent to support GPs with a tool to improve their diagnosis of young people, one would look for specific evidence about the adoption of GPs of the tool. Evidence can then be more focused, be more telling and require less effort. Forms of evidence may include the attention paid to the research project in professional journals, conferences, websites and other media for GPs, appreciation of the tool by the pilot group of GPs and support and diffusion of experience through constituency, adoption of the tool into GP standards.

As a last issue, I would have preferred if the example the authors had chosen would have been a bit more challenging. The example within the text box is a research project with already a very specific aim which seems to be already polished by a 7C treatment. As learning material I could imagine that the authors would introduce a project that is more ambiguous about the impact, eg. in the form of a rather vague promise that results will be of relevance or help GPs.

Through using the 7Cs (or 6Cs) the reader could be shown how such vague project impact improves.

Is the topic of the opinion article discussed accurately in the context of the current literature?

Yes

Are arguments sufficiently supported by evidence from the published literature?

Partly

Are all factual statements correct and adequately supported by citations?

Yes

Are the conclusions drawn balanced and justified on the basis of the presented arguments?

Partly

Reviewer Expertise:

Higher education policy studies, science policy studies, impact of science, evaluation, research funding

I confirm that I have read this submission and believe that I have an appropriate level of expertise to confirm that it is of an acceptable scientific standard, however I have significant reservations, as outlined above.

Reviewer response for version 1

David Phipps, York University, Toronto, ON, Canada

Competing interests: I am collaborating with Emerald Publishing on developing impact tools that are predicated on a different impact framework. I discolsed this to Emerald Publishing before undertaking the review.

This review was published on 31 December 2019.

This is an open access peer review report distributed under the terms of the Creative Commons Attribution Licence, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

recommendation: approve-with-reservations

The article describes a framework for conceptualizing research impact along 7 elements (the 7 Cs): context, communities, constituencies, challenge, channels, communication and capture. The authors are from the Policy Institute at King’s and they have facilitated this framework through on line and in person training. Conceptually the 7Cs are easy to understand. This article presents a new framework among a fairly crowded impact community that is being dominated by the current UK REF 2021 exercise. While the background to the article is based on the need for impact in the REF it also acknowledges that impact is more than REF and thus presents a tool for planning impact in research projects not capturing impact for REF. That is an important distinction.

My review is from the perspective of someone active in the development and provision of impact services working across many countries. My comments are intended to strengthen the good concepts underpinning the 7Cs to make the article more useful to those who hopefully will consider the 7Cs as a tool for their own impact planning.

  1. Many of the key elements of other frameworks are captured in the 7Cs so the 7Cs creates a new method of organizing many existing elements. The literature review bases the 7Cs in an impact context but there has been a lot of literature on previously published frameworks (for example: PARIHS; consolidated framework for implementation research; knowledge to action cycle; co-produced pathway to impact plus the tools from providers such as Fast Track Impact and the Research Impact Academy). I recommend additional literature review with a critical appraisal of frameworks that have preceded the 7Cs to differentiate the 7Cs from previously described frameworks.

  2. Of the 7Cs, there are tools provided for two of them: PESTLE/SWOT for context; and, influence/interest matrix for constituencies. The authors mention the on line and in person training sessions, so I suspect there are additional tools, hopefully for each of the 7Cs. I recommend that the authors present their training materials as a supplementary file and/or link the article to where these tools might exist on line – if they are freely accessible.

  3. The framework doesn’t have a temporal element. Figure 5 starts to align the 7Cs along a pathway to impact with the authors recommending working right to left, but for a grant application when activities happen is a crucial element. I encourage the authors to demonstrate how to develop a pathway to impact with a temporal element using the hypothetical case study as an example for a grant application.

  4. There is a missing C in my opinion: co-production. The 7Cs are predicated on models of knowledge transfer/translation with a focus on channels and communication. What channels will the researcher use and what style should the researcher use to communicate main messages? The literature is clear that stakeholders should be engaged early to identify impact goals that are meaningful to both academic and non-academic stakeholders – something the authors briefly nod to with the single sentence on page 8 of 12 just before “channels”. Beyond co-producing impact goals with stakeholders, undertaking the research and impact activities with stakeholders will produce research evidence that will be more readily taken up by stakeholders/end users and used to inform their policies, practices and services. Translation/transfer methods predicated on channels and communication are necessary, but not sufficient, to drive impact.

  5. Finally, I think the role of facilitation of these tools is a missing and key element. The authors have used these tools with over 300 participants over 2 years in their training programs so have much experience facilitating the 7Cs. I encourage the authors to reflect on the need for facilitation of these tools. Are the 7Cs useful (ie will a researcher use them with fidelity) as presented or do the authors recommend facilitation to instruct the use of the tools? I think the latter since the authors state that there is on line as well as in person instruction. I recommend a reflection on the difference between on line and in person facilitation to turn the 7Cs from a conceptual framework (and we really don’t need yet another conceptual framework!) into a framework that is not only conceptual but can be applied in practice.

Is the topic of the opinion article discussed accurately in the context of the current literature?

Partly

Are arguments sufficiently supported by evidence from the published literature?

Partly

Are all factual statements correct and adequately supported by citations?

Yes

Are the conclusions drawn balanced and justified on the basis of the presented arguments?

Yes

Reviewer Expertise:

research impact, knowledge mobilization

I confirm that I have read this submission and believe that I have an appropriate level of expertise to confirm that it is of an acceptable scientific standard, however I have significant reservations, as outlined above.

Figures

The 7Cs for planning impact into research projects.

Figure 1.

The 7Cs for planning impact into research projects.

PESTLE combined with SWOT analysis to frame the context of research projects.

Figure 2.

PESTLE combined with SWOT analysis to frame the context of research projects.

A worked example of PESTLE combined with SWOT analysis relating to our hypothetical research project.

Figure 3.

A worked example of PESTLE combined with SWOT analysis relating to our hypothetical research project.

Influence/interest matrix to classify constituencies.

Figure 4.

Influence/interest matrix to classify constituencies.

Using the 7Cs to think about pathways to impact.

Figure 5.

Using the 7Cs to think about pathways to impact.

Sample of impact indicators in health research (Adam et al., 2018).

Impacts Indicators
Capacity-building Leveraged funding, research tools and methods, use of facilities and resources, career trajectory of researchers
Advancing knowledge Bibliometrics, engagements, esteem measures, collaborations and partnerships
Informing decision-making Influence on policies, practices, products, processes and behaviours (both in health and the determinants of health)
Health Medical and health interventions, health quality indicators, health status
Economic and social benefits Intellectual property and licensing, spin outs, economic returns, jobs, economic diversity and productivity
Social engagement Public involvement, dissemination, engagement with relevant patient or commissioning groups, culture and creativity

1It is important to note that quality-related (QR) funding is generally a minority stream of funding for UK universities, the rest comes from tuition fees and research grant income. The growing emphasis and weighting on impact, however, is significant.

2The co-authors of this article possess over two decades of experience working on theoretical and practical aspects of research impact, working within and outside the academy, with non-governmental organisations, the private sector, policy makers and research funders, employing expertise in academic methodologies, communications, policy analysis and workshop facilitation, both in the UK and internationally. This experience includes co-founding the International School of Research Impact Assessment (ISRIA), work with the National Institute of Health Research in the United Kingdom, developing innovative forms of research impact in public policy through the development and delivery of multiple “Policy Labs”, supporting researchers to track and assess their own impact, working closely with research funders in the UK, as well as designing and delivering impact training for researchers online, through workshops, clinics and other forms of support.

3The PEST, PESTEL, or PESTLE analysis is a common tool used by marketers, businesses, and other organisations to scan external environmental factors for their operations. It is thought to originate in Francis J Aguilar’s Scanning the Business Environment (1967) but has developed since for use in many different contexts.

References

Adam, P., Ovseiko, P.V., Grant, J. et al. (2018), “ISRIA statement: ten-point guidelines for an effective process of research impact assessment”, Health Res Policy Syst, Vol. 16 No. 1, p. 8, doi: 10.1186/s12961-018-0281-5.

Bayley, J. and Phipps, D. (2019), “Extending the concept of research impact literacy: levels of literacy, institutional role and ethical considerations”, Emerald Open Res, Vol. 1, p. 14, doi: 10.12688/emeraldopenres.13140.1.

Benneworth, P. and Cunha, J. (2015), “Universities' contributions to social innovation: reflections in theory & practice”, Eur J Innov Manag, Vol. 18 No. 4, pp. 508-527, doi: 10.1108/EJIM-10-2013-0099.

Burawoy, M. (2005), “For public sociology”, Am Sociol Rev, Vol. 70 No. 1, pp. 4-28, doi: 10.1177/000312240507000102.

Denicolo, P. (2014) “Achieving impact in research”, in Success in Research, Sage, London.

Economic and Social Research Council (2019), “Impact toolkit”, available at: Reference Source.

Grimm, R., Fox, C., Baines, S. et al. (2013), “Social innovation, an answer to contemporary societal challenges? Locating the concept in theory and practice”, Innovation: The European Journal of Social Science Research, Vol. 26 No. 4, pp. 436-55, doi: 10.1080/13511610.2013.848163.

Higher Education Funding Council for England (2011), “REF2014: assessment framework and guidance on submissions”, available at: Reference Source.

Kerridge, S. (2018), “Hitting the QR sweet spot: will new REF2021 rules lead to a different kind of game-playing?”, blog, Impact of Social Sciences, available at: Reference Source.

Khazragui, H. and Hudson, J. (2015), “Measuring the benefits of university research: impact and the REF in the UK”, Res Eval, Vol. 24 No. 1, pp. 51-62, doi: 10.1093/reseval/rvu028.

King's College London, and Digital Science (2019), “The nature, scale and beneficiaries of research impact”, Higher Education Funding Council of England, Bristol, available at: Reference Source.

MacDonald, R. (2017), “Impact, research and slaying zombies: the pressures and possibilities of the REF”, International Journal of Sociology and Social Policy, Vol. 37 No. 11/12, pp. 696-710, doi: 10.1108/IJSSP-04-2016-0047.

Marjanovic, S., Hanney, S. and Wooding, S. (2009), “A historical reflection on research evaluation studies, their recurrent themes and challenges”, The RAND Corporation, Santa Monica, CA, available at: Reference Source.

Mendelow, A.L. (1991), “Environmental scanning – the impact of the stakeholder concept”, International Conference on Information Systems 1981 Proceedings, accessed 4 September 2019, available at: Reference Source.

Minto, B. (2009), The Pyramid Principle: Logic in Writing and Thinking, Financial Times Prentice Hall, New York, NY, available at: Reference Source.

Reed, M. (2015), “Why invest in an online impact tracking tool? ResearchFish, ImpactStory and Kolola compared”, Fast Track Impact, Huntly.

Reed, M.S. (2018), The Research Impact Handbook, Fast Track Impact, Huntly.

Shapin, S. (2012), “The ivory tower: the history of a figure of speech and its cultural uses”, Br J Hist Sci, Vol. 45 No. 1, pp. 1-27, doi: 10.1017/S0007087412000118.

Tilley, H., Ball, L. and Cassidy, C. (2018), “Research Excellence Framework (REF) Impact Toolkit”, Overseas Development Institute, available at: Reference Source.

UK Research and Innovation (2019), “REF Impact”, accessed 4 September 2019, available at: Reference Source.

Vertigo Ventures (2019), “VV – Impact Tracker: plan, capture and report your impact”, accessed 4 September, available at: Reference Source.

Williams, K. and Grant, J. (2018), “A comparative review of how the policy and procedures to assess research impact evolved in Australia and the UK”, Res Eval, Vol. 27 No. 2, pp. 93-105, doi: 10.1093/reseval/rvx042

Wilsdon, J. (2015), “In defence of the Research Excellence Framework”, The Guardian, available at: Reference Source.

Wilsdon, J., Allen, L., Belfiore, E. et al. (2015), “The metric tide: report of the independent review of the role of metrics in research assessment and management”, doi: 10.13140/rg.2.1.4929.1363.

Corresponding author

Saba Hinrichs-Krapels can be contacted at: saba.hinrichs@kcl.ac.uk

Related articles