The purpose of this paper is to provide insights into publication practices from the perspective of academics working within four disciplinary communities: biosciences, astronomy/physics, education and history. The paper explores the ways in which these multiple overlapping communities intersect with the journal landscape and the implications for the adoption and use of new players in the scholarly communication system, particularly open-access mega-journals (OAMJs). OAMJs (e.g. PLOS ONE and Scientific Reports) are large, broad scope, open-access journals that base editorial decisions solely on the technical/scientific soundness of the article.
Focus groups with active researchers in these fields were held in five UK Higher Education Institutions across Great Britain, and were complemented by interviews with pro-vice-chancellors for research at each institution.
A strong finding to emerge from the data is the notion of researchers belonging to multiple overlapping communities, with some inherent tensions in meeting the requirements for these different audiences. Researcher perceptions of evaluation mechanisms were found to play a major role in attitudes towards OAMJs, and interviews with the pro-vice-chancellors for research indicate that there is a difference between researchers’ perceptions and the values embedded in institutional frameworks.
This is the first purely qualitative study relating to researcher perspectives on OAMJs. The findings of the paper will be of interest to publishers, policy-makers, research managers and academics.
Wakeling, S., Spezi, V., Fry, J., Creaser, C., Pinfield, S. and Willett, P. (2019), "Academic communities: The role of journals and open-access mega-journals in scholarly communication", Journal of Documentation, Vol. 75 No. 1, pp. 120-139. https://doi.org/10.1108/JD-05-2018-0067Download as .RIS
Emerald Publishing Limited
Copyright © 2019, Simon Wakeling, Valerie Spezi, Jenny Fry, Claire Creaser, Stephen Pinfield and Peter Willett
Published by Emerald Publishing Limited. This article is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of this article (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/4.0/legalcode
Since the 1950s a complex journal ecosystem has evolved, particularly for scientific disciplines (Cope and Phillips, 2014). There has been a proliferation of specialist journal titles addressed to a niche audience and at the same time the emergence of highly prestigious broad-scope journals, such as Nature and Science. The volume of published articles grows steadily each year (Ware and Mabe, 2015, p. 29) and is the predominant type of output, with increasing importance even in those disciplines where traditionally the monograph has been the primary means of publication. Alongside this development, the worldwide higher education environment has become increasingly prestige and metrics driven (Wilsdon et al., 2015). Particularly with regard to university research evaluation and performance-based research funding systems (the dominant approach in Europe) (Sivertsen, 2017) and National research evaluation frameworks, such as the Research Excellence Framework (REF) in the UK and Excellence in Research (ERA) for Australia (which both rely on peer evaluation in addition to performance indicators, such as journal article citation analysis, for specific disciplines e.g. the physical and applied sciences). In 2006 Public Library of Science launched PLOS ONE, a new type of journal now commonly called a mega-journal. As epitomised by PLOS ONE, mega-journals have four primary characteristics: broad scope (accepting articles across a range of disciplines), large output (publishing a high number of articles), an open-access publishing model (typically based on an Article Processing Charge (APC) paid by the author prior to publication), and an editorial policy that reviews submissions solely on the basis of their technical or scientific soundness (Björk, 2015). It is the last of these – soundness-only peer review – that has emerged as perhaps the defining characteristic of the OMAJ model (Wakeling, Spezi, Fry, Creaser, Pinfield and Willett, 2017). As stated by PLOS, the motivations for the launch of PLOS ONE were primarily related to challenging prevailing practices in scholarly communication, particularly the importance placed on journal impact factor (JIF) and associated journal metrics (Patterson, 2009). By removing the requirement for work to be judged significant, the creators of PLOS ONE intended to facilitate the dissemination of work that might not be published elsewhere, while also providing a venue for interdisciplinary work.
The output of some mega-journals, two of which (PLOS ONE and Scientific Reports) are now the largest journals in the world, suggests that the model is popular with some authors and uptake has been more prevalent in particular areas, such as the Biosciences. While mega-journals are the fastest growing segment of the Open Access (OA) market (Ware and Mabe, 2015), the proportion of all scholarly output published in open-access mega-journals (OAMJs) is still small (just 2.6 per cent of Scopus indexed articles published in 2016 are from a mega-journal). Thus far, related studies have focused on describing patterns of uptake and use of mega-journals, but have not provided explanations as to the underlying factors.
In this first qualitative study of researchers’ attitudes towards OAMJs we explore the mega-journal phenomenon from a community perspective. Based on focus groups with researchers in four disciplines at five institutions, and interviews with Pro-Vice Chancellors for Research (PVC-Rs), the purpose of this paper is to understand academic publication practices in a disciplinary context and to explain the role of OAMJs in that context. It focuses on the adoption of OAMJs in the biosciences, astronomy/physics, education and history, whilst taking into account institutional frameworks and drivers in the broader science system (e.g. government and funder policies).
2.1 Uptake of OAMJs
Mega-journals are a relatively recent phenomenon, and as such the literature on OAMJs is sparse but growing. In their review of research and comment on mega-journals in the formal and informal literature, Spezi et al. (2017) found evidence of polarised and at times animated debate on the merits or otherwise of the model. While advocates highlight their democratising potential, critics suggest that OAMJs are little more than “dumping grounds” for low quality work, and that the eschewing of significance as a criteria for acceptance leaves readers without a valuable filter (for an example of such debate, see the comments on Anderson, 2010).
A number of recent papers have used bibliometric techniques to map the emergence of OAMJs (Wakeling et al., 2016; Björk, 2015). These suggest that OAMJs have the largest uptake in the Medical and Life Sciences disciplines, and that even within journals with ostensibly broad subject scope, sub-disciplinary preferences are emerging. Work has also been done to understand the citation distribution for articles published in OAMJs, finding significant variation in citation rates for different OAMJs (Wakeling et al., 2016).
In the only qualitative study relating to OAMJs, publishers identified a number of incentives for researchers to publish in mega-journals, in particular journal brand and JIF, as well as the potentially rapid publication of their research (Wakeling, Spezi, Creaser, Fry, Pinfield and Willett, 2017). They also note that mega-journals are attractive to authors of papers reporting less significant findings, or that report research from an emerging field. Two surveys of mega-journal authors (Solomon, 2014; Sands, 2014) offer support for some of these proposed incentives. Interestingly around half of respondents to these surveys reported that their manuscripts had previously been submitted to another journal. An important gap in the literature to date concerns the reasons why researchers choose not to submit their work to a mega-journal in the first place.
2.2 Researcher publication practices
The importance of the journal article to scholarly communication has long been established, and disciplinary differences in level of use and drivers of journal selection are increasingly emerging in the literature (Fry, Probets, Creaser, Greenwood, Spezi and White, 2009; Nicholas et al., 2010; Research Information Network (RIN), 2007; Tenopir et al., 2016). Disciplinary communities have their own culture and norms, which according to Trowler (2014) are conditioned by many factors, including new technologies, the marketization of knowledge, globalisation and the rise of the evaluative state. These forces condition interactions within and across disciplinary communities (Trowler, 2014) and how academics behave--providing an explanation as to why at the level of scholarly communication different disciplinary communities place different values on particular characteristics, and aspects of the publication process (Harley et al., 2010). While some factors – particularly the prestige of the journal, and its audience – appear to be almost universally considered important, others have greater or lesser significance for different disciplines and sub-disciplines. The career-stage of authors has also been shown to influence journal choice. The publishing strategy of early career researchers (ECRs) is commonly understood to focus on publishing in “top” journals (i.e. titles indexed in Scopus and Web of Science, preferably with a high JIF), as they believe this is what will help further their reputation and career (Nicholas et al., 2017). ECRs are also driven by a need quickly to develop a strong research profile, and thus are particularly sensitive to the requirements of balancing journal prestige with other factors such as likelihood of acceptance, and speed of publication.
Of particular relevance to our understanding of author behaviour is the role of the JIF. Initially conceived of as a journal selection tool for librarians (Garfield, 1979), the JIF – a measure of the mean number of citations to articles recently published by a given journal – has become an integral and controversial part of the academic landscape. As well as influencing many researchers’ choice of journal (Cope and Phillips, 2014), there is evidence that the JIF of the publishing journal is used as a component of research evaluation for tenure and promotion in academia (Rijcke et al., 2016), as well as research funding (Hicks et al., 2015). So pervasive is the influence of the JIF that it has even been seen to influence scientists in their choice of research areas, with those more likely to lead to the type of findings considered to be attractive to high impact journals being prioritised (Müller and de Rijcke, 2017).
For researchers who are subject to national University research evaluation exercises (such as the REF in the UK), metrics, and in particular the JIF, take on additional importance. As Nicholas et al. (2014) put it, JIF and research evaluation are “intertwined in the minds of UK researchers” (p. 128). It is well acknowledged that research evaluation exercises such as the REF do shape researchers’ publishing behaviours (Housewright et al., 2013), and a growing body of work is exploring the wider impact on scholarship (see Rijcke et al., 2016). It is important to note here that in recent evaluation cycles the UK REF has explicitly eschewed the JIF as a performance indicator and ERA has not used a Ranked Journal list since ERA 2010 (based on feedback from Research Evaluation Committees that they relied on their own expert knowledge of the quality of research outlets relevant to their discipline).
2.3 Disciplinary communities as discourse communities
Previous literature showed that disciplinary norms play a key role in patterns of scholarly communication. There are various sociological theories describing disciplinary cultures and explaining the similarities and differences between them (Abbott, 2001; Becher and Trowler, 2001; Whitley, 2000). Whilst these theories have been shown to be useful in better understanding scholarly communication by highlighting cultural norms they are at the taxonomic level, whereas the genre analysis literature contextualises disciplines from a community perspective and draws attention to the role of specific genres, e.g. the journal article, within academic communities (Berkenkotter and Huckin, 1995).
The extent to which the use of the term “community” is appropriate to describe the inhabitants of academic disciplines has been debated in the literature given certain aspects of definitions of communities, such as a sense of belonging, shared goals and consensus. For example, Tönnies’ (1957) seminal definition of community (Gemeinschaft) that uses the terms “familiar”, “comfortable” and “exclusive” appears to run counter to what we know of academic communities, which can be fragmented, tribal and competitive (Becher and Trowler, 2001). Survival of the discipline/sub-discipline, or the growth of a new discipline/sub-discipline, however, might be a commonly held shared goal or unifying feature of disciplinary communities.
Seen from a cultural perspective, communities can be perceived as symbolic in that they have a shared set of symbols, constructs and norms which support routine discourse activities (Cohen, 1985). This perspective resonates with the notion of a discourse community, which in Swales’ (1990) definition has six defining characteristics:
a set of shared goals;
a forum for communication between members e.g. meetings, correspondence, e-mail etc.;
on-going “conversations” through active participation in providing information and feedback;
recognised genres for communication;
a specialised vocabulary or language; and
a critical mass of members, with an evolving membership – survival depends on a reasonable ratio between “experts” and “novices”.
Whilst Swales (1998) has criticised the term “disciplinary community” as inappropriately portraying disciplines as idyllic and consensus oriented he has applied the notion of discourse communities to academic settings. Previously, Harrison and Stephen (1995) had also argued for the usefulness of the notion in understanding academic discourse, not least because of the centrality of genres in scholarly communication, e.g. the research proposal, the scholarly article, or the peer reviewer’s report (Berkenkotter and Huckin, 1995), and the shared symbol systems used to construct them. Cohen (1985, p. 19) argues that this symbolic perspective of communities mitigates the need for a defining “consensus of sentiment”, rather symbols are cognitive constructs that provide members of a community with the “means to make meaning”. These genres and symbols function on multiple levels; they are deployed to achieve individual and collective goals and they are used to demarcate the boundaries of the discipline (Harrison and Stephen, 1995; Becher and Trowler, 2001). In their seminal article on domain analysis in information science Hjørland and Albrechtsen (1995) emphasised the analytic valence in conceiving of knowledge domains as discourse communities.
Berkenkotter and Huckin (1995, p. 11) refer to these genres as constituting an “academic conversation”, with varying levels of formality and participation, the competent engagement in which requires enculturation of new members. This knowledge is typically “picked up” in the local milieu of the culture, rather than being explicitly taught. The process of citation and referencing signals to reviewers, editors and readers the way in which a published output builds on what has gone before and is a conversation conducted via the literature. Albeit, the extent to which an author is required to demonstrate how closely coupled an output is to an ongoing conversation varies from one academic community to another (Crane, 1972).
An embedded case-study approach was used, aiming to capture a detailed picture of both disciplinary and institutional perspectives on the role of journals and the OAMJ phenomenon. The case studies involved a series of 16 focus groups based on four disciplines across five institutions spread across England, Scotland and Wales. Focus group participants were active researchers from a range of career stages as depicted in Table I. Given the potential influence of national research evaluation frameworks and other external policy drivers, such as funder policies, on researcher dissemination and publication behaviours (Fry, Oppenheim, Creaser, Johnson, Summers, White, Butters, Craven, Griffiths and Hartley, 2009), with institutions playing a pivotal role in operationalising these frameworks, it was considered important to gain an understanding of the institutional framework within which the discipline-based focus groups were located. For this purpose interviews were conducted with the pro-vice-chancellors for research (or their equivalent) in each of the institutions where a discipline focus group was held.
The focus group protocol started with an explanation of what an OAMJ is and covered factors influencing journal choice, strategies when an article is rejected, experience (if any) of publishing in an OAMJ, values relating to the peer-review system and views on soundness only peer review, the role of journals in filtering for significance for the discipline, and the extent to which the OAMJ approach to peer review has an influence on authors as readers. The interviews with PVC-Rs explored the extent to which they were aware of OAMJs and whether or not they had been discussed with regard to the wider institutional framework; for example, ways in which they were perceived in the context of the academic journal landscape for the disciplines in their institution and the broader context of the UK’s REF.
Using Becher and Trowler’s (2001) typology as a sampling framework four disciplines were chosen that span the physical, life and social sciences, and the arts and humanities. Within this framework disciplines were included on the basis of being “first-movers” in the adoption of OAMJs (biosciences), where OAMJs have been relatively moderately adopted (as evidenced by the representation of these disciplines in mega-journals) and where we expected some awareness of the phenomenon (astronomy/physics) and where it was less likely that there was awareness of OAMJs or indeed of OA more broadly (education and history).
To identify institutions from which to recruit focus group participants’ purposive sampling was used. The criteria included percentage income from research grants and FTE numbers of research active staff; five institutions were selected for the disciplinary focus groups. These were selected geographically to include a reasonable spread across England, Scotland and Wales. Some leeway was allowed in the sampling to make efficient use of resources by holding multiple focus groups at each of the five institutions and a small number of initially identified institutions were unable to participate. There were 60 focus group participants in total spanning the four disciplines (Table II). Participants were recruited via school/department staff and PhD mailing lists – all those who expressed an interest in the focus groups were offered a place. Data collection ranged from November 2016 to February 2017.
The fact that we had a low response rate from history is perhaps a finding in itself and was not unexpected given the low levels of adoption of OA publishing in this discipline. The low response rate does however limit the extent to which the conclusions can be generalised to history as a disciplinary community.
Each focus group lasted an hour, was audio recorded and partially transcribed (this consisted of detailed notes recording the points made by participants, with full transcriptions of noteworthy statements). The data were analysed using a well-established thematic analysis approach (Braun and Clarke, 2006; Saldaña, 2013) resulting in the identification of broad themes, three of which are presented in this paper: choice of journal, awareness of OAMJ, and notions of community. The first two themes presented in the findings section (“choice of journal” and “awareness of OAMJ”) are semantic in nature drawing on obvious meanings in the data, whilst the third is a latent theme. Being a latent theme “notions of community” explores the ideas, assumptions and conceptualisations underlying the obvious, surface level, meaning of the data (Braun and Clarke, 2006, p. 86). The third sub-section of the findings section is, therefore, more conceptual and discursive in nature than the preceding two sub-sections.
All transcripts were coded using the NVivo qualitative analysis software. Given the small number of transcripts, they were divided between two coders: each transcript was coded by one and “proof-coded” by the other in order to ensure a robust and reliable coding process. The “proof-coding” consisted in each researcher reviewing the other’s coding; discrepancies and issues in the coding results were compared, discussed and resolved, so that both coders fully agreed with the final coded transcripts. The proof-coding process renders an inter-coder agreement score irrelevant, since in practice the content of every code is agreed by both coders.
4.1 Choice of journal
Participants described a range of drivers influencing journal selection, and a range of journal characteristics that factored into eventual publication decisions. In considering these findings it is interesting to note that many participants, particularly in the non-science disciplines, stated that they have a relatively clear idea of a target journal early in the writing process, and that both the “angle” of the article and choice of journal evolve in the process.
Whilst the terms “prestige” or “reputation” were widely used in relation to journal titles participants often found it hard to define exactly what they meant by these two terms. There was generally a historical word of mouth element to notions of prestige, with certain journals seemingly firmly established at the top of a hierarchy within each discipline. There were discussions in several of the focus groups about the difference between the “community view” of prestige (or relative importance) and journal rankings based on the JIF.
Journal reputation was most commonly associated with the respect with which it was held within the disciplinary community, with focus group participants’ perceptions based on the quality of published articles, the perceived thoroughness of the review process, and members of the editorial board. It was also notable that several participants explicitly stated that a journal’s affiliation with a learned society had a strong positive influence on its reputation.
4.1.1 Target audience and readership
Many participants emphasised the importance of reaching a desired readership, which was an important factor in journal choice for submitting an article. For bioscientists, astronomers/physicists and historians this was presented as a choice (sometimes expressed as a tension) between a journal with a narrow sub-disciplinary focus (which was perceived to maximise the chances of reaching a target and engaged audience), or a broader scope journal, which would often bring greater prestige, a wider readership and increased visibility. It should be noted that education researchers described a different journal environment, with few if any generalist journals. In general, only higher quality articles reporting research likely to be viewed as significant by a wide range of researchers (at a discipline or cross-discipline level) were deemed suitable for broader scope journals. It was noted by participants too that narrower scope journals, since they are aimed at a specialist audience, allowed for greater use of technical vocabulary and more in-depth reporting of results.
Given the early stage in the writing process at which participants reported thinking about choice of journal for publication, it was clear that researchers assign great importance to the perceived scope of potential journals. The message of an article is then tailored to the anticipated audience during the drafting process. Key to this is researchers understanding the readership of different journals, and a number of participants emphasised the role of learned society journals as titles with a clearly defined audience.
It was also notable that participants were keen to emphasise that different dissemination and publication practices were required in order to communicate with practitioner or other non-academic audiences. The journal article was seen as an inappropriate medium for communication with practitioners, both in terms of accessibility and comprehension.
4.1.2 Article type and quality
An almost universally mentioned driver of choice of journal related to the type and quality of the article, and the extent to which it aligned with the requirements of potential journals. Many participants described “aiming high” by submitting manuscripts to the “highest impact” or “most prestigious” journal in which it might realistically be published. This was primarily with the goal of maximising the exposure and influence of their article, although some participants also mentioned that higher quality journals often produced more useful and detailed peer review reports (that could then be used to refine and improve the paper for submission to the next journal on the list of potential journals), a factor that emerged as a significant driver of submissions for many participants, particularly bioscientists.
Participants discussed quality both in terms of the prestige of a journal and the production/publication service offered by the publisher. The overall publication process, from submission to peer review to publication, clearly influences the opinions researchers form about journals in their discipline(s), and as such it was not surprising that their previous experience with a journal was often cited as a factor influencing decisions on journal selection.
While the perceived quality of the journal was important, many participants also emphasised the importance of selecting a journal that provided a good fit for the article. This included the particular subject speciality of the journal, its editorial policy such as paradigm orientation, e.g. quantitative/qualitative or theoretical/empirical, the extent to which supplementary data are supported and encouraged (or discouraged), how frequently the journal was cited in the article in question, and the speed, quality and efficiency of the peer review process.
Participants also discussed the perceived requirements of “top” journals, with bioscientists and astronomers/physicists in particular noting that the chance of acceptance by high JIF journals is closely linked to the novelty and “newsworthiness” of the manuscript. Several bioscientists and astronomers/physicists viewed this as a troublesome development, and expressed frustration that their articles had been rejected on these grounds by the editor without a chance to go to peer review.
4.1.3 Career and research quality assessment requirements
The most important factor influencing publishing practices for participants was the UK’s REF, with many acknowledging that the requirements of the REF had superseded even reaching a particular readership as a major factor influencing choice of journal. Participants across all disciplines referred to institutional pressure to produce REF-returnable papers, and although it was acknowledged that the REF criteria were somewhat opaque, a number of important consequences for publishing practices were described. There was a general consensus that quality rather than quantity should drive an individual’s publication strategies, primarily because of the necessity to produce 4* REF papers. This suggests a significant shift away from what has until recently been a predominantly “publish or perish” culture (which emphasised quantity over quality) in the scholarly communication system. Those participants who published articles of varying academic impact distinguished between “REF-able” and “non-REF-able” articles, with the latter most likely to be driven either by a desire to publish primary data from a publically funded research project, or to target a particular community (e.g. practitioners).
Many participants noted that ECRs were particularly driven by the perceived requirements of the UK’s REF to publish in prestigious journals. As one participant put it, there are “particular buttons that they need to press for probation and promotion and the next stage and that’s critical” (Institution D, education). Most experienced researchers agreed that in certain circumstances the prestige of the journal had to be balanced with a need for speedy publication (e.g. for PhD students close to completion and applying for post-doctoral positions). Some participants suggested that as a researcher became more established, the pressure to publish in “top” journals reduced somewhat. This was disputed by others, who noted that maintaining a reputation in the field required publications in the most prestigious journals, which were also seen as important for successful grant proposals.
The demands of the REF influenced researchers differently depending on discipline. Although the REF process emphasises evaluation of research quality independent of journal title most bioscientists perceived that publishing in journals with a high JIF was essential, and many astronomers/physicists also felt that the JIF of the journal played an important role in research evaluation. In stark contrast, most historians and education researchers stated that the JIF played little or no role in their choice of journal, and indeed many were unaware of the metric and how it is calculated. It is interesting to note that all PVC-Rs interviewed were keen to emphasise that their institution’s policies and processes relating to promotion and recruitment, as well as REF returns, were based on assessing individual article quality, independent of journal prestige or JIF.
An important theme to emerge from discussions of journal choice were the tensions and trade-offs between different drivers and journal characteristics. Most commonly these related to notions of journal quality and prestige, which included, but was not limited to the JIF. Researchers described a need to weigh the potential benefits of publishing in a high prestige journal with the time and effort required to navigate often highly demanding submission processes, an equation influenced by the low success rates of such submissions. An additional factor was the occasional need for a pragmatic approach, when circumstances (e.g. a PhD student preparing for a job application) necessitated speedy publication. Participants also spoke of a need to balance the reputational and career incentives of publishing in these journals with the optimum readership of an article; it was acknowledged that the journal with the highest JIF might not necessarily be the one most read by the desired audience.
4.2 Open-access mega-journals
4.2.1 Awareness of OAMJs
Both the term “mega-journal” and the concept itself were unfamiliar to most focus-group participants and the PVC-Rs interviewed, with the notable exception of life science researchers, of whom most were at least familiar with the model. As one participant put it: “the concept is well known but the term has no common usage” (Institution B, bioscientist). It was striking that several participants realised during the discussion that they had published articles in OAMJs, stating that they had not realised at the time of submission and publication that the journal in question was in any way different to a traditional journal. Many of those who were previously aware of mega-journals associated such journals primarily with a large publication volume. It was also notable that on several occasions, having heard the focus group moderator outlines the OAMJ approach, several participants seemed to equate OAMJs with predatory journals.
This view was not specific to OAMJs. In discussions about OA as a driver of journal choice while many participants agreed that the accessibility of an article could be important, particularly in reaching non-academic readers, a number of historians and education researchers questioned the quality of OA journals. It was also noted that the cost of Gold OA publishing could be prohibitive; while university funding was available in some circumstances, this was by no means always the case. ECRs and those researchers doing non-funded research in particular were perceived to face significant barriers in this regard.
4.2.2 Perceptions of OAMJs
A number of negative perceptions of OAMJs emerged during discussions. The most commonly mentioned issue related to the visibility of research published in a high volume journal, and the likelihood of it reaching its intended audience. This concern was shared by large numbers of participants across all disciplines and institutions. Another common perception was that mega-journals were of lower quality than traditional journals. Some saw OAMJs as “data-dumping grounds” – venues for articles presenting data without substantial analysis – while others associated them with work of minimal significance or interest. A consequence of both of these negative views was that OAMJs were frequently perceived as a poor choice for ECRs in particular. Several participants, notably historians and education researchers, also objected out of principle to the Gold OA aspect of the mega-journal model.
There were, however, also positive views of mega-journals. Some participants countered the view of others that OAMJs were lower quality with personal experiences of reading significant and interesting articles published in mega-journals. Several bioscientists also noted that OAMJs have helped popularise and facilitate OA publishing, and had been “pioneers” in terms of publishing supplementary and supporting data alongside research articles. Perhaps most significantly, several participants, notably bioscientists and physicists, expressed support for the mega-journal model for similar reasons as those often presented by mega-journal publishers: as a challenge to an increasingly impact and metric-driven publishing system, and as a venue for the publication of papers that might otherwise not be published. With regard to the institutional perspective of OAMJs one PVC-R stated that “if someone had published in a [mega-journal] that is neither a positive or a negative – what counts is the content of the paper and our judgement on the quality of that work” (Institution A), whereas another (Institution D) did concede that an article published in an OAMJ would be less easily recognisable as a 4* REF quality article.
Rather than being viewed as a significant innovation, OAMJs were typically viewed by participants as just another addition to the journal ecosystem, albeit with specific characteristics: “I don’t see mega-journals as being that different to be honest. They’re one end of a spectrum” (Institution C, bioscientist). Some suggested that the current mega-journal model was likely to prove unsustainable, and that therefore OAMJs would evolve either into titles more akin to large traditional journals or simply databases. This latter suggestion found most support among astronomers/physicists, who in large part could not understand the attraction of OAMJs while pre-print servers (and arXiv in particular) were available as a means of quickly disseminating work.
4.2.3 Views on soundness-only peer review
Participants were also asked for their views on the peer review process employed by mega-journals, and in particular the requirement that articles only be scientifically or technically sound, with no assessment of novelty, significance or interest. This characteristic of mega-journals is perhaps the most controversial, and has been at the heart of much of the criticism directed towards the model. While most researchers were unaware that mega-journals operated such a policy, the subject often generated prolonged and sometimes heated debate.
In considering the issue, several participants highlighted what they felt were flaws in the traditional model of peer review. Most commonly this related to the perceived “obsession” of more prestigious journals with “short term impact rather than long term importance” (Institution A, bioscientist). Words such as “sexy” and “trendy” were used to describe the type of work favoured by such journals, something that was felt to be detrimental to progress across all disciplines. More generally there was a sense that high impact journals, and the incentives to publish in them, were linked to a highly competitive research environment that could be counterproductive, not least in the incentive it provided for authors to overstate the importance of their findings.
Many bioscientists and astronomers/physicists therefore supported the idea of soundness-only peer review in principle. The approach was felt likely to help researchers publish lower impact work, and negative results, and there was a recognition that such work can be of great use. It was also suggested that the mega-journal approach limits the potential for small groups of editors unduly influencing the development of a field.
Most common, however, were negative views of the concept, particularly from historians and education researchers. Often this was because soundness-only peer review was associated with an increase in the number of published papers, and therefore perceived to lead to a form of information overload. Journals’ assessment of significance and interest was seen as a crucial filtering mechanism which was linked to discoverability, with participants suggesting that if everything sound is published, researchers will struggle to identify highly significant papers.
Some participants also felt that the OAMJ model removed an important element of feedback from the publication process – the understanding here being that peer reviewer reports and editorial intervention were likely to be more limited if the significance of the article was not a factor in the acceptance decision. As one researcher put it: “what you’ve missed out from what peer-reviewers do is their formative inputs – not just sifting what comes in, but rescuing or cultivating it” (Institution E, education). It should be noted that in many cases participants were basing such views on an assumption of what soundness-only peer review might look like, rather than personal experience. In contrast, several participants who had previously published in mega-journals noted that peer review reports were often indistinguishable in scope to those from more traditional journals.
4.3 Notions of community
A striking finding from the focus groups was the extent to which notions of community were embedded in many areas of discussion. The term was used regularly in a variety of contexts by researchers at all institutions and across all disciplines. Communities were found to exist and be important at different levels – within disciplines and sub-disciplines, and through geographic and institutional circumstances. Several participants argued that notions of community were guided and shaped by shared memory – for example, through experienced journal editors who have an overview of current and past trends covered by a specific journal or set of journals. There was also a strong sense of belonging, shared values and the community as a guardian of the discipline, protective of practices and tradition. Relationships fostered within communities were considered vital, with trust a key factor. Learned societies were also mentioned in this context in a number of the focus groups, and were clearly highly valued as a defined community with explicit membership.
4.3.1 Knowledge of the journal landscape
An implicit influence of community on publication practices came in the form of the extent to which researchers described the development of their understanding of the journal landscape. Many participants demonstrated a complex and in-depth understanding of the characteristics of journals in their fields. This understanding was found to develop in a number of ways; through mentoring and guidance by more senior colleagues in the department, conversations with peers at conferences and other gatherings, interactions with the journals as readers and past-experience as authors, and “formal” lists and hierarchies of journals generated at a departmental or institutional level.
4.3.2 Readership and audience
The idea of community was found to relate closely to the potential and desired readership of articles. An interesting finding here was the notion of different reader groups forming “concentric circles” of communities. This was found to apply both geographically (ranging from local to regional to international communities) and within a disciplinary context (with communities encompassing small sub-disciplinary specialties and broader subject groups). Ultimately, for many participants their choice of journal was inextricably linked to reaching and engaging with a particular community.
For some participants, the audience of a journal was explicitly associated with a community. As one researcher explained: “The maths educational journals are important to me because that is the community I want to speak to” (Institution E, education). There were also suggestions that a journal is not just of the community, but plays a role in creating and guiding the community. This is evidenced in editorials for new journals and associated calls for papers that often state that the purpose of the journal is to meet the needs of a new community. One historian stated that journals “help guide how we think about those fields […] it creates a community as much as anything else” (Institution C, historian), while an education researcher highlighted the influence of journals on developing scholarship:
Journals represent a community of people working on issues, driven to address certain inequalities…if you take that out and treat it as a medium to put text into the world, I think you’re really losing something important about how disciplines evolve and how you get movements of scholars.
(Institution A, education)
It was also notable that participants were keen to emphasise that different dissemination and publication practices were required in order to communicate with practitioner or other non-academic audiences. Despite sometimes working in areas of direct relevance to these audiences, there was a clear sense that the participants did not consider themselves part of these practitioner/non-academic communities. This view was evidenced in the notion that the journal article was not an appropriate medium for communication with practitioners, both in terms of accessibility and comprehension: “we know that policy makers read very little research, what they do read they don’t understand” (Institution E, education).
The notion of a conversation within a community also appeared to play a key role in informing the perceived importance of journals and specific journal titles: “You want a community of practice and scholars that are having the same conversation” (Institution A, education). Whilst it was primarily education researchers who explicitly used the term “conversation”, for example in the choice about where to submit an article being influenced by its fit within a broader ongoing dialogue, researchers from other disciplines also expressed values aligned to the notion of an ongoing conversation within a particular community. Several participants mentioned that when considering where to publish, they will try to submit to a journal that has published some of the articles that their article cites, implying a form of conversation. In the bioscience and astronomy/physics focus groups, the value placed on peer-review comments and reader responses to articles posted to pre-print servers can also been seen to represent a form of community conversation. Indeed, the act of peer reviewing was described as a contribution to the community, with an expectation that the peer review process is supportive and “familial”.
As the literature demonstrates there are multiple dimensions by which communities can be understood and three in particular are especially relevant to our findings:
type or level of community (e.g. institutional vs disciplinary, academic vs practitioner);
function of the community (e.g. enculturation of new members and controlling the boundaries of the discipline); and
influence of the community (e.g. journal choice, choice of research area).
We argue that these three dimensions shape researchers’ perceptions and practices in relation to OAMJs.
5.1 Type or level of community
In describing their dissemination and publication practices the focus group participants highlighted the ways in which they were simultaneously members of multiple communities; namely, academic school, institution, discipline, science system and society. As noted in the literature (Harrison and Stephen, 1995; Becher and Trowler, 2001) genres of communication function on different levels. As a recognised genre using specialized language and symbol systems (Swales, 1990) the journal article plays multiple roles within and across these communities. Particularly with regards to the conversational aspect of scholarly communication – the journal article represents the conversation, whilst the journal title functions as the facilitator of that conversation.
5.1.1 The journal article situated within communities
The role of the article is intricately linked to the multiple levels of community being addressed. This is because being an active member of each community requires achieving certain communication goals, such as the need for promotion within one’s institution or to contribute novel findings to the discipline. In reality, an article is satisfying more than one goal since authors are addressing multiple community audiences with any single article. So, it is not so much this goal or that goal driving the role of an article and thus journal choice, but this goal and this goal, etc. This explains why focus group participants experienced tensions in the choices they make about where to publish since any given journal title might achieve satisfaction of one goal to a greater degree than another.
Derived from the findings, Figure 1 below provides some examples as to how the role of the journal article as a genre might vary according to the level of community being addressed, and the associated goal. While some general goals – for example increased personal recognition and reputation – are applicable to more than one level, the goals shown in the figure are unique to each community. For example, at the institution level, when promotion is the communication goal the article is a signifier of the quality of the underpinning research (e.g. original, significant and rigorous) and productivity of the researcher. At the discipline level, however, where the goal is to communicate a particular message the article contributes to an intellectual conversation in the way in which an author locates the work within a particular body of knowledge (e.g. through citation choices and consequently the narrative constructed around the findings). At the science system level where an article might be listed on an applicant’s publication list the article signifies that the applicant is research active and knowledgeable within the field of research. In terms of the broadest community level, that of society, where an author wishes to influence policy and/or practice as an outcome of their research, then the article represents a vehicle for instigating change.
5.1.2 Journal choice
In making their choice about which journal to publish in authors can find themselves having to prioritise goals – particularly those in sub-disciplines that have an influential practitioner audience (at the society level), such as was the case for a number of the education participants. This then presents a challenge for authors, particularly ECRs, who must balance the goal of tenure or promotion (institutional level) and improving their reputation within the field (disciplinary level) with that of influencing practitioners (society level) and thus be able to demonstrate social/economic impact (which can be seen as both an institutional and society level goal).
Journal characteristics facilitate communication goals in various ways, as demonstrated in Figure 2. Using specific examples, the figure shows how the communication goals and associated roles of the article can influence the factors driving a researcher’s choice of journal. It is notable that quality of peer review is a driver common to all levels of community in our examples, with the improvements to an article that can result from rich and expert feedback being seen by authors as relevant to all goals of communication. It should be noted, however, that in practice focus group participants reported experiencing wide variation in the quality of reviews they received (even within the same journal) and some participants described the very detailed and thorough reviews they had received for a mega-journal submission. In addition, at the discipline level of community the type of peer review was also considered important, particularly the extent to which the significance/importance and potential interest of the paper was assessed. The scope of the journal also helps provide the context and readership for conversations, as does the journal’s archive of published articles.
Speed of publication was also recognised as a factor potentially influencing journal choice at all levels of community, although its importance is dependent on particular circumstances (as indicated by the brackets in the figure). For example a researcher may need an article to be published quickly in order to apply for an academic position or promotion (institutional level) or grant (system level), to avoid being scooped (disciplinary level), or to influence a time-critical policy discussion (society).
Focus group participants talked about the role of journal metrics, and the JIF in particular, as an indicator of article quality at the institutional level, and both bioscientists and physicists emphasised the role that journal metrics can play in signalling an expert in the field when addressing the wider science system, such as applying for funding. The prestige or reputation of the journal was also seen to influence journal choice in the context of the institutional, disciplinary and science system communities, with the journal again acting as a proxy for article quality. However, notions of academic prestige were not seen to be directly relevant to communicating at the societal level. In contrast, whether a journal was OA or not was only seen as relevant if the intention was to reach a non-academic (society) community.
Our findings suggest that membership of these different communities can lead to different drivers influencing journal choice, be they actual or perceived drivers. Perhaps most significant is the perception amongst researchers in some disciplines that the institutional community requires publication in a high JIF journal, whereas the disciplinary community may be best served by publication in a more specialist or practitioner journal. This emerged as the most challenging tension for researchers, and corroborates the literature on journal rankings whereby perceived hierarchies at the sub-discipline level may be more informative than those at the broader level of the discipline (depending on the nature of discipline e.g. the extent to which it is multidisciplinary) (Herron and Hall, 2004; Menachemi et al., 2015) and are also likely to vary on a region by region basis (Taylor and Willett, 2017).
In terms of disciplinary community and journal choice the role of learned society journals, which provide a formal and structured community, emerged as surprisingly important, providing in particular a familiar community that can be trusted, which chimes with McMillan and Chavis’ (1986) notion of “trust” or members of a community knowing what they can expect from one another (McMillan, 1996). This point can be extended to encompass some participants’ perceptions of non-academic readers, to whom they felt the formal academic article an imperfect genre for dissemination. The implication here is that practitioners are not properly equipped to interpret the symbolic language that has evolved to facilitate discourse within an academic disciplinary community. This in turn might suggest that whilst integral to an individual’s standing within institutional and disciplinary communities, the journal article has less currency as a driver of social impact (Tucker and Lowe, 2014).
We also note that within institutional frameworks there has been an increased emphasis on quality of publication (although not necessarily explicitly connected with a JIF) rather than quantity. In the UK context it seems reasonable to conclude that this is directly related to the model of evaluation used in the REF process that assigns a quality rating to outputs (based on originality, significance and rigour) and which limits the number of outputs that can be submitted by each researcher (the maximum was four in 2014, and will be five in 2021). This was an understanding that came through clearly in the focus group data and indicates a significant shift away from what has until recently been a predominantly “publish or perish” culture (which emphasised quantity over quality) in the scholarly communication system.
5.1.3 Adoption of mega-journals
Relating this discussion to OAMJs, we note that mega-journals were generally perceived as existing outside of any of these communities. Their broad scope means they fail to offer a clearly defined community of readers, and their large size makes the notion of the journal itself somewhat fuzzy; few readers if any could claim to read an entire issue of PLOS ONE, in the way they might a much smaller and selective journal. The lack of filtering for significance also ensures that mega-journals not only have lower JIFs than more selective titles, but are perceived as less prestigious, and therefore by extension publication in such journals is not believed to enhance an author’s standing within any level of community. Our results also suggest an antipathy to the model founded on a belief that it removes a key element of community quality checking and therefore control of the disciplinary discourse.
5.2 Function of communities
One function of communities that clearly emerged from the focus groups was the extent to which publication practices were informed by a recognition of community norms. Clear evidence emerged of the enculturation described by Berkenkotter and Huckin, relating particularly to an understanding of the journal landscape. This understanding was said by ECRs to be heavily influenced by the practices, perceptions and guidance of more senior colleagues. It is interesting to note that the notion of traditional practices being passed on to new generations of researchers within the community may serve to limit the opportunity for innovation in scholarly communication. New approaches to scholarly publishing such as mega-journals therefore face a significant barrier to adoption in the shape of these informal but embedded conventions.
The most commonly expressed negative views of OAMJs related to: the loss of filtering/gatekeeping, which was seen as a vital way to preserve and protect shared values and facilitate conversation. This links strongly with the ongoing conversation element of Swales’ (1990) definition of a discourse community, whereby conversation is a mechanism for providing information and feedback. Of course, one of the characteristics of mega-journals is that they have the function to support post-publication commentary, which in theory would support the conversational aspect, but studies have shown that there is very little uptake of article commentary (Adie, 2009; Neylon and Wu, 2009). Related to the loss of filtering/gatekeeping was the sheer size of OAMJs, which was seen as counter to the notion of exclusivity implicit in the valuing of a community. This loss of the conversational aspect could lead to diminished influence of individual researchers in a disciplinary community.
5.3 Influence or impact of communities
In their discussion of elements that contribute to a sense of community, McMillan and Chavis (1986) emphasise on the importance of two-way “influence”, whereby a community must have influence over its members and likewise members must be able to influence the community. Our findings demonstrate several such forms of influence. The requirements for membership and progression within various levels of community appear to drive current publication practices – particularly in the context of the REF and the needs of ECRs. Communities can also be seen to influence the publishing environment within which they operate. By this we mean that the deeply embedded shared values of communities shape the form and influence of journals. To give just one example, earlier research (Wakeling et al., 2016) has identified a predominance of articles relating to certain sub-disciplines (specifically computational biology and ecology) in the output of the ostensibly much broader scope mega-journal PeerJ. In the context of our findings this could be interpreted as specific sub-disciplinary communities adopting the title, triggering uptake and use both in terms of publishing in the journal and citing articles from it.
Our findings suggest that the needs of academics – for access to filtered research outputs, and for publications that enhance community standing and that are deemed likely to be rated highly in evaluation exercises – sustain the current, effectively tiered, journal system. In addition, journals are perceived as influencers of the community. The importance placed on the journal’s role as a forum for conversation highlights the role editors and reviewers play as moderators of academic discourse. Journals also offer a means of validation, with author standing boosted by articles published in what are perceived to be prestigious journals. Journals are also seen to influence the direction of research in broader terms, be that through the emphasis placed by high prestige journals on impact and “newsworthiness”, or the creation of journals to act as focal points for new or emerging sub-disciplines.
One final interpretation of these results relates to the extent to which notions of community identity influence researcher perceptions of mega-journals. It has been argued that the identity of a disciplinary community is deeply bound to the symbols and genres of communication (Harrison and Stephen, 1995; Becher and Trowler, 2001) and consequently professional identities are often bound to a tightly defined set of journals (Anderson, 2016). The characteristics of mega-journals that were found most to trouble participants–their broad scope, and their removal of filtering for significance or interest – are those that directly challenge the means by which disciplinary communities control their boundaries and define themselves. In this respect mega-journals represent a threat to researchers who value the journal as a means of maintaining and developing their identity within a community.
The focus groups revealed that researchers consider themselves members of multiple communities – academic school, institution, discipline, science system and society – and that the journal article plays a role facilitating conversation within each. This understanding reveals the extent to which journals play a central role within disciplinary communities and both shape and are shaped by those communities. However, each level of community places different values on various characteristics of the journal, meaning academics face an often challenging process of balancing these competing factors. Furthermore, factors influencing journal choice at the institutional level are likely to be shaped by either institutional or national research evaluation mechanisms, such as REF and ERA, which adds an additional layer of potential uncertainty for authors in making that choice. The themes to emerge from the focus groups also demonstrate that perceptions relating to national research evaluation mechanisms such as the REF and ERA can often times diverge from the reality on the ground and are persistent across multiple evaluation cycles. This partly explains why participants considered the JIF to be important despite attempts to decouple the quality of the research from the status of the journal title in which it has been published in the REF 2018 evaluation criteria.
Awareness of mega-journals amongst focus group participants was low, and the mega-journal model was viewed with suspicion by many participants. We argue that these negative perceptions of the OAMJ model, with its broad subject scope, and lack of filtering for significance and interest, stem from a belief that it fails adequately to meet the needs of communities. This raises significant issues for mega-journal publishers, who clearly have yet to persuade many researchers that their approach adds significant value to the scholarly communications ecosystem. The implementation by some mega-journals (most notably PLOS ONE) of clearly defined disciplinary subsections suggests awareness on the part of publishers that researchers want publication venues which specifically serve their communities. Whether this approach will be enough to overcome resistance to the model remains to be seen.
While recent years have seen some evidence of attempted disruption to the status quo in scholarly communication our findings suggest that many researchers are still operating in a traditional publication paradigm, with prestige and high JIF journals at the top of a tiered hierarchy. Widespread acceptance of and engagement with mega-journals requires a fundamental shift in these community norms, something which may prove beyond the scope of any single publisher or journal type. Given the rise of alternative models of open and unfiltered dissemination (such as pre-print servers, and subject and institutional repositories), many of which appear already to have achieved some community acceptance, the challenge for mega-journal advocates is twofold; not only to facilitate the system-wide shift in values that is required for the mega-journal approach to find widespread acceptance, but also to adapt the model better to meet the specific needs of multiple levels of community.
Career stage of focus group participants
|Lecturer||Senior lecturer||Reader||Professor||PhD students/post-docs||Other|
Focus group participants by discipline
|Institutions||A, B, C, D||A, D, E||A, D, E||A, C, E|
The source data used for the purposive sampling was the most recent Higher Education Statistics Agency (HESA) tables available at the time 2013/2014 (2015).
These findings corroborate studies investigating the relative importance of journals within certain disciplines that have found there to be perceived hierarchies of prestige (Kohl and Davis, 1985), which may or may not map onto quantitative journal metrics.
As with the terms “prestige” and “reputation” the phrase “top journals” was not precisely defined by participants. In some contexts it was used to refer to quantitative rankings based on citation metrics (such as the JIF) and in others “top” was in relation to subjective rankings based on disciplinary expertise. See Taylor and Willett (2017) for further explanation of ways in which disciplinary communities rank journals.
Predatory publishers exploit the Gold OA model by charging an Article Processing Charge without providing all the expected publishing services, such as peer review (Butler, 2013). They deploy deceptive methods such as falsely listing prominent academics as editorial board members, offering editorships for a price and obscuring APCs until after an article has been accepted.
The term “science system” is a concept from the sociology of science used to describe the macro environment in which disciplines operate e.g. national research evaluation frameworks and funding opportunities. The term is taken to encompass a broader range of disciplines than just the physical and applied sciences.
Anecdotally, despite that in Australia ERA has not used a Ranked Journal list since 2010 it is interesting to note that in the FAQs for ERA 2018 one of the questions is “Why is the Ranked Journal list no longer used?”
Abbott, A.D. (2001), Chaos of Disciplines, University of Chicago Press, Chicago, IL.
Adie, E. (2009), “Commenting on scientific articles (PLoS edition)”, Nascent, available at: http://blogs.nature.com/nascent/2009/02/commenting_on_scientific_artic.html (accessed 21 September 2017).
Anderson, K. (2010), “PLoS’ squandered opportunity – their problems with the path of least resistance [Web log post]”, Scholarly Kitchen Blog, available at: http://scholarlykitchen.sspnet.org/2010/04/27/plos-squandered-opportunity-the-problem-with-pursuing-the-path-of-least-resistance/ (accessed 27 January 2016).
Anderson, K. (2016), “The new(ish) kids on the block – touring the megajournals [Web log post]”, The Scholarly Kitchen, available at: https://scholarlykitchen.sspnet.org/2016/04/05/the-newish-kids-on-the-block-touring-the-megajournals/ (accessed 7 June 2016).
Becher, T. and Trowler, P. (2001), Academic Tribes and Territories: Intellectual Enquiry and the Culture of Disciplines, Open University Press, Buckingham.
Berkenkotter, C. and Huckin, T.N. (1995), Genre Knowledge in Disciplinary Communication: Cognition, Culture, Power, L. Erlbaum Associates, Abingdon, available at: https://books.google.co.uk/books/about/Genre_knowledge_in_disciplinary_communic.html?id=ggJtAAAAIAAJ&redir_esc=y (accessed 8 November 2017).
Björk, B.-C. (2015), “Have the ‘mega-journals’ reached the limits to growth?”, Peer J, Vol. 3, p. e981, available at: https://peerj.com/articles/981 (accessed 26 May 2017).
Braun, V. and Clarke, V. (2006), “Using thematic analysis in psychology”, Qualitative Research in Psychology, Vol. 3 No. 2, pp. 77-101, available at: www.tandfonline.com/doi/abs/10.1191/1478088706qp063oa (accessed 7 November 2013).
Butler, D. (2013), “Investigating journals: the dark side of publishing”, Nature, Vol. 495 No. 7442, pp. 433-435, available at: www.ncbi.nlm.nih.gov/pubmed/23538810 (accessed 4 December 2017).
Cohen, A.P. (1985), The Symbolic Construction of Community, Ellis Horwood and Tavistock Publications, London, available at: http://lasisummerschool.com/wp-content/uploads/2016/12/Long-VErsion-of-Anthony-Cohen-PLease-only-read-selected-pages-in-Syllabus-1.pdf (accessed 26 June 2017).
Cope, B. and Phillips, A. (Eds) (2014), The Future of the Academic Journal, 2nd ed., Chandos Publishing, Oxford.
Crane, D. (1972), Invisible Colleges; Diffusion of Knowledge in Scientific Communities, University of Chicago Press, Chicago, IL, available at: https://books.google.co.uk/books/about/Invisible_Colleges_Diffusion_of_Knowledg.html?id=me9DOwAACAAJ&redir_esc=y (accessed 8 November 2017).
Fry, J., Probets, S., Creaser, C., Greenwood, H., Spezi, V. and White, S. (2009), “PEER behavioural research baseline: authors and users vis-à-vis journals and repositories”, available at: www.peerproject.eu/fileadmin/media/reports/Final_revision_-_behavioural_baseline_report_-_20_01_10.pdf (accessed 17 November 2017).
Fry, J., Oppenheim, C., Creaser, C., Johnson, W., Summers, M., White, S., Butters, G., Craven, J., Griffiths, J. and Hartley, D. (2009), “Communicating knowledge: how and why UK researchers publish and disseminate their findings”, Research Information Network (RIN), London, available at: www.rin.ac.uk/system/files/attachments/Communicating-knowledge-report.pdf (accessed 17 November 2017).
Garfield, E. (1979), Citation Indexing: Its Theory and Application in Science, Technology, and Humanities, Wiley, New York, NY.
Harley, D., Acord, S.K., Earl-Novell, S., Lawrence, S. and King, C.J. (2010), “Assessing the future landscape of scholarly communication: an exploration of faculty values and needs in seven disciplines”, Center for Studies in Higher Education, Berkeley, CA, available at: http://escholarship.org/uc/item/15x7385g#page-11 (accessed 12 September 2017).
Harrison, T. and Stephen, T.D. (1995), “The electronic journal as the heart of an online scholarly community”, Library Trends, Vol. 43 No. 4, pp. 592-608, available at: www.ideals.illinois.edu/bitstream/handle/2142/7985/librarytrendsv43i4g_opt.pdf?sequence=1 (accessed 8 November 2017).
Herron, T.L. and Hall, T.W. (2004), “Faculty perceptions of journals: quality and publishing feasibility”, Journal of Accounting Education, Vol. 22 No. 3, pp. 175-210, available at: www.sciencedirect.com/science/article/pii/S0748575104000375 (accessed 8 November 2017).
Hicks, D., Wouters, P., Waltman, L., de Rijcke, S. and Rafols, I. (2015), “Bibliometrics: the Leiden Manifesto for research metrics”, Nature, Vol. 520 No. 7548, pp. 429-431, available at: www.nature.com/doifinder/10.1038/520429a (accessed 11 September 2017).
Hjørland, B. and Albrechtsen, H. (1995), “Toward a new horizon in information science: domain-analysis”, Journal of the American Society for Information Science, Vol. 46 No. 6, pp. 400-425.
Housewright, R., Schonfeld, R.C. and Wulfson, K. (2013), “UK survey of academics 2012”, available at: www.rluk.ac.uk/ (accessed 18 July 2017).
Kohl, D.F. and Davis, C.H. (1985), “Ratings of journals by ARL library directors and deans of library and information science schools”, College and Research Libraries, Vol. 46 No. 1, pp. 40-47, available at: www.ideals.illinois.edu/bitstream/handle/2142/40859/crl_46_01_40_opt.pdf?sequence=2 (accessed 8 November 2017).
McMillan, D.W. (1996), “Sense of community”, Journal of Community Psychology, Vol. 24 No. 4, pp. 315-325, available at: http://doi.wiley.com/10.1002/%28SICI%291520-6629%28199610%2924%3A4%3C315%3A%3AAID-JCOP2%3E3.0.CO%3B2-T (accessed 5 July 2017).
McMillan, D.W. and Chavis, D.M. (1986), “Sense of community: a definition and theory”, Journal of Community Psychology, Vol. 14 No. 1, pp. 6-23, available at: http://doi.wiley.com/10.1002/1520-6629%28198601%2914%3A1%3C6%3A%3AAID-JCOP2290140103%3E3.0.CO%3B2-I (accessed 8 November 2017).
Menachemi, N., Hogan, T.H. and DelliFraine, J.L. (2015), “Journal rankings by health management faculty members: are there differences by rank, leadership status, or area of expertise?”, Journal of Healthcare Management/American College of Healthcare Executives, Vol. 60 No. 1, pp. 17-28, available at: www.ncbi.nlm.nih.gov/pubmed/26529989 (accessed 8 November 2017).
Müller, R. and de Rijcke, S. (2017), “Exploring the epistemic impacts of academic performance indicators in the life sciences”, Research Evaluation, Vol. 26 No. 3, pp. 157-168, available at: https://academic.oup.com/rev/article-lookup/doi/10.1093/reseval/rvx023 (accessed 8 September 2017).
Neylon, C. and Wu, S. (2009), “Article-level metrics and the evolution of scientific impact”, PLoS Biology, Vol. 7 No. 11, p. e1000242, available at: http://dx.plos.org/10.1371/journal.pbio.1000242 (accessed 25 May 2017).
Nicholas, D., Williams, P., Rowlands, I. and Jamali, H.R. (2010), “Researchers’ e-journal use and information seeking behaviour”, Journal of Information Science, Vol. 36 No. 4, pp. 494-516, available at: http://journals.sagepub.com/doi/10.1177/0165551510371883 (accessed 11 September 2017).
Nicholas, D., Watkinson, A., Volentine, R., Allard, S., Levine, K., Tenopir, C. and Herman, E. (2014), “Trust and authority in scholarly communications in the light of the digital transition: setting the scene for a major study”, Learned Publishing, Vol. 27 No. 2, pp. 121-134, available at: http://doi.wiley.com/10.1087/20140206 (accessed 8 September 2017).
Nicholas, D., Watkinson, A., Boukacem-Zeghmouri, C., Rodrí-guez-Bravo, B., Xu, J., Abrizah, A., Świgoń, M. and Herman, E. (2017), “Early career researchers: scholarly behaviour and the prospect of change”, Learned Publishing, Vol. 30 No. 2, pp. 157-166, available at: http://doi.wiley.com/10.1002/leap.1098 (accessed 8 September 2017).
Patterson, M. (2009), “PLoS journals – measuring impact where it matters”, The Official PLOS Blog, available at: http://blogs.plos.org/plos/2009/07/plos-journals-measuring-impact-where-it-matters/ (accessed 20 April 2017).
Research Information Network (RIN) (2007), “Researchers’ use of academic libraries and their services”, Research Information Network (RIN), London.
Rijcke, S.D., Wouters, P.F., Rushforth, A.D., Franssen, T.P. and Hammarfelt, B. (2016), “Evaluation practices and effects of indicator use–a literature review”, Research Evaluation, Vol. 25 No. 2, pp. 161-169, available at: https://academic.oup.com/rev/article-lookup/doi/10.1093/reseval/rvv038 (accessed 8 September 2017).
Saldaña, J. (2013), The Coding Manual for Qualitative Researchers, SAGE Publications, London.
Sands, R. (2014), “Comparing the results from two surveys of BMJ Open authors [Web log post]”, BMJ Blogs, available at: http://blogs.bmj.com/bmjopen/2014/05/09/comparing-the-results-from-two-surveys-of-bmj-open-authors/ (accessed 12 December 2015).
Sivertsen, G. (2017), “Unique, but still best practice? The Research Excellence Framework (REF) from an international perspective”, Palgrave Communications, Vol. 3, p. e17078, available at: www.nature.com/articles/palcomms201778.pdf (accessed 12 January 2018).
Solomon, D.J. (2014), “A survey of authors publishing in four megajournals”, Peer J, Vol. 2, p. e365, available at: https://peerj.com/articles/365 (accessed 3 December 2015).
Spezi, V., Wakeling, S., Pinfield, S., Creaser, C., Fry, J. and Willett, P. (2017), “Open-access mega-journals: the future of scholarly communication or academic dumping ground? A review”, Journal of Documentation, Vol. 73 No. 2, pp. 263-283, available at: www.emeraldinsight.com/doi/10.1108/JD-06-2016-0082 (accessed 24 March 2017).
Swales, J.M. (1990), Genre Analysis: English in Academic and Research Settings, Cambridge University Press, Cambridge, available at: https://books.google.co.uk/books/about/Genre_Analysis.html?id=shX_EV1r3-0C (accessed 8 November 2017).
Swales, J.M. (1998), Other Foors, Other Voices: A Textography of a Small University Building, Lawrence Erlbaum Associates, Mahwah, NJ.
Taylor, L. and Willett, P. (2017), “Comparison of US and UK rankings of LIS journals”, Aslib Journal of Information Management, Vol. 69 No. 3, pp. 354-367, available at: www.emeraldinsight.com/doi/10.1108/AJIM-08-2016-0136 (accessed 8 November 2017).
Tenopir, C. et al. (2016), “What motivates authors of scholarly articles? The importance of journal attributes and potential audience on publication choice”, Publications, Vol. 4 No. 3, p. 22, available at: www.mdpi.com/2304-6775/4/3/22/htm (accessed 22 August 2016).
Tönnies, F. (1957), Community and Society (Translated by C.P. Loomis), Dover Publications, Mineola, NY, available at: https://books.google.co.uk/books?id=sKcITieRERYC&printsec=frontcover&source=gbs_ge_summary_r&cad=0#v=onepage&q&f=false (accessed 8 November 2017).
Trowler, P. (2014), “Academic tribes and territories: the theoretical trajectory”, Österreichische Zeitschrift Für Geschichtswissenschaften, Vol. 25 No. 3, pp. 17-26.
Tucker, B.P. and Lowe, A.D. (2014), “Practitioners are from mars: academics are from Venus?”, Accounting, Auditing & Accountability Journal, Vol. 27 No. 3, pp. 394-425, available at: www.emeraldinsight.com/doi/10.1108/AAAJ-01-2012-00932 (accessed 17 November 2017).
Wakeling, S. et al. (2016), “Open-access mega-journals: a bibliometric profile”, PLOS ONE, Vol. 11 No. 11, p. e0165359, available at: http://dx.plos.org/10.1371/journal.pone.0165359 (accessed 26 May 2017).
Wakeling, S., Spezi, V., Creaser, C., Fry, J., Pinfield, S. and Willett, P. (2017), Open access megajournals: the publisher perspective (part 2: operational realities), Learned Publishing, Vol. 30 No. 4, pp. 313-322.
Wakeling, S., Spezi, V., Fry, J., Creaser, C., Pinfield, S. and Willett, P. (2017), “Open Access megajournals: the publisher perspective (Part 1: Motivations)”, Learned Publishing, Vol. 30 No. 4, pp. 301-311, available at: http://doi.wiley.com/10.1002/leap.1117 (accessed 18 October 2017).
Ware, M. and Mabe, M. (2015), The STM Report, International Association of Scientific, Technical and Medical Publishers, The Hague, available at: www.markwareconsulting.com/the-stm-report/ (accessed 12 June 2017).
Whitley, R. (2000), The Intellectual and Social Organization of the Sciences, Oxford University Press, Oxford.
Wilsdon, J., Allen, L., Belfiore, E., Campbell, P., Curry, S., Hill, S. and Johnson, B. (2015), Metric Tide – Higher Education Funding Council for England, Higher Education Funding Council for England, Bristol, available at: www.hefce.ac.uk/pubs/rereports/year/2015/metrictide/ (accessed 8 November 2017).
The research was funded by a grant from the UK Arts and Humanities Research Council (AH/M010643/1). The authors also thank all focus group participants and interviewees for their contribution to the research.