The consortium site licence: is it a sustainable model?

Interlending & Document Supply

ISSN: 0264-1615

Article publication date: 1 March 2003

116

Keywords

Citation

McGrath, M. (2003), "The consortium site licence: is it a sustainable model?", Interlending & Document Supply, Vol. 31 No. 1. https://doi.org/10.1108/ilds.2003.12231aac.001

Publisher

:

Emerald Group Publishing Limited

Copyright © 2003, MCB UP Limited


The consortium site licence: is it a sustainable model?

The consortium site licence: is it a sustainable model? A conference organised by the Ingenta Institute on 24 September 2002 at the Royal Society, LondonKeywords: Consortia, Licences

The title causes some confusion. A site licence is agreed between a publisher and a purchaser, which allows access to the complete portfolio of the publisher's journal titles. It is priced in such a way that many journals not previously available are included for a small proportion of their separate purchase price. The licence may be offered to a consortium of purchasers but can also be offered to a single organisation. A consortium also engages in many other activities as was indicated at the conference.

Speakers addressed the issues raised by the research programme commissioned and financed by the Ingenta Institute. Among them – Don King, from the School of Information Sciences at the University of Pittsburgh giving an overview of the development of library consortia and their provision of electronic journal services; Hazel Woodward, University librarian at Cranfield University on the study of the impact of the consortium site licence on academic libraries; Bill Russell, Sales and Marketing Director of Emerald considered the impact of the consortium site licence on publishers, in particular Emerald; and David Nicholas, Department of Information Sciences, City University, London on the results and analysis from a pilot quantitative analysis of web log data. In the afternoon a number of speakers looked at the way forward – Carol Tenopir of the School of Information Sciences at the University of Tennessee, David Pullinger of the UK Office for Statistics, Peter Shepherd, of COUNTER which is developing an international code of practice for the measurement of the usage of online information resources and Pieter Bolman, Director of STM relations at Elsevier Science.

Don King began his overview with the statement that "Journals must continue to be the primary communication channel, at least in science", and that "Libraries must continue to play a major role"; neither statement was challenged during the day. He produced some interesting figures – 25 per cent of material was supplied by libraries to readers in the 1970s but has now risen to 50 per cent. A key factor has been the decline of the personal subscription from 5.8 titles down to 2.2, no doubt due to the increase in journal prices and the declining real incomes of researchers. A trend with which this recently retired writer can identify, as he now has to purchase individual subscriptions to many titles! Aggregated, this means 22 million lost subscriptions from six million scientists in the USA alone. Part of the publisher justification for price increases as unit costs rise with declining subscriptions.

He also quoted an average of 100,000 readings per title per annum but this is misleading on its own. Readings are highly skewed toward a few very popular journals – Nature, Scientific American, BMA Journal, etc. with an enormous tail of many thousands of journals with very low readership. "Most articles are read by the author and their mother" is an old and cynical saying but holds more than a grain of truth. Even with skewing, I find this figure unbelievably high. (Later on, David Pullinger suggested 900 readings per title in the 1990s.) Journal size has increased to an average of 150 articles a year as has the number of articles, although the articles themselves are getting shorter – even researchers are suffering from reduced attention span or could it be that writing is getting better and tighter? I am reminded of Balzac's apology to a friend – "I'm sorry that I didn't have time to write you a shorter letter".) The development of e-journals is underlined by the figure of 10,000 of the 17,000 STM journals being available electronically. The breadth of reading has gone up, one article from 13 titles in the 1970s rising to 18 titles in 1995 and 20 now; the primary reasons being the growth of online searching and the electronic availability of full text. Library consortia in the USA have grown in the 1960s, 1970s and 1980s but declined in the 1990s as funding dried up. Over 100 types of services are covered. Consortia definitely justify their expense. Institutions find the cost of a typical service to be over twice as much when purchased individually. Use costs go down but infrastructure costs go up. As I have argued on a number of occasions, this is one of the "hidden" costs of moving to an "e" environment from a conventional hard copy and ILL. However, Don King argued that publishers also benefit because journal exposure goes up, thus is more attractive to authors and they reduce costs – both administrative and technical.

The state of the market has changed dramatically since the 1960s when most STM sales were personal and most publishers (87 per cent) had three titles or less.

King queried whether the above state of affairs is sustainable and suggested that prices may well start to rise steeply again. An aspect of consortia philosophy of interest to ILL librarians is the commitment to net borrowing being in balance with net lending. He observed that concentrating lending and/or copying in the hands of a few, efficient players was more cost effective as unit cost reductions are available. He opined that ILL volumes would be sustained because of the demand for older issues for which the cost per use is very high at $30-40. I queried this in question time as the experience of BLDSC would suggest otherwise, at least in the shorter term. He acknowledged that any prediction about the future was speculative so ILL librarians must continue to scratch their heads!

Bill Russell from Emerald stepped in at the last minute and gave a presentation on making the big deal partnership work for the user. His presentation summarised the publisher study undertaken by Key Perspectives Ltd for the Ingenta Institute. A total of 16 interviews were conducted with publishers including Elsevier, Emerald, AIP, OUP. Publishers want revenue stability and libraries want price stability and the way to satisfy both was seen as the "big deal". "Big deals" give wider access to the publisher portfolio, discounted price increase, increased journal visibility, increased usage and value for money. Where analysis has taken place, significant usage increase has been seen. The benefits of the Web in this context are cheap distribution of electronic journals, the facilitation of communication between author, reader and publisher communities. The study found clear differences between publishers with the will and resources to invest and those who have not. The majority of publishers base "big deal" offers on the existing level of business plus an annual premium of between 5-15 per cent. Large to medium publishers rely on library consortia for 25-58 per cent of their revenue. Large and medium publishers normally guarantee perpetual access to material paid for. (I wonder if this is generally true?) It does appear that the gap between publisher expectations and the reality of finite budgets will begin to widen again. Many smaller publishers feel squeezed out by the "big boys". The consensus of the conference appeared to be that "big deals" would be around for a maximum of five years. They need to be modified in order to give access to a core collection and to other material on a transactional basis. Some factors emerging for publishers were, inter alia: that consortia were beginning to exploit usage figures to drive harder bargains; open access archives pose a possible threat in the longer term; small publishers are in a difficult position; the need to negotiate long term deals in order to escape market turbulence; the need to attract the best papers to prevent deselection: increase the usage of journals and build long term relations with customers; explore non-academic markets; and more flexibility in pricing and portfolio management. He then gave an Emerald gloss. They have focussed on a consortia approach, with the first deal being completed in 1998; by 1999 it was clear that there was no other alternative. Revenues plateaued and subscription levels fell in 1997-2000. Price increases continue at a high level simply in order to maintain revenues – the traditional model was running out of steam. Emerald has now moved much closer to its customer base and concentrated on selling the whole portfolio on favourable terms. On moving to Ingenta in July 2001, Emerald benchmarked their monthly downloads against the total for all Ingenta downloads. These were 600,000 per month from 130 journals and 38,000 articles. Now a study by Ohiolink showed that Emerald generated more downloads per article than any other publisher. He raised the thorny issue of journal usage statistics. These are now more easily available in an electronic journal environment even if there exists some confusion at present. He suggested that STM publishers could well be the losers if these are exploited by librarians. Usage statistics will undoubtedly show wide variations between journals and overall a fairly low download rate for articles and thus weaken the case for "big deal" models. As he pointed out "The need for flexibility is here to stay". Librarians and users have welcomed their ability to offer a wider range of material and publishers welcomes the stable revenue streams. However the key driver here is the reality of flat budgets, which may well prejudice the "big deal" model. All in all a realistic and hard headed assessment.

Hazel Woodward from Cranfield next introduced the report of the team whose task was, "to identify the main operational and strategic issues relating to library consortia licensing of e journals". All the consortia investigated were academic except for one corporate. The advantages were that costs were down, the "big deal" increased access to more journals, ILL was down, at Cranfield by 5,000 per annum (but Cranfield is exceptional in purchasing a very high proportion of STM journals), thus saving time and money. However some additional funding was needed. Each consortium had signed up an average of 12 e-publishers, giving access to 5,000 titles at a cost of E2 million per annum. So far there is little interest in renewals as three to five year deals are being made. More deals are expected but less "big" deals. There was criticism of the inflexibility of the publisher pricing models which generally speaking offered a price of 85-90 per cent of the print subscription for "e" but with additional "e" fees. Many librarians want to move to "e" only, and it was thought that print versions at least in STM would disappear within three to four years. Not one interviewee believed that the big deal was sustainable. This point was repeated again and again by other contributors during the day. The savings to pay for these deals were made by cancelling titles from smaller publishers. The "No cancellation" policy of many publishers is causing huge problems. Realism is now kicking in as publishers begin to realise that universities do not want to purchase all their output. There was a feeling among interviewees that agents had missed the boat. They should be repositioning themselves to continue their traditional role in the new hybrid environment. Archiving is still a big unresolved issue with resistance being expressed to the "moving walls" model; however it was felt that publishers were doing their best.

Hazel expressed some reservations of the study: it had a short time scale, there were a limited number of interviewees and there was no input from the larger US consortia. Librarians were now subscribing to 6 per cent fewer journals in 1999 than in 1986 but spent 170 per cent more (ARL statistics). There is pressure for "deep" resource sharing and Sir Brian Follett's second report is awaited with keen interest (now due for publication in October). Customisation of databases is moving up the agenda no doubt under pressure from end users overwhelmed by the material available. The low use of "e" resources by undergraduates was confirmed again (see the JUSTEIS report). The "outsell" survey showed that Google is used so much because databases are "too complex".

Dave Nicholas from City University gave a fascinating analysis of the Web log data of user behaviour in a "big deal" environment. "The digital 'fingerprints' of tens of thousands of users from around the world were examined via the journals published by Emerald." He praised the design of the Emerald database – lessons for other publishers here. In summary, 900 big deal subscribers average 118 journals each (from Emerald). In June 2002 there were 154k users of which 54k were big deal users. There were 3k non-deal users, 97k non-subscribers (including triallists) and 2,438k requests. Triallists were 8 per cent of users but 25 per cent of usage. Use was defined by a hit on a TOC, an abstract or an article. One third of all use was from robots – search engines I presume. A total of 100,000 articles were requested by non-subscribers "vastly more than doubling every year" but many were from the triallists.

Pieter Bolman, Director of STM relations at Elsevier, then spoke on the "issues to be addressed in moving towards more efficient models of provision, access and pricing". He didn't think that the pricing model based on historical patterns would survive. "Outsell" suggested that $1.6 billion could be saved on ILL/docdel through big deal agreements. It is possible that we could return to the "pick and choose" model directly with publishers (I raised the fairly obvious point that librarians don't like talking to thousands of publishers, which was fairly curtly dismissed).

Carol Tenopir was critical of the "big deal" concept, there being more disadvantages for users and buyers than advantages. The pressure to opt out will grow. The findings on increasing in reading as a result of "big deal" agreements were contradictory. OHIOLINK suggested that there was more reading of non-subscribed titles than there was for subscribed titles (a further indicator was that 40 per cent of downloads at Ohio State University were from previously non-subscribed journals).

NERL showed the opposite! Phillip Davies of Cornell has found that there was ten times more demand from subscribed titles than non-subscribed. (Perhaps this is a problem of measurement in the more complex hybrid environment rather than an objective contradiction? If Ohio is correct then one wonders at the competence of acquisitions librarians!)

So the question arises what is more cost effective for peripheral titles; big deal or document delivery? Carol provided some figures that should help in answering this question: if the institutional subscription is US$100 then pay-per-view is cost effective after 9.5 views of the title rising to 56.5 views for a subscription of US$1,000. She was strongly of the view that the "all or nothing" deal will not survive. Different site licences will be needed for different types of libraries. There will be no one right way.

There was widespread hostility to the current "big deal", all or nothing model on offer. No publisher in the audience or from the platform defended them. So the implications for document delivery are quite interesting. Demand at the British Library Document Supply Centre dropped by 8 per cent in the UK as a result of the big deals but has now stabilised. My own view, confirmed by this meeting, is that a new balance will be struck between document delivery and "e" deals. Librarians will want to pare away peripheral titles and go back to document delivery via integrated current awareness and electronic delivery, ideally directly to the end user. However publishers will charge a price for this paring away and where exactly the balance will be struck is anyone's guess. Whatever happens there remains a healthy future for the pay-per-view/document delivery option. Apart from anything else it is the only sure curb on publisher prices!

Mike McGrathEditor, ILDS

Related articles