Performance Measurement and Metrics

ISSN: 1467-8047

Article publication date: 20 March 2009



Thornton, S. (2009), "Editorial", Performance Measurement and Metrics, Vol. 10 No. 1. https://doi.org/10.1108/pmm.2009.27910aaa.001



Emerald Group Publishing Limited

Copyright © 2009, Emerald Group Publishing Limited


Article Type: Editorial From: Performance Measurement and Metrics, Volume 10, Issue 1

It is amazing the strange things that you can learn just through being an editor of an international journal. For example, that “Merry Christmas” in Estonian is “Ilusad Jõulud”, or that “puthis” are ancient manuscript books of religious stories or fairy tales from Bengal. This issue of PMM brings together authors from the USA, Bangladesh, Thailand and The Netherlands.

Although the last issue was dedicated to LibQUAL+®, its growing importance within the performance measurement community justifies that we start off this issue with yet another paper on the subject.

I was fortunate enough to attend Danuta Nitecki’s 1997 paper at Morpeth (Nitecki, 1997), which demonstrated that SERVQUAL could be adapted and made applicable to the library environment, and was quite taken by the idea. I found myself discussing the possibilities with Colleen Cook and Fred Heath at lunch, and came away enthused enough to pilot a modified electronic version of my own later that year. Where it all fell down for me was the length of time it took for my users to complete it. I had sent it to 200 scientists, with the warning that it would take time, and asked for volunteers to fill it in. Although we usually got response rates in excess of 50 per cent for our surveys, we managed to get just 10 per cent who struggled through to the bitter end. The results were sadly unusable. In four of the five dimensions, our users perceived our service level to be far in excess of what they expected – we were let down by a pretty decrepit old library – and it was suggested by my management that I had better keep the facts quiet, since too good a service would be taken as ripe for cutting back. Colleen and Fred did not face such issues, and went on to be instrumental in developing LibQUAL+, one of the most widely used tools available today.

Bruce Thompson, Colleen Cook and Martha Kyrillidou present the results of what might actually be a radical sea change in the application of LibQUAL+. Too complex a questionnaire drives away respondents, and may ensure that the respondents who do complete it are not fully representative. Too simple a questionnaire will not supply you with the results you need. LibQUAL+® Lite uses matrix sampling which enables data to be collected on all the desired survey items without requiring every participant to answer every survey question. The results have been so encouraging, with much greater completion rates, that they feel that the results argue for the adoption of the LibQUAL+® Lite in the future, to the exclusion of LibQUAL+®.

Zabed Ahmed and Zahid Hossain Shoeb present the results of a modified SERVQUAL service quality survey of University of Dhaka’s library services. It is interesting that although they looked in detail at LibQUAL+ and SERVPERF, they felt that neither adequately met the local needs of a university in the developing world. I wonder whether there is a somewhat deeper truth in what they say. While the drive for quality improvements have been led by Western Europe, North America and the ANZAC countries, how much thought has been given to the fundamental nature of the tools that have been created and their applicability elsewhere in the world? Do we know enough about these library services and the particular problems and issues that countries outside our “comfort zone” face to provide a meaningful response. I know that I have never really thought about it until now – I have always tended to think of a library in the USA as the same as one in the UK and identical to one in Mombasa or Kyrgyzstan. Different in scale and/or resources, but fundamentally the same. But are they, or are there such fundamental differences that we really need to rethink some of our basics preconceptions?

One author who has looked at different approaches taken to information issues in different countries is Ricardo Gomez, who along with Rucha Ambikar and Chris Coward, reports on ongoing research comparing public information access in 25 countries around the world in such diverse venues as libraries, cybercafés and telecentres. Their work pulls together research carried out in countries as culturally and geographically disparate as Columbia and Kazakhstan, Namibia and Nepal, Malaysia and Moldova. Their conclusions and initial findings are of international importance, and deserve a wide audience among the major international organisations.

Peep Miidla and Konrad Kikas have used data envelopment analysis, a non-parametric linear programming-based tool, to determine the relative efficiency of twenty central public libraries of Estonia. This is a tool much favoured by economists, but which has been sadly neglected by librarians. At a brief glance I almost dismissed their paper, as my gut feeling was that there are so many parameters in play that it would be impossible to draw meaningful results just from the data sets that Peep and Konrad had to draw upon. However, it was the results that tended to change my mind.

Here we have 20 major library services that have undergone a radical period of change, renewal and investment. Just how effective have those changes been, and how wisely those investments have been used? DEA gives you a baseline figure which – although not gospel, as I fear the accountants and economists might assume and act upon – highlights critical areas that decision makers need to investigate. Library No. 12 is a case in question. Using the DEA efficiency rating, this library has declined in performance every year, so that by year four it was down to just 53 per cent of the ideal. There could be many reasons for this, but in my experience, given increases in resources, either there is not real need for the library service (unlikely) or the local management is not good enough for the task in hand. As an indicator, and tool for identifying areas of concern, DEA shows real promise. As a hard and fast measure of efficiency it needs to be treated with caution.

Finally, Henk Voorbij give an overview of the development of academic library benchmarking in The Netherlands. He describes how they chose to go down certain paths and not others, and compares their presentation of results with those of several other major national schemes. This is a long running project and it is interesting to see the changes that were made, as well as some of the mistakes. A comprehensive and illustrative overview.

Steve Thornton


Nitecki, D. (1997), “Assessment of service quality in academic libraries: focus on the applicability of the SERVQUAL”, Proceedings of the 2nd Northumbria International Conference on Performance Measurement in Libraries and Information Services, Morpeth, September, pp. 181–96

Related articles