Editorial

Performance Measurement and Metrics

ISSN: 1467-8047

Article publication date: 6 July 2010

342

Citation

Thornton, S. (2010), "Editorial", Performance Measurement and Metrics, Vol. 11 No. 2. https://doi.org/10.1108/pmm.2010.27911baa.001

Publisher

:

Emerald Group Publishing Limited

Copyright © 2010, Emerald Group Publishing Limited


Editorial

Article Type: Editorial From: Performance Measurement and Metrics, Volume 11, Issue 2

Once again we are able to bring you papers from no less than four continents, South Africa, New Zealand, Bulgaria, Canada and the USA, together with a book review from the UK.

William Daniels, Colin Darch and Karin de Jager present us with a report on their implementation of Research Commons at Cape Town University. It is explicitly intended to provide an enhanced facility where postgraduate students and “emerging researchers” can obtain advanced research support from experienced librarians who would be ready to assist them at all times or refer them to subject or information technology specialists whenever required. Although they are not the first to do this, their critical appraisal of progress so far is illuminating and informative.

It compares favourably with my own solution to meet the information needs of the researchers at the Defence Science and Technical Laboratory in the early part of this decade. My particular problem lay in recruiting from scratch an adequate team of information intermediaries; I could not get enough competent librarians, and resorted (very successfully) to post-doctoral researchers. What William and the others do not appear to have introduced was a concept I was particularly proud of, the “anti-bistro”, a room without any phone or network connections where researchers could go and get down to serious study without any distractions.

Rowena Cullen and Brenda Chawner report on the development of institutional repositories in New Zealand in order to preserve and organise the institution’s research output, to make it more readily available, and to enhance the reputation of the institution as well as that of the individual researcher. They explore factors affecting the adoption and success of institutional repositories from the perspective of both the library managers who established them, and from the perspective of the academic community who contribute as well as use them. From the survey it is apparent that the academics have been slow to embrace the concept, show little interest in using repositories for increasing the accessibility of their own work, or even to access the work of others – results that have paralleled worldwide experience. All-in-all a very disappointing and disheartening set of results and conclusions for all those who have invested time, effort and spirit in creating such repositories.

Savina Kirilova has carried out an extensive survey of the web-presence of 42 Bulgarian university libraries, looking at their content, structure and usability. These sites varied from the good to the indifferent, and as she says, “although a start has been made there is still much work to be done in order to achieve the stated purposes and the quality virtual representation of university libraries in Bulgaria”. A very competent and thorough survey and analysis.

The next five papers have been chosen by Jim Self, Steve Hiller and Martha Kyrillidou from those presented at the first Library Assessment Conference, held in 2008. This new North American based biennial conference series runs on the alternate years from the Northumbria Performance Measurement and Metrics conferences. Following the success of the first conference, the second will be held this October in Baltimore (http://libraryassessment.org/schedule/index.shtml), and it looks like being a first rate colloquium.

The first of these papers, by Sam Kalb, describes a consortial approach to library assessment using LibQUAL+® among Canadian academic libraries. With some 48,000 respondents to the questionnaire offers Canadian library researchers a unique opportunity to study academic service quality data on a granular level not possible from individual library results. Further surveys planned in 2010 will provide an additional set of valuable time-series data to help libraries assess the success of new cooperative initiatives and changes in client expectations and perceptions over time.

Zsuzsa Koltay and Kornelia Tancheva outline a fast track process Cornell used to develop a user-focused vision and recommendations on how Cornell University Library should present itself and the information landscape to its users. Local interviews probing audience work habits and needs and to synthesise them into composite personas segmented on the basis of “like” behaviour. These “imaginary friends” helped validate and supplement user studies done elsewhere and existing quantitative data from Cornell, thus influencing all the decisions and recommendations that the team produced.

I had been completely unaware of this “persona” technique which seems to have been used almost uniquely in an industrial setting, but they have found that it a useful and relevant benchmark for the academic library environment, and also served as an assessment tool of the research practices and work habits of our audiences and for the gaps that exist between our users’ needs and expectations and the state of CUL’s information universe. A very attractive and innovative tool to add to our armoury.

Impact measurement is something I have always been interested in – just how much impact does the work we do have on the lives of our customers and our organisations? Sure, we might think that a service is useful, but does it actually have any impact. The work I did on the subject looked at impact at a purely institutional level, but Terry Plum, Brinley Franklin, Martha Kyrillidou, Gary Roebuck and MaShana Davis, building on MINES for Libraries®, have attempted to develop a reliable and scalable survey tool to assess the utility, accessibility, and impact of the usage of their networked resources and services. Using short, standardised Web surveys placed at the point-of-use of networked electronic resources and services through a network assessment infrastructure that uses contemporary mechanisms of authentication and access, they will gather data from a wide variety of inputs from a variety of different modules, different institutions, and different questions, analyze them according to the needs of the participating libraries and consortia, and return that analysis to the institution. A rather exciting proposal, if it reaches deep enough.

Agnes Tatarka, Kay Chapa, Xin Li and Jennifer Rutner describe and collate their various institutional assessment programmes at their libraries at the University of Chicago, Columbia University, the University of Texas Southwestern Medical Center, and Cornell University. These assessment programs have evolved to reflect local needs and priorities, their libraries’ organisational structure, their institutions’ planning cycle, and the reality of limited resources. I think it significant that their programs have all been tailored, and it is apparent that a generalised programme would not fit all needs. If their efforts are to be successful and sustainable, they must reflect the institution’s priorities, be based on achievable and realistic goals, and have the support from your library’s administration. Even though you may encourage a local culture of assessment, it cannot be dependent on a single person, and must have honest organisational encouragement and not just lip service.

Finally, Elizabeth Yakel from the University of Michigan and Helen Tibbo from the University of North Carolina have taken a serious look at user-based evaluation of archives and special collections. This is largely virgin territory, and they have developed an Archival Metrics Toolkit, based on work that has been tested elsewhere, and comprises five user-based evaluation instruments addressing researcher, archival web sites, online finding aids, atudent researchers, and teaching support. I thoroughly expect to see much more about this toolkit over the next few years.

I hope you enjoy these papers as much as I did.

Steve Thornton

Related articles