Innovative impact planning and assessment through Global Libraries

David Streatfield (Information Management Associates, Twickenham, United Kingdom)
Pablo Andrade Blanco (Department of the National System of Public Libraries and the BiblioRedes Program, Santiago,Chile)
Marcel Chiranov (MC Performance Management, Bucharest, Romania)
Ieva Dryžaite (Libraries for Innovation 2 Project, Vilnius, Lithuania)
Maciej Kochanowicz (Formerly Information Society Development Foundation, Warsaw, Poland)
Tetiana Liubyva (IREX, Kiev, Ukraine)
Yuliya Tkachuk (IREX, Kiev, Ukraine)

Performance Measurement and Metrics

ISSN: 1467-8047

Article publication date: 13 July 2015

3835

Abstract

Purpose

The purpose of this paper is to describe a range of innovative (for public library performance measurement and impact assessment) methods and tools developed by country teams as part of the Global Libraries (GL) initiative. Short reports are provided on: a return on investment study, a simplified data processing system for library managers and an online reporting system for public libraries in Ukraine; a study of the public image of Polish libraries in print mass media, two approaches to sustainability of performance measurement and impact assessment in Romania, through tools to conduct pop-up surveys and use of agricultural subsidies support data, assessments of changes in public library managers’ planning efforts in Poland and of their perceptions of libraries and their own role, using Modified Delphi forecasting, in Lithuania, two ways of focussing on the world of public library users by engaging non-profit organizations in library research in Poland and conducting impact studies in virtual environments in Chile.

Design/methodology/approach

A range of methods and tools and their uses are described.

Findings

No specific research findings are reported.

Research limitations/implications

All of these tools and methods have been (or are being) trialed in national public library contexts; some have been developed over several years.

Practical implications

Useful for people in other (non-GL) countries who may be contemplating public library evaluation at regional, national or local level or who are interested in performance measurement and impact evaluation.

Social implications

This paper is part of a GL effort to share what participants have learnt about impact planning and assessment in public libraries with the wider international libraries community.

Originality/value

The impact planning and assessment program of GL has been the largest sustained international public library evaluation program so far attempted. This paper reports on the more innovative evaluation activities undertaken at country level through this program.

Keywords

Citation

Streatfield, D., Andrade Blanco, P., Chiranov, M., Dryžaite, I., Kochanowicz, M., Liubyva, T. and Tkachuk, Y. (2015), "Innovative impact planning and assessment through Global Libraries", Performance Measurement and Metrics, Vol. 16 No. 2, pp. 177-192. https://doi.org/10.1108/PMM-04-2015-0011

Publisher

:

Emerald Group Publishing Limited

Copyright © 2015, Authors. Published by Emerald Group Publishing Limited. This work is published under the Creative Commons Attribution (CC BY 3.0) Licence. Anyone may reproduce, distribute, translate and create derivative works of the article (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licenses/by/3.0/legalcode .


1. Introduction

This paper describes 11 methods and tools which were developed or modified for use in public library evaluation in five of the countries involved in the Global Libraries (GL) initiative. Why are only five of the 16 countries preparing proposals or receiving country grants represented here? Whatever their skills and experience before joining their GL country teams, there is a common pattern to how the impact specialists go about the work. All of them have a huge task at the outset in learning about the GL approach to impact planning and assessment, their colleagues, their program and their public library system. They then have a heavy workload in organizing needs assessments, baseline surveys and follow-on studies. However, when they reach the stage of having institutionalized the team’s approach to impact assessment they start to look at other ways of assessing impact and streamlining performance measurement. All but one of the authors has been through this process (and this one regularly sees the same pattern emerging with new grantees). Here they describe some assessment methods and tools that have been developed and applied at country level; most of these are relatively new to public library evaluation, particularly at national level. They also offer some of the lessons from their experience in applying these approaches that may interest people concerned with library evaluation in other countries (Readers may like to read this paper in combination with Paley et al., 2015, which describes some of the innovative impact assessment tools developed by the GL team for use within, and potentially beyond, all GL countries).

The 11 short accounts are arranged below under three main headings: these explore data management and exploitation to achieve sustainable impact assessment (the GL preferred term for what is often referred to as impact evaluation); evaluation methods developed to meet particular challenges; and evaluation tools that have been adapted for use with public libraries.

2. Exploiting evaluation data for sustainability

2.1. Simplified data processing for library managers in Ukraine

Performance measurement and impact assessment both require good quality results and a significant issue in many countries is how to strengthen the public library evidence base. The Ukraine project team (known as Bibliomist) has addressed this issue in two ways.

They have designed and are beginning to implement a system to collect data on computer and internet use to help library managers better assess and plan technology-related services. This system uses Data Giraffe open source software; and it is linked to the pop-up survey program implemented by Ukraine and Romania (Chiranov, 2011) (the Romania version is described elsewhere in this paper).

This system builds on the experience of the project team, who found that its usage tracking software helped in taking management decisions, monitoring program activities and identifying which libraries needed assistance (such as detecting computers that had hardware or internet problems). That system has now been improved to simplify data processing for people who are not professionals in statistical data analysis, by providing a set of service variables for data disaggregation. The system’s tracking method enabled the team to meet the GL requirement for workstation use data on “the total number of hours workstations […] are actually used by library visitors, divided by the total […] hours during which the library service point is open to the public.”

Application of Data Giraffe was initiated by IREX (the NGO that manages the GL grant in Ukraine) in 2014. It is composed of two modules: the client software (installed on the computers in libraries) and the administrative interface where administrator generates reports from the database hosted on the server. The software is suitable for most currently used computers and for both broadband and 3G internet connections; it automatically collects accurate data on usage of every computer in the system. It is also a tool to receive fast feedback directly from users of computers thanks to incorporation of the pop-up questionnaires function.

This data processing system can be adapted to the territorial structure of different countries and it allows for computers to be selected using various criteria. For example, it is possible to choose computers by type of settlement/name of location/name of library or to use other criteria and receive aggregated usage statistics for these selected computers.

The system provides initial data processing, such as for workstation usage rates of selected computers and percentages of answers to survey questions, so that library managers can receive results automatically instead of having to make their own calculations. An important feature of the software is that the workstation usage rate is calculated using the actual hours each computer is accessible to the public, since opening hours and days vary. The system is easy to install and use, so it is good for administrators with limited technical and statistical skills. Once the system is set up, the administrator is able to create and edit questions and surveys and to analyze the results.

Currently, representatives from eight main libraries in five out of 24 regions in Ukraine have been trained in the software use and are ready to launch the system. The main libraries in another five regions will receive training and start deploying the system by the end of 2015 (the Beyond Access program in Philippines has also begun to use this system for its libraries).

2.2. An online reporting system (ORS) for Ukraine public libraries

The Ukrainian public library system consists of about 18,000 libraries that report to the Ministry of Culture. Their reporting mechanism was time-consuming and did not provide accurate or relevant data on the performance of libraries across the country. Once a year, each library sent a report to the library at the next step up in the hierarchy: village and town branches reported to the central library, which in turn reported to the regional library, which sent in the data to the Ministry. The data were consolidated at each reporting level on up to the Ministry of Culture, which only received aggregated numbers. There was no reliable way to verify the numbers that traveled through this route. The processing of the paper reports through to public release took approximately 12 months, so even that information was outdated and the indicators included in the system reflected the traditional functions of libraries and captured mostly data on library collections and book circulation.

From the beginning of the grant, the Bibliomist program launched its own electronic reporting system to monitor the performance of the modernized libraries. The system included a basic set of performance indicators, including public access, computer use, community events, training and workshops and projects implemented in partnership. Participating libraries reported to Bibliomist on a monthly basis, on time, and with a 100 percent response rate. Bibliomist received data instantly; all the numbers were calculated automatically and verified through the logical relationship of the indicators incorporated within the system. After two years of testing its e-reporting, Bibliomist proposed to the Ministry a transition to online reporting, to provide recent, relevant and accurate information about libraries and to be able to make data-driven decisions. The pressure to have high-quality data is especially high because of tightening budgets and the need to accurately assess the performance of each library before any decisions on closure, mergers or re-structuring within the centralized library system.

After more than two years of negotiation, the ORS for the Ministry was introduced as a new approach to library statistics. In addition to a number of new indicators introduced (IT training, webinars, outreach activities, computer use, high-tech equipment breakdown and training for librarians), ORS offers two major principles of reporting that are innovative for the Ukrainian library system. First is the principle of institutional responsibility: each library reports for itself by inputting the information about its performance. Now upper-level libraries are free to either aggregate the data or see the data on each branch separately. The aggregation is done automatically by any user of the system who has appropriate rights. That relieves pressure on central libraries at each level and saves time taken in calculating sums and inputting data for all subordinate libraries. The data on each individual library can be accessed even from the very top level of the Ministry. The second principle is that library statistics are for use, not for collecting and storing. As information becomes available in real time, and simple technologies provide multiple opportunities for data analysis (comparing and contrasting, identifying outliers and deviants, tracing trends or producing graphs and charts) cultural organizations at each level can present data in a comprehensive and meaningful way to different audiences and use the data to shape library policies. For a library system as large as that in the Ukraine, having manageable data and tools to study and track the performance of libraries is extremely important.

Bibliomist promotes the system among the library community and government officials to achieve higher impact and both constituencies participate in training on using the system. Librarians discuss with decision-makers the relevance of existing indicators, the best ways to present the work of libraries to communities, solutions to enhance performance, as well as possibilities for deeper analysis and efficient response to issues indicated by library statistics. Having all the data to hand, the Ministry is now able to react faster to local government requests to approve library closures and to justify decisions and provide solutions and recommendations based on accurate information.

Some lessons learned by the project team were: first, transition is a collaborative process that develops slowly and requires detailed negotiation and explanation of every innovation, even a minor one, to different partners with a different strategy. The regions and the center responded to the idea of e-reporting very differently: at the national-level library professionals were less enthusiastic, while people in the regions were eager for change to relieve the burden of collecting useless statistics. The arguments in favor of e-reporting had to be customized and adjusted to address the challenges seen by each kind of partner. In the end, Bibliomist used the most enthusiastic partners to persuade the hesitant ones. Second, because transition relies heavily on collaboration, it opens an opportunity for those involved to streamline communication and develop workable principles of cooperation. Regional libraries and departments of culture have a chance to discuss each other’s responsibilities for collecting and using library data. Third, when librarians and officials see how they can use statistics they become interested in collecting more rather than fewer indicators. The ORS requires libraries to submit information that was not collected in earlier official reporting. However, the libraries agreed to refer to their internal tracking logs and other sources to provide accurate data for the new indicators, because the new ones give them a chance to demonstrate their work better and present their institutions as more modern.

2.3. Sustainable assessment in Romania

We may sometimes get the impression that performance measurement and impact assessment are valuable in their own right. Perhaps because the Romania team was not embedded within the public library system of their country, they had a different perspective. Now that their country grant has concluded it will be interesting to see how effective their sustainability strategy proves to be.

When the Romanian team (known as Biblionet) planned its impact planning and assessment work, they focussed on sustainability from the outset. What type of tools and processes would be easy to use in a cost effective way after the program closed? Would the librarians be willing to use them in the long term to prove the effectiveness of public libraries activities? The team always tried to collect data with a purpose – to be used in communication and policy change, so that program impact evidence was linked to the real world. They tried to avoid communicating complex statistics that were hard to understand and with little relevance to the stakeholders (see Chiranov, 2014). Two of their approaches to assembling usable impact evidence are outlined below.

2.4. Pop-up surveys in Romania

Pop-up user surveys are not new but they are seldom used systematically in gathering impact evidence and hardly at all until recently when assessing the impact of public libraries. A full description of the pop-up surveys approach that the team used can be found in Chiranov (2011). Basically they asked what would be a good tool for librarians in the long term, and with little or no running costs? An online system appeared to be the natural choice and they started to use it February 2010. Working with the tool for more than four years gave them time to refine it, and to build up experience of using it in practice.

Basically the tool consists of two pieces of software: one installed in a central server and the other in each of the donated computers (around 10,000 in Romania). In this way, the team can see where specific responses to the ever changing questions come from. It is also possible to direct a particular question to a single commune in Romania (c. four computers) or to a group of localities, and to post each new set of questions in different libraries, allowing many question to be asked without creating survey fatigue. The system is managed through a dashboard accessible on the internet, where it is possible to post questions, search for results and export data in Excel or SPSS formats for further processing.

The system has definite strong and weak points. Some of the positive points are cost efficiency (a traditional quantitative survey with a margin of error of ±3.5 percent would cost around US$35,000 and might take over eight weeks for survey design, terms of reference preparation, tendering, contracting, implementing and reporting), operation speed (depending on the season they obtained 3,000-5,000 answers in about two-three weeks) and flexibility. On the weak side, the evaluators do not know whether many people offer multiple answers to questions, or if someone likes or dislikes the librarian and responds at that level, and it is not easy to quantify the margin of error. However, the team did some triangulation and found a good comparability with other data sources. In general, even people who were critical of this method agreed that it is probably at least as useful as a questionnaire survey of library patrons.

The team used the pop-up survey data to:

  • Understand computer usage.

  • Gain an insight into patrons’ preferred information and services. In 2011, this helped the team to create a list of more than 70 topics and subtopics which were of interest (e.g. topic – labor market; subtopics – write a CV, prepare for an interview, conduct job search using the internet, find domestic jobs or find overseas jobs). When the Common Impact Measurement System was created by GL in 2013, it contained 49 mandatory metrics, which overlapped 95 percent with the Romania program topics and subtopics.

  • Prepare materials for communication and advocacy purposes. Probably the most important example was the campaign to write yearly letters to 1,400 county councilors in Romania, described in Al et al., 2015.

2.5. Using agricultural subsidies support data in Romania

The Romania team took an opportunity to help farmers to fill in applications for online agricultural subsidies online in the public library. This was a quadruple win combination:

  • the farmers saved time and money by completing the applications in their home town or locally;

  • the libraries attracted new people, were better able to retain library users and offered a new service to the community;

  • the mayors helped farmers and coordinated a new community service; and

  • the agricultural agency managing the subsidies had fewer visitors to their offices, resulting in increased efficiency.

As reported earlier in Paper 3, in four years Romanian public libraries helped more than 116,000 farmers submit applications for farming subsidies amounting to $205 million. More than 230,000 working days were saved because farmers filled out the applications in their hometown or close to home (on average the saving was two days per person, but some farmers saved as many as five days). Assuming 200 working days in a year, this amounts to a staggering 1,150 years saved, as well as a saving in transportation costs of about $1.25 million. In 2012, agricultural officials trained more than 1,000 librarians and 150 city hall employees to understand and be able to help farmers to fill out subsidy application forms.

When presenting the data, the team did not emphasize how many farmers secured their subsidies following library assistance, because this would look too much like traditional library statistics showing the numbers of visits to the library, which has limited influence on public administrators. Instead, and with the help of the agricultural agency managing the subsidies, they estimated the average amount a farmer received and obtained the total number of farmers helped in a particular county. In their letters to councilors they then emphasized the role of county library managers and county coordinators in attracting “agricultural subsidies of about X million euros, for the farmers who applied through the public libraries in your county” (these figure tended to be under-reported but even so sometimes reached more than 2-5 million euros) (see also Streatfield et al., 2012).

3. Meeting some evaluation challenges

3.1. Return on investment (RoI) in Ukraine

In the current difficult global economic environment it is no surprise that there is growing interest in being able to demonstrate to politicians and others the RoI that public libraries can provide. Building on the pioneering US work in applying the contingent valuation approach to public libraries, albeit at a local level (Holt et al., 1996), the Latvia GL team conducted an RoI study for the whole country (Paberza, 2010), again using contingent valuation (for reviews of the strengths and limitations of RoI studies of relevance to public libraries see Aabø, 2009; BOP Consulting, 2014; Ashley and Niblett, 2015).

Another attempt to engage with this thorny issue is described below. The rationale for undertaking the RoI study is outlined in Cottrill et al., 2015.

To calculate the RoI index the Ukraine team adapted the formula used by businesses as follows:

(Equation 1)

B (benefits) is the value for patrons from using the library service; I (investments) is the cost incurred by the library to organize the service. The cost of a service is easy to calculate: sum up all library expenses related to a given service (staff salaries, premises maintenance, utilities, equipment appreciation, expendable materials, internet, etc.) and be sure to proportionately assess all expenses for the time spent to deliver the service. The benefit from a service is estimated by calculating what expenses the patrons would have incurred if they had used the same service offered locally on a paid basis. This method of using alternative value requires the library to assess the availability of similar services and find out the fees that the providers charge. The amount equated to the benefit is an average cost of three similar services available in the area served by the library. If a similar service is not available in the area, the cost of travel to the nearest service provider is also accounted. For example, the patrons’ benefit from Skype-calls out of the library is the money saved on phone call charges; the benefit from IT training in the library is the average price of IT courses offered by private companies in the relevant location. By “similar” we mean a comparable quality: for instance, similar teacher-to-student ratio, agenda, material taught and curriculum covered.

The advantage of such an approach to calculating RoI is that libraries have all the data they need at hand: the library budget and the number of people who actually use the service. There are five different types of data that libraries need in order to calculate RoI: total library operating costs for the previous year; statistical data on services provided to library patrons; a fair market value of each service based on expenses patrons would have incurred if they had to pay for a comparable service; measurements for the total area of the library; and total library working hours and operating time during the year. Libraries only need to do the research on the available services; the method for calculating the index is straightforward. The RoI can be calculated for separate library services, for separate library departments or for the library as a whole.

The RoI index tells you if the service is economically efficient and what can be adjusted to raise the efficiency of service provision. If the RoI value is more than 0, it means that the library spends less on providing the service than the benefit to the users. RoI has negative value if expenses on a service exceed the benefits for patrons (i.e. the result in the formula above is less than 0) – the higher the RoI value the better. This method captures the financial benefits that patrons receive and provides a basis for comparison between libraries or library services. A city library in Ukraine calculated the RoI for its most popular services, including IT courses and photography workshops. Both of these calculations were positive: photography courses showed a high return (1,300 percent), where for every unit of currency spent by the library to provide this service the users received 13 units of benefit; IT courses also showed an acceptable return (257 percent RoI), with each unit providing 2.57 units of benefit to the community. RoI data can be a powerful tool to prove the relevance and importance of libraries, showing concrete financial benefits of modern library services to communities.

The RoI formula is a useful management tool because it shows what exactly goes into each service – the kinds of expenditure; the scope of the service; its share in the library operating time, staff and level of use. Calculating the RoI index for each service separately allows library management to compare the economic feasibility of various kinds of modern library services and to identify the factors that make certain services more or less profitable. With data on RoI for different services and for the library as a whole, managers and directors will better understand how to make economically grounded decisions on funding of library facilities and how to attract additional resources. By analyzing the cost efficiency of different services, one can see which are more cost efficient and probably most relevant to patrons, and show how services can be streamlined or enhanced by lowering costs. Ultimately, library management has reliable RoI data to make decisions on the number and assortment of services to offer.

If you are considering this approach, it is important to establish reliable procedures for data collection to accurately determine the value of library services. Data collection from libraries may require a great deal of supervision and time. The buy-in of librarians is crucial, because the method requires data that are sensitive (salaries, for example). It is important that directors of libraries work with other library staff when calculating RoI. Only library managers may have full access to a library’s financial data; to persuade them to release it they need to understand how the data will be used and what benefits their library can receive from knowing the results. Getting out of libraries and surveying the services available from competitors is an essential part of evaluating library services. Fair comparison of the quality of services offered by the library and those from other institutions when determining an alternative value of services, makes the calculations more accurate and the RoI index more convincing.

Information on how Ukraine used the results of their RoI work is given in Cottrill et al., 2015.

3.2. The public image of Polish libraries in print media

Much of the GL impact assessment effort is focussed on gathering evidence from users of public access computers about the effects of enhanced information access on their lives. The performance of libraries in delivering services is important, as is user satisfaction with those services, but their satisfaction is shaped not only by the actual quality of services offered, but also the public image of libraries. And what in its turn shapes the public image of libraries? The shapers are, among others of course, the media. The Polish team looked at media responses which are likely to influence public perceptions.

The Polish team decided to analyze how the media depict libraries. As the basis of their analysis they used a sample of press articles devoted to libraries. They focussed on print media because of their greater accessibility (thanks to press monitoring services) and greater ease of analysis. The study was carried out by external researchers who specialize in qualitative analysis of media content.

The first analysis was performed in 2011: this compared the image of public libraries in 2008 before the first wave of the Polish program implementation, with the image in 2011 at the end of that phase. The researchers drew a sample of 250 articles (mostly local press) for each period and conducted the qualitative analysis. The study did not apply predefined categories but drew these out from the material that was analyzed. It is worth adding that the program implementation involved not only direct support to the libraries, but also a communication through a media and social campaign, including TV spots. Therefore any eventual shift in the library image between 2008 and 2011 could have been influenced by this factor.

And the shift occurred. In 2008, the media had presented libraries as a somehow forgotten institution – underfinanced, passive and no longer relevant. Libraries were seen as a burden and as the problem to be solved, not as an opportunity. By 2011 libraries were seen as a place of modernization where technology and funds had begun to stream in – a place of change.

But one important element was still missing from this picture. The library was seen as a place of technological and infrastructural modernization, but in the media materials there was little recognition that many people used them. Things had changed, the computers and internet had arrived (with much local media coverage), walls were being painted, funds assigned by local councils, but even so there was little evidence of new community life in libraries.

The researchers scrutinized another set of 250 articles for a second analysis in 2013, using the same method (Halawa and Wróbel, 2013). This confirmed the change that the library image had undergone between 2008 and 2011 (from need to modernization) rather than revealing new developments, although some new motifs did appear, the most important of which was seeing the library as “a place where kids can have a good time.”

Although the team expected that any change in image would not be as decisive as that which had happened between 2008 and 2011 (because they did not have the means to carry out another social campaign that could have affected the image of libraries) they decided to enrich the study with six case studies. Six of the most vibrant articles were selected and both the local librarians and journalists were contacted by researchers. These cases were treated as good practice examples that could give insights for other librarians on how to work with media representatives to better shape the library image at local level. On the basis of these six stories the team worked out concise guidelines for local librarians on how to work with the media, which later served as a basis for a series of webinars for librarians on this topic.

3.3. Assessment of planning efforts undertaken by Polish public library managers

Another vital area of change in the context of the GL initiative is of course change in public library managers’ actions and perceptions of their own role. The next two contributions focus on assessing librarians as planners and then on looking for changes in their perception of library management and library development.

The Polish team started their program with the premise that no activity goes without planning and this did not just apply to their own work. During a series of workshops, librarians participating in the program were obliged to prepare three-years work-plans, mini-strategies encompassing analysis of their current situation (resources, needs, partners, etc.), aims to be achieved and tasks to be carried out to achieve those aims.

The team decided to study the librarians’ plans in detail to determine two things: how the planning process was carried out and how advanced the librarians were in their planning skills (most had not previously been asked to prepare plans); what activities they planned; and in what direction the spectrum of services offered by the libraries is evolving (one of the key objectives of the program was to expand library services beyond book-oriented activities).

An external researcher drew a sample of 100 plans (out of 419) for analysis (Stec, 2011) after the first round of the program was implemented. The analysis employed a mix of quantitative and qualitative methods, where two types of predefined categories were applied:

  1. Elements of plans: these covered SWOT analysis and typical matters mentioned; local diagnosis and its conduct (between workshops librarians carried out small needs assessment studies in their communities); vision, mission, aims and typical issues identified; typical groups of customers; involvement of technology; prospective financing; and plans for monitoring and evaluation. All those topics where first described in quantitative terms and then in-depth qualitative analysis followed. This allowed the team to see how the library planning was carried out.

  2. New types of activities: planned by the libraries. Here categories were based on the types of specialist training offered to librarians (eight such courses were offered, from e-administration to working with disabled citizens), and here also quantitative analyses were followed by qualitative in-depth study. This part of the research allowed the team to see if the scope of services offered by the libraries had broadened in the direction of non-traditional services.

On the basis of the results of this study, the team decided to introduce changes during the second round of their program. The new batch of libraries was offered a template for preparing their plans, so the whole process became more structured, which the study had shown to be necessary.

Another survey was conducted in 2013 to see if the changes had produced results. In this second study 50 new plans and 30 upgraded plans from the first group (these librarians were offered an opportunity to refine their plans) were analyzed. The most important finding was that, although the new template allowed weaker libraries to prepare better and more thought-out plans, it also limited the creativity of those who were more advanced by forcing them to fit into predefined rubrics. Therefore the overall plans become more similar and the differences between libraries grew smaller.

3.4. Involving non-profit organizations in library research in Poland

How can the worlds of strategic library development and professional impact assessment be connected to the local communities within which public librarians work? The Polish team chose an innovative approach to this question.

From the outset in 2008, the Polish program was directed toward small communities, especially in rural areas. The team knew that they would have to really understand the local realities that they would meet on the ground. They could have used professional researchers, but these often did not know the local context that was crucial for the team, so they would have had to devote additional time and resources to get acquainted with local conditions. They could have asked the librarians to perform this task, but the team was unsure about librarians’ engagement, their readiness to undertake such new activities or how well they knew the local context.

Instead, the team decided to employ a network of local non-profit organizations, which had both the knowledge of local context and willingness to engage in such a task. The team had access to a network of such organizations, which was set up several years earlier by the Polish-American Freedom Foundation, the Polish co-donor for the program. The fact that these organizations were connected in one network allowed for a smooth and methodologically uniform execution of the planned research.

Apart from the knowledge of the local context and the organizational capacity, the team also needed a third element – sound research leadership of the planned study. Accordingly, they engaged a team of researchers from the University of Warsaw to oversee the whole process. The researchers designed the case studies (the list of activities to be carried out: interviews to be conducted and information to be collected; a set of tools to carry out all activates; a templates for reporting) and oversaw its implementation by offering on-going methodological support, and then prepared the research report (Rogaczewska, 2013).

The important findings included things related to the knowledge of local communities that the team could not have learnt otherwise. Non-profit organizations knew the local context well enough that they could verify information obtained from librarians and other actors; they could assess the actual role and position of libraries in the communities – going beyond the verbal declarations of both librarians and other people interviewed (who often wanted to be supportive about the library). In this way the study allowed the team to tailor support for libraries much more precisely to their actual needs.

4. Adapting evaluation tools to the public library context

4.1. Modified Delphi forecasting to assess the changing perceptions of library managers in Lithuania

The Lithuania “Libraries for Innovation 2” project is seeking to enhance the perceptions of library managers about their roles and about how they see public libraries developing in future. As part of the project assessment, which will focus on managerial behavior and attitudes, the project team decided to use an innovative (for public library assessment) survey method based on Delphi forecasting, which was originally designed to build up a consensus among experts on more or less likely future events.

The Modified Delphi approach seeks to obtain perceptions of, in this case library managers, about the likelihood and desirability of various changes happening in relation to public libraries. By re-administering the instrument near the end of the project, the project team hopes to identify changes in people’s perceptions that may have been influenced by the project. The method has already provided the team with a consensus view on some of the issues to be faced in public library development in Lithuania.

To develop and apply this survey instrument the team has so far:

  • Run two structured focus groups for library managers, to identify their hopes, expectations and concerns about public library development. They were asked about their library vision, problems, where libraries are going and what respondents wanted to achieve, what activities they planned to implement in the near future, what challenges they expect to encounter; what main changes to public libraries they expect in future and what they see as the main threats.

  • Translated the key hopes, expectations and concerns into 26 forecasting statements set in the year 2022, and provided scales against which respondents can.

  • Record their level of agreement or disagreement with the propositions.

  • Record their attitude to each one.

  • Invited the managers of all Lithuania’s public library services to respond to the survey instrument by checking points on Likert scales for each proposition.

The e-version of this instrument signals to respondents if their perceptions of any proposition are markedly different from those of the majority of replies, in which case they are asked to amplify their reply (giving the team a more nuanced picture of these respondents’ views). Finally, respondents were asked to identify which propositions will be the most important, assuming that they have happened. A further survey using the same propositions will be conducted near the end of the program to see whether the views of the library managers who took part in the program have changed in the light of experience (compared with those who did not participate):

  • This approach combines elements of qualitative and quantitative evaluation but is essentially intended to provide an in-depth qualitative picture of change over time. This method may help evaluators looking at change in perceptions of a target group. Modified Delphi helps not just to collect data but also to engage with the target group to learn about their public library development vision and ways of thinking and to see whether these change over time.

  • It helps to reduce costs and time. For example, it would be possible to engage with library managers by designing and facilitating structured focus groups, but to do this on the scale required would take much more time and money in conducting the events in different locations, recording the results and analyzing the findings than using this method, through which much of the data collection and preliminary analysis can be done electronically.

  • If you want to capture perceptions of the whole target group in appropriate depth, and if this task is too big to be pursued through focus groups, Modified Delphi may provide a good solution.

There are several challenges with using the tool:

  • There may be significant changes affecting public libraries, that emerge during the period between the first and second application of the instrument, which are not anticipated in the initial set of propositions. If so, additional propositions can be introduced in the second round but there will be no basis for comparison with the previous round.

  • If the target group of respondents is not self-critical, there is a risk that responses may be too positive in the initial round. To help address this problem, the library managers were actively encouraged to be realistic rather than over-optimistic.

  • Respondents may not engage with the propositions or may treat the task flippantly. To overcome this, the overall approach is deliberately different to a normal survey, with propositions rather than questions (which the team hopes will be intrinsically more interesting); and the library managers were approached as experts on public libraries rather than as survey respondents. This point is reinforced by sharing the overall group response to propositions with each respondent when they have given their judgement.

  • At present this form of forecasting instrument is not supported by open source software (in particular, the key component which alerts respondents if they have offered a minority judgement and asks for amplification, is not a standard part of most survey tools), but efforts are being made to overcome this difficulty.

Although the Lithuania team used this intrument with library managers, the approach could also be applied to partner organizations, policy shapers and other decision-makers who can influence public library development. This instrument can be used to build up a consensus about future developments among experts (in this case library managers); it can also be used to study changes in perception over time – The Libraries for Innovation 2 Team are doing both.

The team collected data from 56 library managers (out of 65) and 28 of these were then approached to seek amplification of “divergent” responses to various propositions; 23 people have so far done this. (The Lithuania team was not able to use the full e-version of the instrument). A report on this work is now available.

4.2. Impact studies in virtual environments in Chile

The final contribution looks at the changing world within which public libraries must continue to change.

We are immersed in a world that is increasingly linked through virtual platforms, where young people play online videos to others around the world; a world where readers can download free books in various digital libraries; and where we communicate with our family and friends through Facebook, Skype or other applications. This is a world where we organize locally and call each other together using Twitter; in which we can share content and information every second; a world that is becoming a liquid world, with liquid relations, as Zygmunt Bauman (2011) said.

Libraries are part of this context – part of the way of relating with their communities through digital and virtual environments. But how do we address these contexts when making an impact assessment? What can we measure and evaluate in these contexts?

Four years ago at the National Public Library System, these questions led the team to design an evaluation that considered these environments. To do this, Chile’s BiblioRedes program used social network analysis (SNA) as part of the multi-method evaluation of their project Services and Contents for the Digital Inclusion of Local Communities in Chile. The main objective of this project was to improve their services to help reduce the digital divide for libraries and their users. As part of their work, they developed a Community of Local Contents online platform for users to share information through photographs, videos, articles and other content around a variety of topics such as architecture, archaeology, literature, music, traditional and visual arts. The evaluation used several methods including focus groups and interviews. SNA was also used to analyze the online platform’s network of users via Twitter.

SNA is a tool that can be used to analyze complex relationships between members of a social system (Marin and Wellman, 2011). The social structure is made up of entities such as individuals, groups and organizations, which are referred to as nodes. These nodes are connected by links or ties to one another. SNA can be used for a variety of purpose, such as to define network patterns (i.e. patterns of relationships) based on type and frequency of interaction; follow the path that information takes within a network system; and understand how to improve the effectiveness of a given network. Within library impact work, this method can be useful to determine how library patrons and other stakeholders use and share information, tools and resources, which can help libraries continue to meet the needs of people in the community.

Findings from the SNA allowed the team to characterize the online platform’s community of users, both in terms of who the members are and what degree of influence they have within the platform’s overall network (Figure 1) (Andrade Blanco, 2013). They found that the overall network of users consisted of three main subnetworks, with members associated with Chile’s public libraries (dark-purple in online version of this journal), cultural groups or institutions such as museums (medium – blue/aqua) and other private/public entities in Chile (light-green). The findings indicated that the most influential members within the Community of Local Contents included representatives from the media, cultural arts ministry and Chile’ national digital library.

By using SNA, the impact team was able identify the platform’s community of users. These findings informed the project evaluation by helping the team see how different types of users, such as institutions and media representatives, connect and share information within the online community. The Community of Local Contents provided users with a space to share information virtually. The team was also better able to understand how often registered users accessed the platform and what information they found most useful. Ultimately, this information can be used to help determine what kinds of resources and information are most useful, and how public libraries can continue to address user needs.

Today there are many tools for SNA, you can use statistical data in Facebook or Twitter, as well as combining quantitative with qualitative methods such as virtual ethnography, and through such work understand the relationships between the user, communities and institutions[1].

5. Conclusions

How well do the approaches to impact assessment described here meet the likely future challenges to the existence and development of public libraries?

Recent Pew Research Center (2015) draws particular attention to evolving perceptions of the internet by people in emerging and developing countries, raising questions about the capacity of public libraries in these countries to redefine their role sufficiently rapidly to remain relevant. In order to be able to redefine their roles and adapt services, public libraries need manageable data showing evidence of the levels of performance and impact of their basic services as well as of new services. The Ukraine examples of enhanced data processing and reporting show a way forward here, providing that there is a strong enough professional infrastructure in place to support this work nationally. Where such support cannot be assumed, tools such as pop-up surveys can help, but a strategic approach to exploiting library development opportunities will also be needed, as exemplified by the evaluation of the new service for farmers in Romania.

GL concentrates its evaluation strongly on the impact of public access to computers and the internet on the lives of users, but this is not the only important evaluation dimension. Policy-makers are increasingly concerned about value for money in service provision: the Ukraine alternative value approach shows that libraries need not be restricted to hypothetically-based contingent valuation. The three Polish case studies highlight other evaluation concerns: about the image of public libraries; the adaptability of library managers as shown in their planning efforts (also explored from a different perspective in the Lithuania use of Modified Delphi forecasting); and formation of partnerships to secure effective library evaluation. All of these approaches should be readily adaptable for other national public library evaluation efforts focussed on RoI, media representation of public libraries, change produced by training, collaborative evaluation or similar issues.

Public library evaluation (and evaluation of other kinds of libraries) is not a distinct discipline. Most of the ideas used to evaluate libraries have been borrowed from other disciplines such as engineering (by management consultants reapplying the input, process, output and outcome model), educational evaluation or, more recently, from international development evaluation. Delphi forecasting was originated for military purposes by the Pentagon; the modified version used in Lithuania could be applied to any public library service or to national public library evaluation. Similarly, SNA as used in Chile is a widely recognized social research method, and one which should be valuable in providing guidance for redefining public library strategies and services if the requisite professional competence can be secured. The need here is for library leaders to develop dialogue with social science researchers to explore potentially useful ways of engaging in effective public library evaluation in the context of rapid and continuing change.


               Figure 1
             
               Community of local contents overall network

Figure 1

Community of local contents overall network

Note

Lexi Perreras and Brandi Gilbert contributed to this part of the paper.

Corresponding author

David Streatfield can be contacted at: streatfield@blueyonder.co.uk

References

Aabø, S. (2009), “Libraries and return on investment (ROI): a meta-analysis”, New Library World , Vol. 110 Nos 7/8, pp. 311-324, available at: http://dx.doi.org/10.1108/03074800910975142 (accessed September 22, 2015).

Al, U. , Andrade Blanco, P. , Chiranov, M. , Cruz Silva, L.M. , Devetakova, L.N. , Dewata, Y. , Dryžaite, I. , Farquharson, F. , Kochanowicz, M. , Liubyva, T. , López Naranjo, A. , Quynh Truc Phan , Ralebipi-Simela, R. , Soydal, I. , Streatfield, D.R. , Taolo, R. , Tâm Thi Thanh Trân , Tkachuk, Y. (2015), “Global libraries impact planning and assessment progress”, Performance Measurement and Metrics , Vol. 16 No. 2.

Andrade Blanco, P. (2013), “Evaluación de impacto contenidos para la inclusión digital en comunidades locales – Programa BiblioRedes”, paper presented at IFLA World Library and Information Congress, August, 17-23.

Ashley, B. and Niblett, V. (2015), “Researching the economic contribution of public libraries”, Evidence Based Library and Information Practice , Vol. 9 No. 4, pp. 86-91, available at: https://ejournals.library.ualberta.ca/index.php/EBLIP/article/view/23379 (accessed September 22, 2015).

Bauman, Z. (2011), Culture in a Liquid Modern World , Polity Press, Cambridge.

BOP Consulting (2014), Evidence Review of the Economic Contribution of Libraries , Arts Council England, London.

Chiranov, M. (2011), “Applying pop-up survey software to incorporate users’ feedback into public library computing service management”, Performance Measurement and Metrics , Vol. 12 No. 1, pp. 50-65.

Chiranov, M. (2014), “Creating measurement addiction: a tool for better advocacy and improved management”, Performance Measurement and Metrics , Vol. 15 No. 3, pp. 99-111.

Cottrill, J. , Fernando Letelier, F. , Andrade Blanco, P. , García, H. , Chiranov, M. , Tkachuk, Y. , Liubyva, T. , Crocker, R. , Vanderwerff, M. , Čistovienė, G. , Krauls-Ward, I. , Stratilatovas, I. , Mount, D. , Kurutyte, A. and Triyono (2015), “From impact to advocacy: working together toward public library sustainability”, Performance Measurement and Metrics , Vol. 16 No. 2.

Halawa, M. and Wróbel, P. (2013), “Analiza wizerunku bibliotek w 2013, przemiany wizerunku bibliotek 2008-2013 oraz uwagi o komunikacji bibliotek z mediami”, available at: www.biblioteki.org/repository/PLIKI/DOKUMENTY/RAPORTY/Raport_z_badan_terenowych_przeprowadzonych_przez_LOG_w_58_gminach.pdf (accessed September 22, 2015).

Holt, G.E. , Elliott, D. and Dussold, C. (1996), “A framework for evaluating public investment in urban libraries”, Bottom Line , Vol. 9 No. 4, pp. 4-13.

Marin, A. and Wellman, B. (2011), “Social network analysis: an introduction”, in Scott, J. and Carrington, P.J. (Eds), The SAGE Handbook of Social Network Analysis , Sage, London, pp. 11-25.

Paberza, K. (2010), “Towards an assessment of public library value: statistics on the policymakers’ agenda”, Performance Measurement and Metrics , Vol. 11 No. 1, pp. 83-92.

Pew Research Center (2015), Internet Seen As Positive Influence on Education but Negative Influence on Morality in Emerging and Developing Nations , Pew Research Center, Washinton, DC, available at: www.pewglobal.org/2015/03/19/ (accessed September 22, 2015).

Paley, J. , Cottrill, J. , Errecart, K. , White, A. , Schaden, C. , Schrag, T. , Douglas, R. , Tahmassebi, B. , Crocker, R. , Streatfield, D.R. (2015), “The evolution of global libraries ' performance measurement and impact assessment systems’ agenda”, Performance Measurement and Metrics , Vol. 16 No. 2.

Rogaczewska, M. (2013), “Raport z badań terenowych przeprowadzonych przez Lokalne Organizacje Grantowe w 58 gminach”, available at: www.biblioteki.org/repository/PLIKI/DOKUMENTY/RAPORTY/Raport_z_badan_terenowych_przeprowadzonych_przez_LOG_w_58_gminach.pdf (accessed September 22, 2015).

Stec, M. (2011), “Analiza planów rozwoju bibliotek przygotowanych przez bibliotekarki po szkoleniach”, available at: www.biblioteki.org/repository/PLIKI/DOKUMENTY/RAPORTY/02_analiza_planow-rozwoju_bibliotek_raport_PRB.pdf (accessed September 22, 2015).

Streatfield, D.R. , Paberza, K. , Lipeikaite, U. , Chiranov, M. , Devetakova, L. and Sadunisvili, R. (2012), “Developing impact planning and assessment at national level: addressing some issues”, Performance Measurement and Metrics , Vol. 13 No. 1, pp. 58-65.

Further reading

Fredericks, K. and Carman, J. (2013), “Using social network analysis in evaluation”, avialable at: www.rwjf.org/content/dam/farm/reports/reports/2013/rwjf409808 (accessed September 22, 2015).

IREX (2014), “Data Giraffe software” available at: www.irex.org/news/innovative-software-reveals-new-insight-program-impact (accessed September 22, 2015).

Acknowledgements

© Authors. Published by Emerald Group Publishing Limited. This article is published under the Creative Commons Attribution (CC BY 3.0) licence. Anyone may reproduce, distribute, translate and create derivative works of this article (for both commercial & non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/3.0/legalcode

Funded by the Bill and Melinda Gates Foundation.

Related articles