CitationDownload as .RIS
Emerald Group Publishing Limited
New & Noteworthy
Article Type: New & Noteworthy From: Library Hi Tech News, Volume 31, Issue 6
Urban libraries council launches edge initiative for public libraries
The Urban Libraries Council (ULC), founded in 1971, is a membership association of leading public library systems in the USA and Canada. While ULC libraries primarily represent urban and suburban settings, lessons from their work are widely adapted by libraries of all sizes, including those in rural settings.
The Edge Initiative was developed by a national coalition of leading library and local government organizations, funded by the Bill and Melinda Gates Foundation, and led by the Urban Libraries Council. In January 2014, ULC launched Edge, a leadership and management tool which will help library leaders align the library's technology services to the needs of the community and communicate the library's value to community leaders. As of April 2014, > 1,700 public libraries are using Edge to explore the strategic needs of their community and align their public technology services with critical community priorities. Recognizing that communities thrive when people have opportunities to enrich and improve their lives through open access to information, communication and technology services, Edge enables public libraries to assess their current technology, identify areas of excellence and strengthen policies, practices and resources.
Edge provides a tool set that helps library staff with strategic planning and resource allocation while strengthening communications with local leaders. By using Edge, libraries are able to demonstrate how their work adds value and supports strong outcomes for the community. Rashad Young, City Manager of Alexandria, Virginia, said:
Our police and fire departments, our emergency communications department and our recreation department all participate in accreditation programs. Like these departments, libraries need to go through an evaluation process that takes national standards, benchmarks, outcomes and measures to define the service that they provide.
Being City Manager, it really does make a difference when our departments have used professional tools like Edge to say here is what the best practices say about where we should be and how we should be delivering services, here is how we compare against these benchmarks and here is what we need to do to meet or exceed these best practices.
In today's public libraries, patrons attend technology training and use public computers and Internet access to apply for jobs, complete college applications, advance skills through professional certification programs, get homework help, receive literacy training, access government, financial, health information and much more. "Edge is not just about providing high quality technology, it is also about knowing your community and the types of technology programs that your community needs", said Anne Masters, Library Director of Pioneer Library System in Oklahoma:
The Pioneer Public Library is conducting a system-wide community assessment survey in preparation for a strategic planning process. We will integrate our Edge results into our strategic plan and discuss it with our city government and community planning group.
The Edge Initiative was developed by a coalition of leading library and local government organizations with funding from the Bill & Melinda Gates Foundation and led by the Urban Libraries Council. "We're pleased to offer Edge to libraries across the USA – large and small, urban, suburban and rural – to help them continuously plan and work with key stakeholders", stated Susan Benton, President & CEO of the Urban Libraries Council. "By connecting public libraries with the strategic goals and outcomes of local leaders, Edge is making a difference in communities across the country".
To learn more about the edge initiative or to sign up to become an edge library, visit: http://www.libraryedge.org
Unizin consortium to provide common suite of digital education tools, services
Four leading US research universities have partnered to form the Unizin consortium, a university-owned and -directed consortium, which will provide a suite of services to any member institution for many forms of digital education including residential and online, services for sharing course content and comprehensive learning analytics. Unizin aims to significantly improve the tools for faculty and universities to enhance effective education.
The Unizin consortium, launched by Colorado State University, the University of Florida, Indiana University and the University of Michigan, will provide a common technological platform delivered over higher education's community-owned national research and education network operated by Internet2. This advanced environment will allow member universities to work locally and together to strengthen their traditional mission of education and research while using the most innovative digital technology available. The consortium has been launched to enable successful individual campus learning strategies to be easily expanded at scale and shared across all participating institutions. Brad Wheeler, Vice President for Information Technology and Chief Information Officer at Indiana University and co-chair of the Unizin Consortium, said:
Leading universities are investing in emerging technologies that can dramatically enhance the great value of both a residential, and a digital education. By coming together to create Unizin, IU and our partners are ensuring a cost efficient path for the best tools to serve students whether resident, online, or through education to our many alumni.
And just as a small group of universities came together nearly two decades ago to create Internet2 to serve our research mission, the founding consortium universities – with others to join soon – are creating Unizin to serve our educational mission by empowering our faculty with the best tools. Unizin is an extensible and scalable university-owned collaboration that is anchored in the deepest and best values of the academy to advance highly effective education.
For instructors, Unizin will provide powerful content storing and sharing services that give faculty greater control and options over the use of their intellectual property and digital materials. Their courses can span residential, online, badges or massive open online course (MOOC) delivery models from a single software service. For example, when a faculty member is building a syllabus, he or she can search member institutions for content that may make their course more effective. Or a faculty member can publish a lesson plan to be used by other instructors that will note the creator as the originator.
Students will benefit by gaining access to course materials from some of the best minds in higher education in formats that best serve their individual needs – from MOOCs and flipped classrooms, where lectures are given online and class time is reserved for discussion and group work, to traditional in-person courses.
The tools and services eventually provided through Unizin also will allow partner institutions to collect and analyze large amounts of data on student performance within the policies of the member universities. These analytics will enable faculty researchers to gain valuable insight into the ways students best learn, thus shaping future approaches to teaching.
Unizin also will leverage Internet2's Trust & Identity Services and its widely subscribed national research and education InCommon Federation to provide each campus trusted access, authentication, identity management and security. Internet2's services already protect and federate online identities of over seven million people. In addition, each individual campus will address data access and privacy, per their specific campus' policies.
Discussions around the concept of Unizin began more than a year ago and resulted in a charter to enable content sharing, common software systems and greater scale in analytics in a member-directed consortium. The charter was recently signed by each of the founding partners, and each investing institution has committed $1 million for the Unizin effort over the next three years to develop and shape the shared, cloud-scale services. These combined investments will provide a more efficient path for scaled services in digital education than one-off investments by each institution. In addition to the four founding partner institutions, Unizin currently is in discussion with other prominent universities and welcomes others to join the consortium.
Unizin will operate as an unincorporated association at Internet2, a leading not-for-profit global technology organization with > 450 member institutions across the higher education, government and business communities. The Unizin platform will be delivered over the Internet2 Network, the nation's fastest research and education network. A senior executive will lead a small professional staff, located at a city to be named later. This executive leader, who will be named soon, will be employed by Internet2 and report to the consortium board comprising representatives from the founding and select participating member universities and Internet2. James Hilton, Dean of Libraries and Vice Provost for Digital Education at the University of Michigan and Co-Chair of the Unizin board, said:
The intent of Unizin is to create a national community, akin to the original founding of Internet2 that grew from a few founders to over 250 institutional members today. It is of like-minded institutions who are willing to invest time and resources into creating a service grounded in openness and collaboration that will allow all members to leverage the tremendous power of today's digital technologies.
Unizin is a university-owned service organization in support of its members, and in that spirit, we look forward to welcoming additional members to the Unizin consortium.
As part of its launch, Unizin has selected Canvas by Instructure to provide a common learning management system for use by member institutions. Canvas, a service that is also available via the Internet2 NET+ initiative, is a cloud-based technology platform that provides a wide range of functions associated with university classroom administration, including assignments, grading, student–teacher communication, collaborative learning tools and more.
Unizin members will receive access to Canvas as part of the extended Unizin platform that will span from content to analytics services. The Unizin partners selected Canvas based on their collective experience using various learning management systems and due to the Canvas team's commitment to implementing IMS global open standards and to providing most of the system as open source software. These values and partnership align well with Unizin's commitment to both speed in execution and open standards that can help further universities' missions over time.
Canvas by instructure: http://www.instructure.com/
ETD lifecycle management tools available for public review
The electronic theses and dissertations (ETD) Lifecycle Management project, funded by the Institute of Museum and Library Services, has launched a public review of the project's Lifecycle Management Tools. The Lifecycle Management Tools are a documented suite of both new and existing open-source tools for better managing ETDs. During the public review, Educopia is offering free one-on-one technical support and implementation consultations as needed through September 2014.
The Lifecycle Management Tools are a documented suite of tools to address four areas of ETD management:
1. Virus Checking: Submitted ETDs may contain viruses that could damage the entire collection if not screened in advance. Instructions are provided for using ClamAv.
2. File Format Identification: Knowledge of the formats used in an ETD can help a program determine whether an ETD submission (particularly supplemental files) adheres to program requirements and what software will be necessary to access the data in the future. Instructions are provided for using DROID, FITS, JHOVE2 and the Unix file command.
. Preservation metadata: Actions taken during curation should be recorded to track their success and failure. Instructions are provided for using the PREMIS Event Service.
4. ETD submission: Submissions systems can range from simple to complex, but at their most basic, should allow for simple upload, standardized metadata collection and facilitate ongoing ETD workflows into an institutional repository. Instructions are provided for using the ETD Drop application.
The provided manual documents the purpose of each tool, recommends where to deploy them in a workflow and explains how to use the tools.
Funded by the Institute of Museum and Library Services and led by the University of North Texas, in partnership with the Networked Digital Library of Theses & Dissertations and Educopia Institute, the ETD Lifecycle Management project is promoting best practices and improving the capacity of academic libraries to preserve ETDs for future researchers.
To receive the ETD lifecycle management tools manual, sign up at: http://metaarchive.org/imls/index.php/Request_Tools
DiRT receives Mellon grant for partnerships with commons in a box, DHCommons
The University of California, Berkeley, and the Graduate Center of the City University of New York have announced that the Mellon Foundation has generously provided a $150,000 grant to fund a partnership between the (Digital Research Tools) DiRT Directory, Commons In A Box and the DHCommons project directory that will develop APIs to link these initiatives, providing new ways for scholars and students to connect with digital research tools.
For scholars who work with digital tools and methodologies, directories like DiRT are essential guides to the broad range of digital humanities tools that are available to meet various research and pedagogical needs. DiRT provides users with the prompt "I need a digital research tool to..." and offers a variety of options, ranging from "visualize data" and "make a dynamic map" to "manage bibliographic information" and "publish and share information". After choosing an activity, the scholar is presented with a list of tools that they can narrow down based on parameters like platform and cost. When selecting a tool, scholars often weigh factors including what other projects are using the tool and which communities of expertise can provide support. Connecting the DiRT directory to DHCommons and to the Commons In A Box platform – which powers an increasing number of scholarly community hubs, such as MLA Commons and NYC Digital Humanities, will make it easier for scholars to connect with others who are using DH tools. Project Director Quinn Dombrowski (UC Berkeley) said:
Developing publicly available APIs for DiRT and DHCommons will allow us not only to contextualize tools by showing the projects that use them, but also to open up the rich data sets stored in these directories for scholarly inquiry.
Enabling people to explore digital research tools within the Commons In A Box platform and to connect with communities of practice around those tools will lower the barrier to entry for scholars who are new to digital humanities.
"The DiRT Directory is a robust resource that provides important information about DH tools", Commons In A Box Director, Matthew K. Gold added. "We think that integrating information about DiRT tools into CBOX profile pages will make it easier for users of those tools to connect with and learn from one another".
As part of this initiative, the DiRT directory (formerly Bamboo DiRT) will also be redesigned and relaunched at dirtdirectory.org in July 2014. All tool entries will be updated to use TaDiRAH taxonomy terms.
For more information and updates on this initiative: http://dirt.projectbamboo.org/development
OCLC research publishes framework for discussions about evolving scholarly record
The ways and means of scholarly inquiry are experiencing fundamental change, with consequences for scholarly communication and ultimately, the scholarly record. The boundaries of the scholarly record are both expanding and blurring, driven by changes in research practices as well as changing perceptions of the long-term value of certain forms of scholarly materials. Understanding the nature, scope and evolutionary trends of the scholarly record is an important concern in many quarters – for libraries, for publishers, for funders and, of course, for scholars themselves. Many issues are intrinsic to the scholarly record, such as preservation, citation, replicability, provenance and data curation.
To help organize and drive discussions about the evolving scholarly record, OCLC Research developed a framework that has been published in the report, The Evolving Scholarly Record. Written by Brian Lavoie, Eric Childress, Ricky Erway, Ixchel Faniel, Constance Malpas, Jennifer Schaffner and Titia van der Werf, the report provides a high-level view of the categories of material the scholarly record potentially encompasses as well as the key stakeholder roles associated with the creation, management and use of the scholarly record.
* The research process generates materials covering methods employed, evidence used and formative discussion.
* The research aftermath generates materials covering discussion, revision and reuse of scholarly outcomes.
* The scholarly record is evolving to have greater emphasis on collecting and curating context of scholarly inquiry.
* The scholarly record's stakeholder ecosystem encompasses four key roles: create, fix, collect and use.
* The stakeholder ecosystem supports thinking about how roles are reconfigured as the scholarly record evolves.
This conceptualization of the scholarly record and its stakeholder ecosystem can serve as a common point of reference in discussions within and across domains and help cultivate the shared understanding and collaborative relationships needed to identify, collect and make accessible the wide range of materials the scholarly record is evolving to include.
This work is an output of OCLC research's changes in scholarly communication activity, the goal of which is to help libraries find new ways to support their institutions' research mission, contribute to scholarly communications and align institutional collecting strategies with changes in the broader scholarly information landscape.
Read/download The Evolving Scholarly Record: http://oclc.org/research/publications/library/2014/oclcresearch-evolving-scholarly-record-2014-overview.html
HathiTrust research center announces alpha release of extracted features dataset
The HathiTrust Research Center (HTRC) has announced the alpha release of a new dataset, consisting of page-level features extracted from a quarter-million text volumes.
Features are data attributes defined in such a way that they can be identified by a computer and analyzed at scale. The HTRC Feature Extraction alpha dataset has already processed the underlying text, identifying headers and footers, rejoining hyphenated words and offering page-level details such as:
* term-frequency counts, per section (head/body/footer), per page;
* occurrences of terms as different parts of speech;
* line counts and sentence counts; and
* character counts at the start or end of lines.
Because the dataset is currently in alpha version, HTRC is looking for feedback on how data like this can help you in your research and how they can better serve the scholarly community.
Today's dataset is built on the HathiTrust's non-Google-digitized public domain volumes – that is, the original scanned representations of all the texts can be accessed through the HathiTrust. The dataset has features for 67,932,813 pages from 250,178 volumes, spanning nearly 600 years. The median date of the material is 1899, and the text is primarily English. While this alpha release originates from public domain data, this type of extracted feature dataset also provides a road map toward non-consumptive research on works not in the public domain because the features, though useful for scholarly research purposes, are not sufficient to reconstruct the text itself.
The HTRC is a collaborative research center launched jointly by Indiana University and the University of Illinois. In conjunction with the HathiTrust Digital Library, the HTRC team strives to meet the technical challenges that researchers face when dealing with massive amounts of digital text, by developing cutting-edge software tools and cyber infrastructure to enable advanced computational access to the growing digital record of human knowledge.
HTRC extracted features dataset: https://sandbox.htrc.illinois.edu/HTRC-UI-Portal2/Features
White House big data and privacy working group review and reports
Driven by the declining cost of data collection, storage and processing, fueled by new online and real-world sources of data, including sensors, cameras and geospatial technologies and analyzed using a suite of creative and powerful new methods, big data is fundamentally reshaping how Americans and people around the world live, work and communicate. It is enabling important discoveries and innovations in public safety, health care, medicine, education, energy use, agriculture and a host of other areas. But big data technologies also raise challenging questions about how best to protect privacy and other values in a world where data collection will be increasingly ubiquitous, multidimensional and permanent.
In January 2014, President Obama asked his Counselor, John Podesta, to lead a 90-day review of big data and privacy. The review was conceived as fundamentally a scoping exercise, designed to define for the President what is new about the technologies that define the big data landscape, uncover where and how big data affects public policy and the laws and norms governing privacy, to ask how and whether big data create new challenges for the principles animating the Consumer Privacy Bill of Rights embraced by the Administration in 2012 and to lay out an agenda for how government can maximize the benefits and minimize the risks of big data.
The working group – which included Commerce Secretary Pritzker, Energy Secretary Moniz, the President's Science Advisor John Holdren, the President's Economic Advisor Jeff Zients, and other Senior Administration Officials – sought public input and worked over 90 days with academic researchers and privacy advocates, regulators and the technology industry, advertisers and civil rights groups, the international community and the American public. This review was supported by a parallel effort by the President's Council of Advisors on Science and Technology to research the technological trends underpinning big data.
On May 1, Podesta and the big data working group presented their findings and recommendations to the President. The review did not set out to answer every question about big data nor was it intended to develop a comprehensive policy approach to big data. However, by evaluating the opportunities and challenges presented by big data, the working group was able to draw important conclusions and make concrete recommendations to the President for Administration attention and policy development.
The working group made six actionable policy recommendations in their report to the President:
1. Advance the Consumer Privacy Bill of Rights because consumers deserve clear, understandable, reasonable standards for how their personal information is used in the big data era.
2. Pass national data breach legislation that provides for a single national data breach standard, along the lines of the administration's 2011 Cybersecurity legislative proposal.
3. Extend privacy protections to non-US persons because privacy is a worldwide value that should be reflected in how the federal government handles personally identifiable information from non-US citizens.
4. Ensure data collected on students in school is used for educational purposes to drive better learning outcomes while protecting students against their data being shared or used inappropriately.
5. Expand technical expertise to stop discrimination because the federal government should build the technical expertise to be able to identify practices and outcomes facilitated by big data analytics that have a discriminatory impact on protected classes.
6. Amend the Electronic Communications Privacy Act to ensure the standard of protection for online, digital content is consistent with that afforded in the physical world – including by removing archaic distinctions between email left unread or over a certain age.
More about the big data review at the white house Web site: http://www.whitehouse.gov/issues/technology/big-data-review
Read/download the full report: http://www.whitehouse.gov/sites/default/files/docs/big_data_privacy_report_5.1.14_final_print.pdf
Hathitrust Library's fair-use win survives appeal
The "fair use" doctrine allows a digital library to create a full-text searchable database of copyrighted works and to make those works accessible, the second Circuit ruled today.
Beginning in 2008, major universities and libraries partnered with Google to digitize their volumes. The HathiTrust Digital Library enabled > 80 university and research libraries to store, secure and search their digital collections.
By preserving copies of books to make sure that copies exist after their copyrights expire, Hathitrust allows member libraries to create a replacement copy of the work when the library owned an original copy which was lost, destroyed or stolen and cannot get a replacement copy at a fair price. The library normally does not let users access digitized books in their entirety. Rather, it helps scholars, and others find the copyrighted books at libraries through a database that delivers titles and page numbers via keyword search. Blind or print-disabled users can get special access to the original works.
The Authors' Guild led a group of writers and advocacy groups in a lawsuit against HathiTrust and the partnering universities in 2011, alleging their distribution of digital copies of millions of copyrighted works infringed its members' copyrights. The Electronic Frontier Foundation supported Hathitrust in an amicus brief.
Read the full report at courthouse news service: http://www.courthousenews.com/2014/06/10/68607.htm
Reaction from the Authors' Guild to the decision appeared on its blog:
We received yesterday's Second Circuit decision in Authors Guild v. HathiTrust with mixed feelings. The decision was not a total victory for either side. While the Court, over our objections, allowed HathiTrust to maintain its database of digitized books in light of the present security protections, the Court was clear that any breach of that security leaves HathiTrust at risk of future litigation. … The Authors Guild remains committed to the notion that the digital revolution cannot come at the cost of authors' rights to preserve writing as a livelihood.
NISO issues Altmetrics white paper draft for comment
The National Information Standards Organization (NISO) has released a draft white paper summarizing Phase I of its Alternative Assessment Metrics (Altmetrics) Project for public comment. The Initiative was launched in July 2013, with a grant from the Alfred P. Sloan Foundation, to study, propose and develop community-based standards or recommended practices for alternative metrics. In Phase 1 of the project, 3 in-person meetings were held and 30 in-person interviews conducted to collect input from all relevant stakeholders, including researchers, librarians, university administrators, scientific research funders and publishers. The draft white paper is the summary of the findings from those meetings and interviews, along with the identification of potential action items for further work in Phase II of the project.
"Citation reference counts and the Journal Impact Factor have historically been the main metric used to assess the quality and usefulness of scholarship", explains Martin Fenner, Technical Lead Article-Level Metrics for the Public Library of Science and consultant to NISO for the project:
While citations will remain an important component of research assessment, this metric alone does not effectively measure the expanded scope of forms of scholarly communication and newer methods of online reader behavior, network interactions with content, and social media. A movement around the use of alternative metrics, sometimes called "altmetrics", has grown to address the limitations of the traditional measures. With any new methodology, however, issues arise due to the lack of standards or best practices as stakeholders experiment with different approaches and use different definitions for similar concepts. NISO's Altmetrics project gathered together the variety of stakeholders in this arena to better understand the issues, obtain their input on what issues could best be addressed with standards or recommended practices, and prioritize the potential actions. This white paper organizes and summarizes the valuable feedback obtained from over 400 participants in the project and identifies a road forward for Phase II of the project.
Todd Carpenter, NISO Executive Director, states:
More than 250 ideas were generated by participants in the meetings and interviews. We were able to condense these to 25 action items in nine categories: definitions, research outputs, discovery, research evaluation, data quality and gaming, grouping and aggregation, context, stakeholders' perspectives, and adoption. The highest priority items focused on unique identifiers for scholarly works and for contributors, standards for usage statistics in the form of views and downloads, and Building of infrastructure rather than detailed metrics analysis. We are now soliciting feedback on the draft white paper from the wider community prior to its completion. The white paper will then be used as the basis for Phase II: the development of one or more of the proposed standards and recommended practices.
The White Paper is open for public comment through July 18, 2014. It is available with a link to an online commenting form on the NISO Altmetrics Project Web page, along with the detailed output documents and recordings from each of the meetings and related information resources.
Read the white paper at: http://www.niso.org/topics/tl/altmetrics_initiative/
Altmetric launches new tool to track and report on broader impact of research
Alternative metrics provider Altmetric has announced the release of Altmetric for Institutions, a Web-based software application that enables higher educational institutions to track and evaluate the online dissemination and impact of their authored research.
Altmetric collates mentions of scholarly articles across traditional and social media, blog posts, reference management tools and post-publication peer review sites. In addition to these established sources, the data will now also reflect if an article has been cited or mentioned in public-policy documents, offering a much needed insight into the real-life application of research.
Incorporating advanced search and filtering functionality, Altmetric for institutions offers librarians, research administrators, communication officers and faculty themselves an easy and intuitive way of monitoring and reporting on the online attention surrounding individual articles.
Based on data from existing repository or CRIS systems, users are able to view original mentions and summary reports for published articles at the author, group, departmental and institutional level.
Created with input from leading academic establishments around the world, Altmetric for Institutions promises to deliver a much more complete and timely picture of the broader impact of published research than download or citation counts alone can offer.
Juergen Wastl, Research Strategy Officer at Cambridge University and development partner for the new platform, commented:
Altmetric for Institutions has so far proved a very useful tool for monitoring and collating information on the attention that our published research is receiving. It is invaluable in that we can analyse and report on data that would otherwise take weeks or months to collate - and in doing so can better support not only our departments and faculties but also our research networks and initiatives.
Euan Adie, Founder of Altmetric, added:
Altmetric has always placed a strong focus on the quality, breadth, and auditability of our data. This new platform offers institutions a reliable and timely insight into how their research is impacting today's society.
For trial access to altmetric for institutions, visit: http://www.altmetric.com/institutional-edition.php
Mosio offers new widget to embed library help in Facebook, websites, more
Librarians can now extend their reach beyond the library walls with Mosio's new OmniWidget™. A complimentary feature of Mosio's all-in-one library support tool, librarians can use the new widget to embed library help in Facebook, e-resources from any provider, discovery services, learning management systems, library and departmental Web sites and other heavily trafficked areas, connecting with patrons via text, chat and email at their point of need.
"Mosio helps librarians increase their discoverability, accessibility and availability", said Noel Chandler, Mosio co-founder and CEO:
Patrons have millions of choices when it comes to finding information online, and that information may or may not be authoritative. By helping librarians embed themselves in multiple online locations, we ensure that patrons have real-time access to the support they need to make better information decisions and achieve greater outcomes.
Mosio's beta customers have already embedded the customizable OmniWidget in > 10,000 places including the ProQuest, OverDrive, EBSCO, OCLC and Exlibris platforms. A number of libraries have indicated more patrons reach out to them via the widget than their current Contact Us pages. Linda Jones, Education Librarian, Dixie State University, said:
Since customizing Mosio's widget with our library branding and integrating it with our Web site and databases, we have seen more patron engagement and higher usage of our electronic resources. Working with Mosio means we can provide a higher level of service while collecting the data we need to prove ROI to key stakeholders.
Request a free trial at: http://www.mosio.com/mfltrial
Mosio for libraries home page: http://www.textalibrarian.com/
NISO and OAI publish American national standard on ResourceSync framework specification
The NISO and the Open Archives Initiative (OAI) have announced the publication of the ResourceSync Framework Specification (ANSI/NISO Z39.99-2014) – a new American National Standard for the Web detailing various capabilities that a server can implement to allow third-party systems to remain synchronized with its evolving resources. The ResourceSync joint project, funded with support from the Alfred P. Sloan Foundation and Jisc, was initiated to develop a new open standard on the real-time synchronization of Web resources. Herbert Van de Sompel, Scientist, Los Alamos National Laboratory, OAI Executive, and Co-chair of the ResourceSync Working Group, explains:
Increasingly, large-scale digital collections are available from multiple hosting locations, are cached at multiple servers, and leveraged by several services. Since Web resources are continually changing, this proliferation of content yields the challenging problem of keeping services that leverage a server's evolving content synchronized in a timely and accurate manner. Our two-year collaborative effort resulted in a specification that can be used to meet this challenge for a wide variety of use cases. This was possible by devising a modular specification and by grounding it in protocols that are already widely adopted.
"The OAI Protocol for Metadata Harvesting (PMH) 2.0 specification can be used to effectively synchronize the metadata about resources", states Simeon Warner, Director, IT Application Development, Cornell University:
[…] but synchronizing the resources themselves was never specified. Although some resource synchronization methods exist, they are generally ad hoc, arranged by the individuals involved, and cannot be universally deployed. This new specification fills that void.
Michael L. Nelson, Associate Professor, Old Dominion University Computer, explains:
The ResourceSync specification introduces a range of easy to implement capabilities that a server may support to enable remote systems to remain more tightly in step with its evolving resources.
It also describes how a server can advertise the capabilities it supports. Remote systems can inspect this information to determine how best to remain aligned with the evolving data. All capabilities are implemented on the basis of the document formats introduced by the Sitemap protocol. Capabilities can be combined to achieve varying levels of functionality and hence meet different local or community requirements.
Todd Carpenter, NISO Executive Director, states,"We expect this new standard will save a tremendous amount of time, effort, and resources by repository managers through the automation of the replication and updating process":
The end result will be to increase the general availability of content in Web repositories and alleviate the variety of problems created by out-dated, inaccurate, superseded content that exists on the Internet today.
The ResourceSync specification and video tutorials on using the standard are available on the NISO Web site at: http://www.niso.org/workrooms/resourcesync/
Koha version 3.16.0 released
The Koha community has announced the release of version 3.16.0 of the Koha library automation system.
Koha 3.16 is an enhancement and bugfix release. Changes of particular note include:
* There is now the ability to use different templates for notices based on how the notice is to be sent. For example, is now possible to use different wording for email and SMS hold notifications.
* Users can now unselect active facets when refining search results.
* Search history can be now displayed and managed in the staff interface.
* Loan check-ins can now be backdated to an arbitrary date.
* Holds can now be individually suspended and resumed from the OPAC.
* Creating orders from staged files now offers much more flexibility.
* The public reports service can now accept report parameters.
* There is experimental support for Plack for development use.
* The index update script can now operate as a daemon, allowing the indexes to be updated within seconds of a catalog record getting updated.
Koha can be downloaded from: http://download.koha-community.org/
More about v. 3.16.0, including release notes: http://koha-community.org/koha-3-16-1-released/
Innovative acquires VTLS
Innovative, a global leader in library technology, announced in June that it has acquired VTLS, a library automation solutions provider to customers in 44 countries. The combined companies will be led by Innovative CEO Kim Massana. VTLS's products include Virtua, VITAL, Chamo, MozGo and FasTrac.
"This is great news for libraries around the world", says Massana. "Innovative's continuing strategic investment in service to libraries worldwide will result in enhanced support for VTLS's established international customer base and diverse product suite".
Vinod Chachra, Founder and former VTLS President and CEO, will take the role of Vice President of Global Expansion, reporting to Massana and play a key role in ensuring a smooth transition of customers, technology and products globally. Other members of the VTLS development, support and sales teams will join the Innovative management group.
The company's current offices in Blacksburg (Virginia), Barcelona (Spain), and Selangor (Malaysia) will be retained and become innovative centers for operations, along with Syracuse (New York), Dublin (Ireland) and Noida (India). Emeryville (California) will remain the company's corporate headquarter.
As part of the transition, VTLS flagship products will be rebranded, incorporating the company name into the product name including VTLS-Virtua, VTLS-VITAL and VTLS-Chamo Discovery. "We always appreciated the advanced technology and unique position that VTLS has enjoyed in the library community", states Massana. "It was important that we didn't lose the recognition and trust that VTLS has built over the years by continuing the VTLS brand within the product portfolio".
For over 30 years, VTLS has dedicated itself to delivering solutions that meet the unique needs of libraries and I am proud of all that we have accomplished, I believe that the combined companies will bring new resources and offer greater opportunities to customers in solving the challenges of the rapidly changing library marketplace.
Innovative is committed to strengthening its support for all customers worldwide. Bringing our companies together will allow us to provide robust solutions and better support options to all our customers that will, in turn, allow them to better serve their users.
More information: http://www.iii.com/vtls
Jisc and ProQuest sign agreement for national license to essential digital content
A new agreement for a national license between Jisc and ProQuest will enable access for the UK higher education community to two major digital archives: Early European Books Collections 1-4 and The Vogue Archive.
Providing access to almost 25,000 rare and often unique books, Early European Books is a key resource for those with a strong research interest in the period from 1450 to 1700, delivering a wide variety of primary sources from one of the most fascinating and influential periods in Western history and has been developed in collaboration with a range of major European libraries. Very few libraries have access to such a large corpus of works as Early European Books offers, and researchers and students will now be able to view this material wherever and whenever they choose.
Updated monthly with the most recent edition of the magazine, The Vogue Archive gives researchers of fashion, photography, advertising and history access to the entire publication run of the US edition of Vogue magazine, back to its first issue in 1892. Fashion marketing students will be able to research the history of a brand identity by viewing every advertisement featured, while researchers in cultural and gender studies can explore themes such as body image, gender roles and social tastes from the late nineteenth century to the present day. The collection contains > 400,000 pages, reproduced as high resolution full color images, along with very rich indexing and metadata.
We champion the use of digital services and solutions in UK colleges and universities and working with ProQuest to make these archives available to researchers will help us achieve this aim. Having both of these products available on the one ProQuest platform will enhance efficiency in accessibility which will benefit both the students and those teaching them.
Lorraine Estelle, executive director content and discovery, Jisc and divisional CEO Jisc Collections, said:
We are committed to making scholarship as accessible as possible across UK institutions and are thus very pleased to ensure all libraries can have access to Early European Books and the wealth of material it contains. We have long seen the potential of The Vogue Archive to support education across many disciplines – not only those with a focus on fashion – and look forward to it being a well-used resource in higher and further education.
Stephen Brooks, Senior Director for Literature and The Arts at ProQuest said:
This agreement is great news for researchers and students across a range of disciplines; providing them with access to a vast collection of new, digitised content. We are incredibly passionate about meeting researcher needs around the world and this partnership is a valuable and effective way for us to achieve this goal. We applaud Jisc for its recognition of the potential for these resources to impact research in the UK and will continue to work closely with Jisc and the higher education community to support its use of Early European Books and The Vogue Archive.