New & Noteworthy

Heidi Hanson (University of Maryland, College Park, Maryland, United States)
Zoe Stewart-Marshall (Kapolei, Hawaii, United States)

Library Hi Tech News

ISSN: 0741-9058

Article publication date: 3 August 2015

321

Citation

Hanson, H. and Stewart-Marshall, Z. (2015), "New & Noteworthy", Library Hi Tech News, Vol. 32 No. 6. https://doi.org/10.1108/LHTN-06-2015-0044

Publisher

:

Emerald Group Publishing Limited


New & Noteworthy

Article Type: New & Noteworthy From: Library Hi Tech News, Volume 32, Issue 6

Open Library of Humanities Launches in the UK with Jisc Collections

The Open Library of Humanities (OLH) has become the UK’s first collaborative publishing consortium partnering with Jisc Collections that procure electronic content for all of the UK’s research libraries in higher education. OLH’s transformative Library Partnership Subsidy (LPS) business model is now available to UK libraries, who can sign up here to help shape non-profit open-access scholarly publishing in the humanities. This is a significant step towards achieving a sustainable publishing model for scholarly journal articles, in which libraries have a decisive role in shaping the future of open access in the humanities. As members of the LPS, UK libraries will play a part in the governance of the Open Library of Humanities with membership on our Library Board.

The publishing model – the first of its type in the UK – is being offered by OLH following negotiations with Jisc. Not-for-profit and scholar-led, the OLH is a gold open-access platform for a range of humanities journals. Publishing across a wide range of humanities disciplines, the model works by sharing the costs across its consortium of library members – who pay a fee to join – rather than charging authors and their institutions an upfront fee on publication (article processing charges).

Liam Earney, Director of Jisc Collections, said of the launch: “We are keen to engage with a variety of business models to help universities in making their research openly available and compliant with open access policies. The OLH has a transformative business model that is supportive and inclusive of all humanities scholars. It strengthens the relationship between publishers and libraries by actively involving the libraries in the governance of the OLH. This collectively-funded model offers a cost competitive, sustainable path to gold open access that our universities have told us they want.”

Dr Martin Paul Eve, Co-Director of the Open Library of Humanities, said: “We are thrilled and privileged to be working with Jisc in the UK to implement another route to achieve open access. The humanities disciplines still pose a substantially greater challenge than their scientific counterparts. With our unique financial model and innovative approach to transition, we hope to contribute to a solution.”

This launch follows the success of the OLH’s exclusive partnership with LYRASIS, which manages digital content for US libraries through a number of collaborative partnerships. Since launching in January 2015, more than 60 US libraries have now signed up to the LPS model, including Duke University, the University of Pennsylvania and the GALILEO consortium which includes over 2,000 institutions within the state of Georgia.

Open library of humanities: http://www.openlibhums.org/

More about OLH’s business and publishing model: http://www.openlibhums.org/about/the-olh-model/

UC press and the CDL receive Mellon grant to support open-access scholarly publishing

The University of California Press (UC Press) and the California Digital Library (CDL) have received a grant of $750,000 from the Andrew W. Mellon Foundation to develop a Web-based, open-source content and workflow management system to support the publication of open-access (OA) monographs in the humanities and social sciences. When complete, this system will be made available to the community of academic publishers, especially university presses and library publishers.

UC Press is committed to developing a thriving and sustainable ecosystem for the humanities and social sciences and to preserving the monograph as a key vehicle for original scholarship. Last month, UC Press announced Luminos (www.luminosoa.org), a new OA program that brings universal access and advanced digital delivery to the monograph. Development of a new Mellon-funded content and workflow management system will support Luminos, and other OA initiatives, by stripping out complexity – and cost – and enabling sustainable models to flourish.

For CDL, development of a content management solution represents the opportunity to extend its current publishing services to better support monographic series publications in eScholarship (www.escholarship.org), UC’s institutional repository and OA publishing platform. Academic units on the UC campuses are frequently home to innovative, faculty-led book publishing programs with limited staffing. Providing a flexible and efficient workflow management system on the back end of eScholarship will enable these programs to focus resources on the important work of acquiring and raising the visibility of their publications.

“We want to drive innovation that shapes – rather than merely responds to – how scholarship can thrive in a global, deeply networked, public sphere,” says UC Press Director Alison Mudditt. “Digital infrastructure is essential for us to publish traditional and innovative forms of research cost-effectively and ensure maximum global reach. This is not a problem for UC Press alone, however, and by developing an open source solution we believe our system can benefit all of university and library publishing.”

CDL Executive Director Laine Farley comments, “I’m delighted we can work with our colleagues at UC Press to contribute to the new infrastructure needed to move both library and press publishing into a more efficient and forward looking position. This project is an ideal blending of our expertise to realize a common vision.”

The proposed system will increase efficiency and achieve cost reduction by allowing users to manage content and associated workflows from initial authoring through manuscript submission, peer review and production to final publication of files on the open web, whether via a publishing platform or an institutional repository. The system will streamline production so publishers can redirect resources back into the editorial process and disseminate important scholarship more widely.

During this two-year period, the system will be designed and built to support the new open access models being pursued by UC Press as well as CDL’s current publishing programs. Throughout the two-year grant from the Andrew W. Mellon Foundation, UC Press and CDL will engage other university presses and library publishing units to ensure the system will meet the needs of a range of organizations. UC Press and CDL have built in a plan for long-term sustainability to ensure that this resource will continue to serve these communities and will realize its potential to re-invigorate the domain of monographic publishing within the humanities and social sciences.

Full press release at: http://www.cdlib.org/cdlinfo/2015/03/05/uc-press-and-the-california-digital-library-receive-750k-grant-from-the-andrew-w-mellon-foundation/

UKSG transfer code of practice to be maintained by NISO

National Information Standards Organization (NISO) and UKSG have announced that the Transfer Code of Practice will now be supported and maintained by NISO. The Code provides voluntary guidelines for publishers to follow when transferring journal titles between parties to ensure that the journal content remains easily accessible by librarians and readers. NISO has republished Transfer version 3.0 as a NISO Recommended Practice (NISO RP-24-2015) and will move all supporting documentation to the NISO Web site. A NISO Standing Committee has been established to manage the ongoing support of the Transfer Code of Practice.

“The Transfer project was initiated by UKSG in 2006 and the first version of the Code was released in 2007 in response to issues identified by the scholarly communications community when journal titles change platform providers or owners, explains Elizabeth Winter, Electronic Resources Coordinator, Georgia Institute of Technology Libraries, and Co-chair of the NISO Transfer Standing Committee. “Such transfers can negatively impact libraries, intermediaries (such as serials subscription agents, link resolver administrators, and vendors of large-scale discovery systems), and readers. Often the journal would seem to disappear and links from existing information systems to the content would break, even though the title was still being published. Publishers and platform providers have a vested interest in ensuing that their content is easily accessible. The Transfer Code provides them with the specifics of how they can make sure that all of their stakeholders can continue to make the content available with the least amount of disruption.”

“A very important achievement to date for the UKSG Transfer Working Group was the creation of the Enhanced Transfer Alerting Service (ETAS)”, states Alison Mitchell, Editorial Director, Nature Publishing Group, and Co-chair of the NISO Transfer Standing Committee.

This public, searchable database helps publishers communicate journal transfers and makes it easy for librarians and readers to be notified of journal transfers and to search previous journal transfer alerts. The ETAS is currently offered through collaboration among UKSG, JUSP (Journal Usage Statistics Portal), Jisc, and Cranfield University with JUSP and Mimas providing the hosting environment. The current hosting arrangements for the ETAS service will remain in place for the foreseeable future.

“I am delighted, both as UKSG Chair and as Jisc’s service manager for JUSP, that Transfer has been so successful” said Ross MacIntyre, Senior Manager for Bibliographic, Research and Analytic Services, Mimas. “It has been a truly useful initiative to all parties. UKSG continues to invest in research and to incubate projects like Transfer that facilitate better connection between different parties in the knowledge community.”

“NISO is very pleased to take on responsibility for the Transfer Code of Practice,” asserts Todd Carpenter, NISO Executive Director. “We are very well placed to disseminate information about the Code, encourage publisher endorsement and implementation, and promote the best practices in the Code. Additionally, the Code fits nicely in our portfolio alongside related recommended practices such as PIE-J: The Presentation & Identification of E-Journals (NISO RP-16-2013), Online Supplemental Journal Article Materials (NISO RP-15-2013), Journal Article Versions (NISO RP-8-2008), and the forthcoming Protocol for Exchanging Serial Content (PESC).”

The Transfer Code of Practice is available on the NISO Web site from the Transfer Standing Committee’s webpage: http://www.niso.org/workrooms/transfer/

The ETAS alerting service continues to be available at: http://etas.jusp.mimas.ac.uk/

W3C working group publishes first draft of data on the Web best practices

The World Wide Web Consortium (W3C) Data on the Web Best Practices Working Group have published a first public working draft of data on the Web Best Practices to encourage and enable the continued expansion of the Web as a medium for the exchange of data.

Data should be discoverable and understandable by humans and machines. Where data are used in some way, whether by the originator of the data or by an external party, such usage should also be discoverable and the efforts of the data publisher recognized. In short, following these best practices will facilitate interaction between publishers and consumers.

The group has also published a Group Note of Data on the Web Best Practices Use Cases & Requirements with scenarios of how data are commonly published on the Web and how it is used. This document also provides a set of requirements derived from these use cases that will be used to guide the development of the Best Practices document and also two new vocabularies: Quality and Granularity Description and Data Usage Description.

Data on the Web best practices first draft: http://www.w3.org/TR/2015/WD-dwbp-20150224/

Use cases & requirements: http://www.w3.org/TR/2015/NOTE-dwbp-ucr-20150224/

Data on the Web best practices working group: http://www.w3.org/2013/dwbp/

COAR roadmap for repository interoperability published

The Confederation of Open Access Repositories (COAR) has announced the publication of the COAR Roadmap: Future Directions for Repository Interoperability.

Scholarly communication is undergoing fundamental changes, in particular with new requirements for open access to research outputs, new forms of peer-review and alternative methods for measuring impact. In parallel, technical developments, especially in communication and interface technologies, facilitate bi-directional data exchange across related applications and systems.

The success of repository services in the future will depend on the seamless alignment of the diverse stakeholders at the local, national and international levels. The roadmap identifies important trends and their associated action points for the repository community and will assist COAR in identifying priority areas for its interoperability efforts in the future.

This document is the culmination of over a year’s work to identify priority issues for repository interoperability. The preparation of the roadmap was spearheaded by Friedrich Summann from Bielefeld University in Germany, with support from a COAR Editorial Group and input from an international Expert Advisory Panel.

The roadmap is now available on the COAR Web site: http://www.coar-repositories.org/activities/repository-interoperability/

Jisc–ARMA institutional ORCID implementation and cost-benefit analysis report

In May 2014, Jisc and ARMA (Association of Research Managers and Administrators) commissioned eight higher education institution ORCID (Open Researcher and Contributor ID) Pilot projects to support the broader use of ORCID unique researcher identifiers (ORCID iDs) in UK higher education. Information Power Ltd. and Research Consulting Ltd. were commissioned to prepare this report on the results of the eight pilot projects:

  • Inform how ORCID is implemented in UK higher education institutions (HEIs).

  • Enable institutional managers to build a business case for ORCID adoption in HEIs.

  • Encourage wider adoption of ORCID iDs.

Eight pilot institutions participated in the Jisc–ARMA ORCID Project:

  • Aston University;

  • Imperial College London;

  • Northumbria University;

  • University of Southampton;

  • Swansea University;

  • University of Kent;

  • University of Oxford; and

  • University of York.

The majority of institutions had project teams comprising representatives from the Library, the Research Office, IT Services and academic departments and project management was kept “light touch”. Key to the success of the projects were early engagement with senior management, involvement and engagement with key stakeholders across the institution and early consultation with Legal Services and Human Resources. The HEIs found it helpful to secure advice from their legal services departments at the outset of their projects in order to ensure that any personal data processing was lawful.

Perhaps surprisingly, technical issues were not the major issue for most pilot institutions. A range of technical solutions to the storage of researchers’ ORCID iDs were utilised during the pilots. Four institutions used their institutional research information system (CRIS): two used Pure, one Symplectic, and one Converis. Two other institutions developed in-house systems, one used Agresso Business World and one the student portal of SITS e:Vision. Of the eight pilot institutions, only one chose to bulk create ORCID iDs for their researchers, the others opted for the “facilitate” approach to ORCID registration.

Most pilot institutions found it relatively easy to persuade senior management about the institutional benefits of ORCID, but many found it difficult to articulate the benefits to individual researchers. Several commented that staff saw it as “another level of bureaucracy” and it was also noted that concurrent Open Access (OA), REF and ORCID activities can make the message confused, as they overlap. The majority felt that future developments and enhancements to their systems would enable them to articulate the benefits better and encourage much greater take-up. Effective communication was seen as one of the most important elements of projects by every pilot institution. A huge range of advocacy and communication strategies were used and the majority of institutions felt that their communication strategies had worked and advocacy had been successful. Clear and effective messages (as short and precise as possible), creating a well-defined brand for ORCID and the targeting of specific audiences and audience segments were identified as being especially important.

Generally speaking, all the pilot projects were successful in integrating ORCID into institutional systems and processes, but the participants felt it was currently too early really to see the benefits of this. Most reported an increased awareness of ORCID and researchers having a better understanding of the benefits. But all felt that this would change in the future as the ORCID system, and their own internal systems, developed and ORCID iDs became globally recognised by academic institutions, publishers and research funders. All pilot institutions stated that academic registration for an ORCID iD and advocacy and communication activities will continue after the end of the project. They believe that the benefits of ORCID implementation will grow over time and save administrative time for researchers and support staff by ensuring correct and accurate transfer of information between systems.

Specific future benefits mentioned by institutions included:

  • More automated author disambiguation.

  • Improved automated CVs for researchers.

  • Improved retrieval and transfer of author data by the authors (e.g. for seeking collaborations, grants and employment) and by organizations such as funders, institutions and publishers in transferring data between systems.

  • Real opportunities to support greater understanding through analysis and data mining techniques of inferred relationships between individuals, research communities and institutions.

Making archival and special collections more accessible: report from OCLC research

A new report, Making Archival and Special Collections More Accessible, represents the efforts of OCLC Research over the past seven years to support change in the end-to-end process that results in archival and special collections materials being delivered to interested users.

Revealing hidden assets stewarded by research institutions so they can be made available for research and learning locally and globally is a prime opportunity for libraries to create and deliver new value. Making Archival and Special Collections More Accessible collects important work OCLC Research has done to help achieve the economies and efficiencies that permit these materials to be effectively described, properly disclosed, successfully discovered and appropriately delivered. Achieving control over these collections in an economic fashion will mean that current resources can have a broader impact or be invested elsewhere in other activities.

Key highlights from the report:

  • Institutions should undertake an accurate census of their archival collections as a foundation for acting strategically in meeting user needs, allocating available resources and securing additional funding.

  • Accurately describing unprocessed and therefore hidden collections is a daunting but important task that is a necessary prerequisite for discovery.

  • Practitioners should go beyond traditional bounded practices to satisfy their users who have discovered special collections and archival materials.

  • Interlending of actual physical items from special collections for research purposes should be supported.

  • The unique materials stewarded by our institutions need to release their value to a global audience of researchers in ways that will enhance the reputation of the steward. This will happen only when we devote structured effort to the full range of selection, description, discovery and delivery.

Much of the work represented in this compilation was done with the specific advice, guidance or participation of the staff at OCLC Research Library Partnership institutions. OCLC is grateful for this relationship, privileged to provide this venue and committed to listening and leading in this partnership.

This publication will be beneficial to anyone interested in enhancing the library’s value proposition by mobilizing unique materials in the networked environment.

Read/download the report at: http://www.oclc.org/content/dam/research/publications/2015/oclcresearch-making-special-collections-accessible-2015.pdf

More about the OCLC research library partnership: http://www.oclc.org/research/partnership.html

Videos of keynote and plenary sessions from EMEARC annual meeting, the art of invention

Members representing more than 30 countries attended the sixth annual meeting of the OCLC Europe, Middle East and Africa Regional Council (EMEARC) held February 10-11 in Florence, Italy. Videos of the five plenary sessions from the conference, entitled “The Art of Invention. Culture, technology and user engagement in the digital age,” are now online at the OCLC Web site.

The keynote and plenary sessions presented were:

  • “Reinventing Invention” – David Weinberger, Senior Researcher, Berkman Center, Harvard University.

  • “Preserving the Born-Digital Cultural and Scientific Record” – James Neal, OCLC Board of Trustees and University Librarian Emeritus, Columbia University.

  • “The Digital Legacy of Sir Thomas Bodley” – Lucie Burgess, Associate Director for Digital Libraries, Bodleian Libraries.

  • “The Experience of Information” – Francesco Bonami, Artistic Director of Fondazione Sandretto Re Rebaudengo.

  • “The Bigger Picture: The Rijksmuseum Digital Revolution” – Lizzie Jongma, Data Manager, Collection Information Department, Rijksmuseum.

3D technologies in libraries and museums: two project briefings from CNI

Two project briefings presented at the Coalition for Networked Information (CNI) Fall Membership Meeting, held on December 8-9, 2014, examined innovative applications of 3D technologies in libraries and museums.

Günter Waibel and Vincent Rossi presented a project briefing entitled Smithsonian X Digitization: Rapid Capture for Vast Collections, 3D Digitization for Iconic Objects. With 138 million objects and specimens housed in 41 facilities, the scale and diversity of Smithsonian collections presents a unique digitization challenge. The Smithsonian’s Digitization Program Office currently runs extended pilot projects which aim to establish high-quality, high-throughput digitization methodologies for different collection types, as well as apply 3D technologies to the most iconic collection objects. In October 2014, the Smithsonian launched the first conveyor-belt rapid digitization project in the USA, with throughput rates of up to 6,000 items per day. A two-year project to digitize the entire Cooper-Hewitt Smithsonian Design Museum collection of 210,000 objects began in November. Digitization workflows integrate the Smithsonian Transcription Center, where digital volunteers transcribe data from the digital images to make the digital collections searchable. Smithsonian X 3D brings iconic Smithsonian collection objects and remote research sites to a Web-browser near you by applying cutting-edge 3D technology to one-of-a-kind objects and environments. The pilot project investigates the applicability of 3D technology to a cultural heritage setting by focusing on use cases from many of the Smithsonian museums and science centers, such as the 1903 Wright Flyer, Lincoln’s Life Masks, a 1,500 year old Buddha sculpture, a prehistoric fossilized whale and a Super Nova. As presented in the Smithsonian’s 3D explorer (3D.SI.EDU), the 3D models turn online visitors into active investigators. Full datasets for most of the models can be downloaded, which empowers anyone with a 3D printer to create replicas.

Video of the project briefing available on the CNI YouTube channel: https://youtu.be/vsMk84xtwYQ

The Smithsonian’s digitization program office Web site: http://dpo.si.edu/

Trends in 3-D Printing, a project briefing presented by James King, National Institutes of Health, and Kathlin Ray, University of Nevada, Reno, described 3D printing programs at their institution’s libraries and explored what’s different about 3D printing as a service in a library. In 2013, the National Institutes of Health Library converted a print reference collection space into a “Technology Sandbox” to foster collaboration, partnerships and innovation across the agency. The highest profile component of that space has been the 3D printer and modeling software. In the first six months of the open prototype phase, over one hundred unique prints have been produced, including proteins, viruses and anatomy, as well as rapid prototyping and custom laboratory equipment. The library also assisted in the creation of the 3D Print Exchange, which creates validated scientific biomedical 3D print models. James King’s presentation discussed the lessons learned from a federal agency perspective, as well as future plans.

In 2012, the DeLaMare Library at the University of Nevada, Reno launched a 3D printing service as part of a larger “makerspace” initiative. Wildly popular with students, faculty and community members, 3D printing services have evolved over the past two years to include more robust printers/scanners and a whole new class of student employees called 3D Wranglers. The recent addition of a laser cutter has generated even more traffic. Ray’s presentation discussed the pleasures and pitfalls of providing 3D printing and other makerspace activities within today’s academic library.

Video of Trends in 3-D printing available on the CNI YouTube channel: https://youtu.be/XTBf7iVNmkY

Preparing the workforce for digital curation: report from the national academies

From distant satellites to medical implants, sensors are collecting unprecedented quantities of digital data across the scientific disciplines. Other sectors – government, business and health – are collecting huge amounts of data and information as well. If accurate and accessible, such information has the potential to speed scientific discovery, spur innovation, inform policy and support transparency.

However, the policies, infrastructure and workforce needed to manage this information have not kept pace with its rapid growth, says a new report from the National Research Council. The immaturity and ad hoc nature of the field of digital curation – the active management and enhancement of digital information assets for current and future use – so far has led to vulnerabilities and missed opportunities for science, business and government.

There is an urgent need for policies, technologies and expertise in digital curation, said the committee that conducted the study and wrote the report. It recommends that the White House Office of Science and Technology Policy (OSTP) lead policy development in digital curation and prioritize strategic resource investments for the field. Research communities, government agencies, commercial firms and educational institutions should work together to speed the development and adoption of digital curation standards and good practices.

The report also offers several recommendations for strengthening the digital curation workforce. Currently, there are little data available on how – and how many – digital curation professionals are being trained and the career paths they follow. Moreover, it is difficult to estimate current and future demand because digital curation takes place in many types of jobs. The primary source of statistics on employment in the federal government, the Bureau of Labor Statistics does not track digital curation as a separate occupation. However, the committee could estimate the current demand for digital curation professionals by examining data on job openings for related occupations – enterprise architects, data stewards, librarians and archivists, among others. Openings for almost all of these professions at least doubled between 2005 and 2012, the committee found.

Government agencies, private employers and professional associations should develop better mechanisms to track the demand for individuals in jobs where digital curation is the primary focus, the report says. The Bureau of Labor Statistics should add a digital curation occupational title to the Standard Occupational Classification (SOC) when it revises the SOC system in 2018; this recognition would also help to strengthen the attention given to digital curation in workforce preparation. Tracking employment openings for digital curation professionals, enrollments in professional education programs and the career trajectories of their graduates would help balance supply with demand on a national scale.

In addition, OSTP should convene relevant federal organizations, professional associations and private foundations to encourage the development of model curricula, training programs and instructional materials that advance digital curation as a recognized discipline. Educators in institutions offering professional education in digital curation should create partnerships with educators, scholars and practitioners in data-intensive disciplines and established data centers. These partnerships could speed the definition of best practices and guiding principles as they mature and evolve.

The study was sponsored by the Alfred P. Sloan Foundation, the Institute of Museum and Library Services and the National Science Foundation. The National Academy of Sciences, National Academy of Engineering, Institute of Medicine and National Research Council make up the National Academies. They are private, independent nonprofit institutions that provide science, technology and health policy advice under a congressional charter granted in 1863.

Read/download the report: http://www.nap.edu/openbook.php?record_id=18590

UNESCO #NetStudy connects the dots for online access, freedom of expression, privacy and ethics

What measures should UNESCO advocate to ensure that the Internet serves the interests of the largest number of users? What action needs to be taken to apply the rights proclaimed in the Universal Declaration of Human Rights – notably freedom to impart information and opinion and privacy – online, as well as offline?

These questions are the subject of a comprehensive study on Internet-related issues: Keystones to foster inclusive Knowledge Societies, a work in progress born of a year-long open consultation with the gamut of Internet stakeholders. The study examines issues of freedom of expression and privacy, access and ethics on the Internet, and makes proposals to reinforce UNESCO’s work in these areas.

UNESCO’s vision of universal Knowledge Societies builds on a free, open and trusted Internet that enables people to not only have the ability to access information resources from around the world but to also contribute information and knowledge to local and global communities. What can UNESCO do to move toward the realization of this vision of Internet-enabled Knowledge Societies that can foster inclusive sustainable human development worldwide?

To address this question within the mandate of this study, UNESCO has worked with member States and other stakeholders to analyze four separate but interdependent fields of Internet policy and practice, within the mandate of UNESCO, perceived to be central to achieving this vision. These are access to information and knowledge, freedom of expression, privacy and ethical norms and behavior online. This draft report assesses these four fields by viewing them as keystones for building a free and trusted global Internet that will enable inclusive Knowledge Societies.

The framework of investigating the four key fields for this report is that of Internet Universality, which identifies four normative principles agreed by UNESCO Member States. These are the principles of human rights, openness, accessibility and multistakeholder participation, summarized in the acronym R-O-A-M. The report examines each of the four keystones of the Internet and asks whether and how their development is aligned with these four R-O-A-M principles.

The research served as the basis of discussions at a UNESCO conference, Connecting the Dots (3 and 4 March), that brought together some 300 participants, representatives of governments, civil society, academia, the private sector, the technical community, inter-governmental and international organizations as well as innovators and pioneers.

Connecting the Dots was an opportunity for the options contained in the study to be fine-tuned. Once finalized, UNESCO will present them to its Member States in late 2015, during the next General Conference, which is held every second year to determine the Organization’s program and priorities.

The study also represents a significant contribution to the World Summit on the Information Society (WSIS)+10 Review process and the post-2015 international development agenda. An outcome statement about the consultation at the Connecting the Dots conference will be submitted to UNESCO’s Executive Board, the body in charge of overseeing the Organization’s work, at its next session in April.

UNESCO’s partners and sponsors in organizing the Connecting the Dots conference are: Ministry of Foreign Affairs of Finland; Sweden; The Netherlands; Google; the Walt Disney Company; EURid (European Registry of Internet Domain Names); and the Internet Corporation for Assigned Names and Numbers (ICANN).

Read/download the first published version of the Draft Internet Study: http://www.unesco.org/new/fileadmin/MULTIMEDIA/HQ/CI/CI/pdf/internet_draft_study.pdf

More about the UNESCO Internet Study: http://www.unesco.org/new/en/communication-and-information/crosscutting-priorities/unesco-internet-study/

Canada federal research agencies adopt new open access policy for research

Making research results as widely available and accessible as possible is an essential part of advancing knowledge and maximizing the impact of publicly-funded research for Canadians. Increased access to the results of publicly-funded research can spur scientific discovery, enable better international collaboration and coordination of research, enhance the engagement of society and support the economy.

The Honorable Ed Holder, Minister of State (Science and Technology), unveiled the new policy as part of a wide-ranging speech on the government’s updated Science, Technology and Innovation Strategy in a speech to the Economic Club in Toronto. The harmonized Tri-Agency Open Access Policy on Publications requires all peer-reviewed journal publications funded by one of the three federal granting agencies to be freely available online within 12 months. Canada’s three federal granting agencies are: the Canadian Institutes of Health Research (CIHR), the Natural Sciences and Engineering Research Council of Canada (NSERC) and the Social Sciences and Humanities Research Council of Canada (SSHRC). The policy will require NSERC- and SSHRC-funded researchers to comply with the policy for all grants awarded May 1, 2015 and onward. The policy will not change current compliance requirements for CIHR-funded researchers since a similar policy with the same requirements has been in effect since 2008.

In developing this policy, the three agencies held an online consultation, receiving feedback from over 200 individuals and groups from the research community, institutional libraries, scholarly associations, non-governmental organizations, publishers and journals. The granting agencies will continue to work closely with stakeholders to support and facilitate the transition toward greater open access.

In keeping with the global movement toward open access, the harmonized policy requires that researchers receiving grants from CIHR, NSERC and SSHRC make their resulting peer-reviewed journal articles freely available online within 12 months of publication. Researchers can comply with the open-access policy in two ways: “self-archiving” by depositing their peer-reviewed manuscript to an online repository that will make the manuscript freely accessible within 12 months of publication or submitting their manuscript to a journal that offers open access within 12 months of publication. CIHR-funded researchers are also required to deposit bioinformatics, atomic and molecular coordinate data into the appropriate public database immediately upon publication of research results. They must also retain original data sets for a minimum of five years (or longer if other policies apply).

The Tri-Agency Open Access Policy on Publications aligns with the objectives of Canada’s Action Plan on Open Government and is a commitment under the updated Science, Technology, and Innovation Strategy.

Read the Tri-Agency Open Access Policy on Publications at: http://www.science.gc.ca/default.asp?lang=En&n=F6765465-1

CLOCKSS and CHORUS partner to support perpetual public access

The CLOCKSS Archive, a not-for-profit joint venture between the world’s leading academic publishers and research libraries, has entered into an agreement with CHORUS (Clearinghouse for the Open Research of the USA), the not-for-profit, cost-effective and sustainable public access solution, to support the archiving, preservation and perpetual public access to articles reporting on US federally funded research, at no additional cost to taxpayers.

As a result of the agreement, CHORUS Publisher Members will be able to take advantage of a special arrangement with CLOCKSS to establish permanent, perpetual public access to content and ensure the integrity and sustainability of the scholarly record. CHORUS works with US federal agencies, publishers, service providers and researchers to enable, monitor and preserve public access to published articles. Built on CrossRef’s open, interoperable framework, CHORUS services are initiated when researchers submit their articles to publishers.

“Safeguards are necessary to ensure that public access to articles reporting on federally funded research is never disrupted. That is where CLOCKSS comes in”, says Howard Ratner, CHORUS Executive Director. “This agreement allows CHORUS partners to rely on CLOCKSS’ rich and trusted repository of content in the event that publicly accessible research articles become unavailable”.

CLOCKSS has agreements with 200 publishers and has preserved more than 11,000 academic journals, from participating scientific publishers. Through the agreement, CLOCKSS will provide public access to the archived publications should a publisher fail to provide public access for a period of longer than 30 days and there no other parties providing public access.

“By partnering with CHORUS, CLOCKSS is meeting the needs of government funding agencies, publishers, and others in the scholarly community to ensure continued public access to articles reporting on funded research,” states Randy S. Kiefer, Executive Director of the CLOCKSS Archive.

More information about CLOCKSS: http://clockss.org/

More about CHORUS: http://www.chorusaccess.org/

DSpaceDirect now features DSpace 5 improvements

One of the key benefits in choosing the fully hosted DSpaceDirect repository service is that customers are able to use the most recent version of DSpace without having to tie up staff and technical resources to upgrade software. All current and prospective customers now enjoy DSpace 5 benefits – the ability to more easily migrate from older versions of DSpace to batch import content and more. The release of DSpace 5 offers users an even easier-to-use and more efficient institutional repository solution.

The following innovative features are now available as part of the DSpaceDirect service:

  • Batch Importing: Adding a lot of content in a “batch” to a DSpaceDirect account just became easier with the batch import feature that is now available through the administrative interface. Users create a SimpleArchiveFormat package, zip it up and upload it through the browser and DSpaceDirect will ingest the content and metadata automatically into the repository.

  • File Download Tracking: Individual file downloads in a DSpaceDirect repository are now tracked and recorded in the Google Analytics interface along with a full list of other repository statistics, allowing repository administrators to know which files are being downloaded the most, the least, from where in the world, etc.

  • Thumbnail Image Enhancements: Any time an image and NOW pdf file is added to a DSpaceDirect repository a thumbnail for that file is created. With the latest upgrade, these thumbnails are more crisp and clear than ever allowing users to easily see a preview of the full-size image or pdf file being stored in the repository.

  • And coming soon, the new, modern and responsive user interface for DSpaceDirect that will make the repository mobile-friendly and a player in the modern Web technology world. Plans to roll out this new theme will be underway early this summer. For a preview of what is in store for the new interface, called Mirage 2, visit the demo DSpace web site (http://demo.dspace.org/xmlui/).

Two new DSpaceDirect add-on packages are also available, including:

1. PDF Cover Pages: DSpaceDirect will now automatically create a cover page for downloaded pdfs stored in the repository. This citation page can be customized to include the name and URL for the repository; the community and collection where the pdf currently is stored; and the date, title, author and URI for the item and several other optional fields.

2. ORCID Integration: The ORCID (Open Researcher and Contributor ID) integration add-on package for DSpaceDirect is an option which enables ORCID compatibility to the existing DSpaceDirect author capabilities. This new add-on lowers the threshold to adopting ORCID for institutions interested in exploring ORCID capabilities.

DSpaceDirect subscription plans are available in several different sizes to meet various customer needs. Additionally, add-on packages are available to allow further customization of a DSpaceDirect account.

View a 3-minute video about DSpaceDirect improvements at: http://bit.ly/1dNh8Ez

Read more about DSpace 5 improvements: http://duraspace.org/node/2436

DSpaceDirect: http://dspacedirect.org

ArchivesDirect: affordable, standards-based digital preservation + secure storage

Artefactual Systems and the DuraSpace organization have announced the launch of ArchivesDirect, a complete, hosted, “soup-to-nuts” solution for preserving valuable institutional collections and all types of digital resources.

The powerful, combined Archivematica plus DuraCloud ArchivesDirect service meets all 21 aspects of managing and preserving digital objects identified by the Institute of Museum and Library Services-funded white paper “From Theory to Action”. The aim of the report developed by the POWRR (Preserving Digital Objects With Restricted Resources) Project is to provide an analysis of digital preservation solutions for under-resourced institutions. ArchivesDirect meets these challenges by providing tools and services for ingest, preservation, storage and maintenance over time.

ArchivesDirect is priced competitively at $11,900 for the ArchivesDirect standard plan, and the cost for additional terabytes (TBs) is one-third the price of other services. This cost advantage makes it possible for even small to mid-sized institutions to participate in active digital preservation of their collections without having to spend time and money developing tools and infrastructure in-house.

Users will have access to a robust suite of digital preservation functions via the online dashboard. Archivematica is well known for its ability to produce highly standardized and interoperable Archival Information Packages; these packages will automatically be placed into DuraCloud for long-term secure archival storage, where they will be replicated across Amazon S3 and Glacier storage locations and undergo regular integrity checking. ArchivesDirect is unique among preservation and archiving solutions because it is built with open-source software which is well-documented and freely available. All content is stored using open standards, and users can download their data at any point. This means that users of the new service do not have to worry about data lock-in, and the service can be run locally at any time. “This offering is really about collaboration between two organizations dedicated to open technologies and standards”, said Artefactual Systems President Evelyn McLellan. “The digital preservation expertise of Artefactual Systems is a perfect match for the cloud services and storage experience of DuraSpace”.

Artefactual Systems and DuraSpace are extremely grateful to a dedicated group of pilot testers who have provided valuable feedback on functionality, usability and scalability of the service. Because of their rigorous testing efforts, ArchivesDirect has been fine-tuned to handle a wide range of formats and workflows. Thanks go out to the following:

DuraSpace is an independent 501(c)(3) not-for-profit organization providing leadership and innovation for open technologies that promote durable, persistent access to digital data. DuraSpace collaborates with academic, scientific, cultural and technology communities by supporting projects (DSpace, Fedora and VIVO) and creating services (DuraCloud, DSpaceDirect and ArchivesDirect) to help ensure that current and future generations have access to our collective digital heritage.

Artefactual’s mission is to provide the heritage community with vital expertise and technology in the domains of digital preservation and online access. Artefactual develops open-source software (Archivematica and AtoM) and promote open standards as the best means of enabling archives, libraries and museums to preserve and provide access to society’s cultural assets.

ArchivesDirect hosted digital preservation service: http://archivesdirect.org

Watch a 3-minute Quickbyte video about ArchivesDirect: http://youtu.be/u7Ryyo2UWGA

DuraSpace: http://duraspace.org

Artefactual systems: http://artefactual.com

Perceptions 2014: Marshall breeding’s international survey of library automation

In February, independent library automation consultant Marshall Breeding posted a report of analysis of the results of his 2014 library automation perceptions survey. This eighth annual Library Automation Perceptions Report provides evaluative ratings submitted by individuals representing over three thousand libraries from 80 countries describing experiences with 154 different automation products, including both proprietary and open-source systems. The survey results include 994 narrative comments providing candid statements – both positive and negative – about the products and companies involved or statements of intent regarding future automation plans. This report analyzes the results of the survey, presents a variety of statistical tables based on the data collected and provides some initial observations. It aims to provide information to libraries as they evaluate their options for strategic technology products and to the organizations involved in providing these products and services as constructive criticism to help guide improvements.

Selected survey findings identify top performers in several categories:

  • Polaris continues to receive top ratings in all categories from large public libraries and in the general satisfaction, overall product functionality and print functionality among medium-sized public libraries.

  • Apollo from Biblionix received top ratings in all categories from small and very small public libraries.

  • Alma from Ex Libris received top ratings from large academic libraries in the management of electronic resources, customer support and customer loyalty. Ex Libris Aleph scored best among large academic libraries for print functionality and overall product satisfaction.

  • Sierra from Innovative Interfaces received top ratings from large- and mid-sized academic libraries for overall product functionality.

  • OCLC WorldShare Management Services received top ratings from mid-sized academic libraries general product satisfaction, in functionality for managing electronic resources, for product support and company loyalty.

  • Small academic libraries rated Koha (managed independent of a support firm) highest in overall product functionality, management of print materials and for product support.

  • Library.Solution from The Library Corporation received top ratings from mid-sized public libraries for product support.

  • School libraries rated OPALS (OPen-source Automated Library System) most positively in response to all survey questions.

View the full report, including narrative comments, and an interactive version of the survey’s statistical results, at: http://librarytechnology.org/perceptions/2014/

Evergreen integrated library system version 2.8.0 released

The Evergreen community has announced the release of version 2.8.0 of the Evergreen open-source integrated library system. New features and enhancements of note in Evergreen 2.8.0 include:

  • Acquisitions improvements to help prevent the creation of duplicate orders and duplicate purchase order names.

  • A new Apache access handler that allows resources on an Evergreen Web server or which are proxied via an Evergreen Web server, to be authenticated using user’s Evergreen credentials.

  • Copy locations can now be marked as deleted. This allows information about disused copy locations to be retained for reporting purposes without cluttering up location selection drop-downs.

  • Support for matching authority records during MARC import. Matches can be made against MARC tag/subfield entries and against a record’s normalized heading and thesaurus.

  • Patron message center: a new mechanism via which messages can be sent to patrons for them to read while logged into the public catalog.

  • A number of enhancements have been made to the public catalog to better support discoverability by Web search engines.

For more information about changes in this release, check out the release notes.

Evergreen download page: http://evergreen-ils.org/egdownloads/

V. 2.8.0 Release notes: http://evergreen-ils.org/documentation/release/RELEASE_NOTES_2_8.html

Evergreen project home page: http://evergreen-ils.org/

Related articles