New & Noteworthy

Library Hi Tech News

ISSN: 0741-9058

Article publication date: 25 February 2014

370

Citation

(2014), "New & Noteworthy", Library Hi Tech News, Vol. 31 No. 1. https://doi.org/10.1108/LHTN-01-2014-0005

Publisher

:

Emerald Group Publishing Limited


New & Noteworthy

Article Type: New & Noteworthy From: Library Hi Tech News, Volume 31, Issue 1

University College London and Elsevier launch the UCL Big Data Institute

University College London (UCL) and Elsevier, a world-leading provider of scientific, technical and medical information products and services, announced in December 2013 they will establish the UCL Big Data Institute as part of a new collaboration to explore innovative ways to better serve needs of researchers through the exploration of new technologies and analytics as applied to scholarly content and data.

Elsevier believes linking analytics and scientific content is one of the key enablers to better serve scientists and the company will fund research and studentships through the new institute, offering exciting new opportunities to research the analysis, use and storage of big data.

The new institute will complement Elsevier’s recent acquisition of Mendeley, a company that operates a global research management and collaboration platform in the heart of East London’s tech start-up community, and Elsevier’s recent investment into building a London-based world-class web analytics group. Elsevier will establish a Centre of Excellence within the web analytics group in connection with the institute, co-staffed with UCL researchers looking at areas outlined earlier. This will offer employment as well as commercialisation activities in collaboration with UCL Business and UCL Consultants.

UCL has a wide range of other research activities and joint initiatives within the broad and expanding area of big data and research analytics. To fully realise the synergy between these initiatives through the sharing of insights and resources, UCL is developing a research domain for “e-Research”. UCL plans to have a connected community, from particle physics to digital humanities, sharing insights into better use of large computational resources in research. The Big Data Institute will be a key addition to that family of activities.

UCL will have access to Elsevier’s world-class research data and enterprise-level technology, opening up new research possibilities with a broad range of applications. One such proven, open source big data technology, HPCC, is being used by Elsevier in the ScienceDirect platform, and across Elsevier’s parent company Reed Elsevier.

Ron Mobed, Elsevier Chief Executive Officer, said:

This is a significant investment by Elsevier in UK science, in an area where we have outstanding expertise, and in a collaboration with a world-leading institution. Our aim is to help scientists do better research and do it faster.

UCL President and Provost, Professor Michael Arthur, said:

The UCL Big Data Institute will keep us at the forefront of addressing pressing issues around the storage of big data, the curation of scientific information, and the production, disclosure and consumption of research information. UCL and Elsevier inevitably have complementary interests in many aspects of research dissemination and will both together and independently continue to develop these for the good of the global research effort.

Rt Hon David Willetts, Minister of State for Universities and Science, who attended the launch event at UCL, said:

Big data is one of the eight great technologies. It has the potential to play a crucial role supporting the growth of the UK economy. It’s estimated that the big data market will create up to 58,000 new UK jobs by 2017. Collaborations such as the one between University College London and Elsevier are vital if we are to take full advantage of the big data revolution and stay ahead in the global race.

Read more at ElsevierConnect: http://www.elsevier.com/connect/university-college-london-and-elsevier-launch-ucl-big-data-institute

Cold Spring Harbor Laboratory launches bioRxiv preprint server for biology

Cold Spring Harbor Laboratory (CSHL) has announced the launch of a new stage in the evolution of life science communication – a new, free service, called bioRxiv (pronounced “bio-Archive”). It is a preprint server that enables research scientists to share the results of their work before peer review and publication in a journal. Posting to the server and reading its contents costs nothing. Each paper is given a citable web address and is indexed by internet search engines. The best part: from submission to posting takes only hours.

Journals remain the arbiters of quality and reliability in the publication of research results. In addition to editorial selection, journals provide peer review, a process in which a paper must pass critical scrutiny by experts and perhaps be revised before being accepted for publication. But peer review comes at a cost: it may take many months for a manuscript to go from submission to publication. With current biology moving ahead at a phenomenal pace, both scientists and the public are left wondering: how much is progress slowed during the review period?

The lag between the submission of a manuscript to a journal and its eventual publication can be particularly frustrating for new and early-career scientists, whose future employment and funding depend on the availability of their work for scrutiny and assessment soon after they get their first research grants.

BioRxiv bridges the gap between cutting-edge discoveries and traditional publication. How does it work? It is modeled after arXiv, a well-established service provided by Cornell University for the physics research community. To post to bioRxiv, an investigator submits an unpublished manuscript through a web portal; bioRxiv accepts submissions from all areas of the life sciences. The manuscript is then passed on to scientists who have agreed to be bioRxiv affiliates. If a paper is deemed legitimate science, it is published on the bioRxiv web site where it is freely available to read.

Such preprints are therefore not peer reviewed in the traditional sense – in essence the bioRxiv is a distribution service for manuscripts, not a journal. Instead of traditional peer review, scientists have the opportunity to discuss the merits and weaknesses of a manuscript through web-based, moderated comments or by direct interaction with the author. “Scientists bring skepticism and a critical eye to any form of communication, whether or not it has been formally peer reviewed” points out Dr John Inglis, Executive Director and Publisher of CSHL Press, one of CSHL’s five education divisions.

“Cold Spring Harbor Laboratory has been an important, not-for-profit center for communication among scientists since the 1930s”, says Inglis. “BioRxiv is the latest step in its evolution, built on the ever-growing importance of digital technologies in all aspects of information transfer”:

We have established BioRxiv in response to an evident desire among many scientists for more collaboration and openness in their work, with the goal of accelerating the pace of discovery in biological science. We look forward to seeing what the community will do with bioRxiv’s capabilities and working with them to improve it.

BioRxiv (beta): http://biorxiv.org/

Reveal Digital: providing an alternative for creating digital collections

Founded by a digital collection pioneer, a new company is providing a path for libraries to publish their own digital collections within a unique cost-recovery model that supports open access. Reveal Digital officially launched in January 2013 at the Midwinter meeting of the American Library Association, with a mission to make special collection material more meaningful and accessible to students, faculty and researchers through aggregation and digitization.

Founded by digitization veteran Jeff Moyer, the Ann Arbor, Michigan-based business is working in true partnership with libraries to bring unique content into the digital world. Using a cost-recovery business model, the company’s approach ensures that resulting collections are affordable and relevant for libraries.

“This framework provides a viable alternative to mass digitization efforts”, said Moyer:

In fact, it puts complete control in the hands of the libraries, from selecting and prioritizing which collections are developed to balancing the tradeoffs between features, functionality and price. All content will ultimately become available in an open access environment. We believe this platform will enable new and unimagined collaborations among libraries and content holders as they work together to create collections of all kinds – from highly specialized to broad and inclusive.

Reveal Digital employs a “cost recovery=open access” business model. In short, it means that all costs associated with producing a collection define its sales threshold. Two years after the sales threshold is reached, the collection moves to open access.

Production costs include the cost of conversion, editorial (including copyright clearance), royalties (when required), sales and marketing, hosting/ongoing development, and administrative costs. The purchase price is determined based on the expected number of libraries that will purchase the collection in its first four years. Libraries will be asked to enroll in a non-binding agreement to help estimate sales forecast and, thereafter, pricing for each collection.

Reveal Digital is partnering with LYRASIS, which will be the primary sales channel for collections. LYRASIS membership is not required for libraries to purchase collections. Reveal Digital will work directly with selected consortia.

Reveal Digital’s first collaboration will produce Independent Voices, an innovative series of digital collections of alternative press materials, including full runs of newspapers, magazines and journals drawn from widely scattered academic collections. The project will be developed over four years, 2013-2016, and require digitization of more than 1 million pages of content. The collections (which will be cross-searchable and include key metadata) include:

* Women’s Alternative Press.

* GI Underground Press from the Vietnam War era.

* Campus Underground newspapers.

* Minority Press (Black, Hispanic, Native American, Asian).

* Extreme Right Wing Press.

* Anarchist periodicals.

* GLBT Press.

* Literary (Little) magazines.

Full descriptions of the collections can be found on the company’s web site. Content for this project (and all future collections) is curated by a group of scholars and librarians providing editorial guidance.

Explore the initial release of Independent Voices at: http://www.revealdigital.com/voices

Wiley survey reveals generation gap in authors’ open access views and experience

John Wiley & Sons, Inc., has announced the results of its 2013 author survey on open access, with over 8,000 respondents from across Wiley’s journal portfolio. The survey is a follow up to Wiley’s 2012 open access author survey and is the second such survey conducted by Wiley. This year new sections were added including research funding and article licenses.

Consistencies were seen between the 2012 and 2013 surveys in authors’ desire to publish in a high-quality, respected journal with a good impact factor, but the survey also shed light on differences between early career researchers (respondents between the ages of 26-44 with less than 15 years of research experience) and more established colleagues in their opinions on quality and licenses. Differences were also seen across funding bodies and in the funding available for open access to different author groups.

Key findings included:

*The number of open access authors has grown significantly: the number of Wiley authors who have published an open access article almost doubled since 2012, up to 59 percent from 32 percent. Over half of responding authors received grant funding (24 percent full funding, 29 percent partial funding) to cover article publication charges (APCs), an increase of 43 percent over last year.

* Quality and profile of open access publications remains a concern: 68 percent of funded authors publish their work open access, but for those who chose not to, the most prominent reasons were concerns about the perceived quality and profile of open access publications.

* There are indications of author confusion around funder mandates. A significant majority of authors funded by institutions mandating Creative Commons (CC) licensing (notably authors funded by Research Councils UK (RCUK) and the Wellcome Trust which require immediate Gold or Green after embargo periods) said there is no specific license requirement when publishing open access.

* Respondents overwhelmingly preferred the more permissive licenses: CC-BY-NC (Creative Commons Attribution-NonCommercial License) was ranked in their top three by 81 percent of respondents and 70 percent ranked CC-BY (Creative Commons Attribution License) in their top three, but preferences vary by age group. Early career professionals were 6 percent more likely to publish under a CC license than more mature researchers, while over half of respondents above the age of 55 preferred not to use CC licenses of any kind.

* Considerable differences emerge between early career professionals and more established colleagues when comparing funding and payments for APCs. Early career professionals were significantly more likely to have APCs paid for by funders or institutions and were far less likely to pay out of their own funds than respondents over the age of 45 with more than 15 years of experience.

“This year’s survey has shown many steady trends in author views on open access and also clear differences between those of early career and more experienced researchers” said Rachel Burley, Vice President and Director, Open Access, Wiley. “We are continuing to build the Wiley Open Access program to provide more high-quality, peer-reviewed journals and ensure that we offer authors the options that are right for them”.

The survey was circulated to 107,000 corresponding journal article authors of 2012 papers across Wiley’s journal portfolio. Regionally, 36 percent of respondents were from the Americas, 45 percent from countries in Europe, the Middle East and Africa, and 19 percent from the Asia/Pacific region. At the institution level, respondents from universities/colleges comprised the majority of respondents (5,377 responses, 64 percent). The second and third largest pools of responses were authors working at research institutes (937 responses, 11 percent) and hospitals/clinics (839 responses, 10 percent).

To view the results in more detail, view the full presentation on Slideshare: http://www.slideshare.net/WileyScienceNewsroom/wileys-2013-open-access-author-survey

You can also examine some of the data with an interactive visualization tool, and find out how open access authors vary by research experience, region, and subject area, at: http://static.wileyprojects.com/oasurvey/

Open Access Clauses in Publishers’ Licenses – report of COAR Task Force published

The Open Access Licenses and Agreements Task Force has announced the release of a new report, Open Access Clauses in Publishers’ Licenses: Current State and Lessons Learned. The Open Access Agreements and Licensing Task Force is a multi-stakeholder group initiated and supported by Confederation of Open Access Repositories (COAR), with members representing a number of different types of organizations (libraries, licensing agencies, library associations, and open access groups) who have a common interest in promoting sustainable and effective practices for Open Access. The new report presents the results of a review of licenses that secure the rights for authors and/or their institutions to deposit published articles into an OA repository.

As Open Access (OA) policies and laws are being adopted world-wide, the scholarly community is shifting its efforts from advocacy towards practical implementation and support. One of the major routes for making articles open access is through OA repositories. However, the variety and lack of clarity of publishers’ policies regarding article deposit can be a significant barrier to author compliance of OA policies.

In order to overcome this barrier, some organizations have successfully negotiated authors’ or deposit rights with publishers in the context of purchasing content licenses. This report documents the existing OA licensing language that has been implemented by organization around the world and presents some suggestions for their successful adoption. The report concludes that OA clauses offer a feasible option for institutions to address some of the obstacles to article deposit into repositories.

Read the report: http://www.coar-repositories.org/files/OA-Clauses-in-Publishers-Licenses.pdf Open Access Agreements and Licensing Task Force home: http://www.coar-repositories.org/activities/repository-content/licenses-task-force/

CC issues version 4.0 of CC license suite

CC announced in November 2013 that version 4.0 of its licensing suite is now available for use worldwide.

This announcement comes at the end of a two-year development and consultation process, but in many ways, it began much earlier. Since 2007, CC has been working with legal experts around the world to adapt the 3.0 licenses to local laws in over 35 jurisdictions. In the process, CC and its affiliates learned a lot about how the licenses function internationally. As a result, the 4.0 licenses are designed to function in every jurisdiction around the world, with no need for localized adaptations.

In a blog post celebrating the launch, CC general counsel Diane Peters acknowledged the role that CC’s affiliates played in developing the new licenses:

The 4.0 versioning process has been a truly collaborative effort between the brilliant and dedicated network of legal and public licensing experts and the active, vocal open community. The 4.0 licenses, the public license development undertaking, and the CC organization are stronger because of the steadfast commitment of all participants.

CC is a nonprofit organization that enables the sharing and use of creativity and knowledge through free legal tools. Creators and copyright holders can use its licenses to allow the general public to use and republish their content without asking for permission in advance. There are over half a billion CC-licensed works, spanning the worlds of arts and culture, science, education, business, government data, and more.

The improvements in version 4.0 reflect the needs of a diverse and growing user base. The new licenses include provisions related to database rights, personality rights, data mining, and other issues that have become more pertinent as CC’s user base has grown. “These improvements may go unnoticed by many CC users, but that doesn’t mean they aren’t important”, Peters said. “We worry about the slight nuances of the law so our users don’t have to”.

Launch announcement: http://creativecommons.org/weblog/entry/40768

List of new features in version 4.0: http://creativecommons.org/version4

Browse over a million online books on the DPLA Bookshelf

The Digital Public Library of America (DPLA) strives to contain the full breadth of human expression, from the written word, to works of art and culture, to records of America’s heritage, to the efforts and data of science. Since launching in April 2013, it has aggregated over 5.0 million items from over 1,000 institutions. At its DPLAfest on October 24-25, 2013 in Boston, the DPLA introduced a new way to browse over a million online books it has added to its collection. DPLA Bookshelf lets the user scroll a visual representation of a bookshelf that provides all the instantaneous power the digital world provides.

When a user of the DPLA site searches for books, the results are displayed as books on a bookshelf; the shelf is shown as a vertical stack so that the titles and authors are more easily readable on their spines. The width of the book represents the actual height of the physical book, and its thickness represents its page count. The spine is colored with one of ten depths of blue to “heatmap” how relevant the work is to the reader’s search.

When a reader clicks on one of the books, additional information about it is displayed to its right. The reader can launch the e-book with the click of a button. Or, he or she can explore further by clicking on one of the subjects under which the book has been categorized. This replaces the existing shelf with a shelf containing all the other books in the DPLA collection categorized under that same subject.

Further, when a reader clicks on a book, the DPLA Bookshelf displays thumbnails of images within the DPLA collection related to that book’s subject areas. Clicking on a thumbnail displays the image and additional information about it.

DPLA Bookshelf was created by the Harvard Library Innovation Lab, based on its Stacklife project. With the help of a Beta Sprint grant from the DPLA, the group had created an earlier version that mashed up several book sites, including the DPLA’s book offerings.

Dan Cohen, DPLA Executive Director, said:

We knew that when we announced the addition of a large number of books to the DPLA collection that we would have to find an easy, yet powerful, way for our users to browse the collection. This creative new interface adds to our discovery portal the most familiar of library metaphors: a quickly scannable shelf full of books.

The DPLA front-end, including Bookshelf, is an open source project (github.com/dpla/frontend).

Check out the DPLA Bookshelf at: http://dp.la/bookshelf

In other DPLA news, a new DPLA program funded by the Bill & Melinda Gates Foundation will produce workshops and education modules to train public librarians in digitization, metadata creation, and digital technologies. The DPLA announced in October 2013 that it has received a $990,195 grant from the Bill & Melinda Gates Foundation to build upon its network of library professionals and organizations to pilot a national-scale training system for public librarians. Under the grant, the DPLA will collaborate with its service “hubs” – regional digital library partners located in states and regions in the USA – to build curricular resources and implement hands-on training programs that develop digital skills and capacity within the staffs of public libraries.

“From its inception, the DPLA has partnered with libraries across America to help bring their riches to the country and the world”, said Dan Cohen, DPLA’s Executive Director. “With this generous grant from the Gates Foundation, we can extend this partnership to help local libraries and librarians take full advantage of what digital technology has to offer”.

As part of the new program, current librarians and library volunteers around the country will work with the DPLA to acquire, use, and sustain new digital skills using DPLA’s open materials and services, such as metadata creation, digitization, and virtual exhibition curation. Public librarians will receive the training required to produce digitized materials and curate these into virtual exhibitions.

The public libraries that participate in this pilot program will foster a greater understanding of local historical and cultural content, directing their communities to curated materials as well as a mass of relevant items in the DPLA and its associated collections. These libraries will have the opportunity to associate professional metadata with their collections, providing the ability to share their local heritage globally via the DPLA’s portal and platform. The DPLA sees this initiative as an important step in its nascent DPLA Local program, which aims to tailor DPLA’s infrastructure for customized use in communities across the USA.

The funding is part of the Gates Foundation’s global libraries program, which works to ensure that all people, especially those in disadvantaged communities around the world, have access to information through technology in public libraries.

“Public libraries are among the most beloved and trusted institutions in America”, Cohen said. “It’s a privilege to be able to assist them in their mission through this new program”.

DPLA virtual exhibitions can be viewed at: http://dp.la/exhibitions

New research reveals unexpected outlook for future of printed books, eBooks

In December 2013 Ricoh Americas Corporation (Ricoh) announced the findings of its commissioned IT Strategies books study, performed in conjunction with the University of Colorado. Among the key findings of the study: that eBooks’ mindshare is overshadowed by popular press headlines rather than factual data, and that most consumers do not see themselves giving up printed books, due to the benefits the physical form offers.

The most surprising results of the study entitled “The Evolution of the Book Industry: Implications for US Book Manufacturers and Printers” include:

* Nearly 70 percent of consumers feel it is unlikely that they will give up on printed books by 2016. Consumers have an emotional and visceral/sensory attachment to printed books, potentially elevating them to a luxury item.

* Despite their perceived popularity, 60 percent of eBooks downloaded are never read in the USA. Since 2012, the growth of eBooks has slowed significantly as dedicated eReader sales are declining, and tablet PC devices are increasingly becoming utilized for other forms of entertainment.

* College students prefer printed textbooks to eBooks as they help students to concentrate on the subject matter at hand; electronic display devices such as tablet PCs tempt students to distraction.

* Current trends reveal that while fewer copies of books are being sold, more titles are being published.

* Digital printing of “ultra short runs” has empowered book printers to supply books more tightly tied to actual demand.

* The top three reasons consumers choose a printed book are: lack of eye strain when reading from paper copy vs an eBook; the look and feel of paper, and the ability to add it to a library or bookshelf.

“More than 500 years after the invention of the printing press, book manufacturers and publishers are playing a pivotal role in the next renaissance in books that is happening now”, said George Promis, Vice President of Continuous Forms Production Solutions & Technology Alliances, Ricoh:

To borrow a phrase from Mark Twain, reports of the printed book’s death are greatly exaggerated. Print is alive, well and sought after in today’s book market. At Ricoh, we’re focused on ensuring this stays true for years to come.

Other findings from the study specifically relevant to publishers and book manufacturers include:

1. Publishers are using digital printing in two ways:

* As a test with one to two books placed per retailer, circumventing cumbersome distributor guidelines and storage fees before ordering larger offset or digitally printed quantities.

* For predicted strong titles, digitally printed books are used for reorders as needed to supplement first-run offset printed books.

2. Digital production inkjet printers have opened the door to a business model shift. Combined, the study estimates that just 50 production inkjet systems owned by 25 book manufacturers produced more than 10 percent of all printed book pages in the USA in 2012.

3. Offering titles electronically does not correspond to revenue generation or cost savings – even the largest publishers derive revenues of no more than 20-30 percent from eBook sales.

The study surveyed more than 800 respondents, with the following demographic profile:

1. Gender: 55 percent female, 45 percent male.

2. Average age: 39 years old.

3. Education:

* 0.2 percent have not completed high school (one respondent).

* 36 percent have a high school degree.

* 49 percent have an undergraduate degree.

* 15 percent have a graduate or higher degree.

“Despite the perceived growth of eBooks, our research shows that there is a silver lining for the printed books and the digital production print industries”, said Marco Boer, Consulting Partner, IT Strategies:

As book orders become smaller in quantity and more frequent, and as an unprecedented number of titles are introduced each year, digital print is helping book manufacturers tackle potential challenges head on through automation and more intelligent printing.

Download the IT Strategies White Paper, “The evolution of the book industry: implications for US book manufacturers and printers”: http://www.infoprint.com/internet/comnelit.nsf/Files/ITStrategies_FINAL/$File/ITStrategies _FINAL.pdf

Tablet and e-reader ownership update 2013: new research findings from Pew

The number of Americans ages 16 and older who own tablet computers has grown to 35 percent, and the share who have e-reading devices like Kindles and Nooks has grown to 24 percent. Overall, the number of people who have a tablet or an e-book reader among those 16 and older now stands at 43 percent.

Up from 25 percent last year, more than half of those in households earning $75,000 or more now have tablets. Up from 19 percent last year, 38 percent of those in upper-income households now have e-readers.

These latest figures come from a survey by the Pew Research Center’s internet project which was conducted from July 18 to September 20, 2013 among 6,224 Americans ages 16 and older. The margin of error is plus or minus 1.4 percentage points.

Interviews were conducted via landline (n=3,122) and cell phone (n=3,102, including 1,588 without a landline phone). The survey was conducted by Princeton Survey Research Associates International. The interviews were administered in English and Spanish by Princeton Data Source. Statistical results are weighted to correct known demographic discrepancies. Results based on the 5,320 internet users have a margin of sampling error of ±1.5 percentage points.

View or download the full report: http://pewinternet.org/Reports/2013/Tablets-and-ereaders.aspx

For more information on the Pew Research Center’s research related to mobile technology, visit the internet project’s fact sheet (updated September 2013) at: http://pewinternet.org/Commentary/2012/February/Pew-Internet-Mobile.aspx

New technology watch report from DPC on preservation and access for e-journals

“Preservation, Trust and Continuing Access for e-Journals” is the latest in the Digital Preservation Coalition’s (DPC) series of Technology Watch Reports released on October 30. Written by Neil Beagrie, and published in association with Charles Beagrie Ltd, this report was released at the DPC’s e-Journals Summit at the RIBA headquarters at 66 Portland Place, London.

Endorsed by LIBER (The Association for European Research Libraries), the report discusses the critical issues of preservation, trust and continuing access for e-journals, particularly in light of the dynamic and interdependent resources they have become, as well as the ever-growing trend towards open-access.

With extensive experience in this field and a particular reputation for his policy advice on e-journals and the cost/benefits of digital preservation for Jisc and others, Neil tells us that these:

[…] issues have become increasingly important for research libraries as published journals and articles have shifted from print to electronic formats; and as traditional publishing business models and relationships have undergone major transformations as a result of that shift.

With these issues in mind, the report provides a comprehensive review of the latest developments in e-journal preservation, outlining key considerations and an application of best practice standards. The report introduces a range of service providers that now support continuing access and/or preservation of e-journals and how research libraries have increasingly come to trust them.

Neil explains that:

[…] for trust to be established between libraries and digital preservation services there needs to be clear agreements for long-term archiving, and clear procedures and mechanisms for those agreements to be implemented and validated when necessary across all elements of the supply chain.

Matthew Herring from the University of York is sure that the report provides answers to these requirements, calling it:

[…] a clear, comprehensive and informative introduction to the area […] if I was trying to grapple for the first time with long-term e-journal access, I would find this a very helpful guide.

Oya Y. Rieger, Associate University Librarian for Digital Scholarship and Preservation Services at Cornell University Library agrees, adding that:

[…] due to inherent risks associated with digital media, the initial focus of earlier preservation studies was much more on technology issues. Neil’s comprehensive analysis illuminates the complex and integrated nature of technical, policy, business, and trust issues underlying e-journal preservation.

While “Preservation, Trust and Continuing Access for e-Journals” predominantly addresses issues felt most keenly by libraries, scholars and publishers, the report also includes generic lessons on outsourcing and trust learnt in this field of interest to the wider digital preservation community. It is not solely focused on technology, and covers relevant legal, economic and service issues.

The not-for-profit DPC is an advocate and catalyst for digital preservation. The coalition ensures its members can continue to deliver resilient long-term access to digital content and services through knowledge exchange, capacity building, assurance, advocacy and partnership. Its primary objective is raising awareness of the importance of the preservation of digital material and the attendant strategic, cultural and technological issues. The DPC’s technology watch reports support this objective through an advanced introduction to topics that have a major bearing on its vision to “make our digital memory accessible tomorrow”.

Download Neil Beagrie’s report “Preservation, Trust and Continuing Access for e-Journals” at: http://dx.doi.org/10.7207/twr13-04

Portico e-journal service reaches historic milestone: more than 25 million articles preserved

Portico has announced that it is preserving 25 million journal articles – and counting – through its e-journal preservation service. The articles represent content from more than 287,000 volumes and more than 1 million journal issues.

This is a significant milestone for Portico and the library and publisher community that supports digital preservation. “It is quite remarkable to see the growth in the Portico archive over the last eight years”, commented Marilyn Geller, collection management librarian of Lesley University, an early Portico participant:

When it first started, the Portico system was an untested concept. But in the ensuing years, the system has proven successful and the organization has demonstrated that they can be trusted when unforeseen circumstances happen. We couldn’t be more pleased with Portico’s growth.

Portico developed out of an urgent need for the preservation of a growing number of electronic journals. “As more journals moved online, the community was facing a preservation problem we had never known before”, noted Craig Van Dyck, vice president of global content management at Wiley:

Wiley was an early supporter of digital preservation, and was engaged with Portico’s development from the early days. We knew a collaborative effort between publishers and libraries was necessary to preserve this new medium, and Portico was just the kind of third party that was needed.

Since 2005, the number of titles and types of content has grown significantly, as has participation, which now exceeds 1,000 libraries and publishers. Nearly 17,000 e-journals, more than 220,000 e-books, and 72 digitized historical collections have been entrusted to Portico for long-term preservation. The articles, books, and digital collections currently preserved in the archive translate to nearly 415 million files and 60 terabytes of content.

“We continue to look for ways to extend our service to the community”, said Kate Wittenberg, Managing Director, Portico. “The growth of the archive underscores the demand for electronic content and the need to securely preserve it”.

Portico is a digital preservation service for e-journals, e-books, and digitized historical collections. More than 850 libraries and 220 publishers, representing thousands of professional and scholarly societies, are preserving their content in the Portico archive to ensure that the community’s investment in digital content is protected.

For more information about Portico, visit: http://www.portico.org

OCLC to partner with Infopeople to produce new online learning content for libraries

The Institute of Museum and Library Services (IMLS) recently awarded OCLC a 2013 Laura Bush 21st Century Librarian Program grant aimed at strengthening and sustaining continuing education programs for libraries nationally.

A grant award of $248,982 will produce new online learning content for public libraries. In partnership with Infopeople, OCLC will provide training and guidance for continuing education providers to create effective learning content that can reach larger audiences of library staff. Participants will receive comprehensive formal training focused on how to design and deliver engaging, high-quality e-learning, and then will collaboratively produce up to six new online course modules on topics identified as highest priority for public libraries. The content will be designed for publication on any platform and distributed nationally through WebJunction.org.

“We are pleased to make these significant grants supporting continuing education”, said Susan Hildreth, Director of the Institute of Museum and Library Services. “Supporting library continuing education is a high priority for IMLS. High quality library service depends upon smart investments in the people who work in libraries”.

Infopeople is a federally funded grant project administered by the California State Library. The project provides continuing education for California library staff; offers technical assistance to libraries; explores library applications of new technologies; develops and shares resources; and facilitates statewide communication among libraries. Infopeople’s virtual training is available to, and well used by, the greater library community outside of California.

“Infopeople has long viewed online learning as a natural extension of our mission to provide the California library community with the skills, tools and resources needed to deliver high-quality service in a rapidly changing world”, said Lisa Barnhart, Infopeople Training Coordinator. “We look forward to this opportunity to explore new ways to engage our learners”.

“Libraries and cultural institutions are central to achieving lifelong learning for all Americans”, said Cathy De Rosa, OCLC Vice President for the Americas:

With high demands on librarians’ time and resources, this IMLS investment provides a community-based forum to advance and coordinate our national efforts to create training that supports librarians and all those they serve.

More about Infopeople at: http://https://infopeople.org/

Makerspaces in libraries survey results

During October and November 2013, John J. Burke, Director, Gardner-Harvey Library, Miami University Middletown (Ohio), conducted a web-based survey on makerspaces in libraries. Respondents were solicited from 12 electronic discussion lists, some tweets, and one Facebook group: Makerspaces and the Participatory Library.

143 librarians responded to the survey. 41 percent of the respondents currently provide makerspaces in their libraries (or provide maker activities through their libraries). 36 percent of the respondents are planning to start makerspaces in the near future. 24 percent of respondents are not currently providing makerspaces nor are planning to do so. The following responses all come from the 109 librarians who currently provide makerspaces or who plan to soon start a makerspace.

Makerspaces appear in most types of libraries. 51 percent of respondents are in public libraries, 36 percent are in academic libraries, and 9 percent are in school libraries. The remaining 4 percent chose “other” for their type of library (0 percent selected special libraries) – entering combined school and public libraries (2), a community college library, and an iSchool.

Librarians from 30 US states, from Alabama to Wisconsin, responded to the survey along with librarians from seven other countries (Australia, Canada, China, Denmark, Japan, The Netherlands, and the UK).

Makerspaces tend to be a new addition to most respondents’ libraries. 46 percent of respondents started their makerspace in the last year, 13 percent in the last one to two years, and 11 percent two or more years ago. The remaining responses came from libraries that had not yet started to offer a makerspace.

Burke asked respondents to choose the technologies or forms of making they included in their makerspaces from a list of 55 items. All but six of the items were selected by at least one makerspace (those six were welding, stained glass, metal shop activities, letterpress, glass shop activities, and blacksmithing). The top 15 technologies or forms of making, each of which were chosen by 25 percent or more of the 109 respondents, were:

1. Computer workstations: 67 percent.

2. 3D printing: 46 percent.

3. Photo editing: 45 percent.

4. Video editing: 43 percent.

5. Computer programming/software: 39 percent.

6. Art and crafts: 37 percent.

7. Scanning photos to digital: 36 percent.

8. Creating a web site or online portfolio: 34 percent.

9. Digital music recording: 33 percent.

10. 3D modeling: 31 percent.

11. Arduino/Raspberry Pi: 30 percent.

12. Other: 30 percent (included knitting, Legos, etc.).

13. Animation: 28 percent.

14. High quality scanner: 28 percent.

15. Tinkering: 26 percent.

Librarians reported that training sessions, workshops, or classes in their makerspaces were taught by library staff (49 percent), volunteers (27 percent), paid instructors from beyond the library (13 percent), or “other” (12 percent), which includes IT staff, maker group members, “Student Geek Force”, and “center for teaching and learning”.

Respondents also listed their most popular making activities and technologies, their go-to resources for keeping track of making developments, and what they expect to add to their makerspaces in the next year.

More details from the survey will be available in the forthcoming book Makerspaces: A Practical Guide for Librarians by John Burke (to be published by Rowman & Littlefield Publishers in 2014).

Additional details of the survey results at: http://www.users.miamioh.edu/burkejj/Makerspaces%20in%20Libraries%20Survey%20Results%202013.pdf

Koha version 3.14.0 released

Koha is the first free and open source software library automation package (ILS). In use worldwide in libraries of all sizes, Koha is a true enterprise-class ILS with comprehensive functionality including basic and advanced options. Koha includes modules for acquisitions, circulation, cataloging, serials management, authorities, flexible reporting, label printing, multi-format notices, offline circulation for when internet access is not available, and much more. Koha will work for consortia of all sizes, multi-branch, and single-branch libraries.

The Koha community is has announced the release of version 3.14.0 of the Koha library automation system. Koha 3.14.0 is a major functionality release that includes various new features, including:

* A new default theme, “Bootstrap”, for the public catalog interface. The Bootstrap theme provides a responsive design that works well both for desktop web browsers and mobile devices.

* A new course reserves module.

* A new offline circulation module that is entirely browser-based.

* A revamp of the serials predication system which allows for significantly more flexibility in managing patterns.

* The ability to create lists of patrons.

* The ability to search for items to check out by keyword rather than barcode.

* The ability to overlay item records during batch imports.

* The ability to define rules (called MARC modification templates) for batch-modifying MARC records during import.

* Numerous acquisitions workflow improvements, including the ability to move orders from one vendor to another, search acquisitions history, merge duplicate invoices, and get warnings if a new order would over-commit a fund.

* Cataloging improvements including the ability to import authority record via Z39.50, the ability to merge authority records, and the ability to save and continue editing bib records.

* Support for schema.org microdata in the OPAC to increase the visibility of your catalog to search engines.

There are also a number of under-the-hood improvements, including:

* Zebra DOM filter support for UNIMARC bib and authority records.

* The introduction of an ORM, DBIx::Class.

* Significantly improved unit test coverage.

And, as always, there are numerous bugfixes and minor improvements.

Koha 3.14.0 can be downloaded from: http://download.koha-community.org/koha-3.14.00.tar.gz

Koha project home: http://koha-community.org/

The evolution of bibliographic data exchange: information standards quarterly special issue

The National Information Standards Organization (NISO) announces the publication of a special themed issue of information standards quarterly (ISQ) on the topic of the evolution of bibliographic data exchange. Libraries are in the midst of moving away from AACR2 and MARC 21 to the new world of the semantic web, linked data, FRBR, and RDA. As noted by Ted Fons, Executive Director, Data Services & WorldCat Quality at OCLC and the guest content editor for this ISQ issue:

[…] the success of the web as a research tool has dramatically changed the library’s role in the exposure of library catalogs […]. The rise of new metadata initiatives reflects the need to respond to this change and to increase our effectiveness in the exchange and management of library metadata.

Fons has gathered together in the Winter 2013 issue of ISQ a set of thoughtful and informative articles about the work that is underway in this bibliographic data evolution.

In the feature article, are current bibliographic models suitable for integration with the web?, Lars Svensson from the German National Library discusses why it is important for libraries to make their metadata an integral part of the web and why libraries need (but do not yet have) an agreed-upon model that can draw in entities across the cultural heritage sector. Paul Moss (OCLC) in his opinion piece, replacing MARC: where to start, further emphasizes the need to step away from thinking solely about a single library interchange format and instead consider that “each function MARC serves should be examined independently and may be replaced by a different technology”.

The Library of Congress initiated a bibliographic framework project (BIBFRAME) in 2011 to define a replacement for MARC 21. The George Washington University is one of the early experimenters for BIBFRAME and Jackie Shieh in the first in-practice article reviews how their participation was a “transformative opportunity” for their staff “to contribute and establish a new standard that would benefit researchers navigating the information sphere”. Another national library, the Bibliothèque nationale de France (BnF), is also a leader in defining and implementing this new bibliographic model. Ted Fons interviews Gildas Illien, Director of BnF’s Bibliographic and Digital Information Department, who discusses the need for a new framework for bibliographic data exchange, what the BnF has already done to transform the way they express their bibliographic data, what European libraries have been focusing their efforts on in the past five years related to metadata management, and what the focus should be in the next two years.

As an example of how the bibliographic community is addressing these issues, in the context of the wider web, Richard Wallis (OCLC) reviews the work of Schema Bib Extend, a W3C Community Group focused on establishing a consensus within the bibliographic community around proposals for extending the Schema.org vocabulary to enhance its capabilities in describing bibliographic resources. Todd Carpenter (NISO) reviews how the NISO Bibliographic Roadmap project is Charting a Course through a New Exchange Environment in an effort to ensure that the needs of a variety of affected stakeholders – not just libraries – will be fully integrated into the new bibliographic ecosystem.

“It is my hope”, states Fons:

[…] that this set of thoughtful essays provides you with some insight into the landscape of new metadata initiatives and is a useful continuation of the dialog on how we can improve data exchange.

ISQ is available in open access in electronic format on the NISO web site. Both the entire Winter 2013 Evolution of Bibliographic Data Exchange issue and the individual articles may be freely downloaded. Print copies are available by subscription and as print on demand.

For more information and to access the free electronic version, visit: http://www.niso.org/publications/isq

Related articles