New & Noteworthy

Library Hi Tech News

ISSN: 0741-9058

Article publication date: 2 August 2013

184

Citation

(2013), "New & Noteworthy", Library Hi Tech News, Vol. 30 No. 6. https://doi.org/10.1108/lhtn.2013.23930faa.002

Publisher

:

Emerald Group Publishing Limited

Copyright © 2013, Emerald Group Publishing Limited


New & Noteworthy

Article Type: New & Noteworthy From: Library Hi Tech News, Volume 30, Issue 6

Free, public maker space to open in Chicago Public Library Innovation Lab

The Chicago Public Library (CPL) is opening the CPL Innovation Lab at the Harold Washington Library Center. Already used by a variety of industries from retail to banking to universities, innovation labs offer organizations a place to test new ideas for services, programs and products. The third floor space at the CPL will allow CPL to quickly experiment with new ideas and approaches in order to be more customer focused and able to adapt to the community’s changing needs.

The first innovation experiment in the space is the Maker Lab, part of the growing movement of hands-on, collaborative learning environments in which people come together to share knowledge and resources to design, create and build items. CPL is the first large urban library to experiment with a maker space. Made possible with a grant from the Institute of Museum and Library Services (IMLS) to the CPL Foundation, the Maker Lab will be open to the public from July 8 through December 31, 2013. While a number of maker spaces exist in Chicago, this will be the first free maker space open to the public.

Created in partnership with the Museum of Science and Industry, the Library’s Maker Lab offers the public an introduction to technology and equipment which are enabling new forms of personal manufacturing and business opportunities. After the six month run, the library will evaluate the project to determine the fit with the library’s mission and the ability to bring the project, or elements of it, to a wider audience in the neighborhood branches.

The lab will offer access to a variety of software such as Trimble Sketchup, Inkscape, Meshlab, Makercam and equipment including three 3D printers, two laser cutters, as well as a milling machine and vinyl cutter.

In addition to Open Lab hours during which patrons can work with staff members to master new software and create personal projects, a variety of programs and workshops will be offered throughout the seven day schedule of the Maker Lab. Family workshops will be offered every Sunday afternoon to foster invention, creation and exploration of science, technology, engineering, art and math (STEAM), the focus of this year’s Summer Learning Challenge.

“We are thrilled to be able to offer Chicagoans the opportunity to learn firsthand new technologies and skills used in today’s manufacturing at the library,” said Commissioner Brian Bannon. “The Maker Lab is the first of several ideas we plan to test over the next few years in the Innovation Lab, as we focus on expanding access to twenty-first century ideas and information to our communities.”

In developing the space and the programs, the CPL created an advisory board comprised of City Colleges of Chicago, Northwestern University Department of Mechanical Engineering, Columbia College Chicago, Westport Public Library, Arts Alliance Illinois, San Francisco Children’s Creativity Museum, New York Public Library, Ann Arbor District Library, Pumping Station: One and FreeGeek Chicago. All these organizations lent advice to the process as well as programming elements.

The CPL continues to encourage lifelong learning by welcoming all people and offering equal access to information, entertainment and knowledge through materials, programs and cutting-edge technology.

For more information, visit: www.chipublib.org

Survey of colleges in Scotland shows social media has graduated to academia

A new survey of colleges across Scotland shows that social media, and particularly YouTube, has firmly entered the learning environment as teaching and learning tools, with its use growing significantly year on year.

In the 2012 Enhanced Training Needs Analysis (ETNA) survey, carried out by the Jisc Regional Support Centre (RSC) in Scotland and launched in June at their annual conference in Edinburgh, nearly three quarters of academics in further education agree that social media tools enhance the quality of the learning experience. YouTube is by far the most popular tool, while Facebook and particularly Twitter, lag well behind. However, the survey also identifies a strong need for staff training in the use of social media.

The 2012 ETNA survey is the fifth of its kind in Scotland, with ETNA surveys having been carried out for more than a decade across Scottish colleges, analysing technology in further education and able to show trends over time. In 2012, 1,700 staff took part, including more than 700 academics across 40 of the 43 colleges. Together with responses from admin and support staff, managers, learning resource staff, learning technologists, and technical and network staff, it provides a comprehensive picture of technology in the learning landscape.

Of those surveyed:

  • academic staff seemed most in favour of social media: 70 percent agreed that its use enhances the quality of the learning experience and 69 percent agreed that students were at ease using it;

  • some academic staff felt that social media is a distraction to learning;

  • around half of all middle managers said their department uses social media tools for learning and teaching;

  • fewer than 10 percent of staff in any category, however, had received training in social media; and

  • more than a third of staff identified a need for staff training.

Of the media channels:

  • the video world of YouTube stood out strongly, used by 62 percent of academic staff and 40 percent of learning technologists;

  • other media lagged far behind, with Facebook used by only 15 percent of academic staff and Twitter used by just 3 percent;

  • blogs and wikis sat just behind Facebook at 14 and 13 percent of academic staff;

  • emerging platforms such as Pinterest and Flipboard were used by just 1 percent of academics and not at all by managers;

  • Facebook was more popular among admin and support staff, learning resource staff and learning technologists than it was among academic staff; and

  • all social media access was still completely blocked by a significant minority of colleges.

Celeste McLaughlin, advisor: staff development at Jisc RSC Scotland said: “It’s clear from the survey that social media is now here to stay in colleges as learning tools. They offer a familiar environment for students and, at the same time, teaching staff clearly like them. In particular, the ability to share videos online has made YouTube a clear favourite. But training is patchy, so Jisc RSC Scotland aims to help college staff improve their social media skills.”

The 2012 ETNA survey, growth and development – an analysis of skills and attitudes to technology in Scottish further education, was launched at the Jisc RSC Scotland annual conference, Bring Me That Horizon!, June 7, 2013.

Download the 2012 ETNA survey at: www.rsc-scotland.org/wp-content/uploads/2013/06/2012ETNAVolVFinal.pdf

Smartphone and tablet ownership in 2013 – new research reports from Pew

For the first time since the Pew Research Center’s Internet & American Life Project began systematically tracking smartphone adoption, a majority of Americans now own a smartphone of some kind. The definition of a smartphone owner includes anyone who says “yes” to one – or both – of the following questions:

  • 55 percent of cell phone owners say that their phone is a smartphone.

  • 58 percent of cell phone owners say that their phone operates on a smartphone platform common to the US market.

Taken together, 61 percent of cell owners said yes to at least one of these questions and are classified as smartphone owners. Because 91 percent of the adult population now owns some kind of cell phone, that means that 56 percent of all American adults are now smartphone adopters. One-third (35 percent) have some other kind of cell phone that is not a smartphone, and the remaining 9 percent of Americans do not own a cell phone at all.

The results in this report are based on data from telephone interviews conducted by Princeton Survey Research Associates International from April 17 to May 19, 2013, among a sample of 2,252 adults, age 18 and older. Telephone interviews were conducted in English and Spanish by landline (1,125) and cell phone (1,127, including 571 without a landline phone). For results based on the total sample, one can say with 95 percent confidence that the error attributable to sampling is plus or minus 2.3 percentage points. For results based on internet users (n=1,895), the margin of sampling error is plus or minus 2.5 percentage points.

Smartphone report: http://pewinternet.org/Reports/2013/Smartphone-Ownership-2013.aspx

For the first time, a third (34 percent) of American adults ages 18 and older own a tablet computer like an iPad, Samsung Galaxy Tab, Google Nexus, or Kindle Fire – almost twice as many as the 18 percent who owned a tablet a year ago.

Demographic groups most likely to own tablets include:

  • those living in households earning at least $75,000 per year (56 percent), compared with lower income brackets;

  • adults ages 35-44 (49 percent), compared with younger and older adults; and

  • college graduates (49 percent), compared with adults with lower levels of education.

“One of the things that is especially interesting about tablet adoption compared to some of the patterns of other devices we’ve studied is how these technologies’ growth has played out between different age groups,” Research Analyst Kathryn Zickuhr said. “With smartphones, for instance, we’ve seen a very strong correlation with age where most younger adults own smartphones, regardless of income level. But when it comes to tablets, adults in their thirties and forties are now significantly more likely than any other age group to own this device.”

The findings in this report are based on data from telephone interviews conducted by Princeton Survey Research Associates International from April 17 to May 19, 2013, among a sample of 2,252 adults ages 18 and older. Telephone interviews were conducted in English and Spanish by landline and cell phone. For results based on the total sample, one can say with 95 percent confidence that the error attributable to sampling is plus or minus 2.3 percentage points. More information is available in the methods section at the end of this report.

Tablet ownership: www.pewinternet.org/Reports/2013/Tablet-Ownership-2013.aspx

Emerging technologies identified in NMC Horizon Report > 2013 Higher Ed Edition

The New Media Consortium (NMC) and EDUCAUSE Learning Initiative (ELI) have jointly released the NMC Horizon Report > 2013 Higher Education Edition. This tenth edition describes annual findings from the NMC Horizon Project, a decade-long research project designed to identify and describe emerging technologies likely to have an impact on learning, teaching, and creative inquiry in education. Six emerging technologies are identified across three adoption horizons over the next one to five years, as well as key trends and challenges expected to continue over the same period, giving campus leaders and practitioners a valuable guide for strategic technology planning.

This year’s NMC Horizon Report identifies massively open online courses (MOOCs) and tablet computing as technologies expected to enter mainstream use in the first horizon of one year or less. Games and gamification and learning analytics are seen in the second horizon of two to three years; 3D printing and wearable technology are seen emerging in the third horizon of four to five years.

The subject matter in this report was identified through a qualitative research process designed and conducted by the NMC that engages an international body of experts in education, technology, business, and other fields around a set of research questions designed to surface significant trends and challenges and to identify emerging technologies with a strong likelihood of adoption in higher education. The NMC Horizon Report>2013 Higher Education Edition details the areas in which these experts were in strong agreement.

“Campus leaders and practitioners across the world use the report as a springboard for discussion around significant trends and challenges,” says Larry Johnson, Chief Executive Officer of the NMC. “The biggest trend identified by the advisory this year reflects the increasing adoption of openness on and beyond campuses, be it in the form of open content or easy access to data. This transition is promising, but there is now a major need for content curation.”

“Identifying the key emerging technologies for learning is vital at a time when all institutions are forced to make very careful choices about investments in technology,” adds ELI Director Malcolm Brown. “With the sudden emergence of the MOOC [massively open online course], there’s unprecedented attention being given to the teaching and learning mission of higher education. The NMC Horizon Report goes beyond simply naming technologies; it offers examples of how they are being used, which serves to demonstrate their potential. The report also identifies the trends and challenges that will be key for learning across all three adoption horizons. This makes the Horizon Report essential for anyone planning the future of learning at their institution.”

Of the six technologies highlighted in the NMC Horizon Report > 2013 Higher Education Edition, three were also noted in the 2012 edition. Tablet computing remains in the one year or less horizon, as do gaming and gamification, and learning analytics in the two to three year horizon. The 2013 edition refreshes these topics and explores new applications. 3D printing appeared in the very first NMC Horizon Report, released in 2004, but has reemerged this year due to the tremendous activity surrounding the Maker community, while MOOCs and wearable technology are brand new to the NMC Horizon Report series.

The NMC Horizon Report>2013 Higher Education Edition is available online, free of charge, and is released under a Creative Commons license to facilitate its widespread use, easy duplication, and broad distribution.

Download the NMC Horizon Report>2013 Higher Education Edition at:

www.nmc.org/publications/2013-horizon-report-higher-ed

NISO publishes recommended practice on improving OpenURLs through analytics

The National Information Standards Organization (NISO) has announced the publication of a new recommended practice, improving OpenURLs through analytics (IOTA): recommendations for link resolver providers (NISO RP-21-2013). These recommendations are the result of a three-year study performed by the NISO IOTA Working Group in which millions of OpenURLs were analyzed and a Completeness Index was developed as a means of quantifying OpenURL quality. By applying this completeness index to their OpenURL data and following the recommendations, providers of link resolvers can monitor the quality of their OpenURLs and work with content providers to improve the provided metadata – ultimately resulting in a higher success rate for end-users. The project is summarized in a technical report, IOTA Working Group Summary of Activities and Outcomes (NISO TR-05-2013), which was published along with the recommended practice.

“OpenURLs are context-sensitive URLs widely used by publishers and libraries to allow end-users to connect to the full-text of e-resources discovered during a search,” explains Aron Wolf, Data Program Analyst with Serials Solutions and member of the IOTA Working Group. “To ensure that the user accesses the most appropriate copy of a resource (one that is preferably free to the user due to a subscription through the user’s library), the OpenURL link connects to a link resolver knowledgebase. The metadata embedded within the OpenURL is compared through the link resolver with what is held in or licensed through the library and the end-user is then presented with the available full-text access options. At a typical academic library, thousands of OpenURL requests are initiated by patrons each week. The problem is that too often these links do not work as expected because the metadata in the OpenURL is incorrect or incomplete, leaving users unable to access the resources they need.”

“Through our analysis, the IOTA Working Group found that there was a pattern to the failures in OpenURLs,” states Adam Chandler, Electronic Resources User Experience Librarian at Cornell University Library and Chair of the IOTA Working Group. “The Completeness Index was developed as a method of predicting the success of OpenURLs from a given provider by examining the data elements that provider includes in the OpenURLs from its site. This metric can serve as a tool to help determine which content providers are more likely to cause linking problems due to missing data elements in their OpenURLs and can identify exactly what the problems are. The Recommended Practice explains how to implement the measures so that problems can be clearly identified and steps taken with the content providers to improve the quality of the metadata.”

“The IOTA Recommended Practice is a perfect complement to the NISO/UKSG KBART Recommended Practice (NISO RP-9-2010),” states Todd Carpenter, NISO’s Executive Director. “While KBART recommends how to improve the data within the link resolver knowledgebase, IOTA is focused on the metadata passed in the OpenURL itself. Together, these recommendations can ensure that OpenURLs will consistently provide the results that libraries, publishers, and end-users have come to expect from this technology.”

The IOTA Recommended Practice and Technical Report are both available for free download from the IOTA Working Group’s page on the NISO web site at: www.niso.org/workrooms/openurlquality

NISO to develop standards and recommended practices for altmetrics

The NISO has announced a new two-phase project to study, propose, and develop community-based standards or recommended practices in the field of alternative metrics. Assessment of scholarship is a critical component of the research process, impacting everything from which projects get funded to who gains promotion and tenure to which publications gain prominence. Since Eugene Garfield’s pioneering work in the 1960s, much of the work on research assessment has been based upon citations, a valuable measure but one that has failed to keep pace with online reader behavior, network interactions with content, social media, and online content management. Exemplified by innovative new platforms like ImpactStory, a new movement is growing to develop more robust alternative metrics – called altmetrics – that complement traditional citation metrics. NISO will first hold several in-person and virtual meetings to identify critical areas where altmetrics standards or recommended practices are needed and then convene a working group to develop consensus standards and/or recommended practices. The project is funded through a $207,500 grant from the Alfred P. Sloan Foundation.

“Citation analysis lacks ways to measure the newer and more prevalent ways that articles generate impact such as through social networking tools like Twitter, Facebook, or blogs,” explains Nettie Lagace, NISO’s Associate Director for Programs. “Additionally, new forms of scholarly outputs, such as datasets, software tools, algorithms, or molecular structures are now commonplace, but they are not easily – if at all – assessed by traditional citation metrics. These are two among the many concerns the growing movement around altmetrics is trying to address.”

“For altmetrics to move out of its current pilot and proof-of-concept phase, the community must begin coalescing around a suite of commonly understood definitions, calculations, and data sharing practices,” states Todd Carpenter, NISO Executive Director. “Organizations and researchers wanting to apply these metrics need to adequately understand them, ensure their consistent application and meaning across the community, and have methods for auditing their accuracy. We must agree on what gets measured, what the criteria are for assessing the quality of the measures, at what granularity these metrics are compiled and analyzed, how long a period the altmetrics should cover, the role of social media in altmetrics, the technical infrastructure necessary to exchange this data, and which new altmetrics will prove most valuable. The creation of altmetrics standards and best practices will facilitate the community trust in altmetrics, which will be a requirement for any broad-based acceptance, and will ensure that these altmetrics can be accurately compared and exchanged across publishers and platforms.”

“Sensible, community-informed, discipline-sensitive standards and practices are essential if altmetrics are to play a serious role in the evaluation of research,” says Joshua M. Greenberg, Director of the Alfred P. Sloan Foundation’s Digital Information Technology Program. “With its long history of crafting just such standards, NISO is uniquely positioned to help take altmetrics to the next level.”

The first phase of the project will gather two groups of invited experts in altmetrics research, traditional publishing, bibliometrics, and faculty assessment for in-person discussions with the goal of identifying key altmetrics issues and those that can best be addressed through standards or recommended practices. This input will form the basis of two virtual meetings open to the public to further refine and prioritize the issues. Additional community input will be sought through an array of electronic and social mechanisms and events coordinated with major community conferences. A report summarizing this input will identify the specific areas where NISO should develop standards or recommended practices, which will be undertaken by a working group convened in phase two. The complete project from initial meetings to publication of standards is expected to take two years. Information about the meetings and other methods for participation will be announced on the NISO web site and in the monthly Newsline e-newsletter (www.niso.org/publications/newsline/).

NISO fosters the development and maintenance of standards that facilitate the creation, persistent management, and effective interchange of information so that it can be trusted for use in research and learning. To fulfill this mission, NISO engages libraries, publishers, information aggregators, and other organizations that support learning, research, and scholarship through the creation, organization, management, and curation of knowledge.

More information about NISO is available on its web site: www.niso.org

Data-Planet™ and EBSCO provide mutual customers with access to statistical data via EBSCO Discovery Service™

EBSCO and Data-Planet™ have reached an agreement that for the first time provides statistical DataSheets within a discovery service. A growing collection of more than 5,000 summary-level data records from Data-Planet will be available within EBSCO Discovery Service™ allowing mutual customers to link directly to Data-Planet DataSheets.

The Data-Planet repository of statistical content currently holds more than 5,000 datasets presented in more than two billion views of data (maps, trends, tables, rankings). The datasets are sourced from reputable public and private organizations and cover topics across 16 broad subject areas, including education, population and income, industry, commerce, and trade, housing and construction, and much more. All of the data have been standardized and structured, and are described with 37 fields of metadata including DOI, summary-level visualizations of the data, descriptions, titles, geographic entities, data-specific elements, and standardized citations. Users can access all the available views for each data series and download the data from each DataSheet.

Data-Planet President Matt Dunie says the agreement with EBSCO is a natural extension of Data-Planet’s mission. “Our mission is to make statistical data more findable and more usable and this effort is in sync with the overall objective of discovery.”

Data-Planet is part of a growing list of publishers and other content partners that are taking part in EDS to bring more visibility to their content. Partners include the world’s largest scholarly journal and book publishers including Elsevier, Wiley Blackwell, Springer Science & Business Media, IEEE, and thousands of others, as well as content providers, such as LexisNexis, Thomson Reuters (Web of Science), JSTOR, The Hathi Trust and many others.

In addition to being accessible via EBSCO Discovery Service, the Data-Planet repository is searchable via two complementary interfaces: Data-Planet statistical datasets and Data-Planet statistical ready reference.

Now available direct from the producer and technology provider, Data-Planet can be accessed directly at: http://data-planet.conquestsystems.com/statistical/ip

Kuali Open Library Environment (OLE) release 0.8 now available

A recent post on the Kuali OLE blog announcing Kuali OLE 0.8:

“Kuali OLE 0.8 is now available – meeting our target date of 3rd quarter 2013! This latest release is the first available for the partner libraries to download the software and test-load their own data locally. The release is also available for academic research libraries (and others) to test-drive at: http://demo.ole.kuali.org/portal.jsp

“Kuali OLE 0.8 adds the following features/functions, building on the infrastructure provided in previous releases”:

  • The deliver module incorporates basic circulation features/functions: defining policies; importing and creating patron records; performing checkout and checkin; placing and managing requests (recalls, holds, paging, and copying); and generating and delivering notices and bills.

  • The describe and manage module offers enhanced searching features, the ability to create/modify/delete records (bibliographic, holdings, and items), to transfer holdings and item records, bound-with functionality, and MARC record ingest. Also included in this release is the new Describe Workbench portal designed to provide improved workflow process support for cataloging staff.

  • One of the notable developments for select and acquire module for 0.8 is the development of workflow support for the license negotiation process (the License Request). See the OLE Guide to Licensing listed in the Driver’s Manual. Other developments worth noting are the integration of order information into the “OLE instance” record which houses holdings and item information, the improvements in receiving items specifically multi-part orders and an enhanced import process of EDI files. Major additions to acquisition functionality will be coming soon in the 1.0 release.

“The Kuali OLE Project Team is currently focusing its efforts on extending OLE functionality for two upcoming releases. Release 1.0 (target date Q4 2013) will be the early adopters release and is currently in development. Specifications are being written for release 1.5” (target date Q1 2014).

“Kuali OLE is the result of collaboration of higher education research libraries including Indiana University, Duke University, Lehigh University, North Carolina State University, University of Chicago, University of Florida, University of Maryland, University of Michigan, University of Pennsylvania and Villanova University. Kuali OLE has also received generous support from The Andrew W. Mellon Foundation.”

Details of the new release are available in the OLE 0.8 User Documentation: https://wiki.kuali.org/display/OLE/OLE+0.8+Milestone+ User+Documentation

New WorldCat Metadata API will enable OCLC members, partners to build applications

OCLC has launched the new WorldCat Metadata API that will enable member libraries and partners to build and share applications on the OCLC WorldShare Platform for libraries to catalog their collections in WorldCat.

The WorldCat Metadata API supports a variety of cataloging functionality for libraries to catalog their collections in WorldCat. Libraries will be able to create applications with the new API to add new and enrich existing WorldCat bibliographic records, and maintain WorldCat institution holdings and local bibliographic data.

Libraries can continue to catalog their collections in WorldCat using OCLC-built applications such as Connexion and the upcoming WorldShare Metadata Record Manager, or they can create new applications using the WorldCat Metadata API to manage their cataloging workflows.

“The information ecosystem defined by the Web demands access to data, services, applications and partnerships to thrive and meet the demands of today’s information seeker,” said Kathryn Harnish, Director, OCLC Network Experience. “Release of the WorldCat Metadata API will help library staff and partners create applications that can integrate with the services they choose, to help streamline their workflows and get the most from their services and service partners.”

The WorldCat Metadata API is the latest in a series of APIs released by OCLC on the OCLC WorldShare Platform, a global, interconnected web architecture that supports OCLC’s cloud-based services and applications, and provides flexible access to library data through APIs and other web services. Librarians and developers can use these tools to innovate together to build and share solutions for libraries that streamline and enhance library workflows.

The WorldCat Metadata API complements the WorldCat Search API. These web services provide read and write access to libraries’ bibliographic and holdings data. A number of library- and partner-built applications already use the WorldCat Search API and have been well-received by the OCLC membership.

Visit the OCLC Developer Network to learn how libraries can get started using the new WorldCat Metadata API.

WorldCat Metadata API: http://oclc.org/en-US/worldshare-metadata/worldcat-apis.html

OCLC Developer Network: http://oclc.org/developer/

ASIS&T announces management partnership with Dublin Core Metadata Initiative

The Dublin Core Metadata Initiative (DCMI), an internationally renowned organization advancing innovation in metadata design and best practices, will become a project of the Association for Information Science and Technology (ASIS&T) upon DCMI’s wrapping up activities at its current location in Singapore.

On June 30, 2013, DCMI Ltd will cease operations as a company limited by guarantee in Singapore and became a project of ASIS&T. This change for DCMI from independent, non-profit company status in Singapore to a partnership with ASIS&T marks a significant milestone in DCMI’s history. The decision was motivated by the desire of DCMI’s governing Oversight Committee to shape a more flexible and progressive institutional structure, while retaining its mission, goals and objectives and its commitments to an open, consensus-driven community.

Andrew Dillon, President of ASIS&T, stated that the partnering of ASIS&T with DCMI makes excellent sense for both parties. “There is a considerable overlap in participation and subject matter interest in both groups, and information science as a practice and as a discipline will be strengthened by the partnership,” said Dillon. “We look forward to regular cross fertilization through meeting sessions, workshops, webinars and other forums,” Dillon continued.

The change in DCMI institutional structure was guided by extensive investigations by DCMI’s Oversight Committee into alternative structures and potential partnerships that might better serve DCMI’s global community of metadata researchers and practitioners. Mutual and unanimous decisions were reached by the governing bodies of both DCMI and ASIS&T that DCMI become a project of ASIS&T effective July 1, 2013. Going forward, DCMI will maintain its autonomy as a global community through maintenance of the DCMI “brand”, its governance structure and its programmatic commitments and membership programs while actively engaging with ASIS&T as a partner in mutually beneficial activities.

Thus, DCMI and ASIS&T are committed in the coming months and years to seek out joint activities where both organizations can leverage their strengths to the benefit of both communities. Both DCMI and ASIS&T bring to the new partnership their histories as strong, successful organizations serving research and practice communities.

ASIS&T home: www.asis.org/

DCMI home: http://dublincore.org/

SCOAP3 on track for launch of New Open Access publishing model in 2014

SCOAP3, the Sponsoring Consortium for Open Access Publishing in Particle Physics, is a consortium of High-Energy Physics funding agencies, High-Energy Physics laboratories, and leading national and international libraries and library consortia. Its aim is to facilitate Open Access publishing in High Energy Physics.

As announced its web site in early June, SCOAP3 is on track to meet its target start date of January 1, 2014.

At a recent meeting in May, the international SCOAP3 partnership reaffirmed its commitment to SCOAP3. The initiative, with the full support of CERN, the European Organization for Nuclear Research, is now ready to move to its next step: signing contracts with publishers and establishing the consortium.

In recent weeks, SCOAP3 partners and publishers have been working hard to make the large-scale conversion of High-Energy Physics subscription journals to Open Access a reality. Reductions to current subscriptions costs have been calculated, and for many libraries and countries the process is nearing completion.

In a parallel development, APS has restructured its Open Access strategy, and, regrettably, Physical Review C and Physical Review D are no longer among the journals participating in SCOAP3.

In partnership with 11 publishers and learned societies, CERN and the SCOAP3 community of hundreds of libraries worldwide continue their ground-breaking journey. Together, this collaboration will provide Open Access to thousands of High-Energy Physics articles published in ten journals. This will include the first-of-its-kind “flip” to Open Access of four entire subscription journals.

SCOAP3 is looking forward to welcome new partners in the Americas, Europe and Asia.

More information: http://scoap3.org/

British Library and Portico will partner on e-journal deposit infrastructure

The British Library and digital preservation specialist Portico have announced that they will be working together to ensure that thousands of electronic journal titles will be collected, preserved and made available to current and future generations of researchers.

The partnership will help the British Library – along with five other legal deposit libraries – to meet regulations that recently became law in the UK and that extend the practice of legal deposit from traditional print publications to non-print publications such as e-journals, blogs and web sites in the UK web domain. The legal deposit libraries are the British Library; National Library of Scotland; National Library of Wales; Bodleian Libraries; Cambridge University Library; Trinity College Library Dublin.

Portico will utilize its established workflow and processes to create standardized and uniform journal content that can be exported to the British Library. They have started with 1,500 journals from three publishers that are already preserving content with Portico. As necessary, Portico will develop new tools for processing additional publisher content.

The British Library and Portico began their work together through a pilot project in 2012. The two organizations developed their systems as part of the pilot and were thus able to “turn on” automatic delivery of content to the British Library as soon as legislation passed.

Alasdair Ball, Head of Collection Acquisition and Description at the British Library said, “We’re delighted to be working with Portico, using their expertise and technical infrastructure to help us meet the requirements of the new regulations. By working with a respected and well-established provider we’ll gradually enable publishers to deposit their electronic journal content easily and securely, thereby ensuring that it’s collected, preserved and made accessible for future generations of researchers.”

“I am excited that together we have identified a way to ensure that the library community is able to use the same set of tools they’ve invested in through Portico to meet national digital preservation needs,” stated Kate Wittenberg, Managing Director, Portico.

British Library web site: www.bl.uk

Portico: www.portico.org/digital-preservation/

Library of Congress selects inaugural class of National Digital Stewardship Residents

The Library of Congress, in partnership with the IMLS, has selected ten candidates for the inaugural class of the National Digital Stewardship Residency (NDSR) program. The nine-month program begins in September 2013.

The NDSR program offers recent master’s program graduates in specialized fields – library science, information science, museum studies, archival studies and related technology – the opportunity to gain valuable professional experience in digital preservation. Residents will attend an intensive two-week digital stewardship workshop this fall at the Library of Congress. They will then work on a specialized project at one of ten host institutions in the Washington, DC area, including the Library of Congress. These projects will allow them to acquire hands-on knowledge and skills regarding collection, selection, management, long-term preservation and accessibility of digital assets.

The residents listed below were selected by an expert committee of Library of Congress and IMLS staff, with commentary from each host institution.

2013 National Digital Stewardship Residents:

(Name; hometown; university; host institution; project description).

Julia Blasé; Tucson, Ariz.; University of Denver; National Security Archive; to take a snapshot of all archive activities that involve the capture, preservation and publication of digital assets.

Heidi Dowding; Roseville, Mich.; Wayne State University; Dumbarton Oaks Research Library and Collection; to identify an institutional solution for long-term digital asset management, conduct research on a variety of software systems and draft an institutional policy for the appraisal and selection of content destined for preservation.

Maureen Harlow; Clayville, NY; University of North Carolina, Chapel Hill; National Library of Medicine; to create a collection of web content on a specific theme or topic of interest such as medicine and art or the e-patient movement.

Jaime McCurry; Seaford, NY; Long Island University; Folger Shakespeare Library; to establish local routines and best practices for archiving and preserving the institution’s digital content.

Lee Nilsson; Eastpointe, Mich.; Eastern Washington University; Library of Congress, Office of Strategic Initiatives; to analyze the future risk of obsolescence to digital formats used at the library and work with library staff to develop an action plan to prevent the risks.

Margo Padilla; Oakland, Calif.; San Jose State University, Maryland Institute for Technology in the Humanities; to create and share a research report for access models and collection interfaces for born-digital literary materials. She will also submit recommendations for access policies for born-digital collections.

Emily Reynolds; Pleasantville, NY; University of Michigan; The World Bank Group; to facilitate and coordinate the eArchives digitization project, resulting in the creation of a digitized and cataloged historical collection of key archival materials representing more than 60 years of global development work.

Molly Schwartz; Dickerson, Md.; University of Maryland; Association of Research Libraries; to strengthen and expand a new initiative on digital accessibility in research libraries by incorporating a universal design approach to library collections and services.

Erica Titkemeyer; Cary, N.C.; New York University; Smithsonian Institution Archives; to identify the specialized digital and curatorial requirements of time-based media art and establish a benchmark of best practices to ensure that institution’s archives will stand the test of time.

Lauren Work; Rochester, NY; University of Washington; Public Broadcasting Service; to develop and apply evaluation tools, define selection criteria and outline recommended workflows needed to execute a successful analog digitization initiative for the PBS moving image collection.

For more information about the National Digital Stewardship Residency program, visit: www.loc.gov/ndsr

Internship opportunities in digital technology are available in the Library’s Office of Strategic Initiatives: www.loc.gov/internosi

Related articles