Emerald Group Publishing Limited
Copyright © 2005, Emerald Group Publishing Limited
New & Noteworthy
GoogleTo Digitize Collections of Major Research Libraries in USA and UK
As part of its effort to make offline information searchable online, Google Inc. has announced that it is working with the libraries of Harvard, Stanford, the University of Michigan, and the University of Oxford as well as The New York Public Library to digitally scan books from their collections so that users worldwide can search them in Google.
The project addresses several issues posed by the fact that the Internet now serves as a primary source of information inside and outside the academic community. Most of today's online content was "born digital," and often cannot be verified. By contrast, library materials that will become available through Google originate from fully-identified authoritative sources, and cover every conceivable topic since the advent of printing. To ensure permanent preservation, digitized materials are saved in formats that can be supported by many different software programs on a variety of platforms. Files are stored redundantly on several servers to ensure a greater likelihood of survival.
The announcement is an expansion of the Google Print program, which assists publishers in making books and other offline information searchable online. Google is now working with libraries to digitally scan books from their collections, and over time will integrate this content into the Google index, to make it searchable for users world-wide. For publishers and authors, this expansion of the Google Print program will increase the visibility of in and out of print books, and generate book sales via "Buy this Book" links and advertising. For users, Google's library program will make it possible to search across library collections including out of print books and titles that were not previously available anywhere but on a library shelf.
Users searching with Google will see links in their search results page when there are books relevant to their query. Clicking on a title delivers a Google Print page where users can browse the full text of public domain works and brief excerpts and/or bibliographic data of copyrighted material. Library content will be displayed in keeping with copyright law.
More information and examples: http://print.google.com/googleprint/library.html
International Libraries and the Internet ArchiveTo Build Open-Access Text Archives
A number of international libraries have committed to putting their digitized books in open-access archives, starting with one at the Internet Archive. The goal is to ensure permanent and public access to society's published heritage. Anyone with an Internet connection will have access to these collections and the growing set of tools to make use of them. This growing commitment to open access through public archives marks a significant commitment to broad, public, and free access. While still early in its evolution, works in dozens of languages are already stored in the Internet Archive's Open-Access Text Archive offering a breadth of materials to everyone. Over one million books have been committed to the Text Archive. Currently over 27,000 are available and an additional fifty thousand are expected in the first quarter of 2005. Advanced processing of these multilingual books will offer unprecedented access.
The Internet Archive hosts a Text Archive working with a number of other libraries and archives directly and indirectly. At this time, the libraries that have committed to hosting books on the Internet Archive include:
Carnegie Mellon University and the Million Book Project, USA (Raj Reddy).
University of Toronto, Canada (Carole Moore).
Library of Congress American Memory Project, USA (Deanna Marcum).
McMaster University, Canada (Graham Hill).
University of Ottawa, Canada (Leslie Weir).
Bibliotheca Alexandrina, Egypt (Noha Adly).
Indian Institute of Science, India (N. Balakrishnan).
International Institute of Information Technology, India (Dr Jawahar Lakshmi).
Zhejiang University, China (Professor Zhao).
European Archive, The Netherlands (Julien Masans).
Internet Archive, San Francisco USA (Brewster Kahle).
All libraries and archives are encouraged to join this free association by contacting the Internet Archive at firstname.lastname@example.org
Internet Archive: www.archive.org/
Full text of the announcement: www.archive.org/iathreads/post-view.php?id=25361
IMLSStudy to Re-examine Impact of Internet on Use of Libraries
The Institute of Museum and Library Services (IMLS) has awarded a National Leadership Grant for Libraries to the SUNY Buffalo School of Informatics to conduct a follow-up of a 2,000 national baseline survey of public use of the internet and the public library. A major national study conducted by the School of Informatics and the Urban Libraries Council found five years ago that increased Internet use in the USA had not produced a reduction in the public use of libraries.
The researchers now are poised to undertake a much larger national study to see what, if any, changes have taken place over the past five years. Data for the new study will come from a national random digit-dialing survey of 3,000 respondents throughout the USA, plus an in-house questionnaire survey of 10,000 users of libraries in five urban library systems. The original study, "The Impact of Internet Use on Public Library Use," came from a national random-sample telephone survey of 3,097 English- and Spanish-speaking adults.
In that study, the researchers found that the use of libraries and the internet appeared to be complimentary.
The new study will be funded by a $266,881 grant from IMLS, which also funded the first study. George D'Elia, professor in the Department of Information and Library Studies in the UB School of Informatics and principal investigator on the first study, also will lead the new study. The first study, which received the 2003 Jesse H. Shera Award for Distinguished Public Research from the American Library Association's Library Research Round Table, was published in 2002 in the Journal of the American Society for Information Science and Technology.
"At that time, we found that 55 percent of the library users surveyed had internet access at home," D'Elia says, "so it was clear that use of the two information sources was not an either-or proposition. Internet users also use the library rather extensively. We expect to get the same results this time that internet use does not reduce library use. We'll see if there is a percentage change in any of the areas studied."
The first study reported that 75.2 percent of internet users also used the library, 60.3 percent of library users also used the internet, 40 percent of the survey population used both the library and the internet. They also found that library users and internet users were better educated than nonusers; both library and internet users reported higher household incomes than nonusers; the percentage of females who reported using the library was higher than the percentage of males who reported using the library, but the percentage of males who reported using the internet was higher than the percentage of females who reported using the internet.
Additional information: www.buffalo.edu/reporter/vol36/vol36n16/articles/LibraryStudy.html
Pew Internet ProjectNew Reports, New Database of Predictions from Pew Internet Project
The Pew Internet and American Life Project continues to generate significant new reports that explore the impact of the internet on families, communities, work and home, daily life, education, health care, and civic and political life. The project aims to be an authoritative source on the evolution of the internet through collection of data and analysis of real-world developments as they affect the virtual world. The project releases 15-20 pieces of research a year, varying in size, scope, and ambition.
Recent reports include:
The State of Blogging.
Artists, Musicians and the Internet.
The Future of the Internet.
Search Engine Users.
A Decade of Adoption: How the Internet has Woven Itself into American Life.
Additionally, in conjunction with Elon University, the project has released a new database entitled "Imagining the Internet: Predictions Database" of over 4,000 predictions made about the internet and networked communications. In the fall of 2000, Elon University and the Pew Internet and American Life Project formed a partnership to build the Imagining the Internet Predictions Database, giving students and faculty the opportunity to do research about the internet and share their findings with a wider audience.
The first project, completed in February 2001, was called One Neighborhood, One Week on the Internet. It chronicled internet usage by 24 families during a one-week period. Journalism students wrote feature stories about each of the families, and also studied the data collected during the week in family diaries. Janna Quitney Anderson, assistant professor of communications and director of internet projects in the Elon School of Communications, directed the project. The second project, completed in spring 2002, was the pilot Internet Predictions project directed by Connie Ledoux Book, assistant professor of communications in the Elon School of Communications.
The initial predictions project set the stage for the third Elon-Pew project, the building of "Imagining the Internet," the online database, under Anderson's direction. Research to find and log more than 4,200 1990-1995 internet predictions from stakeholders and skeptics took place in 2003; the "Share Your Vision" and "Experts Survey" pieces were completed at the suggestion of Lee Rainie, director of the Pew Internet and American Life Project, in 2004, with the unveiling of the site in early 2005.
Pew Internet and American Life Project web site: www.pewinternet.org/
To sign up for e-mail alerts for new reports: www.pewinternet.org/signup.asp
Imagining the Internet: Predictions Database: www.elon.edu/predictions/
OCLC Research Sponsors Contest for Development of Creative Library Software
OCLC Research has announced a contest intended to give developers an outlet for creative software development of library services. Contest entries will be accepted through midnight (ET) May 15, 2005.
The contest is intended to encourage innovation and development of web-based services for libraries and library users. Contestants will be challenged to think differently about their environments by working with deconstructed functional components of library services. The contestants' mission is to write a program that does something interesting and innovative with the WorldCat data using at least one of the OCLC-provided services. Contestants must also submit a working prototype. The contestants are challenged to convince the judges of why their program is interesting and why it will help libraries and/or library users; other than that, contestants are free to implement whatever strikes their fancy. The prize is US$2,500.
Entries will be judged by a panel of expert practitioners and academicians from OCLC and the library/information community: Thom Hickey (Chief Scientist, OCLC), Tip House (Chief Systems Analyst, OCLC), Elizabeth Lane Lawley (Associate Professor and Director, Lab for Social Computing, Rochester Institute of Technology; blogger), Mike Teets (Executive Director, Product Architecture and Development, OCLC), Roy Tennant (User Services Architect, California Digital Library;Owner, Web4Lib and XML4Lib electronic discussions; author), and Jon Udell (Lead Analyst, InfoWorld; author; blogger).
Full details available on the contest web site: www.oclc.org/research/researchworks/contest/
JPEG 2000In Archives and Libraries
For many years, libraries and archives have used the JPEG and TIFF coding standards to store and make available images in an electronic format. Decades of research in image compression techniques as a subfield of signal processing have yielded advancements through the use of wavelet transformation, and some have adopted products based on proprietary wavelet compression implementations such as SID.
In the 1990s, under the auspices of the International Standards Organization and the standards section of the International Telecommunication Union, the Joint Photographic Experts Group worked to create a new imaging standard using wavelet compression. The work of the committee reached a pinnacle in December 2000 with the ratification of Part 1 of the JPEG 2000 standard.
As JPEG 2000 is embraced by specialized vertical markets (such as medical imaging and national defense intelligence gathering) and appears in the consumer digital camera and scanner markets, it has the potential to revolutionize common practices in libraries and archives. In addition to achieving greater magnitudes of compression with reduced or no loss of image data, JPEG 2000 was designed to embed the technical and descriptive metadata associated with images that has become crucial to long-term usability of the image file as a digital artifact.
With funding from the Gladys Kreible Delmas Foundation and the Connecticut State Library, the University of Connecticut convened the Symposium on the Adoption of JPEG 2000 by Archives and Libraries on November 4-5, 2004 to begin the process of understanding, coordinating and accelerating the implementation of the standard by providing a forum for delegates to outline the efforts required to achieve wide-scale adoption.
Out of the symposium came several desires that would aid our communities in making decisions regarding a JPEG 2000 practice as well as provide a forum for sharing information about JPEG 2000 in archives and libraries. The lead request was the creation of a website that would allow users to register and post their own information on articles, projects, and products that use the standard. That web site is now available at http://j2kArcLib.info/ and it contains information such as:
Report on the Symposium: http://j2karclib.info/node/42
Project Briefing at the CNI Fall Task Force Meeting: http://j2karclib.info/node/30
Resources by Focus: http://j2karclib.info/resources/by_focus
Resources by Type: http://j2karclib.info/resources/by_type
The skeleton is there and waiting for information to be added. Symposium participants will be adding information as we come across it; feel free to create an account on the site to add your own information.
A mailing list has been created where subscribers can talk about the application of JPEG 2000 in archives, libraries, and related fields. One can subscribe through the web (http://listserv.uconn.edu/cgi-bin/wa?SUBED1=j2karclib-l) or send an e-mail to email@example.com with "subscribe j2kArcLib-L [your name]" as the first line of the message. Archives to the list will be available in GMANE.ORG shortly.
For NSF Middleware Initiative
The sixth release of the National Science Foundation Middleware Initiative (NMI) - NMI-R6 - offers several new components relating to end-user authorization services. Architected to integrate with academic and research software and infrastructures such as Grids, the release is available under open-source licenses to the public at www.nsf-middleware.org New and updated components in NMI-R6 include standards-based intra- and inter-institutional authentication and authorization components, frameworks, and related directory schemata. Designed to address specific challenges in research and education security infrastructures, the release includes a rich variety of tools that enable researchers to work more efficiently.
The NMI-R6 release features NMI-EDIT's Signet: An Introduction, a set of privilege management tools based on the Stanford Authority System in use since 2001, Signet allows the delegated management of resource access privilege information that can then be provisioned to applications or directories. Also included in the release is Grouper, a set of group management tools that integrates with Signet or can be used independently.
The GRIDS Center has announced a new resource to help make sense of the complicated and ever growing landscape of Grid technologies. The new Grid Ecosystem provides a mix of high level discussion on architecture and main subsystems, and drills down into technical detail and reference pointers for individual software packages. Also new from the GRIDS team is a pilot community collaboration server.
For the full release, refer to: http://collab.nsf-middleware.org/Lists/NMIR6/announce.aspx
Virtual Data Center Releases OSS Digital Library System Software
The VDC Development Team has announced the official 1.02 Release of the VDC. The Virtual Data Center (VDC) is an OSS digital library system "in a box" for numeric data.
VDC provides a complete open-source, digital library system for the management, dissemination, exchange, and citation of virtual collections of quantitative data. The VDC functionality provides everything necessary to maintain and disseminate an individual collection of research studies: including facilities for the storage, archiving, cataloging, translation, and dissemination of each collection. Online analysis is provided, powered by the R Statistical environment. The system provides extensive support for distributed and federated collections including: location-independent naming of objects, distributed authentication and access control, federated metadata harvesting, remote repository caching, and distributed virtual collections of remote objects.
Adds support for Fedora 2 Core, operation behind an http proxy, and numerous small enhancements.
Provides RPM's for Redhat 9.0, Fedora 1 Core, Fedora 2 Core, and Redhat 3 Advance Server.
Provides all core features and contains no known bugs. Supported standards and protocols and formats include: DDI, Dublin Core, and MARC for metadata; R,SPSS, SAS, ASCII, and STATA for data; OAI and Z39.50 for queries; UNF's and Handle's for naming/citation.
Features planned for the next point release include: usability enhancements and a completely integrated user manuals and help system.
Virtual Data Center homepage: http://thedata.org/index.php/Main/HomePage
SDL Search EngineCovers Open Access Resources in Library and Information Science
The Documentation Research and Training Center (DRTC) of the Indian Statistical Institute has launched Search for Digital Libraries (SDL), a OAI-compliant harvester. SDL presently harvests digital repositories in Library and Information Science and open access journals in LIS. As of January 2005, the SDL provided access to over 2700 full-text articles, in English and other languages. The Documentation Research and Training Center, Indian Statistical Institute (ISI), is a premier research institute, founded by Professor S.R. Ranganathan.
AquaBrowser Library Search EngineOffers Interactive, Intelligent Searching
AquaBrowser Library® is a revolutionary library catalog search tool developed by MediaLab Solutions, The Netherlands, it is designed specifically to enhance the experience of any library catalog searcher. AquaBrowser Library widens the scope of the catalog search by incorporating three distinct design principles - known as search, discover, and refine - into real usable tools.
AquaBrowser Library allows catalog users to search for information using a typical query box. The results are portrayed in a typical browser search list by relevance to the query terms given by the user. Clicking items in the list takes the user to the exact sources, both inside and outside the library.
On the left side of the results screen, the user sees a moving, interactive cloud of words based on the query. These words are suggestions for the user to discover new information and to help in formulating the query. These suggestions consist of translations (to another language), spelling variations (to correct for spelling mistakes or words that can be spelt in more than one way) and associations (free "brainstorming" associations). Clicking on any of these terms in the word cloud yields a new search combining that term with the original one. The system thereby ranks relevance by this history, finding the combination on top of the list of results. The words the user clicked on in the word cloud are shown in a different color, as a trail of suggestions in the word cloud, thus never losing the original term.
On the right side of the results screen are lists of possible options to refine the result set. If the user chooses to refine the results, he/she can do so by picking any of the search options, such as a specific subject, author, category, or a language, etc. The refine section organizes resources by category, allowing the user to distinguish between books, magazines, newspapers, web pages, etc. By doing this, the user can not only find the exact information he/she is looking for, but also from the exact kind of resource the user would like it to be from, depending on the user's needs.
Jimmy Thomas, Director of Strategic Products at TLC, provided an extensive product spotlight presentation of the AquaBrowser Library to librarians attending the American Library Association Midwinter Meeting in Boston in January. AquaBrowser Library is internationally available; The National Library of Singapore has implemented the AquaBrowser Library nation-wide. AquaBrowser Library integrates with many integrated library systems and is currently in use in over 40 percent of public libraries in The Netherlands.
AquaBrowser demonstration: http://tlc.medialab.nl/
US Copyright Clearance Center and III PartnerTo Streamline Copyright Compliance for Electronic Reserves
Copyright Clearance Center, and Innovative Interfaces, have announced an integration partnership to make copyright compliance for electronic reserves faster and easier. By providing direct access to Copyright Clearance Center's rights database from the Innovative's electronic course reserves module, librarians and staff will be able to fulfill e-reserve requests for copyright permission more efficiently and quickly.
Previously, librarians and staff members interested in posting electronic reserves needed to leave their library system and log into Copyright Clearance Center's web site separately. With the new integration of services, when a user selects a title in Innovative's electronic course reserves, they can click the "get permission" link and be taken to the appropriate permission service and search results page on copyright.com. User and bibliographic information can be automatically transferred to place a copyright permission order, eliminating the need to log in, re-key information or conduct another title search.
The integration was accomplished through Copyright Clearance Center's Copyright Integration Services.
Copyright Clearance Center: http://copyright.com/
Innovative Interfaces: www.iii.com/
WebFeatGranted Patent for Federated Search Technology in the USA
WebFeat, a federated search engine used by over 1,500 leading academic, public, and special libraries, has been granted a patent for its federated search technology by the US Patent and Trademark Office. Granted in the fourth quarter of 2004, US patent No. 6,807,539 covers WebFeat's method and technology for managing the authentication and session management necessary to perform a federated search across licensed resources. WebFeat's patent focuses on WebFeat's method and technology for administering the authentication and session management necessary to perform a federated search across licensed resources.
WebFeat's web site: www.webfeat.org
RLG Releases New Tools for Public Use
Descriptive Metadata Guidelines for Cultural Materials
RLG has announced that its new guide to descriptive metadata for unique cultural objects is available online. Written by an expert working group and vetted by a wide community, the Descriptive Metadata Guidelines for RLG Cultural Materials demystifies the complex concepts and acronyms in the field of descriptive metadata.
Not just for RLG Cultural Materials contributors, the overview is useful for anyone across the spectrum of museums, libraries, and archives who is interested in fast-tracking collection items for online access. The guidelines give practical advice to implementers, but also provide context for the decision-maker. They can be used to create or review local best practice in describing collections and their objects - regardless of the specific metadata standards you use.
The new guidelines can be freely accessed at: www.rlg.org/en/page.php?Page_ID?214 This document is a complete revision of the initial guidelines, last updated in 2003. The guide draws upon RLG's experience in aggregating these one-of-a-kind objects for the RLG Cultural Materials database and its spin-off, Trove.net (tm).
EAD Report Card
RLG has also announced a new application for the archival community. RLG's EAD Report Card - the first web application that automatically checks the quality of your EAD encoding - is freely available. Created by popular demand, the tool supplements RLG's Best Practice Guidelines for Encoded Archival Description. Once finding aids have been uploaded, the program will flag any discrepancies. It also cites the relevant part of the guidelines, so that the operator can fix the problem in real time. The EAD Report Card can be accessed here: www.rlg.org/en/page.php?Page_ID=20513
The EAD Best Practice Guidelines are also available: http://www.rlg.org/en/page.php?Page_ID=450
RLG welcomes feedback on these products.
WSDCMBPNew Version of Western States Metadata Best Practices Available
The Western States Digital Standards Group - Metadata Working Group has announced the availability of version 2.0 of the Western States Dublin Core Metadata Best Practices. The majority of the changes made to the WSDCMBP element definitions further clarify their meaning and provide additional examples appropriate for cultural heritage collections. The changes in Version 2.0 include:
"Format.Use" has become "Format" to align use of this field with the Dublin Core Metadata Initiative element set.
"Format.Creation" has become "Digitization Specifications" to reflect that it include additional preservation information beyond the format of the master digital object.
"Holding Institution" has become "Contributing Institution" in order to accommodate multiple roles that digitization project partners may play. For example, one institution may physically hold the original resource, another may perform the digital imaging, and another may create metadata. Each of these factors contributes to establishing the provenance of the available resource and associated metadata.
The Metadata Working Group welcomes comments from projects using the WSDCMBP for inclusion in future revisions.
Open eBook Forum Announces Top Selling eBooks of 2004
The Open eBook Forum has released its annual list of Top Selling eBooks for 2004. Dan Brown dominated the 2004 list with The Da Vinci Code topping the list and Brown's Angels & Demons and Deception Point rounding out the top three. At the same time, the industry reported continued growth, with eBook revenues for the third quarter of 2004 up 25 percent and eBook units sold up 11 percent over the same quarter in 2003. The bestseller list and quarterly eBook Statistics Report were both released by the Open eBook Forum (OeBF), the industry trade and standards organization.
The OeBF 2004 eBook Bestseller List includes the bestselling eBooks reported from the leading eBook retailers and distributors including eBooks.com, eReader.com, Fictionwise.com, Mobipocket.com and OverDrive. The top ten bestselling eBooks were:
The Da Vinci Code by Dan Brown (Doubleday - $14.95).
Angels & Demons by Dan Brown (PocketBooks - $6.99).
Deception Point by Dan Brown (PocketBooks - $6.99).
Digital Fortress by Dan Brown (St Martin's Press - $5.99).
Darwin's Radio by Greg Bear (Del Rey - $6.99).
Holy Bible, New International Version - International Bible Society (Zondervan - $14.99).
I, Robot by Isaac Asimov (Spectra - $4.99).
Electronic Pocket Oxford English Dictionary and Thesaurus Value Pack (Oxford University Press - $19.95).
Darwin's Children by Greg Bear (Del Rey - $6.99).
Merriam-Webster's Collegiate® Dictionary (Merriam-Webster - $25.95).
Scott Adams, the creator of the popular Dilbert cartoon strip and books, stated, "eBooks have been a substantial portion of my total book sales. I've reached a lot of readers who don't like the higher cost of hardcover books." Mr Adams' self-published eBook version of God's Debris is No. 19 on the bestseller list.
A significant percentage of the eBook bestsellers for 2004 can also be found on year-end print bestseller lists indicating eBook purchasing is following the diversity of mainstream reading habits. The full 30 bestsellers for 2004 can be found on the OeBF website at www.openebook.org/bestseller/year04.htm.
The OeBF also released details of its Q3 eBook Statistics Report. Compiled from data submitted by 23 of the world's leading eBook publishers and retailers, the findings revealed:
Revenues: $3,227,972 in sales have been logged by retailers in Q3 2004, a 25 percent increase over the same period in 2003 during which time retailers reported $2,591,469 on sales of eBooks.
Unit sales: A total of 419,962 eBooks were sold in Q3 2004 alone, an 11 percent increase over the same period in 2003, during which time 377,095 units were sold.
The Open eBook Forum is an international non-profit trade and standards organization for the electronic publishing industry.