Emerald Group Publishing Limited
Copyright © 2000, MCB UP Limited
NISO/NFAIS Workshop Explores Best Practices in Ejournal Publishing
The National Information Standards Organization (NISO) is an ANSI (American National Standards Institute)-accredited standards developer in the area of information standards. Membership is open to all interested parties and currently includes primary and secondary publishers, libraries, library system vendors, government agencies, associations, software developers and hardware manufacturers.
The idea of improving consistency in the identification and presentation of electronic journals has been a matter of increasing interest within the NISO constituency. On February 20, 2000, NISO and the National Federation of Abstracting and Indexing Societies (NFAIS) co-sponsored an invitational meeting on best practices for electronic journals as a preconference workshop to the NFAIS Annual Conference 2000 in Philadelphia, Pennsylvania. The goal was to explore areas in which the development of best practices, or guidelines, or formal standards might be possible and productive. The meeting was attended by commercial publishers, scholarly societies, secondary publishers, librarians, and vendors serving these communities.
The workshop began with an introduction and an overview presentation to set the scene for the rest of the meeting. There followed presentations on the challenges of electronic publishing from four points of view: primary publishing, non-profit/scholarly society publishing, secondary publishing, and the library community. Attendees then broke into four discussion groups on general areas of interest (editorial standards, archiving, linking, and technical standards) and regathered to summarize their conclusions and recommendations.
Pat Harris, the Executive Director of NISO, welcomed meeting attendees on behalf of NISO and NFAIS, noted the diversity of interests represented, and posed the main question: Are we all dealing with common problems to which we can work out some common solutions? It has been necessary to weigh the need for more consistency in electronic publishing practices against the feeling that more time to experiment and develop best practices was required. Now it appears that the time might be right to consider standardizing some aspects of ejournal publishing. Pat noted that the very first NISO standard, Z39.1, addressed the format and arrangement of printed periodicals, and expressed optimism that best practices could also be established in the electronic arena to everyone's benefit.
Emily Fayen of RoweCom, Inc. presented an overview of the major stakeholders and issues related to electronic journal publishing.
Interested parties include publishers, authors, scholarly societies, researchers and other users, abstracting and indexing (A&I) services, aggregators, libraries and library consortia. Major issues affecting them all include:
New publishing practices - for example, articles may be published within issues or they may not, or they may be unbundled initially and later bundled into issues, either online or in print.
Keeping up with new content - not only different physical formats of materials, but emendations over time such as annotations and errata.
Quality assurance - how do we know we have everything, or that the electronic version matches print?
Globalization - how to ensure consistency of cataloging and indexing, when there exist multiple editions of journals (e.g. US and European editions).
Changing economics - how do we identify and recover costs?
Emergence of new services - capabilities such as reference linking, which were previously impossible, quickly become user requirements in the electronic environment.
Rights management and archiving - universal concerns, affecting all stakeholder groups.
While there is substantial overlap in interest among the stakeholder groups, certain concerns emerge to the forefront for each:
Publishers are concerned with publishing costs, changing readership, changing user expectations, rights management and archiving.
Authors and scholarly societies are now dealing with self-publishing, new models for scholarly publishing, quality assurance, rights management and archiving.
Researchers want simple access to complex information space, including easy access to full text and reference linking.
A&I services are trying to adjust to the impact of electronic publishing on their print-oriented practices, and to implement reference linking and other value-added services.
Aggregators, a relatively new industry, have the problem of managing content from multiple sources, providing reference linking not just within their own service but to other content providers, ensuring completeness, and rights management and archiving.
Libraries have the challenges of keeping up with the flood of new content and new options, providing their users with easy access to information wherever it may happen to reside, rights management and archiving.
Consortia have become very influential through cooperative purchasing and negotiation of licenses; they are concerned with assured access, and, of course, rights management and archiving.
It is clear that many different groups need to work together. While each of these groups has its own priorities, needs and goals, it is also clear that many common interests and interdependencies exist. Meeting attendees, as representatives of these groups, should consider how NISO can help, and what short-term and longer- term actions and initiatives they might recommend as steps in addressing this set of issues.
The Commercial Primary Publisher
Howard Ratner from Springer-Verlag presented the issues from the point of view of a commercial primary publisher in the area of science, technology and medicine (STM).
According to Ratner, epublishing has allowed Springer-Verlag to increase page count and frequency. Last year Springer introduced its "Online First" service, in which articles appear online before they are later issued in print. This differs from a preprint service as preprints are traditionally issued before peer review; Online First articles are peer reviewed and include final edits. One big issue was how authors should cite articles before volume issue and page number were assigned. This was addressed by putting a visible digital object identifier (DOI) on every article, which is retained even after print publication.
The "published" content of journals published both electronically and in print is an issue. PDF versions duplicate the print version, but HTML versions enhance it, containing links to author information, reference links, cited by links, and perhaps links to other supplementary materials. This raises the question of what the boundaries of an article are for archiving purposes.
Some Springer print journals have "online only" sections. In the case of a journal with research articles and case reports, the case reports were published in the online version only. However, this caused A&I services to fail to index them, so now the abstracts of case reports are included in the print version. "Online only" sections of print journals may also include editorials, reviews, discussions and product listings. Again, these raise the question, must this non-research content be archived? How will users know which content is included in which version?
A related question to what to maintain only online is what part of print content to duplicate online. Originally, only research content was included in LINK, Springer's online full-text electronic journal service, but this has been expanded to "cover-to-cover" coverage. Both of these examples point out the dependencies between primary and secondary publishers.
Online content is more dynamic than print. Publications can be updated piecemeal, not only chapter-by-chapter, but actually chunk-by-chunk, somewhat akin to a loose-leaf. This not only complicates archiving but also raises the problem of identifying a particular version for citation purposes.
Various options exist for archiving, including print, issuing an annual CD-ROM, creating a static online archive such as provided by JSTOR, and relying on services such as provided by OCLC or PubMed Central. Publisher commitments vary and in general are not trusted by librarians.
Linking to related data in all forms has become centrally important. This includes not only reference linking to cited articles, but links to databases of scientific data, author Web pages and biographies, patent information, and product information. Everybody wants to link to everybody, but this cannot realistically be done with bilateral contracts, leading to the development of central facilities such as CrossRef (a multi-publisher reference linking service).
Ratner closed by urging the audience to keep learning, keep listening to both customers and competitors, keep evolving, and keep cooperating.
Several issues came up in the question and answer period that followed. One key issue that came up in several different contexts throughout the workshop was the inconvenience of needing different citations to the same article for print and electronic versions. Standard citations to print include page numbers which often are not available for HTML and other electronic versions, particularly when the online is issued first. The de facto solution is to use a constant identifier such as the DOI on both print and electronic versions.
A library school professor pointed out that students often don't have administrative privileges on shared machines in libraries and labs, so they cannot download software and plugins required to access electronic content. Technical administrators try to provide common applications but can't keep up with the variety of tools and formats used in epublishing today. Ratner agreed it would be helpful if publishers could provide a list of key helper applications for the subject areas they covered, but warned this would be difficult, as often there are competing products to perform the same function, and publishers cannot fully control what authors require.
An issue with some resonance among the publishers concerned publication of errata in the online environment. Because it is imperative that the content be frozen and unchangeable at the time an article is published, it is not uncommon to see an article on the Web linking to its own errata. The absurdity of this is not lost on users. Perhaps the inconvenience can be ameliorated by technical devices such as errata windows which pop-up inline.
The Non-Profit Society Publisher
John Ewing, of the American Mathematical Society (AMS), gave the point of view of a scholarly society publisher. AMS has about 30,000 members (one-third international) and 230 employees. It focuses on research rather than education. It publishes more than 100 book titles per year, and publishes Math Reviews, the online version of which is known as MathSciNet. This database indexes and reviews the mathematical literature. It contains 1.5 million reviews from 1940 onward and adds about 65,000 new items every year.
AMS itself is a relatively small publisher, with four journals published in both paper and electronic versions, three electronic-only journals, and two member journals. A tradition in math is that subscriptions are almost exclusively institutional. Interestingly, while AMS has always viewed itself as being in the forefront of electronic publishing, the mathematics research community is by and large characterized by valuing print and the historical record more than immediacy.
AMS invested heavily in linking electronic services such as gophers and listservs, work which was rendered superfluous by the rise of the Web. Although we like to think we can predict the future more accurately today, this remains to be seen. It also invested a huge amount in development of TeX and mathematical fonts, which it put in the public domain.
In 1994 AMS leadership articulated a vision that included putting existing print journals and Math Reviews online immediately, and developing a line of approximately 40 electronic-only journals. In reality, online versions of the print journals were developed over the next couple of years, as well as MathSciNet, which was an instant success. However, only three electronic-only titles were published and they attracted so few subscriptions they are now given away free as a benefit of membership. In all, less than one-half of one percent of the mathematics literature appears in online-only journals.
Most articles are contributed in LaTeX or some variation such as plain TeX or AMSTeX; everything is converted to a LaTeX structure. In the 1990s, the electronic versions were made available in a variety of formats, including TeX, dvi, PDF, postscript and HTML. HTML was particularly difficult, as it could not represent mathematical fonts and all math had to be converted to little embedded gif files for display. AMS found from analyzing statistics that few people were reading papers online, and the main requirement was a format that printed well. As a result, since January 2000, HTML has been discontinued and linked PDF is being emphasized. However, all papers have associated HTML files containing selected information such as the abstract and references.
AMS also experimented with "online first", encountering the problem of how to cite articles before issues and page numbers are assigned. Currently they are posted with publisher item identifier (PII) identifiers for use in pre-pagination citations, but AMS is moving toward use of the DOI. Timeliness, however, is not as important in mathematics as in other fields, and the confusion and problems associated with online first publication are not worth it.
References in published articles don't link to the full text of the cited paper directly, but to the record for the cited article in Math Reviews. The entry in Math Reviews in turn links to the full text of the article. This is particularly helpful when the cited paper is not yet available online - readers can still get some information by following the link, and when the online version appears the link will already be there and no retrospective updating will be required. However, this only works for papers in Math Reviews, and it is a lot of work for the AMS.
AMS believes ownership and access to electronic versions should be the same as to the print. The subscription process works well, not because the literature is needed now, but because this guarantees it will be available when it is needed. If subscriptions do not guarantee backfile access, they will drop off. Backfile access and archiving is of particular importance to the math research community. AMS allocates one percent of the subscription price to an archiving fund. The problem is not storing the data but converting from one format to another, which is something all publishers will face within the next ten years. AMS faced this problem in the early 1980s in converting Math Reviews. They attempted a machine-conversion of 80,000 pages from STI to TeX, but after many false starts, the data finally had to be rekeyed from paper.
Electronic publishing provides more efficient document delivery to users, may reduce costs eventually, and does provide a better product. However, the real value of electronic publishing is still unknown. In 1922 Thomas Edison predicted the motion picture would replace the textbook in the educational system. Ewing used this anecdote to illustrate the point that it is hard even for experts to predict the future - they tend to get the general ideas right but the details all wrong. Scholarly publishing is a delicate ecosystem, and delicate ecosystems need protection. Even good technology can have bad consequences unexpected by the experts. The goal of scholarly publishers is to protect the literature for future generations this is particularly true of math, where the literature is very much for the future, not just the present. Therefore participating in and guiding change is an essential responsibility of societies.
The audience appeared to be a bit taken back by the idea of the mathematical community's reluctance to embrace electronic journals. One speaker noted that the lack of interest in ejournals appeared to be contradicted by the increasing interest in math eprints. However, Ewing pointed out that, although the number of new papers on eprint servers has grown significantly, it still averages only about 200 papers per month, a tiny fraction of the literature. Even within those 200, some of them will have already been published in print, and some represent fringe literature. In fact, although some subspecialties such as algebraic geometry and mathematical physics have embraced eprints enthusiastically, for the most part the community is still concerned with print, permanence and archiving.
The Secondary Publisher
Helen Atkins, from the Institute for Scientific Information (ISI), spoke from the point of view of A&I services, a type of secondary publisher. These publishers index journal articles for the purpose of helping users identify what has been published and gain access to articles in which they are interested. In the past days of all print, users "wrote and walked" from print indexes to print journals. In the future, index and articles will all be electronic with all materials linked together by actionable identifiers. At present, however, we are in transition, and need to take care to create a future that works for all parties.
Users do not want citations to consist of identifiers alone - they need to know the author, when and where it was published, of what journal it is a part. Identifiers are good for systems but bad for people; both identifiers and bibliographic description are required. Serials information that A&I services generally capture includes journal title, title abbreviation(s), publisher, ISSN (the International Standard Serials Number), CODEN (a designation assigned to a periodical title by the Chemical Abstracts Service), volume and issue numbering, cover date, and page ranges for articles. Which of these will be available for ejournals? Which are important for identifying an article uniquely?
When ISI's flagship product was the print Current Contents, ISI would send publishers a sheet specifying format requirements for a table of contents page to be indexed in Current Contents. These didn't address bibliographic content, but presentation values, such as using black print on white background. Now ISI is less presumptuous about dictating to publishers, but it does have some requests based on the shared interest of both primary and secondary publishers of getting information to users. In this spirit, it is suggested that publishers provide:
in the areas of content, structure and description.
Suggestions for content:
Select a single definitive archival version.
Include all content of the journal in this version - distributing content between print and various online versions does not help A&I services, libraries or users.
Do not change content once published.
Do not share content among different journals.
Suggestions for structure (publication pattern):
Pick a single structure, communicate it, and stick with it.
Determine a publication schedule (frequency) and cycle (volumes? issues?).
Get the right structure, because it will determine bibliographic description.
Once you put content into a structure, don't change it.
Suggestions for description:
Decide what bibliographic elements will describe each article and make it obvious in the publications.
Decide on a numbering scheme that ensures a unique article description, both within the journal (e.g. article numbers and pagination) and universally.
Keep elements and numbering consistent across the various formats of the publication (e.g. PDF, HTML, and print).
Communicate to users what numbering scheme is being used.
Present the bibliographic and numbering information clearly and consistently; it should be impossible for a user to make a print of an article that does not indicate where it came from.
Create a standard citation format consistent between print and electronic versions; show this on the page, and communicate it to readers.
In response to a question, Atkins noted that ISI has a tremendous amount of communication with primary publishers; it is essentially a full time job to keep up with changes, pose questions and resolve problems. This is probably true for all A&I services.
Regina Reynolds, from the National Serials Data Program (NSDP) of the Library of Congress, spoke on the library community's perspective. NSDP is the center that administers the assignment of ISSNs in the USA.
Reynolds illustrated her presentation with Tenniel's drawings from Alice in Wonderland. She noted that Alice was an appropriate theme for electronic publishing, as, just like in Wonderland, there are many features about ejournals that are familiar; then suddenly they will behave in a strange way. Challenges for libraries include maintaining awareness of what is available for purchase, negotiating licenses, providing access, developing new internal workflows, and establishing bibliographic control, as well as dealing with issues relating to presentation, storage and retention.
Publishers are experimenting with a new medium and need the freedom to try new approaches and models. Librarians need at least some predictability and stability. Each side must work with the other. Publishers must keep librarians informed, give the same information on print and electronic versions, and minimize change for the sake of change. Librarians must realize ejournals are in a state of transition, and expect experimentation and change. Neither side should forget the user.
Bibliographic rules have not responded rapidly enough to the electronic environment and box us in tightly. The definition of a serial is one example - currently the rules say a serial must be published in designated issues with a number or a date. This must be reconsidered now that some electronic journals are published article by article. Another big issue is determining when different formats of an ejournal are the same serial or not; for example the PDF and print versions may be equivalent, while an HTML version is quite different.
The International Standard Bibliographic Description Serials (ISBD-S) proposes a new umbrella category called "continuing resource" to cover both serials published in volumes and issues, and "integrating resources" that change and add new material over time. ISBD rules and library cataloging rules must be harmonized. Also, rules for assigning ISSNs must be harmonized with library cataloging.
Libraries now must decide whether or not to catalog electronic journals. There are good arguments for making the catalog provide complete, one-stop shopping for all materials. Weighing against this are the expense, difficulty and volume of cataloging required. It may be that user expectations do not justify this investment.
Aggregators are another relatively new factor with both positive and negative effects. They vastly simplify licensing and access to content, but provide their own aggravations, including overlapping content, frequent changes in content, and unpublicized changes in content. Managing aggregations, such as providing notification when the service is down, and providing access to titles from the catalog, is a challenge in its own right.
Some libraries list ejournals on their Web pages (portal pages) as an alternative to cataloging them. However, this is cumbersome when dealing with large numbers of titles, and does not provide good subject access. Others cut corners by not following the rule requirement of creating a separate cataloging record for the electronic version, and instead add links to the ejournal onto the record for the print. This is unsatisfactory when the electronic and print versions differ substantially, or when the title is available through one or more aggregator services.
OCLC's CORC (Cooperative Online Resource Cataloging) project is still in development, but holds some potential for libraries to be able to create cataloging records for electronic resources more economically. Embedded metadata may also hold some promise. Some libraries have begun embedding metadata in HTML headers for their Web pages so that software like CORC can use it to build catalog records automatically. Something similar might be possible if NSDP supplied descriptive metadata to publishers applying for ISSNs, and publishers included this on the home pages of their ejournals.
Presentation is extremely important to libraries as well as to A&I services. In an informal, unscientific survey of serials catalogers, they identified five top problems prevalent with ejournals:
the publisher providing no information about future access to back issues;
the publisher including back issues when an old title is changed to a new title - this creates havoc for users with citations to the older title;
multiple titles, or differing presentations of the title in different places on the publication - one title is good, more than one is bad;
the publisher providing no information about permission to download;
the online journal displaying no ISSN, or the ISSN of the print version of the publication.
Other aggravations included frequent changes to URLs, no notification of URL changes, and long and cumbersome URLs. Aggregators came in for criticism for not providing coverage information, being unclear whether full text or abstracts were provided, and giving URLs to a homepage for the aggregator service instead of directly to the journal title itself. Title presentation problems include difficulty finding a title at all, multiple versions of the title, and using a different title on the print and electronic versions. Publishers should realize that cataloging rules require certain pieces of bibliographic data and make some effort to include these online.
An early, and still controversial decision of the NSDP was to give different ISSNs to print and electronic versions. The online ISSN database, which is available by subscription, shows all ISSNs for a title. ISSN problems include ejournals displaying no ISSN or using the ISSN of the printed version. Publishers are urged to display the ISSN of both versions on both print and online publications.
In sum, the librarians' wish list is that publishers:
carry "masthead" information on the journal homepage, including issuing body, publisher and place of publication;
show consistency in title presentation;
retain the old title on back issues following a title change;
maintain stable URLs;
provide ongoing access to back issues; and
give information about the differences between print and various electronic versions.
A special wish list for aggregators is to:
be clear about which titles (and years within titles) are full text and which are abstracts;
notify the subscriber of additions, deletions and other changes;
display the ISSN of both print and electronic versions; and
allow linking directly to a journal, not the homepage of the service.
The Yale Medical School has mounted a prototype service called "Search Jake," which shows, for any journal in the database, all places where that journal is abstracted and indexed and where it is available in full text. Such a service covering all major journals would be invaluable to librarians. Perhaps some cooperative building and maintenance of such a database could be initiated nationally, along the lines of the CONSER project.
Reynolds concluded that this was an appropriate time to consider a standard or set of guidelines addressing presentation. It would provide guidance for new ejournal publishers and help publishers provide reliable and predictable information for secondary publishers and librarians. NSDP already distributes a pamphlet called You Name It to every publisher applying for a pre-publication ISSN. It explains good practices and points to applicable standards. It would be very helpful to have a best practices for presentation for ejournals to point to.
Following this presentation there was some discussion of the "appropriate copy" problem. This issue relates to the fact that many libraries have legitimate access to copies of ejournals through services other than those of the publishers themselves. For example, a library may belong to a consortium that loads ejournals locally, or may subscribe through an aggregator service. Other reasons for multiple locations gaining in importance include eprint services and archival storage copies. The problem is that the way reference linking is currently implemented, clicking on an actionable link will take the user to a single copy of the article, usually the copy at the publisher's site, and not necessarily the copy the user is authorized to access. This was once known as "the Harvard problem" because Dale Flecker of the Harvard University Library was the first person to articulate it clearly, but it has subsequently become commonly recognized as an issue in need of a solution.
Breakout Group Recommendations
Editorial Standards: Content and Bibliographic Issues (Helen Atkins reporting)
The group agreed that guidelines or best practices in this area would be useful. A formal standard is not appropriate at this time, but it would be a good target to work towards. Published guidelines would be easier for publishers to adopt, and would allow publishers to continue to experiment creatively.
Guidelines should be developed under the aegis of NISO but should be internationalized as well. Representatives from all stakeholder groups, including system implementers, should be involved in their development.
The guidelines should start with the overarching principles of consistency, predictability, stability, and completeness. It should describe the organization of the guidelines document, and give definitions as preliminaries. Then it should itemize the guidelines themselves, giving the reason for each, and examples of good practice.
Archiving Issues (Kathy Klemperer presenting)
The group found it difficult to limit the discussion to ejournals as archiving applies to all electronic content. They discussed the areas of responsibility, data formats, physical media, metadata, and models.
Responsibility for archiving could fall to publishers, non-profit agencies, or libraries. The group had no recommendation apart from noting pros and cons of each alternative. They did however believe that it would be advisable for publishers to keep their licenses open enough that there can be experimentation with different models for archiving.
There should be standards addressing archival data formats, and these should be sanctioned by a formal standards body. Acceptable formats should not be proprietary and should use a low common denominator. Multiple formats should be maintained, and new formats should be incorporated into the standard as they appear.
Technology is evolving too quickly to recommend a standard for physical media. However, best practice would indicate the medium should be refreshable, machinery for reading it must be replaceable, offsite storage should be possible, and the transfer time to copy to new media should not be prohibitively long.
Metadata standards for ejournal archives are appropriate. Administrative, descriptive and structural metadata are all required.
There is an ISO (International Standards Organization) standard in process for a reference model for archival storage of digital data (see Reference Model for an Open Archival Information System, OASIS, at http://ftp.ccsds.org/ccsds/documents/pdf/CCSDS-650.0-R-1.pdf). Interoperability between archives should be a goal. It might be useful to have a best practices document for running a digital archive.
Reference Linking Issues (Howard Ratner reporting)
The breakout group acknowledges it consisted mainly of publishers, and that input from librarians and other players is required. They discussed the assignment of identifiers, presentation issues, related services, and other issues.
Assignment: identifiers should ideally be human-readable, short (20 or fewer characters), and contain a check digit according to a common algorithm.
Presentation: identifiers should appear on all versions of an article (print and electronic); every identifier should be labeled as to what type of identifier it is. It was unresolved whether to recommend that identifiers appear in reference citations or whether more user-friendly anchor text should be displayed.
Services: a common means of token passing should be supported, so that one service knows that a click is coming from another service; it should be possible to look up metadata for an article using an identifier as query; there should be some standard set of information available to a user to find out exactly what is available from a service (e.g. an XML template with elements for the journal title, multiple ISSNs, starting date and issue).
Miscellaneous: there should be a common way of representing the location of articles in unpaged journals; article identifiers should be incorporated into existing end-user style guides such as The Chicago Manual of Style; there should be a registry of the various title abbreviation lists and schemes.
Technical Standards: Presentation/ Format Issues (Emily Fayen reporting)
Technical standards would be useful to address how information is transmitted from primary publishers to secondary, to aggregators, and consortia.
The group agreed that much latitude needs to be allowed in the format of the actual content, but that it could be useful to standardize the descriptive information that accompanies the content in some kind of "wrapper". It is the wrapper that should be addressed here.
Publishers could be surveyed to collect information about the DTDs they currently use. This information could be collected and analyzed to determine if there is a core set of metadata elements that are in use by most publishers in ejournal publication
A NISO standard for citations (Z39.29) is due to be released for review soon. This may contribute data elements to the wrapper as well.
Wrap-up and Conclusions
Emily Fayen summarized the main points of consensus during the day:
There is clearly a need for cooperation among groups.
Epublishing is a moving target. We need to balance the need of primary publishers for the freedom to experiment against the need of secondary publishers for consistency and completeness of information.
The appropriate response at this point is for some best practices and guidelines to steer evolution in a direction that looks right to the many stakeholders.
Scholarly publishing is indeed a fragile ecosystem that we should not attempt to disturb too greatly.
We do need to protect the literature for future generations.
It was noted that this workshop did not cover many other issues relevant to electronic publishing, including access control, rights management, licensing and contracts, and privacy of users.
There was also some discussion of underrepresented stakeholders, particularly end-users (readers). To some extent scholarly publishers, professional societies, and librarians represent their users, but the lack of a mechanism for direct user input haunts most standards processes.
Possible follow-on steps were suggested, including a reconvening of the workshop group at some future time, and the publication of a glossary of terms related to electronic journal publishing.
Workshop attendees were thanked for their participation and creative work. A shorter version of this meeting report will be posted on the NISO Website (http://www.niso.org) along with the PowerPoint presentations of contributors. The meeting planners will reconvene shortly to talk about the agenda that has emerged from the workshop, and to recommend steps for further action.
Priscilla Caplan is Assistant Director for Digital Library Services at the Florida Center for Library Automation, and chair of the NISO Standards Development Committee. email@example.com