CitationDownload as .RIS
Emerald Group Publishing Limited
Copyright © 2001, MCB UP Limited
Danielle Mihram and G. Arthur Mihram
The theme of the 167th National Meeting of the American Association for the Advancement of Science (AAAS-2001), held in San Francisco, CA February 15-20, 2001, was "Building the Future through Science, Engineering, and Technology". Thank heavens, nonetheless, that the Meeting had few futurists attempting to convince the Meeting's audiences of inevitable events to come.
The American Association for the Advancement of Science (www.aaas.org) is the world's largest federation of scientists, with more than 138,000 individual members and nearly 300 affiliated scientific and engineering societies. This year's Meeting, unlike nearly every annual one of the past decade (cf. our reports in this journal), included a much smaller number of sessions of direct interest to the information specialist. However, a few sessions provided extremely interesting "windows" to the future of information technology (IT). Archival materials dealing with the content of the Meeting come in two formats[1,2]. Approximately 5,000 persons registered for this year's Meeting.
We shall commence, therefore, with a quick report on a few sessions on the "periphery" of the information specialist's viewpoint, then move to report on the more directly connected sessions.
The Human Genome Report
The publication of the complete sequence of the human genome was timed to concur with this year's Annual Meeting. Science (AAAS's publication) and Nature (UK) each published the sequence that very week as a result of collaboration among the two journals and the two organizations who (rather independently) mapped out the human genome: Celera Genomics and the USA's National Institute of Health's (NIH) Human Genome Research Institute.
Each organization gave a plenary address (Francis S. Collins of NIH on Saturday, February 17; then J. Craig Venter of Celera Genomics on Sunday, February 18), calling back to one's memory their joint report of the completion at a White House ceremony last year. Free copies of the respective issues of Nature and Science were made available to every registrant.
Mathematics and the Internet
This year's Meeting had a number of sessions on mathematics, which might seldom draw the attention of information specialists, other than those who have noted the widespread affiliation and/or merger of many academic computer science departments with the mathematics departments.
For example, one session ("Mathematical Aspects of Intellectual Property Management on the Internet"), held on Saturday afternoon, was announced to be dealing with both "electronic watermarking" (the electronic insertion throughout a "digital document" of a marking which will survive copying and most attempts at modification: e.g. the NBC logo on a television program or a video produced by that broadcaster) and "electronic fingerprinting" (a unique electronic watermark inserted into the document each time that it is copied or transmitted, facilitating a trace of any illegal copying back to its source).
However, the presentations seemed to stress the need to be even more clever than the latest clever hacker, with suggestions to devise even more complicated encryption structures so as to confuse or confound the illicit hacker or to devise even more clever "keys" so as to deny access even further to hackers who use code-breaking schemes.
Indeed, when we asked from the floor (Franklin et al., 2001) about the earlier suggestion (Mihram and Mihram, 2000) that we use the historical perspective to understand why we have for centuries opted for a governmentally-secured (and operated) national postal service and, therefore, to conclude that there is indeed a role for government to play ("enhanced electronic postmarks" and "digital electronic copyright seals"), the response was that this approach "does not seem to be so popular". Perhaps the response was that of an unknowing Democrat to a Republican question (NB: No partisan political orientation necessarily implied here!).
One session on Saturday morning (February 17) was entitled "Nature and Origins of Mathematical Thinking." The audiocassettes (Devlin et al., 2001) of the session would not be of general interest for an audience of information specialists, unless they are concerned with our awareness of how learning takes place (Mihram, 1974). The Session's organizer (Keith Devlin, Saint Mary's College California) noted that there has been growing interest in the nature of mathematical ability: What exactly is it? How does it differ from numerical ability? Does it depend on language? Can chimpanzees "do mathematics"? How did humans acquire this ability? How do children acquire mathematical ability?
One does find that mathematicians have simply learned a (special) language, one which (being the language of logic) demands that one practice it with frequency, otherwise one loses one's adeptness at it. Thus, learning mathematics is to become skillful in a certain well-defined type of discourse. Although Devlin (Devlin et al.) felt that many (most?) mathematicians do not believe that they think in language, the "jury is still out" on this decision. It seems, however, that mathematicians, like all of us, use analogy in order to come forward from what we know (that we know) to a new conclusion; one's faculty in mathematics is probably no greater than one's frequent use of the language, so that rewarding mathematical analogies are naturally more facile to one practicing the language frequently.
On Digital Copyright Protection
The Editor, Donald Kennedy, of Science Magazine, spoke at 2:00 pm on Saturday, February 17, on "New Tests for Science," remarking that the presentation of large sets of data (such as that of the Genome Project) has always been conducted in the print medium, though at considerable cost. He noted that some editors (e.g. in the not-so-scientific economics) no longer require that the data supporting an author's published conclusions be printed as well, yet that some are not even mandating that the author maintain these data for their subsequent (i.e. historical) retrieval.
During the session's audience queries we asked (Kennedy and Mihram, 2001) whether the AAAS has as yet established a general policy for maintaining archival records for scientific data, which (if digitized) possess a considerably shorter "shelf-life" than do printed records. We sought also to know whether means employed by the US Copyright Office (Devlin et al., 2001) will be implemented so as to ensure the integrity of the "original edition" (the copyright-registered edition) of such data, particularly in view of the ease with which such digital records can be altered subsequently. It seems that he took our question "under advisement", though we have no reason to believe that he does not feel that there is indeed a difficulty to be faced by science, as we enter more completely our Age of Tele- and Digital Communications (Mihram, 1975).
More on the Mind and the Brain
AAAS-2001 contained a number of sessions dealing with the operation of the brain, probably evidence of the decade (1990s) of intense interest in the neurosciences. For example, Alison Gopnik, of the University of California-Berkeley, presented on Sunday morning, February 18, "The Scientist in the Crib: What Early Learning tells us about the Mind", in which she must have clearly overstepped the definitional bounds describing science, in claiming that infants and young children aged 2-4, just because they are logically sorting out alternative explanations for phenomena which they are forced to encounter, are actually "little scientists". In our questioning (Gopnik et al., 2001) she seemed reluctant to accept the rather well-established conclusions of Inhelder and Piaget (1958) that mental development is essentially age-dependent and that scientific thinking cannot occur until after adolescence, at which age the individual has adequate mental structuring and fortitude to build "mental models" of the world, as is exemplified by his/her personality.
Then, later on Sunday morning, February 18, a session on "Reprogramming the Human Brain after Injury" was conducted. Here one could learn as much (Chapman et al., 2001) about how the brain/mind works as about the technological advances (MRIs: magnetic resonance instruments and FRMIs: frontal magnetic resonance instruments) which are making possible the non-obtrusive, non-invasive examination of what is happening within the brain, while the body is being exposed to (or else denied) various sensual (usually visual) experiences. Speakers discussed new breakthroughs in manipulating brain injury through neural transplantation, by stimulation approaches, and via the utilization of advanced brain imaging technology in order to measure reorganization of the central nervous system.
For example, one speaker (Sandra B. Chapman, University of Texas at Dallas) reported that brain plasticity in children must be viewed from a long-term perspective. She noted that, when brain imaging measures are used with yearly assessments of cognitive-linguistic function, emerging evidence indicates (contrary to previous views of greater plasticity at younger ages) that early injury can have a more deleterious impact on long-term recovery. In response to our question, about connecting the research of Inhelder and Piaget (1958) as well as others (e.g. Mihram, 1974) with the possibility that severe trauma in the frontal areas of children might preclude their becoming scientists once they reach adulthood, she essentially concurred ("Brilliant question!"), though she remains hopeful that neurological research would permit eventually full recovery.
The session, "Standard Setting and the Rate of Technology Development" (convened on Saturday morning, February 17) attracted a large crowd.
The setting of technical standards is emerging as a key issue affecting not only the rate but also the direction of technological development in many industries. Failure to agree on domestic standards for cellular telephony, for example, has been identified as the principal reason why Europe was able to capture both technological and market leads in that industry over their potential US competitors (Motorola). If standards are set too late, one runs the risk that technical development may be stymied by needless uncertainty yet, if they are set too early, promising lines of inquiry may be stifled. Standards can facilitate innovation and economic growth, but ineffective standards can be impediments.
The panelists in this session were unanimous in noting that standards are fundamental to the diffusion of new technology and to the workings of almost every type of economic unit: they deliver benefits that are realized on scales ranging from individual companies and consumers to local markets, as well as globe-spanning communication systems and world trade networks. The early development of standards was largely retrospective, and represented solutions to consumer goods that already had been developed. Their economic roles were limited largely to variety reduction so as to enable compatibility and economies of scale. Today's standards arena is much more dynamic and complex. In the fast-paced high-technology sector, as well as for industries that rely on high-technology products and services, standards activities are no longer routine, and time is of the essence. Standards can be sources of market power and competitive advantage; they also are a form of technical infrastructure, functioning as public goods that yield widely shared benefits. Modern technology-based industries and economies increasingly rely on standards to develop mass markets, enable interoperability, and promote efficiency. As a result, standards are being developed in formal bodies, in consortia, and in the market. These bodies operate at many levels: regional, national, and international.
Among industrialized countries, the US standards infrastructure is characterized uniquely by a loosely coordinated system of Federal, State and local governments, voluntary standards associations, trade and professional organizations, for-profit entities, and industrial semi-permanent and ad hoc groups. The current question for the USA (noted by the speakers) is the following: As the importance of technology grows in all industry sectors, will the US system be effective in sustaining the domestic innovation system and economic growth as well as furthering US interests and innovation in a global economy?
Karen Brown (Director, National Institute of Standards and Technology (NIST)) provided an overview of the work of NIST in her presentation, "Standards Today and Tomorrow: Will there Be a World of Difference?"
Standards, according to Brown, are profoundly important: in her view they are in the top ten engineering achievements of the last century, equal to the invention of the automobile and the airplane. Contrary to other standards in other nations, standards in the USA are voluntary, not law as in other countries.
Because impartiality and neutrality are key to standard-related issues she stressed the fact that NIST is the neutral third party where people can have a discussion about standards. It does not regulate but facilitates; it coordinates, it does not push for a specific decision. It views itself as a convener, and as an "educational" body, providing workshops on current issues. An example of such work is NIST's role in helping the discussion for standards for e-books so as to move the industry along.
Carl Cargill (Director of Standards at Sun Microsystems) echoed Brown's view that standards are apparently contradictory, and have a pervasive influence on commerce: they can be viewed as beneficial (insuring health and safety) or as detrimental to progress (for example, the number of regulations pertaining to information technology (IT) has increased). He pointed out that they are a multi-billion business to the IT industry (he urged the audience to take a look at the February 27, 1993 issue of the Economist). He also noted that one must distinguish among three types of standards: environmental standards (e.g. product safety standards); attribute standards (quality, ergonomics "they make products attractive"); and technical interface standards (e.g. JAVA).
Cargill's view is that standardization in the realm of technology is still an art form. Though standard developers are good engineers, they have multiple agendas (their field/discipline, their company, their personal goals). Standard-setting organizations have different goals: trade associations [national, formal organizations] do not necessarily have the same objectives as consortia ("they pop up every other week, their advantage is that they concentrate on a very small focus"), while alliances espouse "values" that may impede commercial goals (e.g. the Open Source movement who give away their software codes).
Substantial IT investment in consortia and alliances (e.g. the World Wide Web Consortium and the European Telecommunications Standards Industry (a very big alliance)) reflects the fact that information-sharing is based on standardization ("you either share or die"); tremendous interconnection and speed; immediacy of competition, cooperation; multinational markets and players. In this constantly changing playing-field, standardization provides "stability": stability of approach and perception; legitimization of business solution.
Cargill is not unaware of current concerns regarding standards. For example, intellectual property and licensing: how do you license materials on the Web? The answers reflect multiple and differing national agendas. In addition Cargill is quite outspoken about the failure of Business and Engineering Schools where standardization is not taught and for which there is little academic interest ("too much work"). Standards, he noted, are treated as a "discipline", though it is a management tool: there must be a structure and a form; it allows for planning and ordered change; it allows current practice to be structured; it creates the rate of technology acceptance; it allows society to know what is coming.
Though his presentation was entitled "Standards and the Rate of Technology Change (or) Walden Pond Has Been Drained!", at no time did Cargill invoke Walden Pond. The audience did not fail to ask: Why has Walden Pond been drained? Cargill was ready: That pond was about individualism; you cannot have individualism in standards: you must play collectively.
The booklet National Standards Strategy for the United States published by ANSI (American National Standards Institute)" was distributed at the session. It provides an excellent overview of the topic, including sets of principles for successful standards processes and a dozen tactical initiatives for moving forward. The booklet's electronic version is available at: http://web.ansi.org/public/nss.html
The session, "What Is Technological Literacy, and Why Does it Matter?" (Saturday afternoon, February 17), also attracted a very large crowd. The session provided a broad look at the concept of technological literacy, including its importance to K-12 education as well as questions such as "what are the characteristics of the technologically literate citizen?" and "what are the connections between technological literacy and the engineering, science, and mathematics professions?"
W.E. Dugger (Director of Project, "Technology for All Americans", of the International Technology Education Association, Blacksburg, VA) reported on the recently published Standards for Technological Literacy: Content for the Study of Technology (Technology Content Standards), published by the International Technology Education Association (ITEA) the online version of the study is available in PDF format at: http://www.iteawww.org/TAA/STLstds.htm; E-mail: email@example.com). The publication represents the culmination of the work of that Association's "Technology for All Americans Project" (TfAAP), funded from 1996 to 1999 by the National Science Foundation and the National Aeronautics and Space Administration.
The document defines what students should know and be able to do in order to be technologically literate and provides standards that prescribe what should be the outcomes of the study of technology in grades K-12.
In developing the standards, ITEA focused on what students in grades K-12 need to know and be able to do in order to be technologically literate. In the review process of the project, more than 4,000 people contributed to the improvement of the "Technology Content Standards", as it was developed and modified. A detailed overview of the project has been published in Dugger's article, "Standards for Technological Literacy", The Technology Teacher, pp. 8-13.
Technology Content Standards is not a curriculum: it represents a recommendation from educators, engineers, scientists, mathematicians, and parents about what skills are needed in order to become technologically literate. It offers criteria and benchmarks to judge progress toward a vision of technological literacy for all students. The basic features of such standards for the study of technology in grades K-12 are as follows:
It offers a common set of expectations for what students in technology should learn.
It offers specific details about what every student should learn about technology.
It is developmentally appropriate for students.
It provides a basis for developing meaningful, relevant, and articulated curricula at the local and state/ provincial levels.
It promotes content connections with other fields of study in grades K-12.
Learning Science via the Web
In the session, "Communicating the Future: A Research Agenda for Understanding Public Communication of Science and Technology", M.W. Tremayne reported on part of the research undertaken toward his Doctoral dissertation at the University of Wisconsin-Madison. His presentation, entitled, "Using Think-Aloud Protocols to Compare the Usefulness of Science Web Sites to the Public", addressed the following question: As people increasingly use the Web to satisfy their information needs, what effects does this medium have on how information is processed and ultimately stored in individual memories? He reported on the results of a think-aloud study, which examined the impact of interactivity and other Web site characteristics on individuals' ability to roam and "read" two popular science Web sites: The Why Files (http://whyfiles.org/teach/index.html) and the Web site of the San Francisco Exploratorium. His goal was to explore three cognitive strategies: orienting behavior (efforts to keep track of where one is on a site), maintenance (efforts to place new information into short-term memory), and elaboration (efforts to incorporate new information into one's existing understanding of a phenomenon), so as to compare the results of this study with an earlier think-aloud study. The effects of increased social familiarity with the Web will then be analyzed.
The Why Files use news and current events as a springboard to explore science as a human enterprise and as a way of looking at the world. Their target audiences are schools and the "man in the street" interested in learning about science from a layman's perspective. "Beyond portraying the outcomes of science, our overarching goal is to explain the process, culture and people that shape it" (see their Web site). Users can access the information via a "clickable" menu. In contrast, the San Francisco Exploratorium is more interactive: you can choose stories in any order that you want, and some pages allow you to calculate or compute results.
Interactivity of such sites was examined (i.e. how do such sites influence what an individual does after accessing the site? What does he do next?). The question that Tremayne is asking is as follows: How does interactivity influence people's cognitive activities?
His group of subjects included individuals who had little experience with the Web, while others were very experienced with Web-based searches. They were asked to talk aloud as they went through the two sites. Among the results on which he reported was the fact that "elaboration behavior" ("processing information", making connections with content) was greater when subjects were on the San Francisco Exploratorium site. Subjects spent more time there, and engaged in more searching activities (the experienced Web-users exhibited slightly more activity). In addition, individuals elaborated more (phrases such as: "Oh! I've seen this before in ...") rather than simply reading the text on the screen; while keying they also "thought aloud" ("I wonder where this will take me..."). Tremayne's current conclusion: true interactivity on the Web has to be linked with a format that encourages "elaboration behavior."
Virtual Laboratories and Computer Networking Technologies
The Symposium on the impact of computer networking technology on the academic research community entitled, "Impact of Computer Networking Technologies on Scientific Research" (Monday afternoon, February 19), provided an exciting overview of the current collaborative work undertaken by scientists thanks to the enabling powers of computer networking technology.
Because many of today's research challenges demand not only the networking of investigators but also access to research results and to scientific information and tools, scientists are increasingly relying on collaborative research initiatives. A research institution might face such issues as geographical isolation, lack of facilities, and lack of "critical mass." Because networks create new opportunities for researchers, both in improving communications and in access to resources, partnerships are increasingly being made across disciplinary domains and across institutional boundaries. The presentations at this Symposium dealt with examples from biodiversity, chemical engineering, and space physics.
One particularly interesting presentation was that of T.A. Feinholt (School of Information, University of Michigan, Ann Arbor), one entitled, "Lessons Learned from a Decade of Collaborative Development and Use."
The Internet has created new possibilities for the organization of joint research, such as "collaboratories" (or computer-supported "laboratories without walls."). The word, "collaboratory" (collaborate + laboratory) was coined in 1989 by Bill Wulf (http://www.emsl.pnl.gov:2080/docs/collab/intro/CollabConcept.html).
According to this Web site, "Scientific collaboratories are designed to enable close ties among scientists in a given research area, to promote collaborations involving scientists in diverse areas, to accelerate the development and dissemination of basic knowledge, and to minimize the timelag between discovery and application." In addition the collaboratory relies on the development of new communications technologies (shared computer displays, electronic notebooks, virtual reality collaboration spaces) and an integration of these technologies with current videoconferencing and e-mail capabilities. Joint research by scientists can be undertaken "virtually" and facilities and databases are independent of geographical location.
During the past decade several scientific fields have conducted research via collaboratories: chemistry, medicine, biology, and engineering. Feinholt's presentation focused primarily on the Collaboratory for Research on Electronic Work (CREW), a research unit at the School of Information, University of Michigan. Three projects are currently being developed:
The Great Lakes Regional Center for AIDS Research (CFAR), sponsored by the National Institutes of Health (NIH) and including 12 institutions receiving NIH funding: http://intel.si.umich.edu/crew/Research/resrch_GreatLakesCFAR.htm; and http://www.greatlakescfar.org/cfar.
The NEESgrid (http//www.neesgrid.org): a virtual laboratory for the earthquake engineering community. It is funded by the National Science Foundation's (NSF) NEES (the George E. Brown Jr Network for Earthquake Engineering Simulation). Congress has authorized NEES for a five-year construction period, from October 1, 1999 through September 30, 2004, for a total of $81.9 million. The goal of NEES is to provide a national, networked collaboratory of geographically-distributed, shared-use next-generation experimental research equipment sites, with teleobservation and teleoperation capabilities, which will transform the environment for earthquake engineering research and education through collaborative and integrated experimentation, computation, theory, databases, and model-based simulation to improve the seismic design and performance of US and civil mechanical infrastructure systems. When the construction period is completed in September 2004, the NEES collaboratory will enter its operational period from October 1, 2004 through September 30, 2014 and be managed by the NEES Consortium.
Information on the NEES Program awards is available at the following site: http://www.eng.nsf.gov/nees
The Space Physics and Aeronomy Research Collaboratory (SPARC, funded by the National Foundation for Science (NSF)), a virtual facility which was launched in 1993 so as to study light in the Northern Hemisphere. SPARC (the project is administratively located at the University of Michigan, Ann Arbor) brings together diverse experts engaged in data-collecting and working in many observatories, as well as a community of student learners who are mentored by senior members of SPARC. Access to the data is via a Web-based interface plus Explorer or Navigator. Currently, scientists working in this project have access to approximately 500 data resources. Such an approach to research provides a global perspective (in contrast with the limited perspective gained in a single laboratory) and allows for the solution of complex problems requiring a multi-disciplinary approach and hence complementary expertise.
Feinholt spoke about the advantages as well as the unresolved issues related to this new approach for research. Using SPARC as an example, he noted that, on the positive side, SPARC transforms the nature of students' participation in research activity, allowing students to take an active role in experiments without travel to distant observatories. Second, SPARC transforms the division of labor in research activity, allowing teams of scientists with complementary expertise to apply their distinctive perspective without being collocated.
Questions fundamental to the research enterprise when undertaken in such collaborative environments are not resolved: for example, how will ownership of data be determined? How will priority of publication be established? Will new forms of organizing knowledge and scientific results emerge? Feinholt named Margaret Hedstrom, archivist at the University of Michigan, whose interest is the archiving of electronic information, but whose concerns also include issues relating to the archiving and preserving of evolving data as well as the determining of priority of research in such collaborative environments.
Expanded GrantsNet: Now Includes Undergraduate Science Programs
GrantsNet (the free Web site at www.grantsnet.org that provides information on grants and fellowships for young biomedical researchers) launched (during a career fair at the conference) a new online database to assist undergraduate science programs. The database will extend GrantsNet's audience to include professors, deans, and other administrators who seek support for undergraduate science education and research programs.
GrantsNet also unveiled several new features, such as My GrantsNet option that allows users to keep abreast of new funding opportunities through e-mail alerts, and to save searches quickly or to update information.
The Howard Hughes Medical Institute (HHMI www.hhmi.org) and the American Association for the Advancement of Science are the sponsors of GrantsNet which was launched in 1998. More than 80,000 users have registered to use its extensive database of fellowships and grants, its links to Web sites and online applications of funders, and its tips on applying for grants. Since its inception, the database has been accessed by graduate students, postdoctoral fellows, medical students, and junior faculty in search of information about grants and fellowships. It also provides funders with a single place to pool their information and bring their programs to the attention of potential applicants. With the new database, it becomes possible to improve the "fit" between funders and applicants for undergraduate programs.
Paulson, R. (Ed.) (2001), AAAS 2001 Annual Meeting and Science Innovation Exposition, American Association for the Advancement of Science, Washington, DC.
Audio Visual Education Network [AVEN], Inc. prepared audio cassette tapes of nearly every session at the Meeting, most three-hour sessions requiring two cassettes. Only about 25 sessions were video-recorded: typically the plenary lectures. The audiocassettes are available ($9.50 or $10.00) and the video compact discs can be ordered ($13.50) from AVEN: 10532 Greenwood Ave N.; Seattle, WA 98133.
Danielle Mihram PhD is Assistant Dean, Leavey Library, University of Southern California and G. Arthur Mihram, PhD is an Author and Consultant: Princeton, NJ 08542-1188.
Chapman, S.B. et al. (2001), "Reprogramming the human brain after injury", AVEN Cassette No. AS143 A and B.
Devlin, K.G. et al. (2001), "Nature and origins of mathematical thinking", AVEN Cassettes No. AS141 A and B.
Franklin, M.G. et al. (2001), "Mathematical aspects of intellectual property management on the Internet", AVEN Cassette No. AS115 A.
Gopnik, A. et al. (2001), "The scientist in the crib:: what early learning tells us about the mind", AVEN Cassette No. AS119.
Inhelder, B. and Piaget, J. (1958), Growth of Logical Thinking from Childhood to Adolescence, Basic Books, New York, NY.
Kennedy, D. and Mihram, G.A. (2001), "New tests for Science", AVEN Cassette No. AS117.
Mihram, G.A. (1974), "Simulation: methodology for decision theorists", Role and Effectiveness of Theories of Decision in Practice, Hodder & Stoughton, London, pp. 320-27.
Mihram, G.A. (1975), An Epistle to Dr Benjamin Franklin, Exposition-University Press, New York, NY.
Mihram, G.A. and Mihram, D. (2000), "Tele-cybernetics: on some necessary governmental roles in the Internet and the Web", WebNet 2000 (Proceedings, World Conference 2000 on the WWW and the Internet), Association for the Advancement of Computing in Education, Norfolk, VA, pp. 396-401.