Search results

1 – 10 of 495
Article
Publication date: 1 March 1995

Daniel D. Wheeler

The Kidlink project began in 1990 with the goal of creating aglobal dialogue among the ten to 15‐year‐old youths of the world. Theprimary medium for this exchange has been through…

204

Abstract

The Kidlink project began in 1990 with the goal of creating a global dialogue among the ten to 15‐year‐old youths of the world. The primary medium for this exchange has been through e‐mail mailing lists. Over four years, that project has experienced nearly an order of magnitude growth. This growth has entailed major changes in both the way the dialogue itself has been structured and in the organization of the volunteers who run the project. Growth is much more of a challenge for those Internet projects that provide contact with individuals than for those that are primarily providing access to information. The success of KIDLINK provides a useful model for others.

Article
Publication date: 1 March 2002

Christopher Bailey, Hua Lu, Greg Glinski, Daniel Wheeler, Phil Hamilton, Mike Hendriksen and Brian Smith

Flip‐chip assembly, developed in the early 1960s, is now being positioned as a key joining technology to achieve high‐density mounting of electronic components on to printed…

Abstract

Flip‐chip assembly, developed in the early 1960s, is now being positioned as a key joining technology to achieve high‐density mounting of electronic components on to printed circuit boards for high‐volume, low‐cost products. Computer models are now being used early within the product design stage to ensure that optimal process conditions are used. These models capture the governing physics taking place during the assembly process and they can also predict relevant defects that may occur. Describes the application of computational modelling techniques that have the ability to predict a range of interacting physical phenomena associated with the manufacturing process. For example, in the flip‐chip assembly process we have solder paste deposition, solder joint shape formation, heat transfer, solidification and thermal stress. Illustrates the application of modelling technology being used as part of a larger UK study aiming to establish a process route for high‐volume, low‐cost, sub‐100‐micron pitch flip‐chip assembly.

Details

Circuit World, vol. 28 no. 1
Type: Research Article
ISSN: 0305-6120

Keywords

Article
Publication date: 1 August 1998

Chris Bailey, Daniel Wheeler and Mark Cross

Solder materials are used to provide a connection between electronic components and printed circuit boards (PCBs) using either the reflow or wave soldering process. As a board…

Abstract

Solder materials are used to provide a connection between electronic components and printed circuit boards (PCBs) using either the reflow or wave soldering process. As a board assembly passes through a reflow furnace the solder (initially in the form of solder paste) melts, reflows, then solidifies, and finally deforms between the chip and board. A number of defects may occur during this process such as flux entrapment, void formation, and cracking of the joint, chip or board. These defects are a serious concern to industry, especially with trends towards increasing component miniaturisation and smaller pitch sizes. This paper presents a modelling methodology for predicting solder joint shape, solidification, and deformation (stress) during the assembly process.

Details

Soldering & Surface Mount Technology, vol. 10 no. 2
Type: Research Article
ISSN: 0954-0911

Keywords

Article
Publication date: 1 February 1993

Thomas A. Peters

The purpose of this article is to present an overview of the history and development of transaction log analysis (TLA) in library and information science research. Organizing a…

Abstract

The purpose of this article is to present an overview of the history and development of transaction log analysis (TLA) in library and information science research. Organizing a literature review of the first twenty‐five years of TLA poses some challenges and requires some decisions. The primary organizing principle could be a strict chronology of the published research, the research questions addressed, the automated information retrieval (IR) systems that generated the data, the results gained, or even the researchers themselves. The group of active transaction log analyzers remains fairly small in number, and researchers who use transaction logs tend to use this method more than once, so tracing the development and refinement of individuals' uses of the methodology could provide insight into the progress of the method as a whole. For example, if we examine how researchers like W. David Penniman, John Tolle, Christine Borgman, Ray Larson, and Micheline Hancock‐Beaulieu have modified their own understandings and applications of the method over time, we may get an accurate sense of the development of all applications.

Details

Library Hi Tech, vol. 11 no. 2
Type: Research Article
ISSN: 0737-8831

Article
Publication date: 1 June 1900

The decision of the Wolverhampton Stipendiary in the case of “Skim‐milk Cheese” is, at any rate, clearly put. It is a trial case, and, like most trial cases, the reasons for the…

66

Abstract

The decision of the Wolverhampton Stipendiary in the case of “Skim‐milk Cheese” is, at any rate, clearly put. It is a trial case, and, like most trial cases, the reasons for the judgment have to be based upon first principles of common‐sense, occasionally aided, but more often complicated, by already existing laws, which apply more or less to the case under discussion. The weak point in this particular case is the law which has just come into force, in which cheese is defined as the substance “usually known as cheese” by the public and any others interested in cheese. This reliance upon the popular fancy reads almost like our Government's war policy and “the man in the street,” and is a shining example of a trustful belief in the average common‐sense. Unfortunately, the general public have no direct voice in a police court, and so the “usually known as cheese” phrase is translated according to the fancy and taste of the officials and defending solicitors who may happen to be concerned with any particular case. Not having the general public to consult, the officials in this case had a war of dictionaries which would have gladdened the heart of Dr. JOHNSON; and the outcome of much travail was the following definition: cheese is “ coagulated milk or curd pressed into a solid mass.” So far so good, but immediately a second definition question cropped up—namely, What is “milk?”—and it is at this point that the mistake occurred. There is no legal definition of new milk, but it has been decided, and is accepted without dispute, that the single word “milk” means an article of well‐recognised general properties, and which has a lower limit of composition below which it ceases to be correctly described by the one word “milk,” and has to be called “skim‐milk,” “separated milk,” “ milk and water,” or other distinguishing names. The lower limits of fat and solids‐not‐fat are recognised universally by reputable public analysts, but there has been no upper limit of fat fixed. Therefore, by the very definition quoted by the stipendiary, an article made from “skim‐milk” is not cheese, for “skim‐milk” is not “milk.” The argument that Stilton cheese is not cheese because there is too much fat would not hold, for there is no legal upper limit for fat; but if it did hold, it does not matter, for it can be, and is, sold as “Stilton” cheese, without any hardship to anyone. The last suggestion made by the stipendiary would, if carried out, afford some protection to the general public against their being cheated when they buy cheese. This suggestion is that the Board of Agriculture, who by the Act of 1899 have the legal power, should determine a lower limit of fat which can be present in cheese made from milk; but, as we have repeatedly pointed out, it is by the adoption of the Control system that such questions can alone be settled to the advantage of the producer of genuine articles and to that of the public.

Details

British Food Journal, vol. 2 no. 6
Type: Research Article
ISSN: 0007-070X

Book part
Publication date: 17 December 2003

John B. Guerard and Andrew Mark

In this study, we produce mean-variance efficient portfolios for various universes in the U.S. equity market, and show that the use of a composite of analyst earnings forecast…

Abstract

In this study, we produce mean-variance efficient portfolios for various universes in the U.S. equity market, and show that the use of a composite of analyst earnings forecast, revisions, and breadth variable as a portfolio tilt variable and an R&D quadratic term enhances stockholder wealth. The use of the R&D screen creates portfolios in which total active return generally rise relative to the use of the analyst variable. Stock selection may not necessarily rise as risk index and sector index returns are affected by the use of the R&D quadratic term. R&D expenditures of corporations may be integrated into a mean-variance efficient portfolio creation system to enhance stockholder returns and wealth. The use of an R&D variable enhances stockholder wealth relative to the use of capital expenditures or dividends as the quadratic term. The stockholder return implications of the R&D quadratic variable are particularly interesting given that most corporations allocate more of their resources to capital expenditures than R&D.

Details

Research in Finance
Type: Book
ISBN: 978-1-84950-251-1

Book part
Publication date: 1 May 2012

John B. Guerard

Stock selection models often use momentum and analysts’ expectation data. We find that earnings forecast revisions and direction of forecast revisions are more important than…

Abstract

Stock selection models often use momentum and analysts’ expectation data. We find that earnings forecast revisions and direction of forecast revisions are more important than analysts’ forecasts in identifying mispriced securities. Investing with expectations data and momentum variables is consistent with maximizing the geometric mean and Sharpe ratio over the long run. Additional evidence is revealed that supports the use of multifactor models for portfolio construction and risk control. The anomalies literature can be applied in real-world portfolio construction in the U.S., international, and global equity markets during the 1998–2009 time period. Support exists for the use of tracking error at risk estimation procedures.

While perfection cannot be achieved in portfolio creation and modeling, the estimated model returns pass the Markowitz and Xu data mining corrections test and are statistically different from an average financial model that could have been used to select stocks and form portfolios. We found additional evidence to support the use of Arbitrage Pricing Theory (APT) and statistically-based and fundamentally-based multifactor models for portfolio construction and risk control. Markets are neither efficient nor grossly inefficient; statistically significant excess returns can be earned.

Details

Research in Finance
Type: Book
ISBN: 978-1-78052-752-9

Article
Publication date: 1 February 1993

Beth Sandore

The ability to conduct unobtrusive observation of user searching is a potential strength of the method of information retrieval system analysis known as transaction log analysis…

Abstract

The ability to conduct unobtrusive observation of user searching is a potential strength of the method of information retrieval system analysis known as transaction log analysis (TLA). Transaction logs supply unequivocal information about what a user typed while searching. All other methods rely on self‐reporting, which, as Nielsen points out, is not always corroborated by the logs. Regardless of where in an institution information retrieval (IR) system evaluation takes place, TLA is a method that enables library staff at all levels to examine a variety of system and user‐related activities that are recorded on the log. Dominick suggested that TLA can enable the examination of three broad categories of activity: 1) system performance and resource utilization, 2) information retrieval performance, and 3) user interaction with the IR system. This article has been divided into several sections corresponding to functional areas in a library to suggest useful applications of TLA.

Details

Library Hi Tech, vol. 11 no. 2
Type: Research Article
ISSN: 0737-8831

Abstract

Details

Circuit World, vol. 36 no. 2
Type: Research Article
ISSN: 0305-6120

Article
Publication date: 1 June 1984

Edwin Fleming, Allan Bunch and Wilfred Ashworth

THE European campaign to catch up with the United States and Japan in the provision of information technology took a major step forward at the end of February when the Council of…

Abstract

THE European campaign to catch up with the United States and Japan in the provision of information technology took a major step forward at the end of February when the Council of Ministers of the European Communities adopted the ESPRIT programme. ESPRIT equates to the ‘European Strategic Programme of Information Technology’ and the main areas of research cover micro electronics, software technology, advanced information processing, office systems, and computer integrated manufacturing. The programme will span the years 1984–88 and will cost 1,500,000,000 European Units of Account (£900,000,000), half of which will be contributed by the European Communities Commission, and half by industry. Although the European Community represents over thirty per cent of the world IT market, European industry provides only ten per cent of this market. For further details of the programme, contact Mr W Colin, IT Task Force, 200 Rue de la Loi, B 1049 Brussels, Belgium, tel 235 4477 or 235 2348, telecopier 230 1203, tx 25946.

Details

New Library World, vol. 85 no. 6
Type: Research Article
ISSN: 0307-4803

1 – 10 of 495