Search results1 – 10 of over 1000
This paper aims to develop and argue for a new research path to advance theory on incumbent firm adaptation to discontinuous technological change. Integrating variance and…
This paper aims to develop and argue for a new research path to advance theory on incumbent firm adaptation to discontinuous technological change. Integrating variance and process epistemologies, implications of distinguishing a firm's capacity to adapt from their adaptive choices are highlighted.
The concepts and argument presented are based on an extensive review and synthesis of the literature on the phenomenon.
Distinguishing resource-based capacity variables and behavioral-based choice variables can fuel progress in the literature on incumbent adaptation to technological changes. More attention is needed on the direct, proximate determinants of what occurs in the process of adaptation, e.g. the intermediate choices to adapt, the timing of adaptive actions and the selection of a means for adapting. Work must then associate specific choices with performance outcomes to complete both sides of the mediated cause-effect model connecting characteristics of the decision issue to performance.
Most studies toward understanding how incumbent firms adapt to discontinuous technological innovation have used variance analyses to identify firm and technology characteristics that explain adaptation outcomes. Focusing on characteristics and content, however, does not adequately explain why or how firms adapt. Scholars thus continue to lament the lack of clear, practical theory. I contend one heretofore unaddressed reason for this dissatisfaction is that too much of the research base neglects the importance of understanding choices and the factors affecting them.
The purpose of this paper is to critically analyze the influence of the algorithm used on scholarly search engines (Garfield’s algorithm) and propose metrics to improve it…
The purpose of this paper is to critically analyze the influence of the algorithm used on scholarly search engines (Garfield’s algorithm) and propose metrics to improve it so that science could be based on a more democratic way.
This paper used a snow-ball approach to collect data that allowed identifying the history and the logic behind the Garfield’s algorithm. It follows on excerpting the foundation of existing algorithm and databases of major scholarly search engine. It concluded proposing new metrics so as to surpass restraints and to democratize the scientific discourse.
This paper finds that the studied algorithm currently biases the scientific discourse toward a narrow perspective, while it should take into consideration several researchers’ characteristics. It proposes the substitution of the h-index by the number of times the scholar’s most cited work has been cited. Finally, it proposes that works in languages different than English should be included.
The broad comprehension of any phenomena should be based on multiple perspectives; therefore, the inclusion of diverse metrics will extend the scientific discourse.
The improvement of the existing algorithm will increase the chances of contact among different cultures, which stimulate rapid progress on the development of knowledge.
The value of this paper resides in demonstrating that the algorithm used in scholarly search engines biases the development of science. If updated as proposed here, science will be unbiased and bias aware.
The Library Association of Ireland issued last month the first number of An Leabharlann, their new official journal. The title, for those of us who do not speak the language of Erin, means The Library. It is an extremely interesting venture which will be followed by librarians on the mainland with sympathetic curiosity. In particular our readers would be interested in the first of a series of articles by Father Stephen J. Brown, S.J., on Book Selection. The worthy Father lectures on this subject at University College, Dublin, in the Library School. It is mainly concerned with what should not be selected, and deals in vigorous fashion with the menace of much of current published stuff. No doubt Father Brown will follow with something more constructive. Mr. T. E. Gay, Chairman of the Association, discusses the need for a survey of Irish libraries and their resources. We agree that it is necessary. The Net Books Agreement, the Council, Notes from the Provinces, and an article in Erse—which we honestly believe that most of our Irish friends can read—and an excellent broadcast talk on the Library and the Student by Miss Christina Keogh, the accomplished Librarian of the Irish Central Library, make up a quite attractive first number. A list of broadcast talks given by members of the Association is included.
Examines Secretary of War Elihu Root’s 1903 reorganization of the US Army. Prior to Root, the Army suffered major organizational problems, including no central authority…
Examines Secretary of War Elihu Root’s 1903 reorganization of the US Army. Prior to Root, the Army suffered major organizational problems, including no central authority and an ambiguous chain‐of‐command. Post‐Civil War antimilitary sentiment had left the Army poorly funded, undermanned and barely capable of waging war on the Indians. In 1898, the ineptly fought Spanish‐American War highlighted Army deficiencies. Root’s modernization created the Chief of Staff, a senior general who reported to the Secretary of War, controlled the previously independent bureaus, prepared war plans and coordinated military activities with the Navy. Root also increased Army manpower and funding, reformed state militia into what is now the National Guard, and overhauled military training. Root laid the foundation for the complex defense management of the present day. His doctrine of civilian supremacy and concept of clear command relationships are as sound now as in 1903.
A recently published survey found that slightly over 14 million persons age 16 or over hunted in the United States in 1991 and spent over $12 billion on hunting. By comparison, the same survey determined there are over 35 million anglers. Another source estimates that nearly 18 million participants age seven and older hunted with firearms in 1992. That ranks hunting well below the participatory sports of swimming, bicycling, and bowling in popularity, but ahead of football, skiing, tennis, and target shooting. Estimates vary, and while these numbers are substantial, they indicate that hunters comprise well under ten percent of the total U.S. population. Hunters have come under increasing fire from animal rightists and others who claim the sport is cruel and unnecessary. Hundreds of articles and a number of books have been written in recent years on both sides of the issue, or, more accurately, all sides. Many writers as well as the population at large see hunting as not entirely “good” or “bad” but some of each, depending upon the context.
Utilizes the groupthink framework to analyse successive decisions made by the same group of senior executives of the National Broadcasting Company (NBC). These decisions…
Utilizes the groupthink framework to analyse successive decisions made by the same group of senior executives of the National Broadcasting Company (NBC). These decisions related to NBC’s flagship late‐night television show, The Tonight Show. Based on this analysis, presents an enhanced groupthink framework that attempts to highlight why defective decision making occurred in one decision‐making situation but not in another consecutive decision. Concludes that the answer lies in the presence of group isolation from qualifed experts and the specific leaders’ behaviours of stating a preferred decision choice and not encouraging member opinions.
Simulation-based methods and simulation-assisted estimators have greatly increased the reach of empirical applications in econometrics. The received literature includes a thick layer of theoretical studies, including landmark works by Gourieroux and Monfort (1996), McFadden and Ruud (1994), and Train (2003), and hundreds of applications. An early and still influential application of the method is Berry, Levinsohn, and Pakes's (1995) (BLP) application to the U.S. automobile market in which a market equilibrium model is cleared of latent heterogeneity by integrating the heterogeneity out of the moments in a GMM setting. BLP's methodology is a baseline technique for studying market equilibrium in empirical industrial organization. Contemporary applications involving multilayered models of heterogeneity in individual behavior such as that in Riphahn, Wambach, and Million's (2003) study of moral hazard in health insurance are also common. Computation of multivariate probabilities by using simulation methods is now a standard technique in estimating discrete choice models. The mixed logit model for modeling preferences (McFadden & Train, 2000) is now the leading edge of research in multinomial choice modeling. Finally, perhaps the most prominent application in the entire arena of simulation-based estimation is the current generation of Bayesian econometrics based on Markov Chain Monte Carlo (MCMC) methods. In this area, heretofore intractable estimators of posterior means are routinely estimated with the assistance of simulation and the Gibbs sampler.