Search results
1 – 10 of 48Showkat Ahmad Shah and Md. Saiful Islam
A wetland is a place of tourist attraction, and tourism values play a key role in economic development. Among various services provided by a wetland, recreational services are…
Abstract
Purpose
A wetland is a place of tourist attraction, and tourism values play a key role in economic development. Among various services provided by a wetland, recreational services are increasingly valuable in the tourism sector. This paper aims to unfold the potential recreational values of the Dal Lake in Jammu and Kashmir, India.
Design/methodology/approach
The study uses individual travel cost methods (TCMs) and assesses its impact on regional development in terms of income and employment generation. A sample of 200 tourists is selected through an on-site survey on Dal Lake, and the demand for recreational visits and its value is estimated by employing the truncated Poisson regression model (TPRM) and un-truncated Poisson regression model (UTPRM). The consumers' surplus is estimated and tourists' benefit to visiting the wetland is explored.
Findings
On average, estimated consumers' surplus per visitor is Rs 6,250 (US$96.15) and Rs 25,000 (US$384.61) from respective models. The annual total recreational value of the lake is accounted for Rs 1713m (US$ 26m). This high consumer surplus (CS) and recreational values of the lake indicate large demand for its recreational facilities.
Originality/value
The study is based on primary data and thus, is original. The paper has implications for the policymakers to formulate sustainable management plans for the proper use of Dal Lake and tourism development.
Details
Keywords
Prediction of increased risk of suicide is difficult. We had the opportunity to follow up 20 patients receiving electroconvulsive therapy (ECT) because of severe depression. They…
Abstract
Prediction of increased risk of suicide is difficult. We had the opportunity to follow up 20 patients receiving electroconvulsive therapy (ECT) because of severe depression. They filled in the Antonovsky sense of coherence test (SOC) and Beck depression inventory (BDI) before and after a series of ECT treatments. Seventeen surviving patients had a mean observation time of 20.6 months, whereas the three deceased patients had 11.3 months. There was a lower mean age at onset of illness and a longer mean duration of disease in the deceased. Other clinical parameters did not differ. The surviving patients had a significant decrease on the BDI from 35 to 18 (P<0.001) and an increase on the SOC test after ECT from 2.45 to 3.19 (P<0.001), indicating both less depression and better functioning in life. The deceased had a larger change on the BDI from 32 to 13, not attaining significance because of the low number of deceased. The SOC test, however, did not increase to a purported normal level; that is, from 2.43 to 2.87. Although the SOC scale has been shown to predict mortality in substance abusers, the SOC test has not been part of earlier reviews of predictive power. Tentatively, a low pathological score on the SOC test may indicate low sense of coherence in life that might increase the propensity for suicide. These preliminary results need replication in larger studies.
Details
Keywords
Olga Petricevic and Alain Verbeke
The purpose of this paper is to explore two distinct subsets of dynamic capabilities that need to be deployed when pursuing innovation through inter-organizational activities…
Abstract
Purpose
The purpose of this paper is to explore two distinct subsets of dynamic capabilities that need to be deployed when pursuing innovation through inter-organizational activities, respectively, in the contexts of broad networks and specific alliances. The authors draw distinctions and explore potential interdependencies between these two dynamic capability reservoirs, by integrating concepts from the theoretical perspectives they are derived from, but which have until now largely ignored each other – the social network perspective and the dynamic capabilities view.
Design/methodology/approach
The authors investigate nanotechnology-driven R&D activities in the 1995–2005 period for 76 publicly traded firms in the electronics and electrical equipment industry and in the chemicals and pharmaceuticals industry, that applied for 580 nanotechnology-related patents and engaged in 2,459 alliances during the observation period. The authors used zero-truncated Poisson regression as the estimation method.
Findings
The findings support conceptualizing dynamic capabilities as four distinct subsets, deployed for sensing or seizing purposes, and across the two different inter-organizational contexts. The findings also suggest potential synergies between these subsets of dynamic capabilities, with two subsets being more macro-oriented (i.e. sensing and seizing opportunities within networks) and the two other ones more micro-oriented (i.e. sensing and seizing opportunities within specific alliances).
Practical implications
The authors show that firms differ in their subsets of dynamic capabilities for pursuing different types of inter-organizational, boundary-spanning relationships (such as alliances vs broader network relationships), which ultimately affects their innovation performance.
Originality/value
The authors contribute to the growing body of work on dynamic capabilities and firm-specific advantages by unbundling the dynamic capability subsets, and investigating their complex interdependencies for managing different types of inter-organizational linkages. The main new insight is that the “linear model” of generating more innovations through higher inter-firm collaboration in an emerging field paints an erroneous picture of how high innovation performance is actually achieved.
Details
Keywords
Koraljka Golub, Osma Suominen, Ahmed Taiye Mohammed, Harriet Aagaard and Olof Osterman
In order to estimate the value of semi-automated subject indexing in operative library catalogues, the study aimed to investigate five different automated implementations of an…
Abstract
Purpose
In order to estimate the value of semi-automated subject indexing in operative library catalogues, the study aimed to investigate five different automated implementations of an open source software package on a large set of Swedish union catalogue metadata records, with Dewey Decimal Classification (DDC) as the target classification system. It also aimed to contribute to the body of research on aboutness and related challenges in automated subject indexing and evaluation.
Design/methodology/approach
On a sample of over 230,000 records with close to 12,000 distinct DDC classes, an open source tool Annif, developed by the National Library of Finland, was applied in the following implementations: lexical algorithm, support vector classifier, fastText, Omikuji Bonsai and an ensemble approach combing the former four. A qualitative study involving two senior catalogue librarians and three students of library and information studies was also conducted to investigate the value and inter-rater agreement of automatically assigned classes, on a sample of 60 records.
Findings
The best results were achieved using the ensemble approach that achieved 66.82% accuracy on the three-digit DDC classification task. The qualitative study confirmed earlier studies reporting low inter-rater agreement but also pointed to the potential value of automatically assigned classes as additional access points in information retrieval.
Originality/value
The paper presents an extensive study of automated classification in an operative library catalogue, accompanied by a qualitative study of automated classes. It demonstrates the value of applying semi-automated indexing in operative information retrieval systems.
Details
Keywords
Scott C. Hewitson, Jonathan D. Ritschel, Edward White and Gregory Brown
Recent legislation resulted in an elevation of operating and support (O&S) costs’ relative importance for decision-making in Department of Defense programs. However, a lack of…
Abstract
Purpose
Recent legislation resulted in an elevation of operating and support (O&S) costs’ relative importance for decision-making in Department of Defense programs. However, a lack of research in O&S hinders a cost analyst’s abilities to provide accurate sustainment estimates. Thus, the purpose of this paper is to investigate when Air Force aircraft O&S costs stabilize and to what degree. Next, a parametric O&S model is developed to predict median O&S costs for use as a new tool for cost analyst practitioners.
Design/methodology/approach
Utilizing the Air Force total ownership cost database, 44 programs consisting of 765 observations from 1996 to 2016 are analyzed. First, stability is examined in three areas: total O&S costs, the six O&S cost element structures and by aircraft type. Next, stepwise regression is used to predict median O&S costs per total active inventory (CPTAI) and identify influential variables.
Findings
Stability results vary by category but generally are found to occur approximately five years from initial operating capability. The regression model explains 89.01 per cent of the variance in the data set when predicting median O&S CPTAI. Aircraft type, location of lead logistics center and unit cost are the three largest contributing factors.
Originality/value
Results from this research provide insight to cost analysts on when to start using actual O&S costs as a baseline for estimates in lieu of analogous cost program data and also derives a new parametric O&S estimating tool designed as a cross-check to current estimating methodologies.
Details
Keywords
James C. Ellis, Edward White, Jonathan D. Ritschel, Shawn M. Valentine, Brandon Lucas and Ian S. Cordell
There appears to be no empirical-based method in the literature for estimating if an engineering change proposal (ECP) will occur or the dollar amount incurred. This paper aims to…
Abstract
Purpose
There appears to be no empirical-based method in the literature for estimating if an engineering change proposal (ECP) will occur or the dollar amount incurred. This paper aims to present an empirically based approach to address this shortfall.
Design/methodology/approach
Using the cost assessment data enterprise database, 533 contracts were randomly selected via a stratified sampling plan to build two regression models: one to predict the likelihood of a contract experiencing an ECP and the other to determine the expected median per cent increase in baseline contract cost if an ECP was likely. Both models adopted a stepwise approach. A validation set was placed aside prior to any model building.
Findings
Not every contract incurs an ECP; approximately 80 per cent of the contracts in the database did not have an ECP. The likelihood of an ECP and the additional amount incurred appears to be statistically independent of acquisition phase, branch of service, commodity, contract type or any other factor except for the basic contract amount and the number of contract line item numbers; both of these later variables equally affected the contract percentage increase because of an ECP. The combined model overall bested current anecdotal approaches to ECP withhold.
Originality/value
This paper both serves as a published reference point for ECP withholds in the archival forum and presents an empirically based method for determining per cent ECP withhold to use.
Details
Keywords
Salvador del Saz-Salazar, Salvador Gil-Pareja and María José García-Grande
This study, using a contingent valuation approach, aims to shed light on the economic evaluation of online learning during the first wave of the pandemic.
Abstract
Purpose
This study, using a contingent valuation approach, aims to shed light on the economic evaluation of online learning during the first wave of the pandemic.
Design/methodology/approach
A sample of 959 higher education students was asked about their willingness-to-accept (WTA) a monetary compensation for the loss of well-being resulting from the unexpected and mandatory transition to the online space. In explaining WTA determinants, the authors test the appropriateness of the double-hurdle model against the alternative of a Tobit model and find that the factors affecting the participation decision are not the same as those that affect the quantity decision.
Findings
Results show that a vast majority of the respondents think that the abrupt transition to online learning is detrimental to them, while those willing to accept a monetary compensation account for 77% of the sample, being the mean WTA between €448 and €595. As expected, WTA decreases with income and age, and it increases if some member of the family unit is unemployed. By aggregating the mean WTA by the population affected, total loss of well-being is obtained.
Originality/value
To the best of the authors’ knowledge, to date, this method has not been used to value online learning in a WTA framework, much less in the particular context of the pandemic. Thus, based on the understanding that the economic evaluation of online learning could be very useful in providing guidance for decision-making, this paper contributes to the literature on the economic evaluation of higher education.
Details
Keywords
This paper tests whether Bayesian A/B testing yields better decisions that traditional Neyman-Pearson hypothesis testing. It proposes a model and tests it using a large, multiyear…
Abstract
Purpose
This paper tests whether Bayesian A/B testing yields better decisions that traditional Neyman-Pearson hypothesis testing. It proposes a model and tests it using a large, multiyear Google Analytics (GA) dataset.
Design/methodology/approach
This paper is an empirical study. Competing A/B testing models were used to analyze a large, multiyear dataset of GA dataset for a firm that relies entirely on their website and online transactions for customer engagement and sales.
Findings
Bayesian A/B tests of the data not only yielded a clear delineation of the timing and impact of the intellectual property fraud, but calculated the loss of sales dollars, traffic and time on the firm’s website, with precise confidence limits. Frequentist A/B testing identified fraud in bounce rate at 5% significance, and bounces at 10% significance, but was unable to ascertain fraud at the standard significance cutoffs for scientific studies.
Research limitations/implications
None within the scope of the research plan.
Practical implications
Bayesian A/B tests of the data not only yielded a clear delineation of the timing and impact of the IP fraud, but calculated the loss of sales dollars, traffic and time on the firm’s website, with precise confidence limits.
Social implications
Bayesian A/B testing can derive economically meaningful statistics, whereas frequentist A/B testing only provide p-value’s whose meaning may be hard to grasp, and where misuse is widespread and has been a major topic in metascience. While misuse of p-values in scholarly articles may simply be grist for academic debate, the uncertainty surrounding the meaning of p-values in business analytics actually can cost firms money.
Originality/value
There is very little empirical research in e-commerce that uses Bayesian A/B testing. Almost all corporate testing is done via frequentist Neyman-Pearson methods.
Details