Search results
1 – 10 of over 29000Karen A.F. Landale, Rene G. Rendon and Timothy G. Hawkins
The purpose of this research is to explore the effects of supplier selection method on key procurement outcomes such as procurement lead time (PLT), supplier performance and buyer…
Abstract
Purpose
The purpose of this research is to explore the effects of supplier selection method on key procurement outcomes such as procurement lead time (PLT), supplier performance and buyer team size.
Design/methodology/approach
Data were collected from a sample of 124 archival contract records from the US Department of Defense. A multiple regression model and multivariate analysis of covariance/analysis of covariance models were used to test the effects of source selection method on pertinent procurement outcomes.
Findings
The trade-off (TO) source selection method increases PLT, as does the number of evaluation factors and the number of proposals received. Substantially larger sourcing teams are also associated with the TO source selection method. Nonetheless, the TO method results in better supplier performance.
Practical implications
TO source selections yield superior supplier performance than low-bidder methods. However, they are costly in terms of time and personnel. Any assessment of supplier value should consider not only the price premium for higher performance but also the transaction costs associated with the TO method.
Originality/value
Very little research addresses a buying team’s evaluation of supplier-offered value ex ante and whether that value assessment materializes into actual value-added supplier performance. Low bidder tactics are pervasive, but price (i.e. sacrifice) is only one component of value. Benefits from superior supplier performance may yield greater overall value. If value is critical to the buyer, a TO source selection method – versus a low-bidder approach – is the appropriate tool because of higher supplier performance ex post.
Details
Keywords
Jared Nystrom, Raymond R. Hill, Andrew Geyer, Joseph J. Pignatiello and Eric Chicken
Present a method to impute missing data from a chaotic time series, in this case lightning prediction data, and then use that completed dataset to create lightning prediction…
Abstract
Purpose
Present a method to impute missing data from a chaotic time series, in this case lightning prediction data, and then use that completed dataset to create lightning prediction forecasts.
Design/methodology/approach
Using the technique of spatiotemporal kriging to estimate data that is autocorrelated but in space and time. Using the estimated data in an imputation methodology completes a dataset used in lightning prediction.
Findings
The techniques provided prove robust to the chaotic nature of the data, and the resulting time series displays evidence of smoothing while also preserving the signal of interest for lightning prediction.
Research limitations/implications
The research is limited to the data collected in support of weather prediction work through the 45th Weather Squadron of the United States Air Force.
Practical implications
These methods are important due to the increasing reliance on sensor systems. These systems often provide incomplete and chaotic data, which must be used despite collection limitations. This work establishes a viable data imputation methodology.
Social implications
Improved lightning prediction, as with any improved prediction methods for natural weather events, can save lives and resources due to timely, cautious behaviors as a result of the predictions.
Originality/value
Based on the authors’ knowledge, this is a novel application of these imputation methods and the forecasting methods.
Details
Keywords
Zachary Hornberger, Bruce Cox and Raymond R. Hill
Large/stochastic spatiotemporal demand data sets can prove intractable for location optimization problems, motivating the need for aggregation. However, demand aggregation induces…
Abstract
Purpose
Large/stochastic spatiotemporal demand data sets can prove intractable for location optimization problems, motivating the need for aggregation. However, demand aggregation induces errors. Significant theoretical research has been performed related to the modifiable areal unit problem and the zone definition problem. Minimal research has been accomplished related to the specific issues inherent to spatiotemporal demand data, such as search and rescue (SAR) data. This study provides a quantitative comparison of various aggregation methodologies and their relation to distance and volume based aggregation errors.
Design/methodology/approach
This paper introduces and applies a framework for comparing both deterministic and stochastic aggregation methods using distance- and volume-based aggregation error metrics. This paper additionally applies weighted versions of these metrics to account for the reality that demand events are nonhomogeneous. These metrics are applied to a large, highly variable, spatiotemporal demand data set of SAR events in the Pacific Ocean. Comparisons using these metrics are conducted between six quadrat aggregations of varying scales and two zonal distribution models using hierarchical clustering.
Findings
As quadrat fidelity increases the distance-based aggregation error decreases, while the two deliberate zonal approaches further reduce this error while using fewer zones. However, the higher fidelity aggregations detrimentally affect volume error. Additionally, by splitting the SAR data set into training and test sets this paper shows the stochastic zonal distribution aggregation method is effective at simulating actual future demands.
Originality/value
This study indicates no singular best aggregation method exists, by quantifying trade-offs in aggregation-induced errors practitioners can utilize the method that minimizes errors most relevant to their study. Study also quantifies the ability of a stochastic zonal distribution method to effectively simulate future demand data.
Details
Keywords
Adam Biggs and Joseph Hamilton
Evaluating warfighter lethality is a critical aspect of military performance. Raw metrics such as marksmanship speed and accuracy can provide some insight, yet interpreting subtle…
Abstract
Purpose
Evaluating warfighter lethality is a critical aspect of military performance. Raw metrics such as marksmanship speed and accuracy can provide some insight, yet interpreting subtle differences can be challenging. For example, is a speed difference of 300 milliseconds more important than a 10% accuracy difference on the same drill? Marksmanship evaluations must have objective methods to differentiate between critical factors while maintaining a holistic view of human performance.
Design/methodology/approach
Monte Carlo simulations are one method to circumvent speed/accuracy trade-offs within marksmanship evaluations. They can accommodate both speed and accuracy implications simultaneously without needing to hold one constant for the sake of the other. Moreover, Monte Carlo simulations can incorporate variability as a key element of performance. This approach thus allows analysts to determine consistency of performance expectations when projecting future outcomes.
Findings
The review divides outcomes into both theoretical overview and practical implication sections. Each aspect of the Monte Carlo simulation can be addressed separately, reviewed and then incorporated as a potential component of small arms combat modeling. This application allows for new human performance practitioners to more quickly adopt the method for different applications.
Originality/value
Performance implications are often presented as inferential statistics. By using the Monte Carlo simulations, practitioners can present outcomes in terms of lethality. This method should help convey the impact of any marksmanship evaluation to senior leadership better than current inferential statistics, such as effect size measures.
Details
Keywords
Mei-Ling Cheng, Ching-Wu Chu and Hsiu-Li Hsu
This paper aims to compare different univariate forecasting methods to provide a more accurate short-term forecasting model on the crude oil price for rendering a reference to…
Abstract
Purpose
This paper aims to compare different univariate forecasting methods to provide a more accurate short-term forecasting model on the crude oil price for rendering a reference to manages.
Design/methodology/approach
Six different univariate methods, namely the classical decomposition model, the trigonometric regression model, the regression model with seasonal dummy variables, the grey forecast, the hybrid grey model and the seasonal autoregressive integrated moving average (SARIMA), have been used.
Findings
The authors found that the grey forecast is a reliable forecasting method for crude oil prices.
Originality/value
The contribution of this research study is using a small size of data and comparing the forecasting results of the six univariate methods. Three commonly used evaluation criteria, mean absolute error (MAE), root mean squared error (RMSE) and mean absolute percent error (MAPE), were adopted to evaluate the model performance. The outcome of this work can help predict the crude oil price.
Details
Keywords
Cengiz Bahadir Karahan and Levent Kirval
Turkey is a maritime country with its current merchant fleet and shipyards, geographical location, young population and growth potential. Clustering, being one of the important…
Abstract
Purpose
Turkey is a maritime country with its current merchant fleet and shipyards, geographical location, young population and growth potential. Clustering, being one of the important improvement methods of global competition power, is widely used in the maritime sector. Analysing the clustering level and potential of Istanbul, which is the major city of Turkey, in regard to economic and social aspects is a basic step for increasing global competitiveness in this sector. This study aims to measure the clustering level of Istanbul’s maritime sector and also define the effect of clustering level on firm performance.
Design/methodology/approach
The clustering levels of Istanbul’s maritime transportation and supporting firms, shipyards and maritime equipment manufacturers are measured by means of a survey based on Porter’s diamond theory in this paper. The relationship between clustering level and firm performance is defined by using simple linear regression and fuzzy linear regression methods. The weights of the criteria are calculated by means of entropy method.
Findings
It is concluded that despite its deficits, Istanbul’s maritime sector has significant potential to become a major maritime cluster not only in its region but also worldwide. The effect of clustering level on firm performance was observed to be statistically significant, but not high. The results of the simple linear regression and fuzzy linear regression methods are compared.
Originality/value
According to the author’s knowledge, this paper is the first study using fuzzy linear regression and entropy methods to analyse maritime clusters. It evaluates the effect of clustering level on firm performance in the case of Istanbul maritime sector.
Details
Keywords
Petar Jackovich, Bruce Cox and Raymond R. Hill
This paper aims to define the class of fragment constructive heuristics used to compute feasible solutions for the traveling salesman problem (TSP) into edge-greedy and…
Abstract
Purpose
This paper aims to define the class of fragment constructive heuristics used to compute feasible solutions for the traveling salesman problem (TSP) into edge-greedy and vertex-greedy subclasses. As these subclasses of heuristics can create subtours, two known methodologies for subtour elimination on symmetric instances are reviewed and are expanded to cover asymmetric problem instances. This paper introduces a third novel subtour elimination methodology, the greedy tracker (GT), and compares it to both known methodologies.
Design/methodology/approach
Computational results for all three subtour elimination methodologies are generated across 17 symmetric instances ranging in size from 29 vertices to 5,934 vertices, as well as 9 asymmetric instances ranging in size from 17 to 443 vertices.
Findings
The results demonstrate the GT is the fastest method for preventing subtours for instances below 400 vertices. Additionally, a distinction between fragment constructive heuristics and the subtour elimination methodology used to ensure the feasibility of resulting solutions enables the introduction of a new vertex-greedy fragment heuristic called ordered greedy.
Originality/value
This research has two main contributions: first, it introduces a novel subtour elimination methodology. Second, the research introduces the concept of ordered lists which remaps the TSP into a new space with promising initial computational results.
Details
Keywords
Joshua L. McDonald, Edward D. White, Raymond R. Hill and Christian Pardo
The purpose of this paper is to demonstrate an improved method for forecasting the US Army recruiting.
Abstract
Purpose
The purpose of this paper is to demonstrate an improved method for forecasting the US Army recruiting.
Design/methodology/approach
Time series methods, regression modeling, principle components and marketing research are included in this paper.
Findings
This paper found the unique ability of multiple statistical methods applied to a forecasting context to consider the effects of inputs that are controlled to some degree by a decision maker.
Research limitations/implications
This work will successfully inform the US Army recruiting leadership on how this improved methodology will improve their recruitment process.
Practical implications
Improved US Army analytical technique for forecasting recruiting goals..
Originality/value
This work culls data from open sources, using a zip-code-based classification method to develop more comprehensive forecasting methods with which US Army recruiting leaders can better establish recruiting goals.
Details
Keywords
Denis S. Clayson, Alfred E. Thal, Jr and Edward D. White III
The purpose of this study was to investigate the stability of the cost performance index (CPI) for environmental remediation projects as the topic is not addressed in the…
Abstract
Purpose
The purpose of this study was to investigate the stability of the cost performance index (CPI) for environmental remediation projects as the topic is not addressed in the literature. CPI is defined as the earned value of work performed divided by the actual cost of the work, and CPI stability represents the point in time in a project after which the CPI varies by less than 20 percent (measured in different ways).
Design/methodology/approach
After collecting monthly earned value management (EVM) data for 136 environmental remediation projects from a United States federal agency in fiscal years 2012 and 2013, the authors used the nonparametric Wilcoxon signed-rank test to analyze CPI stability. The authors also used nonparametric statistical comparisons to identify any significant relationships between CPI stability and independent variables representing project and contract characteristics.
Findings
The CPI for environmental projects did not stabilize until the projects were 41 percent complete with respect to project duration. The most significant factors contributing to CPI stability were categorized into the following managerial insights: contractor qualifications, communication, stakeholder engagement, contracting strategy, competition, EVM factors, and macro project factors.
Originality/value
As CPI stability for environmental remediation projects has not been reported in the literature, this research provides new insights to help project managers understand when the CPIs of environmental remediation projects stabilize and which factors have the most impact on CPI stability.
Details
Keywords
Manuel Rossetti, Juliana Bright, Andrew Freeman, Anna Lee and Anthony Parrish
This paper is motivated by the need to assess the risk profiles associated with the substantial number of items within military supply chains. The scale of supply chain management…
Abstract
Purpose
This paper is motivated by the need to assess the risk profiles associated with the substantial number of items within military supply chains. The scale of supply chain management processes creates difficulties in both the complexity of the analysis and in performing risk assessments that are based on the manual (human analyst) assessment methods. Thus, analysts require methods that can be automated and that can incorporate on-going operational data on a regular basis.
Design/methodology/approach
The approach taken to address the identification of supply chain risk within an operational setting is based on aspects of multiobjective decision analysis (MODA). The approach constructs a risk and importance index for supply chain elements based on operational data. These indices are commensurate in value, leading to interpretable measures for decision-making.
Findings
Risk and importance indices were developed for the analysis of items within an example supply chain. Using the data on items, individual MODA models were formed and demonstrated using a prototype tool.
Originality/value
To better prepare risk mitigation strategies, analysts require the ability to identify potential sources of risk, especially in times of disruption such as natural disasters.
Details