The purpose of this study is to examine the facilitating and inhibiting influence of team-level negative affectivity and conscientiousness on a dyad of emergent states…
The purpose of this study is to examine the facilitating and inhibiting influence of team-level negative affectivity and conscientiousness on a dyad of emergent states, adopting and comparing both the composition and compilation perspectives.
Data were collected over three time points from 410 undergraduate students nested within cross-functional project teams (N = 62). The data, including individual self-reports and judges’ ratings of team performance, were aggregated to the team-level using both composition (mean) and compilation (skewness) approaches.
The findings indicate that mean-levels of negative affectivity were associated with decreased psychological safety. The use of skewed conscientiousness counterintuitively suggests too many highly conscientious members can also be detrimental to psychological safety. Psychological safety influences team potency and ultimately performance.
The results of this study highlight that the aggregation approach used is important. For example, the use of skewed (but not mean-level) conscientiousness brought an undetected and counterintuitive relationship to light. Future research should use compilation approaches in addition to composition approaches.
Control charts are designed to be effective in detecting a shift in the distribution of a process. Typically, these charts assume that the data for these processes follow an approximately normal distribution or some known distribution. However, if a data-generating process has a large proportion of zeros, that is, the data is intermittent, then traditional control charts may not adequately monitor these processes. The purpose of this study is to examine proposed control chart methods designed for monitoring a process with intermittent data to determine if they have a sufficiently small percentage of false out-of-control signals. Forecasting techniques for slow-moving/intermittent product demand have been extensively explored as intermittent data is common to operational management applications (Syntetos & Boylan, 2001, 2005, 2011; Willemain, Smart, & Schwarz, 2004). Extensions and modifications of traditional forecasting models have been proposed to model intermittent or slow-moving demand, including the associated trends, correlated demand, seasonality and other characteristics (Altay, Litteral, & Rudisill, 2012). Croston’s (1972) method and its adaptations have been among the principal procedures used in these applications. This paper proposes adapting Croston’s methodology to design control charts, similar to Exponentially Weighted Moving Average (EWMA) control charts, to be effective in monitoring processes with intermittent data. A simulation study is conducted to assess the performance of these proposed control charts by evaluating their Average Run Lengths (ARLs), or equivalently, their percent of false positive signals.
One aspect of forecasting intermittent demand for slow-moving inventory that has not been investigated to any depth in the literature is seasonality. This is due in part…
One aspect of forecasting intermittent demand for slow-moving inventory that has not been investigated to any depth in the literature is seasonality. This is due in part to the reliability of computed seasonal indexes when many of the periods have zero demand. This chapter proposes an innovative approach which adapts Croston's (1970) method to data with a multiplicative seasonal component. Adaptations of Croston's (1970) method are popular in the literature. This method is one of the most popular techniques to forecast items with intermittent demand. A simulation is conducted to examine the effectiveness of the proposed technique extending Croston's (1970) method to incorporate seasonality.
A Bayesian approach to demand forecasting to optimize spare parts inventory that requires periodic replenishment is examined relative to a non-Bayesian approach when the…
A Bayesian approach to demand forecasting to optimize spare parts inventory that requires periodic replenishment is examined relative to a non-Bayesian approach when the demand rate is unknown. That is, optimal inventory levels are decided using these two approaches at consecutive time intervals. Simulations were conducted to compare the total inventory cost using a Bayesian approach and a non-Bayesian approach to a theoretical minimum cost over a variety of demand rate conditions including the challenging slow moving or intermittent type of spare parts. Although Bayesian approaches are often recommended, this study’s results reveal that under conditions of large variability across the demand rates of spare parts, the inventory cost using the Bayes model was not superior to that using the non-Bayesian approach. For spare parts with homogeneous demand rates, the inventory cost using the Bayes model for forecasting was generally lower than that of the non-Bayesian model. Practitioners may still opt to use the non-Bayesian model since a prior distribution for the demand does not need to be identified.
Research in the area of forecasting and stock inventory control for intermittent demand is designed to provide robust models for the underlying demand which appears at…
Research in the area of forecasting and stock inventory control for intermittent demand is designed to provide robust models for the underlying demand which appears at random, with some time periods having no demand at all. Croston’s method is a popular technique for these models and it uses two single exponential smoothing (SES) models which involve smoothing constants. A key issue is the choice of the values due to the sensitivity of the forecasts to changes in demand. Suggested selections of the smoothing constants include values between 0.1 and 0.3. Since an ARIMA model has been illustrated to be equivalent to SES, an optimal smoothing constant can be selected from the ARIMA model for SES. This chapter will conduct simulations to investigate whether using an optimal smoothing constant versus the suggested smoothing constant is important. Since SES is designed to be an adapted method, data are simulated which vary between slow and fast demand.
The purpose of this study is to investigate forecast models using data provided by the Texas Commission on Environmental Quality (TCEQ) to monitor and develop forecast…
The purpose of this study is to investigate forecast models using data provided by the Texas Commission on Environmental Quality (TCEQ) to monitor and develop forecast models for air quality management.
The models used in this research are the LDF (Fisher Linear Discriminant Function), QDF (Quadratic Discriminant Function), REGF (Regression Function), BPNN (Backprop Neural Network), and the RBFN (Radial Basis Function Network). The data used for model evaluations span a 12‐year period from 1990 to 2002. A control chart of the data is also examined for possible shifts in the distribution of ozone present in the Houston atmosphere during this time period.
Results of this research reveal variables that are significantly related to the ozone problem in the Houston area.
Models developed in this paper may assist air quality managers in modeling and forecasting ozone formations using meteorological variables.
This is the first study that has extensively compared the efficiency of LDF, QDF, REGF, BPNN and RBFN forecast models used for tracking air quality. Prior studies have evaluated Neural Networks, ARIMA and regression models.
Understanding large amounts of information and efficiently using that information in improved decision making has become increasingly challenging as businesses collect…
Understanding large amounts of information and efficiently using that information in improved decision making has become increasingly challenging as businesses collect terabytes of data. Businesses have turned to emerging technology including neural networks, symbolic learning, and genetic algorithms. In the current study, four classification methods were compared using results from an Indonesian contraceptive‐method preference survey. The four methods are linear discriminant analysis, quadratic discriminant analysis, backpropagation neural networks, and modular neural networks. The modular neural network is a more complex and less frequently used neural network model. This comparative study gives insight into its performance on classifying observations from a challenging data set, the 1987 National Indonesia Contraceptive Prevalence Survey.
Project managers in information systems play a central role in the development, maintenance, and enhancement of software. Software metrics assist these managers in identifying opportunities for process improvement and help quantify software characteristics. Weaknesses in the traditional approaches to measuring reliability have led to the development of software metrics. The interpretation of software metrics can be critical to making effective responses in the management information systems’ decision‐making processes. This paper gives insight into the use and understanding of some software metrics.
When forecasting intermittent demand the method derived by Croston (1972) is often cited. Previous research favorably compared Croston's forecasting method for demand with…
When forecasting intermittent demand the method derived by Croston (1972) is often cited. Previous research favorably compared Croston's forecasting method for demand with simple exponential smoothing assuming a nonzero demand occurs as a Bernoulli process with a constant probability. In practice, however, the assumption of a constant probability for the occurrence of nonzero demand is often violated. This research investigates Croston's method under violation of the assumption of a constant probability of nonzero demand. In a simulation study, forecasts derived using single exponential smoothing (SES) are compared to forecasts using a modification of Croston's method utilizing double exponential smoothing to forecast the time between nonzero demands assuming a normal distribution for demand size with different standard deviation levels. This methodology may be applicable to forecasting intermittent demand at the beginning or end of a product's life cycle.