Search results
1 – 10 of 33The purpose of the paper is the construction of confidence intervals for the ratio of the values of process capability index Cpm for two processes. These confidence intervals can…
Abstract
Purpose
The purpose of the paper is the construction of confidence intervals for the ratio of the values of process capability index Cpm for two processes. These confidence intervals can be used for comparing the capability of any pair of competitive processes.
Design/methodology/approach
Two methods for constructing confidence intervals for the ratio of the values of process capability index Cpm for two processes are proposed. The suggested techniques are based on a two-step approximation of the doubly non-central F distribution. Their performance is tested via simulation.
Findings
The performance of the suggested techniques seems to be rather satisfactory even for small samples, as illustrated through the use of simulated data.
Practical implications
The practical implication of the suggested techniques is that they can be implemented in real-world applications, since they can be used for comparing the capability of any pair of competitive processes.
Originality/value
The paper presents two new methods for constructing confidence intervals for the ratio of the values of process capability index Cpm for two processes.
Details
Keywords
Xin Li, Jiming Guo and Lv Zhou
Global positioning system (GPS) kinematic positioning suffers from performance degradation in constrained environments such as urban canyons, which then restricts the application…
Abstract
Purpose
Global positioning system (GPS) kinematic positioning suffers from performance degradation in constrained environments such as urban canyons, which then restricts the application of high-precision vehicle positioning and navigation within the city. In December 2012, the BeiDou Navigation Satellite System (BDS) regional service was announced, and the combined BDS/GPS kinematic positioning has been enabled in the Asia-Pacific area. Previous studies have mainly focused on the performance evaluations of combined BDS/GPS static positioning. Not much work has been performed for kinematic vehicle positioning under constrained observation conditions. This study aims to analyze the performance of BDS/GPS kinematic vehicle positioning in various conditions.
Design/methodology/approach
In this study, three vehicle experiments under three observation conditions, an open suburban area, a less dense non-central urban area and a dense central urban area, are investigated using both the code-based differential global navigation satellite system (DGNSS) and phase-based real-time kinematic (RTK) modes. The comparison between combined BDS/GPS and GPS-only vehicle positioning solutions is conducted in terms of positioning availability and positioning precision.
Findings
Numerical results show that the combined BDS/GPS system significantly outperforms the GPS-only system under poor observation conditions, whereas the improvement was less significant under good observation conditions.
Originality/value
Thus, this paper studies the performance of combined BDS/GPS kinematic relative positioning under various observation conditions.
Details
Keywords
Belmiro P.M. Duarte and Pedro M. Saraiva
This paper seeks to present an optimization‐based approach to design acceptance sampling plans by variables for controlling non‐conforming proportions in lots of items. Simple and…
Abstract
Purpose
This paper seeks to present an optimization‐based approach to design acceptance sampling plans by variables for controlling non‐conforming proportions in lots of items. Simple and double sampling plans with s known and unknown are addressed. Normal approximation distributions proposed by Wallis are employed to handle plans with s unknown. The approach stands on the minimization of the average sampling number (ASN) taking into account the constraints arising from the two point conditions on the operating characteristic (OC) curve. The resulting optimization problems fall under the class of mixed integer non‐linear programming (MINLP), and are solved employing GAMS. The results obtained strongly agree with classical acceptance sampling plans found in the literature, although outperforming them in some cases, and providing a general approach to address other cases.
Design/methodology/approach
The approach takes the form of formulation of the design of acceptance sampling plans by variables for non‐conforming proportions as optimization problems minimizing the ASN with the constraints being the acceptance probability at the controlled points of the OC curve, and subsequent solution of the mathematical programming problems arising with mathematical programming algorithms.
Findings
The results are in strong agreement with acceptance sampling plans available in the literature. The approach presented here outperforms the classical plans in some cases and its generality allows one to design other plans without the requirement of additional relations between the parameters and intensive enumerative algorithms.
Originality/value
The paper presents an optimization‐based approach to design robust acceptance sampling plans by variables for non‐conforming proportions that allows a general treatment and disregards the need for computational intensive enumerative‐based procedures.
Details
Keywords
David Ray, John Gattorna and Mike Allen
Preface The functions of business divide into several areas and the general focus of this book is on one of the most important although least understood of these—DISTRIBUTION. The…
Abstract
Preface The functions of business divide into several areas and the general focus of this book is on one of the most important although least understood of these—DISTRIBUTION. The particular focus is on reviewing current practice in distribution costing and on attempting to push the frontiers back a little by suggesting some new approaches to overcome previously defined shortcomings.
Selects one of Hamaker′s procedures for deriving a “σ” method (i.e. known process standard deviation) double sampling plan and exploits some of its properties to develop a system…
Abstract
Selects one of Hamaker′s procedures for deriving a “σ” method (i.e. known process standard deviation) double sampling plan and exploits some of its properties to develop a system of “s” method (i.e. unknown process standard deviation) double sampling plans by variables that match the system of single specification limit “s” method single sampling plans of the current edition of the international standard on sampling by variables. ISO 3951: 1989. The new system is presented in two forms, the second of which may also be used for combined double specification limits and multivariate acceptance sampling.
Details
Keywords
Diandian Ma, Xiaojing Song, Mark Tippett and Thu Phuong Truong
The purpose of this study is to determine distributional properties of the accumulated rate of interest when the instantaneous rate of interest evolves in terms of the Cox et al.…
Abstract
Purpose
The purpose of this study is to determine distributional properties of the accumulated rate of interest when the instantaneous rate of interest evolves in terms of the Cox et al. (1985) square root process.
Design/methodology/approach
The law of iterated (or double) expectations is used to determine the mean and variance of the accumulated rate of interest on a cash management (or loan) account when interest accumulates at the instantaneous rates of interest implied by the square root process.
Findings
This study demonstrates how the accumulated rate of interest does not satisfy the strong mixing conditions necessary for convergence in distribution to the normal density function.
Originality/value
This study has strong educational value in determining distributional properties of the accumulated rate of interest when the instantaneous rate of interest evolves in terms of the Cox et al. (1985) square root process and demonstrating how the accumulated rate of interest does not satisfy the strong mixing conditions necessary for convergence in distribution to the normal density function.
Details
Keywords
THE importance of the aircraft engine supercharger is being emphasized by the increasing demands for high altitude performance in the present war. Centrifugal stresses of…
Abstract
THE importance of the aircraft engine supercharger is being emphasized by the increasing demands for high altitude performance in the present war. Centrifugal stresses of considerable magnitude are induced in the supercharger impeller by reason of the high rotative speeds necessary to obtain the desired pumping effect. A speed of 20,000 r.p.m. is not uncommon for an impeller of 12 in. outside diameter and over. Consequently, a knowledge of the centrifugal stresses constitutes a basic design consideration. Unfortunately, a direct determination of these stresses is not an easy matter.
Anindya Chakrabarty, Zongwei Luo, Rameshwar Dubey and Shan Jiang
The purpose of this paper is to develop a theoretical model of a jump diffusion-mean reversion constant proportion portfolio insurance strategy under the presence of transaction…
Abstract
Purpose
The purpose of this paper is to develop a theoretical model of a jump diffusion-mean reversion constant proportion portfolio insurance strategy under the presence of transaction cost and stochastic floor as opposed to the deterministic floor used in the previous literatures.
Design/methodology/approach
The paper adopts Merton’s jump diffusion (JD) model to simulate the price path followed by risky assets and the CIR mean reversion model to simulate the path followed by the short-term interest rate. The floor of the CPPI strategy is linked to the stochastic process driving the value of a fixed income instrument whose yield follows the CIR mean reversion model. The developed model is benchmarked against CNX-NIFTY 50 and is back tested during the extreme regimes in the Indian market using the scenario-based Monte Carlo simulation technique.
Findings
Back testing the algorithm using Monte Carlo simulation across the crisis and recovery phases of the 2008 recession regime revealed that the portfolio performs better than the risky markets during the crisis by hedging the downside risk effectively and performs better than the fixed income instruments during the growth phase by leveraging on the upside potential. This makes it a value-enhancing proposition for the risk-averse investors.
Originality/value
The study modifies the CPPI algorithm by re-defining the floor of the algorithm to be a stochastic mean reverting process which is guided by the movement of the short-term interest rate in the economy. This development is more relevant for two reasons: first, the short-term interest rate changes with time, and hence the constant yield during each rebalancing steps is not practically feasible; second, the historical literatures have revealed that the short-term interest rate tends to move opposite to that of the equity market. Thereby, during the bear run the floor will increase at a higher rate, whereas the growth of the floor will stagnate during the bull phase which aids the model to capitalize on the upward potential during the growth phase and to cut down on the exposure during the crisis phase.
Details
Keywords
The curvilinear shape of a bond price‐yield curve implies that risk management based on a linear approximation using duration is only viable for very small changes in interest…
Abstract
The curvilinear shape of a bond price‐yield curve implies that risk management based on a linear approximation using duration is only viable for very small changes in interest rates. Not accounting for convexity when there are large yield changes can result in critical errors in measuring or hedging interest rate risk. The linear approximations will under‐or overestimate the value at risk (VaR) for non‐linear financial instruments. Nonlinearity can be particularly problematic if there are large changes in market risk factors. The large changes are more likely to occur when VaR is computed for high confidence levels and/or longer time horizons. Even if the movements in risk factors are small, estimation errors in VaR would get larger as the degree of non‐linearity in financial instruments increases.
Nina Reynolds and Adamantios Diamantopoulos
Although pretesting is an essential part of the questionnaire design process, the range of methodological work on pretesting issues is limited. The present paper concentrates on…
Abstract
Although pretesting is an essential part of the questionnaire design process, the range of methodological work on pretesting issues is limited. The present paper concentrates on the effect of the pretest survey method on error detection by contrasting respondents who are interviewed personally with those who receive an impersonal survey method. The interaction between survey method and respondent knowledge of the questionnaire topic is also considered. The findings show that the pretest method does have an effect on the error detection rate of respondents; however, the hypothesised interaction between method and knowledge was not unequivocally supported. The detailed results illustrate which error types are affected by the method used during pretesting. Implications for future research are considered.
Details