Search results

11 – 20 of over 3000
Article
Publication date: 5 March 2018

Preeti Wanti Srivastava and Savita Savita

Most of the literature on the design of accelerated life test (ALT) plan focus on a single system (subsystem) totally disregarding its internal configuration. Many a times it is…

Abstract

Purpose

Most of the literature on the design of accelerated life test (ALT) plan focus on a single system (subsystem) totally disregarding its internal configuration. Many a times it is not possible to identify the components that cause the system failure or that the cause can only be identified by a subset of its components resulting in a masked observation. The purpose of this paper is to deal with the planning of ramp-stress accelerated life testing for a high-reliability parallel system comprising two dependent components using masked failure data. Such a testing may prove to be useful in a twin-engine aircraft. A ramp-stress results when stress applied on the system increases linearly with time.

Design/methodology/approach

A parallel system with two dependent components is taken with dependency modeled by Gumbel-Hougaard copula. The stress-life relationship is modeled using inverse power law, and cumulative exposure model is assumed to model the effect of changing stress. The method of maximum likelihood is used for estimating design parameters. The optimal plan consists in finding optimal stress rate using D-optimality criterion.

Findings

The optimal plan consists in finding optimal stress rate using D-optimality criterion by minimizing the reciprocal of the determinant of Fisher information matrix. The proposed plan has been explained using a numerical example and carrying out a sensitivity analysis.

Originality/value

The model formulated can help reliability engineers obtain reliability estimates quickly of high-reliability products that are likely to last for several years.

Details

International Journal of Quality & Reliability Management, vol. 35 no. 3
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 1 April 2000

Fumiyo N. Kondo and Genshiro Kitagawa

Access to daily store level scanner data has been increasingly easier in recent years in Japan and time series analysis based on a sales response model is becoming realistic…

2789

Abstract

Access to daily store level scanner data has been increasingly easier in recent years in Japan and time series analysis based on a sales response model is becoming realistic. Introduces a new method of combining time series analysis and regression analysis on the price promotion effect, which enables simultaneous decomposition of store level scanner sales into trend (including seasonality), day‐of‐the‐week effect and explanatory variable effect due to price promotion. The method was applied to daily store level scanner sales of milk, showing evidence of the existence of day‐of‐the‐week effect. Further, a method of incorporating several kinds of price‐cut variables in regression analysis and the analyzed results were presented.

Details

Marketing Intelligence & Planning, vol. 18 no. 2
Type: Research Article
ISSN: 0263-4503

Keywords

Article
Publication date: 22 March 2022

Zhanpeng Shen, Chaoping Zang, Xueqian Chen, Shaoquan Hu and Xin-en Liu

For fast calculation of complex structure in engineering, correlations among input variables are often ignored in uncertainty propagation, even though the effect of ignoring these…

Abstract

Purpose

For fast calculation of complex structure in engineering, correlations among input variables are often ignored in uncertainty propagation, even though the effect of ignoring these correlations on the output uncertainty is unclear. This paper aims to quantify the inputs uncertainty and estimate the correlations among them acorrding to the collected observed data instead of questionable assumptions. Moreover, the small size of the experimental data should also be considered, as it is such a common engineering problem.

Design/methodology/approach

In this paper, a novel method of combining p-box with copula function for both uncertainty quantification and correlation estimation is explored. Copula function is utilized to estimate correlations among uncertain inputs based upon the observed data. The p-box method is employed to quantify the input uncertainty as well as the epistemic uncertainty associated with the limited amount of the observed data. Nested Monte Carlo sampling technique is adopted herein to ensure that the propagation is always feasible. In addition, a Kriging model is built up to reduce the computational cost of uncertainty propagation.

Findings

To illustrate the application of this method, an engineering example of structural reliability assessment is performed. The results indicate that it may significantly affect output uncertainty whether to quantify the correlation among input variables. Furthermore, an additional advantage for risk management is obtained in this approach due to the separation of aleatory and epistemic uncertainties.

Originality/value

The proposed method takes advantage of p-box and copula function to deal with the correlations and limited amount of the observed data, which are two important issues of uncertainty quantification in engineering. Thus, it is practical and has the ability to predict accurate response uncertainty or system state.

Details

Engineering Computations, vol. 39 no. 6
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 3 August 2010

Wei Xu, Guenther Filler, Martin Odening and Ostap Okhrin

The purpose of this paper is to assess the losses of weather‐related insurance at different regional levels. The possibility of spatial diversification of insurance is explored by…

Abstract

Purpose

The purpose of this paper is to assess the losses of weather‐related insurance at different regional levels. The possibility of spatial diversification of insurance is explored by estimating the joint occurrence on unfavorable weather conditions in different locations, looking particularly at the tail behavior of the loss distribution.

Design/methodology/approach

Joint weather‐related losses are estimated using copulas. Copulas avoid the direct estimation of multivariate distributions but allow for much greater flexibility in modeling the dependence structure of weather risks compared with simple correlation coefficients.

Findings

Results indicate that indemnity payments based on temperature as well as on cumulative rainfall show strong stochastic dependence even at a large regional scale. Thus the possibility to reduce risk exposure by increasing the trading area of insurance is limited.

Research limitations/implications

The empirical findings are limited by a rather weak database. In that case the estimation of high‐dimensional copulas leads to large estimation errors.

Practical implications

The paper includes implications for the quantification of systemic weather risk which is important for the rate making of crop insurance and reinsurance.

Originality/value

This paper's results highlight how important the choice of the statistical approach is when modeling the dependence structure of weather risks.

Details

Agricultural Finance Review, vol. 70 no. 2
Type: Research Article
ISSN: 0002-1466

Keywords

Book part
Publication date: 15 January 2010

Jon Crockett, Gerard Andrew Whelan, Caroline Louise Sinclair and Hugh Gillies

Interest in car-sharing initiatives, as a tool for improving transport network efficiency in urban areas and on interurban links, has grown in recent years. They have often been…

Abstract

Interest in car-sharing initiatives, as a tool for improving transport network efficiency in urban areas and on interurban links, has grown in recent years. They have often been proposed as a more cost effective alternative to other modal shift and congestion relief initiatives, such as public transport or highway improvement schemes; however, with little implementation in practice, practitioners have only limited evidence for assessing their likely impacts.

This study reports the findings of a Stated Preference (SP) study aimed at understanding the value that car drivers put on car sharing as opposed to single occupancy trips. Following an initial pilot period, 673 responses were received from a web-based survey conducted in June 2008 amongst a representative sample of car driving commuters in Scotland.

An important methodological aspect of this study was the need to account for differences in behaviour to identify those market segments with the greatest propensity to car share. To this end, we estimated a range of choice model forms and compared the ability of each to consistently identify individual behaviours. More specifically, this included a comparison of:

Standard market segmentation approaches based on multinomial logit with attribute coefficients estimated by reported characteristics (e.g. age, income, etc.);

A two-stage mixed logit approach involving the estimation of random parameters logit models followed by an examination of individual respondent's choices to arrive at estimates of their parameters, conditional on know distributions across the population (following Revelt & Train, 1999); and

A latent-class model involving the specification of C classes of respondent, each with their own coefficients, and assigning each individual a probability that they belongs to a given class based upon their observed choices, socioeconomic characteristics and their reported attitudes.

As hypothesised, there are significant variations in tastes and preferences across market segments, particularly for household car ownership, gender, age group, interest in car pooling, current journey time and sharing with a stranger (as opposed to family member/friend). Comparing the sensitivity of demand to a change from a single occupancy to a car-sharing trip, it can be seen that the latter imposes a ‘penalty’ equivalent to 29.85 IVT minutes using the mixed logit structure and 26.68 IVT minutes for the multinomial specification. Segmenting this latter value according to the number of cars owner per household results in ‘penalties’ equivalent to 46.51 and 26.42 IVT minutes for one and two plus car owning households respectively.

Details

Choice Modelling: The State-of-the-art and The State-of-practice
Type: Book
ISBN: 978-1-84950-773-8

Article
Publication date: 17 May 2013

Michael Martin

Interest rate risk, i.e. the risk of changes in the interest rate term structure, is of high relevance in insurers' risk management. Due to large capital investments in interest…

1595

Abstract

Purpose

Interest rate risk, i.e. the risk of changes in the interest rate term structure, is of high relevance in insurers' risk management. Due to large capital investments in interest rate sensitive assets such as bonds, interest rate risk plays a considerable role for deriving the solvency capital requirement (SCR) in the context of Solvency II. This paper seeks to address these issues.

Design/methodology/approach

In addition to the Solvency II standard model, the author applies the model of Gatzert and Martin for introducing a partial internal model for the market risk of bond exposures. After introducing calibration methods for short rate models, the author quantifies interest rate and credit risk for corporate and government bonds and demonstrates that the type of process can have a considerable impact despite comparable underlying input data.

Findings

The results show that, in general, the SCR for interest rate risk derived from the standard model of Solvency II tends to the SCR achieved by the short rate model from Vasicek, while the application of the Cox, Ingersoll, and Ross model leads to a lower SCR. For low‐rated bonds, the internal models approximate each other and, moreover, show a considerable underestimation of credit risk in the Solvency II model.

Originality/value

The aim of this paper is to assess model risk with focus on bonds in the market risk module of Solvency II regarding the underlying interest rate process and input parameters.

Details

The Journal of Risk Finance, vol. 14 no. 3
Type: Research Article
ISSN: 1526-5943

Keywords

Book part
Publication date: 24 March 2006

Thomas B. Fomby and Dek Terrell

The editors are pleased to offer the following papers to the reader in recognition and appreciation of the contributions to our literature made by Robert Engle and Sir Clive…

Abstract

The editors are pleased to offer the following papers to the reader in recognition and appreciation of the contributions to our literature made by Robert Engle and Sir Clive Granger, winners of the 2003 Nobel Prize in Economics. Please see the previous dedication page of this volume. The basic themes of this part of Volume 20 of Advances in Econometrics are time-varying betas of the capital asset pricing model, analysis of predictive densities of nonlinear models of stock returns, modeling multivariate dynamic correlations, flexible seasonal time series models, estimation of long-memory time series models, the application of the technique of boosting in volatility forecasting, the use of different time scales in Generalized Auto-Regressive Conditional Heteroskedasticity (GARCH) modeling, out-of-sample evaluation of the ‘Fed Model’ in stock price valuation, structural change as an alternative to long memory, the use of smooth transition autoregressions in stochastic volatility modeling, the analysis of the “balancedness” of regressions analyzing Taylor-type rules of the Fed Funds rate, a mixture-of-experts approach for the estimation of stochastic volatility, a modern assessment of Clive's first published paper on sunspot activity, and a new class of models of tail-dependence in time series subject to jumps. Of course, we are also pleased to include Rob's and Clive's remarks on their careers and their views on innovation in econometric theory and practice that were given at the Third Annual Advances in Econometrics Conference held at Louisiana State University, Baton Rouge, on November 5–7, 2004.

Details

Econometric Analysis of Financial and Economic Time Series
Type: Book
ISBN: 978-1-84950-388-4

Article
Publication date: 2 October 2020

Xiu Wei Yeap, Hooi Hooi Lean, Marius Galabe Sampid and Haslifah Mohamad Hasim

This paper investigates the dependence structure and market risk of the currency exchange rate portfolio from the Malaysian ringgit perspective.

630

Abstract

Purpose

This paper investigates the dependence structure and market risk of the currency exchange rate portfolio from the Malaysian ringgit perspective.

Design/methodology/approach

The marginal return of the five major exchange rates series, i.e. United States dollar (USD), Japanese yen (JPY), Singapore dollar (SGD), Thai baht (THB) and Chinese Yuan Renminbi (CNY) are modelled by the Bayesian generalized autoregressive conditional heteroskedasticity (GARCH) (1,1) model with Student's t innovations. In addition, five different copulas, such as Gumbel, Clayton, Frank, Gaussian and Student's t, are applied for modelling the joint distribution for examining the dependence structure of the five currencies. Moreover, the portfolio risk is measured by Value at Risk (VaR) that considers the extreme events through the extreme value theory (EVT).

Findings

The finding shows that Gumbel and Student's t are the best-fitted Archimedean and elliptical copulas, for the five currencies. The dependence structure is asymmetric and heavy tailed.

Research limitations/implications

The findings of this paper have important implications for diversification decision and hedging problems for investors who involving in foreign currencies. The authors found that the portfolio is diversified with the consideration of extreme events. Therefore, investors who are holding an individual currency with VaR higher than the portfolio may consider adding other currencies used in this paper for hedging.

Originality/value

This is the first paper estimating VaR of a currency exchange rate portfolio using a combination of Bayesian GARCH model, EVT and copula theory. Moreover, the VaR of the currency exchange rate portfolio can be used as a benchmark of the currency exchange market risk.

Details

International Journal of Emerging Markets, vol. 16 no. 5
Type: Research Article
ISSN: 1746-8809

Keywords

Article
Publication date: 1 January 2014

Mustapha Djeddou, Hichem Zeher and Younes Nekachtali

– The paper aims to propose a new method for estimating the time of arrival (TOA) of ultra-wideband (UWB) signals under IEEE 802.15.4a multipath channel model.

Abstract

Purpose

The paper aims to propose a new method for estimating the time of arrival (TOA) of ultra-wideband (UWB) signals under IEEE 802.15.4a multipath channel model.

Design/methodology/approach

The proposed approach is based on a proportionality test and consists in finding out whether two autoregressive (AR) processes, modeling two frames, are proportional or not. The latter operation uses a distance to measure the proportionality between the two AR processes.

Findings

The developed technique may be used in two ways, sample-by-sample or block-by-block, according to the required ranging accuracy. It is important to note that the method offers flexibility between the computational load and the needed estimation accuracy. Moreover, the proposed method uses a threshold that is derived analytically according to a preset false alarm probability.

Practical implications

Simulation experiments are conducted to assess the performance of the new TOA estimation algorithm. Thereby, the comparison is done against the well-known CLEAN algorithm for a sample-by-sample based TOA estimation and against three energy detector based receiver algorithms. The obtained results highlight the effectiveness of the developed approach.

Originality/value

The developed TOA estimation algorithm is completely different from other techniques in the literature, and it is based on a proportionality test between two sliding frames. These latter are modeled by two AR processes. Then a distance measure is used to check whether or not the power spectral densities are proportional.

Details

COMPEL: The International Journal for Computation and Mathematics in Electrical and Electronic Engineering, vol. 33 no. 1/2
Type: Research Article
ISSN: 0332-1649

Keywords

Book part
Publication date: 29 August 2007

Xavier Martin, Anand Swaminathan and Laszlo Tihanyi

Strategy deals with decisions about the scope of the firm and related choices about how to compete in various businesses. As such, research in strategy entails the analysis of…

Abstract

Strategy deals with decisions about the scope of the firm and related choices about how to compete in various businesses. As such, research in strategy entails the analysis of discrete choices that may not be independent of each other. In this paper, we review the methodological implications of modeling such choices and propose conditional, nested, mixed logit, and hazard rate models as solutions to the issues that arise from non-independence among strategic choices. We describe applications with an emphasis on international strategy, an area where firms face a multiplicity of choices with respect to both location and mode of entry.

Details

Research Methodology in Strategy and Management
Type: Book
ISBN: 978-0-7623-1404-1

11 – 20 of over 3000