Search results
1 – 10 of over 3000Auditors and accountants have an accepted reputation of being conservative. However, Antle and Nalebuff (1991) conclude in their analytical model on auditor‐client negotiations…
Abstract
Auditors and accountants have an accepted reputation of being conservative. However, Antle and Nalebuff (1991) conclude in their analytical model on auditor‐client negotiations that auditors are not conservative and that a conservative audit report is never issued. This paper extends the Antle and Nalebuff (1991) results. By replacing the Antle and Nalebuff (1991) assumption that an auditor has a symmetric loss function (financial statement overstatements have the same impact as financial statement understatements) with the assumption that an auditor has an asymmetric loss function (losses to an auditor for financial statement overstatement are greater than the losses of an equal understatement), I find that auditors can be conservative and that conservative audit reports are issued to the users.
Rodney G. Lim and Peter J. Carnevale
Research adopting prospect theory to examine negotiator performance was extended to mediation. We examined whether framing negotiator payoffs in terms of gains or losses affects a…
Abstract
Research adopting prospect theory to examine negotiator performance was extended to mediation. We examined whether framing negotiator payoffs in terms of gains or losses affects a mediator's behavior towards negotiators when the mediator has no personal frame. The use of a mediator presents a critical test between an explanation of framing effects based on bargainers' underlying preferences for risk and a simpler explanation based on the psychophysical properties of perceived gains and losses. A computer‐based experiment was conducted in which subjects acted as mediators between two disputants (computer programs) in an integrative bargaining task. As predicted, subjects proposed settlements of higher joint value when both disputants had loss frames than when both had gain frames, supporting the psychophysical explanation. Moreover, within mixed framed disputes, subjects' proposals favored the loss‐framed bargainer over the gain‐framed bargainer. However, predicted interactions between bargainer frame and concession‐making activity were not supported Implications of the results for real bargainers and mediators are discussed.
Wenbin Wu, Ximing Wu, Yu Yvette Zhang and David Leatham
The purpose of this paper is to bring out the development of a flexible model for nonstationary crop yield distributions and its applications to decision-making in crop insurance.
Abstract
Purpose
The purpose of this paper is to bring out the development of a flexible model for nonstationary crop yield distributions and its applications to decision-making in crop insurance.
Design/methodology/approach
The authors design a nonparametric Bayesian approach based on Gaussian process regressions to model crop yields over time. Further flexibility is obtained via Bayesian model averaging that results in mixed Gaussian processes.
Findings
Simulation results on crop insurance premium rates show that the proposed method compares favorably with conventional estimators, especially when the underlying distributions are nonstationary.
Originality/value
Unlike conventional two-stage estimation, the proposed method models nonstationary crop yields in a single stage. The authors further adopt a decision theoretic framework in its empirical application and demonstrate that insurance companies can use the proposed method to effectively identify profitable policies under symmetric or asymmetric loss functions.
Details
Keywords
The process capability indices have been widely used to measure process capability and performance. In this paper, we proposed a new process capability index which is based on an…
Abstract
The process capability indices have been widely used to measure process capability and performance. In this paper, we proposed a new process capability index which is based on an actual dollar loss by defects. The new index is similar to the Taguchi’s loss function and fully incorporates the distribution of quality attribute in a process. The strength of the index is to apply itself to non‐normal or asymmetric distributions. Numerical examples were presented to show superiority of the new index against Cp, Cpk, and Cpm which are the most widely used process capability indices.
Details
Keywords
Fernando Antonio Moala and Karlla Delalibera Chagas
The step-stress accelerated test is the most appropriate statistical method to obtain information about the reliability of new products faster than would be possible if the…
Abstract
Purpose
The step-stress accelerated test is the most appropriate statistical method to obtain information about the reliability of new products faster than would be possible if the product was left to fail in normal use. This paper presents the multiple step-stress accelerated life test using type-II censored data and assuming a cumulative exposure model. The authors propose a Bayesian inference with the lifetimes of test item under gamma distribution. The choice of the loss function is an essential part in the Bayesian estimation problems. Therefore, the Bayesian estimators for the parameters are obtained based on different loss functions and a comparison with the usual maximum likelihood (MLE) approach is carried out. Finally, an example is presented to illustrate the proposed procedure in this paper.
Design/methodology/approach
A Bayesian inference is performed and the parameter estimators are obtained under symmetric and asymmetric loss functions. A sensitivity analysis of these Bayes and MLE estimators are presented by Monte Carlo simulation to verify if the Bayesian analysis is performed better.
Findings
The authors demonstrated that Bayesian estimators give better results than MLE with respect to MSE and bias. The authors also consider three types of loss functions and they show that the most dominant estimator that had the smallest MSE and bias is the Bayesian under general entropy loss function followed closely by the Linex loss function. In this case, the use of a symmetric loss function as the SELF is inappropriate for the SSALT mainly with small data.
Originality/value
Most of papers proposed in the literature present the estimation of SSALT through the MLE. In this paper, the authors developed a Bayesian analysis for the SSALT and discuss the procedures to obtain the Bayes estimators under symmetric and asymmetric loss functions. The choice of the loss function is an essential part in the Bayesian estimation problems.
Details
Keywords
Hamid Baghestani and Bassam M. AbuAl-Foul
This study evaluates the Federal Reserve (Fed) initial and final forecasts of the unemployment rate for 1983Q1-2018Q4. The Fed initial forecasts in a typical quarter are made in…
Abstract
Purpose
This study evaluates the Federal Reserve (Fed) initial and final forecasts of the unemployment rate for 1983Q1-2018Q4. The Fed initial forecasts in a typical quarter are made in the first month (or immediately after), and the final forecasts are made in the third month of the quarter. The analysis also includes the private forecasts, which are made close to the end of the second month of the quarter.
Design/methodology/approach
In evaluating the multi-period forecasts, the study tests for systematic bias, directional accuracy, symmetric loss, equal forecast accuracy, encompassing and orthogonality. For every test equation, it employs the Newey–West procedure in order to obtain the standard errors corrected for both heteroscedasticity and inherent serial correlation.
Findings
Both Fed and private forecasts beat the naïve benchmark and predict directional change under symmetric loss. Fed final forecasts are more accurate than initial forecasts, meaning that predictive accuracy improves as more information becomes available. The private and Fed final forecasts contain distinct predictive information, but the latter produces significantly lower mean squared errors. The results are mixed when the study compares the private with the Fed initial forecasts. Additional results indicate that Fed (private) forecast errors are (are not) orthogonal to changes in consumer expectations about future unemployment. As such, consumer expectations can potentially help improve the accuracy of private forecasts.
Originality/value
Unlike many other studies, this study focuses on the unemployment rate, since it is an important indicator of the social cost of business cycles, and thus its forecasts are of special interest to policymakers, politicians and social scientists. Accurate unemployment rate forecasts, in particular, are essential for policymakers to design an optimal macroeconomic policy.
Details
Keywords
The random walk forecast of exchange rate serves as a standard benchmark for forecast comparison. The purpose of this paper is to assess whether this benchmark is unbiased and…
Abstract
Purpose
The random walk forecast of exchange rate serves as a standard benchmark for forecast comparison. The purpose of this paper is to assess whether this benchmark is unbiased and directionally accurate under symmetric loss. The focus is on the random walk forecasts of the dollar/euro for 1999‐2007 and the dollar/pound for 1971‐2007.
Design/methodology/approach
A forecasting framework to generate the one‐ to four‐quarter‐ahead random walk forecasts at varying lead times is designed. This allows to compare forecast accuracy at different lead times and forecast horizons. Using standard evaluation methods, this paper further evaluates these forecasts in terms of unbiasedness and directional accuracy.
Findings
The paper shows that forecast accuracy improves with a reduction in the lead time but deteriorates with an increase in the forecast horizon. More importantly, the random walk forecasts are unbiased and accurately predict directional change under symmetric loss and thus are of value to a user who assigns similar cost to incorrect upward and downward move predictions in the exchange rates.
Research limitations/implications
The one‐ to four‐quarter‐ahead random walk forecasts evaluated here are for averages of daily figures and not for the (end‐of‐quarter) rates in 3‐, 6‐, 9‐ and 12‐months. Thus, the framework is of value to a market participant who is interested in forecasting quarterly average rates rather than the end‐of‐quarter rates.
Originality/value
The exchange rate forecasting framework presented in this paper allows the evaluation of the random walk forecasts in terms of directional accuracy which (to the best of knowledge) has not been done before.
Details
Keywords
Hung‐Chun Liu and Jui‐Cheng Hung
The purpose of this paper is to apply alternative GARCH‐type models to daily volatility forecasting, and apply Value‐at‐Risk (VaR) to the Taiwanese stock index futures markets…
Abstract
Purpose
The purpose of this paper is to apply alternative GARCH‐type models to daily volatility forecasting, and apply Value‐at‐Risk (VaR) to the Taiwanese stock index futures markets that suffered most from the global financial tsunami that occurred during 2008.
Design/methodology/approach
Rather than using squared returns as a proxy for true volatility, this study adopts three range‐based proxies (PK, GK and RS), and one return‐based proxy (realized volatility), for use in the empirical exercise. The forecast evaluation is conducted using various proxy measures based on both symmetric and asymmetric loss functions, while back‐testing and two utility‐based loss functions are employed for further VaR assessment with respect to risk management practice.
Findings
Empirical results demonstrate that the EGARCH model provides the most accurate daily volatility forecasts, while the performances of the standard GARCH model and the GARCH models with highly persistent and long‐memory characteristics are relatively poor. In the area of risk management, the RV‐VaR model tends to underestimate VaR and has been rejected owing to a lack of correct unconditional coverage. In contrast, the GARCH genre of models can provide satisfactory and reliable daily VaR forecasts.
Originality/value
The unobservable volatility can be proxied using parsimonious daily price range with freely available prices when applied to Taiwanese futures markets. Meanwhile, the GARCH‐type models remain valid downside risk measures for both regulators and firms in the face of a turbulent market.
Details
Keywords
Salimeh Sadat Aghili, Mohsen Torabian, Mohammad Hassan Behzadi and Asghar Seif
The purpose of this paper is to develop a double-objective economic statistical design (ESD) of (
Abstract
Purpose
The purpose of this paper is to develop a double-objective economic statistical design (ESD) of (
Design/methodology/approach
The design used in this study is based on a double-objective economic statistical design of (
Findings
Numerical results indicate that it is not possible to reduce the second type of error and costs at the same time, which means that by reducing the second type of error, the cost increases, and by reducing the cost, the second type of error increases, both of which are very important. Obtained based on the needs of the industry and which one has more priority has the right to choose. These designs define a Pareto optimal front of solutions that increase the flexibility and adaptability of the
Practical implications
This research adds to the body of knowledge related to flexibility in process quality control. This article may be of interest to quality systems experts in factories where the choice between cost reduction and statistical factor reduction can affect the production process.
Originality/value
The cost functions for double-objective uniform and non-uniform sampling schemes with the Weibull shock model based on the Linex loss function are presented for the first time.
Details
Keywords
The purpose of this research is to provide a new loss function‐based risk assessment method so the likelihood and consequence resulting from the failure of a manufacturing or…
Abstract
Purpose
The purpose of this research is to provide a new loss function‐based risk assessment method so the likelihood and consequence resulting from the failure of a manufacturing or environmental system can be evaluated simultaneously.
Design/methodology/approach
Instead of using risk matrices of the occurrence and consequence separately for evaluating manufacturing and environmental risks, an integrated approach by exploring the relationship between process capability indices: Cp, Cpk and Cpm, and three different loss functions: Taguchi's loss function; Inverted normal loss function (INLF); and Revised inverted normal loss function (RINLF) is proposed.
Findings
The new method of quantitative risk assessment linking the likelihood and expected loss of failure is illustrated by two numeric examples. The results suggest that the revised inverted normal loss function (RINLF) be used in assessing manufacturing and environmental risks.
Practical implications
It gives decision‐makers a concrete tool to assess the likelihood and consequence of their processes. Linking the process capability indices and loss functions is particularly promising, as this provides a useful risk assessment tool for practitioners who want to reduce hazardous waste and manufacturing losses from their facilities.
Originality/value
The manufacturing and environmental risks are determined by paring the process capability indices and loss function. From the loss function‐based estimation, one can quantify the consequence of a manufacturing loss and get the severity rating in an objective way.
Details