Search results
1 – 10 of over 9000Devon DelVecchio and Adam W. Craig
This research aims to integrate two theories of reference price formation and to test the resulting exemplar‐prototype hybrid (EPH) model's predictions. Study 1 tests the…
Abstract
Purpose
This research aims to integrate two theories of reference price formation and to test the resulting exemplar‐prototype hybrid (EPH) model's predictions. Study 1 tests the predictions of the EPH model regarding price attractiveness ratings. Study 2 helps to document the process by which the EPH model operates.
Design/methodology/approach
The investigation consists of a pair of laboratory experiments that manipulate the skew (positive, negative) of historic price data, and the frequency of the modal price (high, low) in the price history.
Findings
Both the skew of prices to which consumers are exposed and the frequency with which the modal price occurs affect recall of the modal price and evaluations of the attractiveness of subsequent prices.
Research limitations/implications
Consumers rely on information about both the price range and individual price points to form reference prices. Common models of reference price effects may be improved by including a non‐linear estimate of the effect of modal price frequency.
Practical implications
Managers are advised to offer a consistent regular price from which occasional discounts of varying value are offered to create a strong memory trace in consumers' minds for the higher “regular” price and avoid such a trace for any one discounted price.
Originality/value
Prior studies detail aspects of the relationship between reference prices and the attractiveness of market prices. This is the first attempt to integrate, rather than contrast, the two predominant types of theories (range‐based, price‐point based) of reference price formation.
Details
Keywords
Martin Eling, Simone Farinelli, Damiano Rossello and Luisa Tibiletti
Recent literature discusses the persistence of skewness and tail risk in hedge fund returns. The aim of this paper is to suggest an alternative skewness measure, Azzalini's…
Abstract
Purpose
Recent literature discusses the persistence of skewness and tail risk in hedge fund returns. The aim of this paper is to suggest an alternative skewness measure, Azzalini's skewness parameter delta, which is derived as the normalized shape parameter from the skew‐normal distribution. The paper seeks to analyze the characteristics of this skewness measure compared with other indicators of skewness and to employ it in some typical risk and performance measurements.
Design/methodology/approach
The paper first provides an overview of the skew‐normal distribution and its mathematical formulation. Then it presents some empirical estimations of the skew‐normal distribution for hedge fund returns and discusses the characteristics of using delta with respect to classical skewness coefficients. Finally, it illustrates how delta can be used in risk management and in a performance measurement context.
Findings
The results highlight the advantages of Azzalini's skewness parameter delta, especially with regard to its interpretation. Delta has a limpid financial interpretation as a skewness shock on normally distributed returns. The paper also derives some important characteristics of delta, including that it is more stable than other measures of skewness and inversely related to popular risk measures such as the value‐at‐risk (VaR) and the conditional value‐at‐risk (CVaR).
Originality/value
The contribution of the paper is to apply the skew‐normal distribution to a large sample of hedge fund returns. It also illustrates that using Azzalini's skewness parameter delta as a skewness measure has some advantages over classical skewness coefficients. The use of the skew‐normal and related distributions is a relatively new, but growing, field in finance and not much has been published on the topic. Skewness itself, however, has been the subject of a great deal of research. Therefore, the results contribute to three fields of research: skewed distributions, risk measurement, and hedge fund performance.
Details
Keywords
Wenbo Hu and Alec N. Kercheval
Portfolio credit derivatives, such as basket credit default swaps (basket CDS), require for their pricing an estimation of the dependence structure of defaults, which is known to…
Abstract
Portfolio credit derivatives, such as basket credit default swaps (basket CDS), require for their pricing an estimation of the dependence structure of defaults, which is known to exhibit tail dependence as reflected in observed default contagion. A popular model with this property is the (Student's) t-copula; unfortunately there is no fast method to calibrate the degree of freedom parameter.
In this paper, within the framework of Schönbucher's copula-based trigger-variable model for basket CDS pricing, we propose instead to calibrate the full multivariate t distribution. We describe a version of the expectation-maximization algorithm that provides very fast calibration speeds compared to the current copula-based alternatives.
The algorithm generalizes easily to the more flexible skewed t distributions. To our knowledge, we are the first to use the skewed t distribution in this context.
Muhammad Rizwan Iqbal and Sajdah Hassan
The purpose of this paper is to explore the scope of robust dispersion control charts in a distribution-free environment, which is a specific case of non-normal control charts…
Abstract
Purpose
The purpose of this paper is to explore the scope of robust dispersion control charts in a distribution-free environment, which is a specific case of non-normal control charts. These control charts are skewness-based structures designed to monitor skewed-type processes whilst equally performing under symmetric processes. Moreover, the choice of a suitable control chart for a particular non-normal situation is also suggested.
Design/methodology/approach
The probability control limits approach is considered as an alternative way to determine the skewness-based structure of dispersion control charts. The proposals of five robust and two conventional Shewhart-type dispersion control charts are suggested as efficient competitors of skewness correction (SC) dispersion control charts. The evaluation of robust proposals and competing dispersion control charts is done through false alarm rate (FAR) and probability to signal (PTS) measures.
Findings
The proposed dispersion control charts are found robust and efficient alternatives of SC dispersion control charts in both normal and non-normal distributions. The FARs and PTS properties of proposed control charts are impressive in all studied cases, and a real-data example also verifies the dominance of proposed control charts.
Originality/value
Conventional dispersion control charts quickly lose their efficiency as underlying process distribution deviates from normality; however, robust control charts emerge as most suitable candidates in such situations. This paper proposes the idea of robust dispersion control charts under a distribution-free structure for the skewed-type process, which is not yet explored.
Details
Keywords
Ziwei Ma, Tonghui Wang, Zheng Wei and Xiaonan Zhu
The purpose of this study is to extend the classical noncentral F-distribution under normal settings to noncentral closed skew F-distribution for dealing with independent samples…
Abstract
Purpose
The purpose of this study is to extend the classical noncentral F-distribution under normal settings to noncentral closed skew F-distribution for dealing with independent samples from multivariate skew normal (SN) distributions.
Design/methodology/approach
Based on generalized Hotelling's T2 statistics, confidence regions are constructed for the difference between location parameters in two independent multivariate SN distributions. Simulation studies show that the confidence regions based on the closed SN model outperform the classical multivariate normal model if the vectors of skewness parameters are not zero. A real data analysis is given for illustrating the effectiveness of our proposed methods.
Findings
This study’s approach is the first one in literature for the inferences in difference of location parameters under multivariate SN settings. Real data analysis shows the preference of this new approach than the classical method.
Research limitations/implications
For the real data applications, the authors need to remove outliers first before applying this approach.
Practical implications
This study’s approach may apply many multivariate skewed data using SN fittings instead of classical normal fittings.
Originality/value
This paper is the research paper and the authors’ new approach has many applications for analyzing the multivariate skewed data.
Details
Keywords
Carlos Montes-Galdón and Eva Ortega
This chapter proposes a vector autoregressive VAR model with structural shocks (SVAR) that are identified using sign restrictions, and whose distribution is subject to time…
Abstract
This chapter proposes a vector autoregressive VAR model with structural shocks (SVAR) that are identified using sign restrictions, and whose distribution is subject to time varying skewness. The authors also present an efficient Bayesian algorithm to estimate the model. The model allows tracking joint asymmetric risks to macroeconomic variables included in the SVAR, and provides a structural narrative to the evolution of those risks. When faced with euro area data, our estimation suggests that there has been a significant variation in the skewness of demand, supply and monetary policy shocks. Such variation can explain a significant proportion of the joint dynamics of real GDP growth and inflation, and also generates important asymmetric tail risks in those macroeconomic variables. Finally, compared to the literature on growth- and inflation-at-risk, the authors find that financial stress indicators are not enough to explain all the macroeconomic tail risks.
Details
Keywords
The aim of this paper is to propose and analyse policies capable of generating left‐skewed pension distributions. Such policies can deliver large pension values with high…
Abstract
Purpose
The aim of this paper is to propose and analyse policies capable of generating left‐skewed pension distributions. Such policies can deliver large pension values with high probability and hence are of interest to practical fund managers.
Design/methodology/approach
The paper uses a computational method capable of solving stochastic optimal control problems. The optimal strategies obtained through the method are used to simulate dynamic portfolio management.
Findings
The paper finds that optimisation of locally non‐concave performance measures has produced left‐skewed payoff distributions of small VaR and CVaR. The distributions remain left‐skewed for relatively large values of the diffusion parameter.
Practical implications
On the basis of the findings, it would seem beneficial for real‐world fund managers to implement this kind of optimising “cautious‐relaxed” policy.
Originality/value
A novel non‐concave performance measure has been proposed in the paper to describe a portfolio manager's aim. The computed “cautious‐relaxed” policies have been shown to realise this aim.
Details
Keywords
This chapter outlines the major analytical efforts performed as part of the overarching research project with the aim to investigate the organizational and environmental…
Abstract
This chapter outlines the major analytical efforts performed as part of the overarching research project with the aim to investigate the organizational and environmental circumstances around the extreme negatively skewed performance outcomes regularly observed across firms. It presents the collection and treatment of comprehensive European and North American datasets where subsequent analyses reproduce the contours of performance distributions observed in prior empirical studies. Key theoretical perspectives engaged in prior studies of performance data and the implied risk-return relationships are presented and these point to emerging commonalities between empirical findings in the management and finance fields. The results from extended analyses of more fine-grained data from North American manufacturing firms uncover the subtle effects of leadership and structural features, and computational simulations demonstrate how the implied adaptive processes can lead to the empirically observed performance distributions. Finally, the findings from the analytical project activities are set in context and the implications of the observed results are discussed to reach at a final conclusion.
Details
Keywords
Liqun Hu, Tonghui Wang, David Trafimow, S.T. Boris Choy, Xiangfei Chen, Cong Wang and Tingting Tong
The authors’ conclusions are based on mathematical derivations that are supported by computer simulations and three worked examples in applications of economics and finance…
Abstract
Purpose
The authors’ conclusions are based on mathematical derivations that are supported by computer simulations and three worked examples in applications of economics and finance. Finally, the authors provide a link to a computer program so that researchers can perform the analyses easily.
Design/methodology/approach
Based on a parameter estimation goal, the present work is concerned with determining the minimum sample size researchers should collect so their sample medians can be trusted as good estimates of corresponding population medians. The authors derive two solutions, using a normal approximation and an exact method.
Findings
The exact method provides more accurate answers than the normal approximation method. The authors show that the minimum sample size necessary for estimating the median using the exact method is substantially smaller than that using the normal approximation method. Therefore, researchers can use the exact method to enjoy a sample size savings.
Originality/value
In this paper, the a priori procedure is extended for estimating the population median under the skew normal settings. The mathematical derivation and with computer simulations of the exact method by using sample median to estimate the population median is new and a link to a free and user-friendly computer program is provided so researchers can make their own calculations.
Details
Keywords
David Trafimow, Ziyuan Wang, Tingting Tong and Tonghui Wang
The purpose of this article is to show the gains that can be made if researchers were to use gain-probability (G-P) diagrams.
Abstract
Purpose
The purpose of this article is to show the gains that can be made if researchers were to use gain-probability (G-P) diagrams.
Design/methodology/approach
The authors present relevant mathematical equations, invented examples and real data examples.
Findings
G-P diagrams provide a more nuanced understanding of the data than typical summary statistics, effect sizes or significance tests.
Practical implications
Gain-probability diagrams provided a much better basis for making decisions than typical summary statistics, effect sizes or significance tests.
Originality/value
G-P diagrams provide a completely new way to traverse the distance from data to decision-making implications.
Details