Search results

1 – 10 of over 55000
Article
Publication date: 7 September 2015

Stoyu I. Ivanov

The purpose of this paper is to find if erosion of value exists in grantor trust structured exchange traded funds. The author examines the performance of six currency exchange…

Abstract

Purpose

The purpose of this paper is to find if erosion of value exists in grantor trust structured exchange traded funds. The author examines the performance of six currency exchange traded funds’ tracking errors and pricing deviations on intradaily-one-minute interval basis. All of these exchange traded funds are grantor trusts. The author also studies which metric is of more importance to investors in these exchange traded funds by examining how these performance metrics are related to the exchange traded funds’ arbitrage mechanism.

Design/methodology/approach

The Australian Dollar ETF (FXA) is designed to be 100 times the US Dollar (USD) value of the Australian Dollar, the British Pound ETF (FXB) is designed to be 100 times the USD value of the British Pound, the Canadian Dollar ETF (FXC) is designed to be 100 times the USD value of the Canadian Dollar, the Euro ETF (FXE) is designed to be 100 times the USD value of the Euro, the Swiss Franc ETF (FXF) is designed to be 100 times the USD value of the Swiss Franc and the Japanese Yen ETF (FXY) is designed to be 10,000 times the USD value of the Japanese Yen. The author uses these proportions to estimate pricing deviations. The author uses a moving average model based on an Elton et al. (2002) to estimate if tracking error or pricing deviation are more relevant in ETF arbitrage and thus to investors.

Findings

The author documents that the average intradaily tracking errors for the six currency ETFs are relatively small and stable. The tracking errors are highest for the FXF, 0.000311 percent and smallest for FXB, −0.000014 percent. FXB is the only ETF with a negative tracking error. All six ETFs average intradaily pricing deviations are negative with the exception of the FXA pricing deviation which is a positive $0.17; the rest of the ETFs pricing deviations are −0.3778 for FXB, −0.3231 for FXC, −0.2697 for FXC, −0.2697 for FXE, −0.6484 for FXF and −0.9273 for FXY. All exhibit skewness, kurtosis, very high levels of positive autocorrelation and negative trends, which suggests erosion of value. The author also found that these exchange traded funds’ arbitrage mechanism is more closely related to the exchange traded funds’ pricing deviation than tracking error.

Research limitations/implications

The paper uses high-frequency one-minute interval data in the analysis of pricing deviation which might be artificially deflating standard errors and thus inflating the t-test significance values.

Originality/value

The paper is relevant to ETF investors and contributes to the continuing search in the finance literature of better ETF performance metric.

Details

International Journal of Managerial Finance, vol. 11 no. 4
Type: Research Article
ISSN: 1743-9132

Keywords

Article
Publication date: 1 March 1992

Ron Garland

Electronic scanning check‐out systems now operate in most NewZealand supermarkets, and three‐quarters of all grocery products boughtby New Zealand households are optically…

Abstract

Electronic scanning check‐out systems now operate in most New Zealand supermarkets, and three‐quarters of all grocery products bought by New Zealand households are optically scanned. With the introduction of optical scanning technology at point‐of‐sale comes the debate on price accuracy. Based on a sample of 18.129 products bought in 86 New Zealand supermarkets, the level of pricing errors and the monetary value of pricing errors are examined. Previous research in the USA has suggested that consumers suspect pricing errors may disadvantage them rather than the retailer. However, the monetary consequences of price inaccuracy resulted in a net average undercharge to the consumer of 31 cents in every NZ$100 spent; half of this net average undercharge resulted from uncharged goods, that is, goods free to the consumer. Price inaccuracy in the New Zealand supermarket industry is disadvantaging the supermarket retailer. Extrapolation of the results of this research shows that this industry could be losing nearly NZ$18 million per annum from pricing errors. Recommends detection of pricing errors and greater emphasis on staff training and supervision for check‐out operators and for those responsible for price changes.

Details

Logistics Information Management, vol. 5 no. 3
Type: Research Article
ISSN: 0957-6053

Keywords

Book part
Publication date: 23 December 2005

David Ng and Mehdi Sadeghi

This paper studies the empirical application of an asset pricing model derived from the irrational individual behavior of loss aversion. Previous research using loss aversion…

Abstract

This paper studies the empirical application of an asset pricing model derived from the irrational individual behavior of loss aversion. Previous research using loss aversion asset pricing finds conclusive evidence that estimations match market equity premium and volatility using simulation data. We find that within its empirical application, the estimated errors are comparable to errors estimated from the capital asset pricing model. This study of the correlations between rational and irrational asset pricing model from the empirical results finds validity for both estimated values. Finally, we see the importance of cultures, economic development and financial development on asset pricing through an empirical examination of five pacific-basin countries in the estimation of asset pricing models.

Details

Asia Pacific Financial Markets in Comparative Perspective: Issues and Implications for the 21st Century
Type: Book
ISBN: 978-0-76231-258-0

Article
Publication date: 31 May 2019

Siddik Bozkurt and David Gligor

Although unfavorable pricing errors (UPEs) cost customers billions of dollars each year, research has not yet examined customers’ reactions to UPEs. This paper aims to fill this…

1417

Abstract

Purpose

Although unfavorable pricing errors (UPEs) cost customers billions of dollars each year, research has not yet examined customers’ reactions to UPEs. This paper aims to fill this gap by examining customers’ reactions to UPEs in terms of frequency, magnitude and the interaction between frequency and magnitude. Also, this study explores the moderated mediating role of price consciousness.

Design/methodology/approach

Three experimental studies were conducted to examine customers’ reactions to UPEs in terms of frequency, magnitude and the interaction between frequency and magnitude. PROCESS Model 6 and 84 along with multivariate regression analysis and MANOVA were used to test the hypotheses.

Findings

The results show that high-frequency and high-magnitude UPEs lead to increased perceived deception and dissatisfaction, resulting in a higher negative attitude toward the grocery store, decreased re-patronage intentions and increased negative word-of-mouth (NWOM). Also, results show that regardless of customers’ price consciousness level, customers display negative reactions when encountering UPEs.

Research limitations/implications

This paper only investigates UPEs in the brick and mortar setting; future studies should examine UPEs in different settings.

Practical implications

The findings show that UPEs can cause significant problems for grocery stores. Thus, managers should take precautionary measures (e.g. constantly checking shelves) to ensure that the advertised price and the checkout price match.

Originality/value

This paper represents the first attempt to empirically examine the relationship between UPEs frequency and magnitude, on the one hand, and perceived deception, dissatisfaction, customer attitude, re-patronage intentions, NWOM and price consciousness on the other.

Details

Journal of Consumer Marketing, vol. 36 no. 6
Type: Research Article
ISSN: 0736-3761

Keywords

Article
Publication date: 20 January 2012

Yuan‐shuh Lii and Monle Lee

The purpose of this research is to explore the joint effect of compensation frames and product‐price levels on consumer attitudinal reactions and behavioral intentions after a…

2231

Abstract

Purpose

The purpose of this research is to explore the joint effect of compensation frames and product‐price levels on consumer attitudinal reactions and behavioral intentions after a service failure involving online pricing error.

Design/methodology/approach

A 2 (compensation frames: dollar off versus percentage off) x 2 (product‐price levels: high‐price versus low‐price) between‐subjects factorial design was used to test the hypotheses.

Findings

The findings indicate that consumers perceive a price reduction for compensation framed in dollar terms as more (less) fair than the same price reduction framed in percentage terms for high‐price product (low‐price product). The higher consumer perceptions of compensation fairness are, the more likely consumers are to have positive post‐recovery satisfaction and trust. Consequently, consumers who are satisfied with the compensation effort are more likely to trust the service firm, engage in positive eWOM behavior, and purchase the item.

Practical implications

When online retailers decide to honor their erroneous pricing on their web sites by compensating consumers with price reduction, they must learn the product‐price conditions in which the effectiveness of compensation frames can reach a higher level.

Originality/value

To the best of the authors' knowledge, no previous studies have examined the joint effect of compensation frames and product‐price levels on consumers' attitudinal reactions and behavioral intentions involving service recovery of online pricing errors.

Details

Managing Service Quality: An International Journal, vol. 22 no. 1
Type: Research Article
ISSN: 0960-4529

Keywords

Article
Publication date: 21 July 2020

Shuang Zhang, Song Xi Chen and Lei Lu

With the presence of pricing errors, the authors consider statistical inference on the variance risk premium (VRP) and the associated implied variance, constructed from the option…

Abstract

Purpose

With the presence of pricing errors, the authors consider statistical inference on the variance risk premium (VRP) and the associated implied variance, constructed from the option prices and the historic returns.

Design/methodology/approach

The authors propose a nonparametric kernel smoothing approach that removes the adverse effects of pricing errors and leads to consistent estimation for both the implied variance and the VRP. The asymptotic distributions of the proposed VRP estimator are developed under three asymptotic regimes regarding the relative sample sizes between the option data and historic return data.

Findings

This study reveals that existing methods for estimating the implied variance are adversely affected by pricing errors in the option prices, which causes the estimators for VRP statistically inconsistent. By analyzing the S&P 500 option and return data, it demonstrates that, compared with other implied variance and VRP estimators, the proposed implied variance and VRP estimators are more significant variables in explaining variations in the excess S&P 500 returns, and the proposed VRP estimates have the smallest out-of-sample forecasting root mean squared error.

Research limitations/implications

This study contributes to the estimation of the implied variance and the VRP and helps in the predictions of future realized variance and equity premium.

Originality/value

This study is the first to propose consistent estimations for the implied variance and the VRP with the presence of option pricing errors.

Details

China Finance Review International, vol. 11 no. 1
Type: Research Article
ISSN: 2044-1398

Keywords

Article
Publication date: 1 May 1994

George C. Philippatos, Nicolas Gressis and Philip L. Baird

The Black‐Scholes (B‐S) model in its various formulations has been the mainstay paradigm on option pricing since its basic formulation in 1973. The model has generally been proven…

Abstract

The Black‐Scholes (B‐S) model in its various formulations has been the mainstay paradigm on option pricing since its basic formulation in 1973. The model has generally been proven empirically robust, despite the well documented empirical evidence of mispricing deep‐in‐the‐money, deep out‐of‐the‐money and, occasionally, at‐the‐money options with near maturities [see Galai (1983)]. Research on explaining the observed pricing anomalies has focused on the variance of the return of the underlying asset, which, in the case of the B‐S model, is assumed to remain invariant over time. The variance term is not directly observable, leading researchers to speculate that pricing discrepancies may be caused by misspecification of this variable. More specifically, interest in the volatility variable has centered about the implied standard deviation (ISD).

Details

Managerial Finance, vol. 20 no. 5
Type: Research Article
ISSN: 0307-4358

Article
Publication date: 19 January 2015

Thomas Kokholm and Martin Stisen

This paper studies the performance of commonly employed stochastic volatility and jump models in the consistent pricing of The CBOE Volatility Index (VIX) and The S&P 500 Index…

1086

Abstract

Purpose

This paper studies the performance of commonly employed stochastic volatility and jump models in the consistent pricing of The CBOE Volatility Index (VIX) and The S&P 500 Index (SPX) options. With the existence of active markets for volatility derivatives and options on the underlying instrument, the need for models that are able to price these markets consistently has increased. Although pricing formulas for VIX and vanilla options are now available for commonly used models exhibiting stochastic volatility and/or jumps, it remains to be shown whether these are able to price both markets consistently. This paper fills this vacuum.

Design/methodology/approach

In particular, the Heston model, the Heston model with jumps in returns and the Heston model with simultaneous jumps in returns and variance (SVJJ) are jointly calibrated to market quotes on SPX and VIX options together with VIX futures.

Findings

The full flexibility of having jumps in both returns and volatility added to a stochastic volatility model is essential. Moreover, we find that the SVJJ model with the Feller condition imposed and calibrated jointly to SPX and VIX options fits both markets poorly. Relaxing the Feller condition in the calibration improves the performance considerably. Still, the fit is not satisfactory, and we conclude that one needs more flexibility in the model to jointly fit both option markets.

Originality/value

Compared to existing literature, we derive numerically simpler VIX option and futures pricing formulas in the case of the SVJ model. Moreover, the paper is the first to study the pricing performance of three widely used models to SPX options and VIX derivatives.

Details

The Journal of Risk Finance, vol. 16 no. 1
Type: Research Article
ISSN: 1526-5943

Keywords

Article
Publication date: 11 November 2014

Rick L. Andrews and Peter Ebbes

This paper aims to investigate the effects of using poor-quality instruments to remedy endogeneity in logit-based demand models. Endogeneity problems in demand models occur when…

Abstract

Purpose

This paper aims to investigate the effects of using poor-quality instruments to remedy endogeneity in logit-based demand models. Endogeneity problems in demand models occur when certain factors, unobserved by the researcher, affect both demand and the values of a marketing mix variable set by managers. For example, unobserved factors such as style, prestige or reputation might result in higher prices for a product and higher demand for that product. If not addressed properly, endogeneity can bias the elasticities of the endogenous variable and subsequent optimization of the marketing mix. In practice, instrumental variables (IV) estimation techniques are often used to remedy an endogeneity problem. It is well-known that, for linear regression models, the use of IV techniques with poor-quality instruments can produce very poor parameter estimates, in some circumstances even worse than those that result from ignoring the endogeneity problem altogether. The literature has not addressed the consequences of using poor-quality instruments to remedy endogeneity problems in non-linear models, such as logit-based demand models.

Design/methodology/approach

Using simulation methods, the authors investigate the effects of using poor-quality instruments to remedy endogeneity in logit-based demand models applied to finite-sample data sets. The results show that, even when the conditions for lack of parameter identification due to poor-quality instruments do not hold exactly, estimates of price elasticities can still be quite poor. That being the case, the authors investigate the relative performance of several non-linear IV estimation procedures utilizing readily available instruments in finite samples.

Findings

The study highlights the attractiveness of the control function approach (Petrin and Train, 2010) and readily available instruments, which together reduce the mean squared elasticity errors substantially for experimental conditions in which the theory-backed instruments are poor in quality. The authors find important effects for sample size, in particular for the number of brands, for which it is shown that endogeneity problems are exacerbated with increases in the number of brands, especially when poor-quality instruments are used. In addition, the number of stores is found to be important for likelihood ratio testing. The results of the simulation are shown to generalize to situations under Nash pricing in oligopolistic markets, to conditions in which cross-sectional preference heterogeneity exists and to nested logit and probit-based demand specifications as well. Based on the results of the simulation, the authors suggest a procedure for managing a potential endogeneity problem in logit-based demand models.

Originality/value

The literature on demand modeling has focused on deriving analytical results on the consequences of using poor-quality instruments to remedy endogeneity problems in linear models. Despite the widespread use of non-linear demand models such as logit, this study is the first to address the consequences of using poor-quality instruments in these models and to make practical recommendations on how to avoid poor outcomes.

Details

Journal of Modelling in Management, vol. 9 no. 3
Type: Research Article
ISSN: 1746-5664

Keywords

Article
Publication date: 1 July 1995

Steven J. Cochran and Robert H. DeFina

Several recent studies have indicated the existence of a predictable component in stock prices. This study examines the sources of this serial correlation using error‐correction…

Abstract

Several recent studies have indicated the existence of a predictable component in stock prices. This study examines the sources of this serial correlation using error‐correction models. The results show that autocorrelated economic variables can generate serial correlation in stock returns. After these effects are accounted for, however, significant serial correlation in stock prices remains. The activities of noise traders and inefficiencies in the pricing of securities, within the context of limitations to the arbitrage process, are suggested as additional sources of serial correlation in stock prices.

Details

Managerial Finance, vol. 21 no. 7
Type: Research Article
ISSN: 0307-4358

1 – 10 of over 55000