Search results
1 – 10 of 261Eric Renault and Daniela Scidá
Many Information Theoretic Measures have been proposed for a quantitative assessment of causality relationships. While Gouriéroux, Monfort, and Renault (1987) had introduced the…
Abstract
Many Information Theoretic Measures have been proposed for a quantitative assessment of causality relationships. While Gouriéroux, Monfort, and Renault (1987) had introduced the so-called “Kullback Causality Measures,” extending Geweke’s (1982) work in the context of Gaussian VAR processes, Schreiber (2000) has set a special focus on Granger causality and dubbed the same measure “transfer entropy.” Both papers measure causality in the context of Markov processes. One contribution of this paper is to set the focus on the interplay between measurement of (non)-markovianity and measurement of Granger causality. Both of them can be framed in terms of prediction: how much is the forecast accuracy deteriorated when forgetting some relevant conditioning information? In this paper we argue that this common feature between (non)-markovianity and Granger causality has led people to overestimate the amount of causality because what they consider as a causality measure may also convey a measure of the amount of (non)-markovianity. We set a special focus on the design of measures that properly disentangle these two components. Furthermore, this disentangling leads us to revisit the equivalence between the Sims and Granger concepts of noncausality and the log-likelihood ratio tests for each of them. We argue that Granger causality implies testing for non-nested hypotheses.
Details
Keywords
Glenn W. Harrison and J. Todd Swarthout
We take Cumulative Prospect Theory (CPT) seriously by rigorously estimating structural models using the full set of CPT parameters. Much of the literature only estimates a subset…
Abstract
We take Cumulative Prospect Theory (CPT) seriously by rigorously estimating structural models using the full set of CPT parameters. Much of the literature only estimates a subset of CPT parameters, or more simply assumes CPT parameter values from prior studies. Our data are from laboratory experiments with undergraduate students and MBA students facing substantial real incentives and losses. We also estimate structural models from Expected Utility Theory (EUT), Dual Theory (DT), Rank-Dependent Utility (RDU), and Disappointment Aversion (DA) for comparison. Our major finding is that a majority of individuals in our sample locally asset integrate. That is, they see a loss frame for what it is, a frame, and behave as if they evaluate the net payment rather than the gross loss when one is presented to them. This finding is devastating to the direct application of CPT to these data for those subjects. Support for CPT is greater when losses are covered out of an earned endowment rather than house money, but RDU is still the best single characterization of individual and pooled choices. Defenders of the CPT model claim, correctly, that the CPT model exists “because the data says it should.” In other words, the CPT model was borne from a wide range of stylized facts culled from parts of the cognitive psychology literature. If one is to take the CPT model seriously and rigorously then it needs to do a much better job of explaining the data than we see here.
Details
Keywords
Chrystalleni Aristidou, Kevin Lee and Kalvinder Shields
A novel approach to modeling exchange rates is presented based on a set of models distinguished by the drivers of the rate and regime duration. The models are combined into a…
Abstract
A novel approach to modeling exchange rates is presented based on a set of models distinguished by the drivers of the rate and regime duration. The models are combined into a “meta model” using model averaging and non-nested hypothesis-testing techniques. The meta model accommodates periods of stability and slowly evolving or abruptly changing regimes involving multiple drivers. Estimated meta models for five exchange rates provide a compelling characterization of their determination over the last 40 years or so, identifying “phases” during which the influences from policy and financial market responses to news succumb to equilibrating macroeconomic pressures and vice versa.
Details
Keywords
The dissolving trade barriers, financial deregulation, hyper‐mobility of capital and the rapid diffusion of new information technologies have ushered the Australian economy into…
Abstract
The dissolving trade barriers, financial deregulation, hyper‐mobility of capital and the rapid diffusion of new information technologies have ushered the Australian economy into the borderless world. The orthodoxy that states that centralised wage‐fixing in Australia has impeded wage flexibility and resulted in high unemployment is unconvincing. Partly, this is because in the 1980s Australian labour market institutions have been decentralised and decollectivised in response to pressures from the borderless world. The insights garnered from cross‐sectional comparative statics that, first, skill‐biased Schumpeterian technological change was the major cause of labour immiserisation and, second, adverse Stolper‐Samuelson trade played an insignificant effect need to be reviewed. Parsimonious dynamic time‐series models of trade and technology have been formulated using general‐to‐specific methods after taking account of stochastic trends through unit root and cointegration tests. Granger causality and non‐nested tests applied to these models support the contention that both trade and technology contributed to increasing wage disparity during the borderless era. Moreover the supply side factors such as female participation, immigration and institutional factors such as deunionisation have also increased wage disparity. The deregulation of the Australian labour market by the Workplace Relations Act, whilst an inevitable response to achieve competitiveness in the borderless world market, would exacerbate wage inequality. Policies aimed at skill accumulation on the one hand, and social welfare policies involving negative income taxes on the other may have to be implemented to mitigate the deleterious social effects of rising wage inequality.
Details
Keywords
Sarah Jinhui Wu, Steven A. Melnyk and Morgan Swink
Operational practices and operational capabilities are critical yet distinct elements in operations strategy. The purpose of this paper is to examine their conceptual differences…
Abstract
Purpose
Operational practices and operational capabilities are critical yet distinct elements in operations strategy. The purpose of this paper is to examine their conceptual differences and explore how they are developed in a portfolio, considering the potential for practices and capabilities to be either compensatory or additive in nature.
Design/methodology/approach
The compensatory model argues that the lack of investments in certain practices or capabilities can be offset by higher level of investments in other practices or capabilities. In contrast, the additive model argues that the firm must invest in certain practices or capabilities and that trade‐offs are impossible. The authors examine evidence for these two competing models using an approach borrowed from studies of multi‐attribute consumer preference models and statistical comparisons of non‐nested models.
Findings
Data for the study were collected from operations managers who were members of a large professional organization. The findings indicate that the effects of operational practices are additive for some operational outcomes and compensatory for others. However, the combinatorial nature of operational capabilities is purely compensatory.
Practical implications
The results imply that adequate investment in a wide range of operational practices is necessary to enhance operations performance. However, operations units appear to have more flexibility in choosing to develop a distinctive operational capability set.
Originality/value
The study clarifies the different orientation of operational practices and operational capabilities as they contribute to operations strategy. The findings provide guidelines regarding the combinatorial natures of operational practices and operational capabilities. These guidelines have critical strategic implications for resource allocation schemes and how these schemes affect operational performance.
Details
Keywords
Anastasios G. Malliaris and Ramaprasad Bhar
The equity premium of the S&P 500 index is explained in this paper by several variables that can be grouped into fundamental, behavioral, and macroeconomic factors. We hypothesize…
Abstract
The equity premium of the S&P 500 index is explained in this paper by several variables that can be grouped into fundamental, behavioral, and macroeconomic factors. We hypothesize that the statistical significance of these variables changes across economic regimes. The three regimes we consider are the low‐volatility, medium‐volatility, and high‐volatility regimes in contrast to previous studies that do not differentiate across economic regimes. By using the three‐state Markov switching regime econometric methodology, we confirm that the statistical significance of the independent variables representing fundamentals, macroeconomic conditions, and a behavioral variable changes across economic regimes. Our findings offer an improved understanding of what moves the equity premium across economic regimes than what we can learn from single‐equation estimation. Our results also confirm the significance of momentum as a behavioral variable across all economic regimes
Details
Keywords
Shehely Parvin, Paul Z. Wang and Jashim Uddin
The purpose of this paper is to examine two alternative consumer behavioural intention models that have been developed from the marketing and information systems disciplines in a…
Abstract
Purpose
The purpose of this paper is to examine two alternative consumer behavioural intention models that have been developed from the marketing and information systems disciplines in a service environment. Specifically, it reports an empirical assessment of the two non-nested structural models in the context of Australian restaurant industry.
Design/methodology/approach
This study used a web-based survey by an online research organization and structural equation modelling with AMOS was used to compare the two non-nested behavioural intention models.
Findings
This study found that the second model that incorporates expectation-confirmation theory outperformed the first model in terms of model fit with the empirical data.
Practical implications
The findings of this study provide service managers with important insights into the appropriate design of service delivery systems to increase consumer satisfaction which, in turn, leads to more positive behavioural intentions. Moreover, the restaurant research setting means that marketing managers in the growing tourism and hospitality industry should benefit from the study findings.
Originality/value
This study synthesized two consumer behavioural intention models from different disciplines and provided an approach to the empirical comparison of the non-nested structural models.
Details
Keywords
Many studies examine the relative information content of earnings and cash flows from operations. Most studies find that earnings have higher information content than cash flows…
Abstract
Purpose
Many studies examine the relative information content of earnings and cash flows from operations. Most studies find that earnings have higher information content than cash flows. An interesting question that follows is whether these findings hold after controlling the extremity of earnings and cash flows. The purpose of this paper is to examine the relative information content of earnings and cash flows in the following four different cases: first, moderate earnings vs moderate cash flows, second, extreme earnings vs moderate cash flows, third, moderate earnings vs extreme cash flows, and fourth, extreme earnings vs extreme cash flows.
Design/methodology/approach
To assess the relative information content of earnings and cash flows for each of the four cases mentioned above, the authors compare the explanatory power for regression of returns on unexpected earnings relative to regression of returns on unexpected cash flows. Therefore, the author compares the adjusted R2 of the model with earnings variables and the model with cash flows variables using Vuong's test, that examines the statistical significance of the difference between adjusted R2s of the rival (non-nested) models, and interpret a statistically higher adjusted R2 as an indicator for higher relative information content.
Findings
The results show that: first, when both earnings and cash flows are moderate, earnings are more highly associated with stock market price changes than cash flows, second, when both earnings and cash flows are extreme, earnings also have greater relative information content than cash flows, third, when the extremity differs between earnings and cash flows, the moderate variable is superior to the other extreme variable in explaining security returns. These results suggest that earnings are definitely more value relevant than cash flows. However, only in cases when cash flows from operations are moderate and earnings are extreme, cash flows possess higher information content than earnings.
Practical implications
The explanatory power for stock returns will be higher for earnings or cash flows depending on which is more highly persistent. This result reverses the conventional finding of the superiority of earnings over cash flows in explaining security returns.
Originality/value
In contrast to previous studies, the authors control for the extremity of earnings and cash flows when evaluating the relative information content of earnings and cash flows from operations.
Details