Search results

1 – 10 of over 10000
Article
Publication date: 23 June 2022

Mohamed A. Ayadi, Anis Chaibi and Lawrence Kryzanowski

Prior research has documented inconclusive and/or mixed empirical evidence on the timing performance of hybrid funds. Their performance inferences generally do not efficiently…

Abstract

Purpose

Prior research has documented inconclusive and/or mixed empirical evidence on the timing performance of hybrid funds. Their performance inferences generally do not efficiently control for fixed-income exposure, conditioning information, and cross-correlations in fund returns. This study examines the stock and bond timing performances of hybrid funds while controlling and accounting for these important issues. It also discusses the inferential implications of using alternative bootstrap resampling approaches.

Design/methodology/approach

We examine the stock and bond timing performances of hybrid funds using (un)conditional multi-factor benchmark models with robust estimation inferences. We also rely on the block bootstrap method to account for cross-correlations in fund returns and to separate the effects of luck or sampling variation from manager skill.

Findings

We find that the timing performance of portfolios of funds is neutral and sensitive to controlling for fixed-income exposures and choice of the timing measurement model. The block-bootstrap analyses of funds in the tails of the distributions of stock timing performances suggest that sampling variation explains the underperformance of extreme left tail funds and confirms the good and bad luck in the bond timing management of tail funds. We report inference changes based on whether the Kosowski et al. or the Fama and French bootstrap approach is used.

Originality/value

This study provides extensive and robust evidence on the stock and bond timing performances of hybrid funds and their sensitivity based on (un)conditional linear multi-factor benchmark models. It examines the timing performances in the extreme tails funds using the block bootstrap method to efficiently identify (un)skilled fund managers. It also highlights the sensitivity of inferences to the choice of testing methodology.

Details

International Journal of Managerial Finance, vol. 19 no. 3
Type: Research Article
ISSN: 1743-9132

Keywords

Book part
Publication date: 20 September 2021

John R. Busenbark, Kenneth A. Frank, Spiro J. Maroulis, Ran Xu and Qinyun Lin

In this chapter, we explicate two related techniques that help quantify the sensitivity of a given causal inference to potential omitted variables and/or other sources of…

Abstract

In this chapter, we explicate two related techniques that help quantify the sensitivity of a given causal inference to potential omitted variables and/or other sources of unexplained heterogeneity. In particular, we describe the Impact Threshold of a Confounding Variable (ITCV) and the Robustness of Inference to Replacement (RIR). The ITCV describes the minimum correlation necessary between an omitted variable and the focal parameters of a study to have created a spurious or invalid statistical inference. The RIR is a technique that quantifies the percentage of observations with nonzero effects in a sample that would need to be replaced with zero effects in order to overturn a given causal inference at any desired threshold. The RIR also measures the percentage of a given parameter estimate that would need to be biased in order to overturn an inference. Each of these procedures is critical to help establish causal inference, perhaps especially for research urgently studying the COVID-19 pandemic when scholars are not afforded the luxury of extended time periods to determine precise magnitudes of relationships between variables. Over the course of this chapter, we define each technique, illustrate how they are applied in the context of seminal strategic management research, offer guidelines for interpreting corresponding results, and delineate further considerations.

Book part
Publication date: 13 May 2017

Otávio Bartalotti and Quentin Brummet

Regression discontinuity designs have become popular in empirical studies due to their attractive properties for estimating causal effects under transparent assumptions…

Abstract

Regression discontinuity designs have become popular in empirical studies due to their attractive properties for estimating causal effects under transparent assumptions. Nonetheless, most popular procedures assume i.i.d. data, which is unreasonable in many common applications. To fill this gap, we derive the properties of traditional local polynomial estimators in a fixed- G setting that allows for cluster dependence in the error term. Simulation results demonstrate that accounting for clustering in the data while selecting bandwidths may lead to lower MSE while maintaining proper coverage. We then apply our cluster-robust procedure to an application examining the impact of Low-Income Housing Tax Credits on neighborhood characteristics and low-income housing supply.

Details

Regression Discontinuity Designs
Type: Book
ISBN: 978-1-78714-390-6

Keywords

Content available
Book part
Publication date: 13 May 2017

Abstract

Details

Regression Discontinuity Designs
Type: Book
ISBN: 978-1-78714-390-6

Book part
Publication date: 13 December 2013

Nikolay Gospodinov, Ana María Herrera and Elena Pesavento

This article investigates the robustness of impulse response estimators to near unit roots and near cointegration in vector autoregressive (VAR) models. We compare estimators…

Abstract

This article investigates the robustness of impulse response estimators to near unit roots and near cointegration in vector autoregressive (VAR) models. We compare estimators based on VAR specifications determined by pretests for unit roots and cointegration as well as unrestricted VAR specifications in levels. Our main finding is that the impulse response estimators obtained from the levels specification tend to be most robust when the magnitude of the roots is not known. The pretest specification works well only when the restrictions imposed by the model are satisfied. Its performance deteriorates even for small deviations from the exact unit root for one or more model variables. We illustrate the practical relevance of our results through simulation examples and an empirical application.

Details

VAR Models in Macroeconomics – New Developments and Applications: Essays in Honor of Christopher A. Sims
Type: Book
ISBN: 978-1-78190-752-8

Keywords

Article
Publication date: 27 May 2022

John Galakis, Ioannis Vrontos and Panos Xidonas

This study aims to introduce a tree-structured linear and quantile regression framework to the analysis and modeling of equity returns, within the context of asset pricing.

Abstract

Purpose

This study aims to introduce a tree-structured linear and quantile regression framework to the analysis and modeling of equity returns, within the context of asset pricing.

Design/Methodology/Approach

The approach is based on the idea of a binary tree, where every terminal node parameterizes a local regression model for a specific partition of the data. A Bayesian stochastic method is developed including model selection and estimation of the tree structure parameters. The framework is applied on numerous U.S. asset pricing models, using alternative mimicking factor portfolios, frequency of data, market indices, and equity portfolios.

Findings

The findings reveal strong evidence that asset returns exhibit asymmetric effects and non- linear patterns to different common factors, but, more importantly, that there are multiple thresholds that create several partitions in the common factor space.

Originality/Value

To the best of the authors' knowledge, this paper is the first to explore and apply a tree-structured and quantile regression framework in an asset pricing context.

Details

Review of Accounting and Finance, vol. 21 no. 3
Type: Research Article
ISSN: 1475-7702

Keywords

Article
Publication date: 21 October 2020

Krittika Banerjee and Ashima Goyal

After the adoption of unconventional monetary policies (UMPs) in advanced economies (AEs) there were many studies of monetary spillovers to asset prices in emerging market…

Abstract

Purpose

After the adoption of unconventional monetary policies (UMPs) in advanced economies (AEs) there were many studies of monetary spillovers to asset prices in emerging market economies (EMEs) but the extent of contribution of EMEs and AEs, respectively, in real exchange rate (RER) misalignments has not been addressed. This paper addresses the gap in a cross-country panel set-up with country specific controls.

Design/methodology/approach

Fixed effects, pooled mean group (Pesaran et al., 1999) and common correlated effects (Pesaran, 2006) estimations are used to examine the relationship. Multiway clustering is taken into account to ensure robust statistical inferences.

Findings

Robust evidence is found for significant monetary spillovers over 1998–2017 in the form of RER overvaluation of EMEs against AEs, especially through the portfolio rebalancing channel. EME RER against the US saw significantly more overvaluation in UMP years indicating greater role of the US in monetary spillovers. However, in the long-run monetary neutrality holds. EMEs did pursue mercantilist and precautionary policies that undervalued their RERs. Precautionary undervaluation is more evident with bilateral EME US RER.

Research limitations/implications

It may be useful for large EMEs to monitor the impact of foreign portfolio flows on short-run deviations in RER. Export diversification reduces EME mercantilist motives against the US. That AE monetary policy significantly appreciates EME RER has implications for future policy cooperation between EMEs and AEs.

Originality/value

To the best of the author's knowledge such a comparative analysis between AE and EME policy variables on RER misalignment has not been done previously.

Details

International Journal of Emerging Markets, vol. 17 no. 2
Type: Research Article
ISSN: 1746-8809

Keywords

Book part
Publication date: 25 January 2023

Yang Yang, Graziano Abrate and Chunrong Ai

This chapter provides an overview of the status of applied econometric research in hospitality and tourism management and outlines the econometric toolsets available for…

Abstract

This chapter provides an overview of the status of applied econometric research in hospitality and tourism management and outlines the econometric toolsets available for quantitative researchers using empirical data from the field. Basic econometric models, cross-sectional models, time-series models, and panel data models are reviewed first, followed by an evaluation of relevant applications. Next, econometric modeling topics that are germane to hospitality and tourism research are discussed, including endogeneity, multi-equation modeling, causal inference modeling, and spatial econometrics. Furthermore, major feasibility issues for applied researchers are examined based on the literature. Lastly, recommendations are offered to promote applied econometric research in hospitality and tourism management.

Details

Cutting Edge Research Methods in Hospitality and Tourism
Type: Book
ISBN: 978-1-80455-064-9

Keywords

Article
Publication date: 27 October 2020

Pavana Kumara Bellairu, Shreeranga Bhat and E.V. Gijo

The aim of this article is to demonstrate the development of environment friendly, low cost natural fibre composites by robust engineering approach. More specifically, the prime…

Abstract

Purpose

The aim of this article is to demonstrate the development of environment friendly, low cost natural fibre composites by robust engineering approach. More specifically, the prime objective of the study is to optimise the composition of natural fibre reinforced polymer nanocomposites using a robust statistical approach.

Design/methodology/approach

In this research, the material is prepared using multi-walled carbon nanotubes (MWCNT), Cantala fibres and Epoxy Resin in accordance with the ASTM (American Society for Testing and Materials) standards. Further, the composition is prepared and optimised using the mixture-design approach for the flexural strength of the material.

Findings

The results of the study indicate that MWCNT plays a vital role in increasing the flexural strength of the composite. Moreover, it is observed that interactions between second order and third order parameters in the composition are statistically significant. This leads to proposing a special cubic model for the novel composite material with residual analysis. Moreover, the methodology assists in optimising the mixture component values to maximise the flexural strength of the novel composite material.

Originality/value

This article attempts to include both MWCNT and Cantala fibres to develop a novel composite material. In addition, it employs the mixture-design technique to optimise the composition and predict the model of the study in a step-by-step manner, which will act as a guideline for academicians and practitioners to optimise the material composition with specific reference to natural fibre reinforced nanocomposites.

Details

Multidiscipline Modeling in Materials and Structures, vol. 17 no. 2
Type: Research Article
ISSN: 1573-6105

Keywords

Article
Publication date: 2 March 2010

George J. Besseris

The purpose of this paper is to propose a methodology that may aid in assessing information technology (IT) quality characteristic optimisation through the use of simple and robust

Abstract

Purpose

The purpose of this paper is to propose a methodology that may aid in assessing information technology (IT) quality characteristic optimisation through the use of simple and robust tools with minimal effort.

Design/methodology/approach

Non‐linear saturated fractional factorial designs proposed by Taguchi receive robust data processing by the efficient nonparametric test of Jonckheere and Terpstra.

Findings

The paper finds that e‐mail quality improvement is achieved by collecting data through an unreplicated‐saturated L9(34) design. Active influences are attributed to the e‐mail volume and the receiving hardware type.

Research limitations/implications

The overall efficiency of the method is greatly enhanced due to incorporation of a nonparametric analysis tool that is known to perform effectively when data availability is minimal. The method does not succumb to normality and multi‐distributional effects which may easily handicap the decision‐making process when derived from other mainstream methods.

Practical implications

There are obvious professional and pedagogical aspects in this work aiming at IT quality practitioners offering facilitation towards implementing robust techniques while suppressing quality costs. It is noteworthy that nonparametric data processing improves on the ability to make predictions over Taguchi's regular Design of Experiments (DOE) formulation for small sampling conditions.

Originality/value

This method embraces designing efficiency by non‐linear orthogonal arrays with multi‐level order statistics providing the weaponry to deal with quality optimisation in complex environments such as those in the IT area. The value of this work may be appreciated best by quality managers and engineers engaged in routine quality improvement projects in the area of information systems which also augments the general database of quality‐related testing cases.

Details

The TQM Journal, vol. 22 no. 2
Type: Research Article
ISSN: 1754-2731

Keywords

1 – 10 of over 10000