Search results

1 – 10 of over 43000
Article
Publication date: 29 May 2020

Kyuseok Lee

In a recent paper, Yoon and Lee (2019) (YL hereafter) propose a weighted Fama and MacBeth (FMB hereafter) two-step panel regression procedure and provide evidence that their…

Abstract

Purpose

In a recent paper, Yoon and Lee (2019) (YL hereafter) propose a weighted Fama and MacBeth (FMB hereafter) two-step panel regression procedure and provide evidence that their weighted FMB procedure produces more efficient coefficient estimators than the usual unweighted FMB procedure. The purpose of this study is to supplement and improve their weighted FMB procedure, as they provide neither asymptotic results (i.e. consistency and asymptotic distribution) nor evidence on how close their standard error estimator is to the true standard error.

Design/methodology/approach

First, asymptotic results for the weighted FMB coefficient estimator are provided. Second, a finite-sample-adjusted standard error estimator is provided. Finally, the performance of the adjusted standard error estimator compared to the true standard error is assessed.

Findings

It is found that the standard error estimator proposed by Yoon and Lee (2019) is asymptotically consistent, although the finite-sample-adjusted standard error estimator proposed in this study works better and helps to reduce bias. The findings of Yoon and Lee (2019) are confirmed even when the average R2 over time is very small with about 1% or 0.1%.

Originality/value

The findings of this study strongly suggest that the weighted FMB regression procedure, in particular the finite-sample-adjusted procedure proposed here, is a computationally simple but more powerful alternative to the usual unweighted FMB procedure. In addition, to the best of the authors’ knowledge, this is the first study that presents a formal proof of the asymptotic distribution for the FMB coefficient estimator.

Details

Studies in Economics and Finance, vol. 37 no. 2
Type: Research Article
ISSN: 1086-7376

Keywords

Article
Publication date: 7 November 2016

Marcel Bolos, Ioana Bradea and Camelia Delcea

The purpose of this paper is to focus on the adjustment of the GM(1, 2) errors for financial data series that measures changes in the public sector financial indicators, taking…

Abstract

Purpose

The purpose of this paper is to focus on the adjustment of the GM(1, 2) errors for financial data series that measures changes in the public sector financial indicators, taking into account that the errors in grey models remain a key problem in reconstructing the original data series.

Design/methodology/approach

Adjusting the errors in grey models must follow some rules that most often cannot be determined based on the chaotic trends they register in reconstructing data series. In order to ensure the adjustment of these errors, for improving the robustness of GM(1, 2), was constructed an adaptive fuzzy controller which is based on two input variables and one output variable. The input variables in the adaptive fuzzy controller are: the absolute error ε i 0 ( k ) [ % ] of GM(1, 2), and the distance between two values x i 0 ( k ) [ % ] , while the output variable is the error adjustment A ε i 0 ( k ) [ % ] determined with the help of the above-mentioned input variables.

Findings

The adaptive fuzzy controller has the advantage that sets the values for error adjustments by the intensity (size) of the errors, in this way being possible to determine the value adjustments for each element of the reconstructed financial data series.

Originality/value

To ensure a robust process of planning the financial resources, the available financial data are used for long periods of time, in order to notice the trend of the financial indicators that need to be planned. In this context, the financial data series could be reconstituted using grey models that are based on sequences of financial data that best describe the status of the analyzed indicators and the status of the relevant factors of influence. In this context, the present study proposes the construction of a fuzzy adaptive controller that with the help of the output variable will ensure the error’s adjustment in the reconstituted data series with GM(1, 2).

Details

Grey Systems: Theory and Application, vol. 6 no. 3
Type: Research Article
ISSN: 2043-9377

Keywords

Book part
Publication date: 13 May 2017

Yang Tang, Thomas D. Cook, Yasemin Kisbu-Sakarya, Heinrich Hock and Hanley Chiang

Relative to the randomized controlled trial (RCT), the basic regression discontinuity (RD) design suffers from lower statistical power and lesser ability to generalize causal…

Abstract

Relative to the randomized controlled trial (RCT), the basic regression discontinuity (RD) design suffers from lower statistical power and lesser ability to generalize causal estimates away from the treatment eligibility cutoff. This chapter seeks to mitigate these limitations by adding an untreated outcome comparison function that is measured along all or most of the assignment variable. When added to the usual treated and untreated outcomes observed in the basic RD, a comparative RD (CRD) design results. One version of CRD adds a pretest measure of the study outcome (CRD-Pre); another adds posttest outcomes from a nonequivalent comparison group (CRD-CG). We describe how these designs can be used to identify unbiased causal effects away from the cutoff under the assumption that a common, stable functional form describes how untreated outcomes vary with the assignment variable, both in the basic RD and in the added outcomes data (pretests or a comparison group’s posttest). We then create the two CRD designs using data from the National Head Start Impact Study, a large-scale RCT. For both designs, we find that all untreated outcome functions are parallel, which lends support to CRD’s identifying assumptions. Our results also indicate that CRD-Pre and CRD-CG both yield impact estimates at the cutoff that have a similarly small bias as, but are more precise than, the basic RD’s impact estimates. In addition, both CRD designs produce estimates of impacts away from the cutoff that have relatively little bias compared to estimates of the same parameter from the RCT design. This common finding appears to be driven by two different mechanisms. In this instance of CRD-CG, potential untreated outcomes were likely independent of the assignment variable from the start. This was not the case with CRD-Pre. However, fitting a model using the observed pretests and untreated posttests to account for the initial dependence generated an accurate prediction of the missing counterfactual. The result was an unbiased causal estimate away from the cutoff, conditional on this successful prediction of the untreated outcomes of the treated.

Article
Publication date: 4 December 2017

Chathebert Mudhunguyo

The purpose of this paper is to evaluate accuracy of macro fiscal forecasts done by Government of Zimbabwe and the spillover effects of forecasting errors over the period…

1116

Abstract

Purpose

The purpose of this paper is to evaluate accuracy of macro fiscal forecasts done by Government of Zimbabwe and the spillover effects of forecasting errors over the period 2010-2015.

Design/methodology/approach

In line with the study objectives, the study employed the root mean square error methodology to measure the accuracy of macro fiscal forecasts, borrowing from the work of Calitz et al. (2013). The spillover effects were assessed through running simple regression in Eviews programme. The data used in the analysis are based on annual national budget forecasts presented to the Parliament by the Minister of Finance. Actual data come from the Ministry of Finance budget outturns and Zimbabwe Statistical Agency published national accounts.

Findings

The results of the root mean square error revealed relatively high levels of macro-fiscal forecasting errors, with revenue recording the highest. The forecasting errors display a tendency of under predicting the strength of economic recovery during boom and over predicting its strength during periods of weakness. The study although found significant evidence of GDP forecasting errors translating into revenue forecasting inaccuracies, the GDP forecasting errors fail to fully account for the revenue errors. Revenue errors were, however, found to be positive and significant in explaining the budget balance errors.

Originality/value

In other jurisdictions, particularly developed countries, they undertake regular evaluation of their forecasts in order to improve their forecasting procedures, which translate into quality public service delivery. The situation is lagging in Zimbabwe. Given the poor performance in public service delivery in Zimbabwe, this study contributes in dissecting the sources of the challenge by providing a comprehensive review of macro fiscal forecasts.

Details

African Journal of Economic and Management Studies, vol. 8 no. 4
Type: Research Article
ISSN: 2040-0705

Keywords

Article
Publication date: 1 December 2005

Marco Aurélio Stumpf González, Lucio Soibelman and Carlos Torres Formoso

Available literature claims that location is a key attribute in the housing market. However, the impact of this attribute is difficult to measure and the traditional hedonic…

1055

Abstract

Purpose

Available literature claims that location is a key attribute in the housing market. However, the impact of this attribute is difficult to measure and the traditional hedonic approach using subjective assessments is problematic. This paper seeks to explore trend surface analysis technique, attempting to provide an alternative way to measure location values.

Design/methodology/approach

TSA works in a similar way to other response surface methods but it is implemented directly in regression models, using a set of combinations of the co‐ordinates of properties in several power degrees. It can also be implemented in artificial neural networks, taking advantage of the neural ability in non‐linear domains. This work presents a comparison between traditional regression approach, error modelling, response surfaces, and TSA. ANN is also used to estimate some models, comparing their results. The objective is to verify the behaviour of TSA in hedonic models. A case study was carried using data of over 30,000 sales tax data of apartments sold in Porto Alegre, a southern Brazilian town.

Findings

The results indicates that TSA is an effective tool for the spatial analysis of real estate, because TSA models are similar to other approaches, but are developed with less expert work.

Originality/value

This paper presents an application of TSA in real estate market, which is an interesting alternative to traditional measures of location attributes.

Details

Property Management, vol. 23 no. 5
Type: Research Article
ISSN: 0263-7472

Keywords

Book part
Publication date: 11 August 2014

Ben Amoako-Adu, Vishaal Baulkaran and Brian F. Smith

The chapter investigates three channels through which private benefits are hypothesized to be extracted in dual class companies: excess executive compensation, excess capital…

Abstract

Purpose

The chapter investigates three channels through which private benefits are hypothesized to be extracted in dual class companies: excess executive compensation, excess capital expenditures and excess cash holdings.

Design/methodology/approach

With a propensity score matched sample of S&P 1500 dual class and single class companies with concentrated control, the chapter analyzes the relationship between the valuation discount of dual class companies and measures of excess executive compensation, excess capital expenditure and excess cash holdings.

Findings

Executives in dual class firms earn greater compensation relative to their counterparts in single class firms. This excess compensation is more pronounced when the executive is a family member. The value of dual class shares is discounted most when cash holdings and executive compensation of dual class are excessive. Excess compensation is highest for executives who are family members of dual class companies. The dual class discount is not related to excess capital expenditures.

Originality/value

The research shows that the discount in the value of dual class shares in relation to the value of closely controlled single class company shares is directly related to the channels through which controlling shareholder-managers can extract private benefits.

Details

Advances in Financial Economics
Type: Book
ISBN: 978-1-78350-120-5

Keywords

Article
Publication date: 26 August 2014

Imre Karafiath

In the finance literature, fitting a cross-sectional regression with (estimated) abnormal returns as the dependent variable and firm-specific variables (e.g. financial ratios) as…

1449

Abstract

Purpose

In the finance literature, fitting a cross-sectional regression with (estimated) abnormal returns as the dependent variable and firm-specific variables (e.g. financial ratios) as independent variables has become de rigueur for a publishable event study. In the absence of skewness and/or kurtosis the explanatory variable, the regression design does not exhibit leverage – an issue that has been addressed in the econometrics literature on the finite sample properties of heteroskedastic-consistent (HC) standard errors, but not in the finance literature on event studies. The paper aims to discuss this issue.

Design/methodology/approach

In this paper, simulations are designed to evaluate the potential bias in the standard error of the regression coefficient when the regression design includes “points of high leverage” (Chesher and Jewitt, 1987) and heteroskedasticity. The empirical distributions of test statistics are tabulated from ordinary least squares, weighted least squares, and HC standard errors.

Findings

None of the test statistics examined in these simulations are uniformly robust with regard to conditional heteroskedasticity when the regression includes “points of high leverage.” In some cases the bias can be quite large: an empirical rejection rate as high as 25 percent for a 5 percent nominal significance level. Further, the bias in OLS HC standard errors may be attenuated but not fully corrected with a “wild bootstrap.”

Research limitations/implications

If the researcher suspects an event-induced increase in return variances, tests for conditional heteroskedasticity should be conducted and the regressor matrix should be evaluated for observations that exhibit a high degree of leverage.

Originality/value

This paper is a modest step toward filling a gap on the finite sample properties of HC standard errors in the event methodology literature.

Details

International Journal of Managerial Finance, vol. 10 no. 4
Type: Research Article
ISSN: 1743-9132

Keywords

Article
Publication date: 17 March 2023

Stewart Jones

This study updates the literature review of Jones (1987) published in this journal. The study pays particular attention to two important themes that have shaped the field over the…

Abstract

Purpose

This study updates the literature review of Jones (1987) published in this journal. The study pays particular attention to two important themes that have shaped the field over the past 35 years: (1) the development of a range of innovative new statistical learning methods, particularly advanced machine learning methods such as stochastic gradient boosting, adaptive boosting, random forests and deep learning, and (2) the emergence of a wide variety of bankruptcy predictor variables extending beyond traditional financial ratios, including market-based variables, earnings management proxies, auditor going concern opinions (GCOs) and corporate governance attributes. Several directions for future research are discussed.

Design/methodology/approach

This study provides a systematic review of the corporate failure literature over the past 35 years with a particular focus on the emergence of new statistical learning methodologies and predictor variables. This synthesis of the literature evaluates the strength and limitations of different modelling approaches under different circumstances and provides an overall evaluation the relative contribution of alternative predictor variables. The study aims to provide a transparent, reproducible and interpretable review of the literature. The literature review also takes a theme-centric rather than author-centric approach and focuses on structured themes that have dominated the literature since 1987.

Findings

There are several major findings of this study. First, advanced machine learning methods appear to have the most promise for future firm failure research. Not only do these methods predict significantly better than conventional models, but they also possess many appealing statistical properties. Second, there are now a much wider range of variables being used to model and predict firm failure. However, the literature needs to be interpreted with some caution given the many mixed findings. Finally, there are still a number of unresolved methodological issues arising from the Jones (1987) study that still requiring research attention.

Originality/value

The study explains the connections and derivations between a wide range of firm failure models, from simpler linear models to advanced machine learning methods such as gradient boosting, random forests, adaptive boosting and deep learning. The paper highlights the most promising models for future research, particularly in terms of their predictive power, underlying statistical properties and issues of practical implementation. The study also draws together an extensive literature on alternative predictor variables and provides insights into the role and behaviour of alternative predictor variables in firm failure research.

Details

Journal of Accounting Literature, vol. 45 no. 2
Type: Research Article
ISSN: 0737-4607

Keywords

Book part
Publication date: 4 December 2012

William Coffie and Osita Chukwulobelu

Purpose – The purpose of this chapter is to examine whether or not the Capital Asset Pricing Model (CAPM) reasonably describes the return generating process on the Ghanaian Stock…

Abstract

Purpose – The purpose of this chapter is to examine whether or not the Capital Asset Pricing Model (CAPM) reasonably describes the return generating process on the Ghanaian Stock Exchange using monthly return data of 19 individual companies listed on the Exchange during the period January 2000 to December 2009.

Methodology/approach – We follow a methodology similar to Jensen (1968) time series approach. Parameters are estimated using OLS. This study is designed to measure beta risk across different times by following the time series approach. The betas of the individual securities are estimated using time series data of the excess return version of the CAPM.

Findings – Our test results show that although market beta contributes to the variation in equity returns in Ghana, its contribution is not as significant as predicted by the CAPM, and in some cases very weak. Our results also reject the strictest form of the Sharpe–Lintner CAPM, but we found positive linear relationship between equity risk premium and market beta. Instead, our evidence uphold the Jensen (1968) and Jensen, Black, and Scholes (1972) versions of the CAPM.

Research limitations/implications – This study is limited to the single-factor CAPM. Future studies will extend the test to include both size and BE/ME fundamentals and factors relating to P/E ratio, momentum and liquidity.

Practical implications – Our results will make corporate managers to be cautious when using CAPM as a basis to determine cost of equity for investment appraisal purposes, and fund managers when evaluating asset and portfolio performance.

Originality/value – The CAPM is applied to individual securities instead of portfolios, since the model was developed using information on a single security.

Details

Finance and Development in Africa
Type: Book
ISBN: 978-1-78190-225-7

Keywords

Open Access
Article
Publication date: 23 February 2024

Bonha Koo and Ryumi Kim

Using the next-day and next-week returns of stocks in the Korean market, we examine the association of option volume ratios – i.e. the option-to-stock (O/S) ratio, which is the…

Abstract

Using the next-day and next-week returns of stocks in the Korean market, we examine the association of option volume ratios – i.e. the option-to-stock (O/S) ratio, which is the total volume of put options and call options scaled by total underlying equity volume, and the put-call (P/C) ratio, which is the put volume scaled by total put and call volume – with future returns. We find that O/S ratios are positively related to future returns, but P/C ratios have no significant association with returns. We calculate individual, institutional, and foreign investors’ option ratios to determine which ratios are significantly related to future returns and find that, for all investors, higher O/S ratios predict higher future returns. The predictability of P/C depends on the investors: institutional and individual investors’ P/C ratios are not related to returns, but foreign P/C predicts negative next-day returns. For net-buying O/S ratios, institutional net-buying put-to-stock ratios consistently predict negative future returns. Institutions’ buying and selling put ratios also predict returns. In short, institutional put-to-share ratios predict future returns when we use various option ratios, but individual option ratios do not.

Details

Journal of Derivatives and Quantitative Studies: 선물연구, vol. 32 no. 1
Type: Research Article
ISSN: 1229-988X

Keywords

1 – 10 of over 43000