Search results

1 – 10 of over 49000
Book part
Publication date: 30 May 2018

Jeffrey S. Hoch and Pierre Chaussé

This chapter considers the analysis of a cost-effectiveness dataset from an econometrics perspective. We link cost-effectiveness analysis to the net benefit regression framework…

Abstract

This chapter considers the analysis of a cost-effectiveness dataset from an econometrics perspective. We link cost-effectiveness analysis to the net benefit regression framework and explore insights and opportunities from econometrics and their practical implications. As an empirical illustration, we compare various econometric techniques using a cost-effectiveness dataset from a published study. The chapter concludes with a discussion about implications for applied practitioners and future research directions.

Article
Publication date: 26 September 2011

Samih Azar

This paper seeks to reconsider the Euler equation of the Consumption Capital Asset Pricing Model (CCAPM), to derive a regression‐based model to test it, and to present evidence…

Abstract

Purpose

This paper seeks to reconsider the Euler equation of the Consumption Capital Asset Pricing Model (CCAPM), to derive a regression‐based model to test it, and to present evidence that the model is consistent with reasonable values for the coefficient of relative risk aversion (CRRA). This runs contrary to the findings of the literature on the equity premium puzzle, but is in agreement with the literature that estimates the CRRA for the purpose of computing the social discount rate, and is in line with the research on labor supply. Tests based on General Method of Moments (GMM) for the same sample produce results that are extremely disparate and unstable. The paper aims to check and find support for the robustness of the regression‐based tests. Habit formation models are also to be evaluated with regression‐based and GMM tests. However, the validity of the regression‐based models depends critically on their functional forms.

Design/methodology/approach

The paper presents empirical evidence that the conventional use of GMM fails because of four pathological features of GMM that are referred to under the general caption of “weak identification”. In addition to GMM, the paper employs linear regression analysis to test the CCAPM, and it is found that the regression residuals follow well‐behaved distributional properties, making valid all statistical inferences, while GMM estimates are highly unstable.

Findings

Four unexpected findings are reported. The first is that the regression‐based models are consistent with reasonable values for the CRRA, i.e. estimates that are below 4. The second is that the regression‐based tests are robust, while the GMM‐based tests are not. The third is that regression‐based tests with habit formation depend crucially on the specification of the model. The fourth is that there is evidence that market stock returns are sensitive to both consumption and dividends. The author calls the latter “extra sensitivity of market stock returns”, and it is described as a new puzzle.

Originality/value

The regression‐based models of the CCAPM Euler equation are novel. The comparison between GMM and regression‐based models for the same sample is original. The regression‐based models with habit formation are new. The equity premium puzzle disappears because the estimates of the CRRA are reasonable. But another puzzle is documented, which is the “extra sensitivity of market stock returns” to consumption and dividends together.

Article
Publication date: 1 March 2006

Philip Gharghori, Howard Chan and Robert Faff

Daniel and Titman (1997) contend that the Fama‐French three‐factor model’s ability to explain cross‐sectional variation in expected returns is a result of characteristics that…

Abstract

Daniel and Titman (1997) contend that the Fama‐French three‐factor model’s ability to explain cross‐sectional variation in expected returns is a result of characteristics that firms have in common rather than any risk‐based explanation. The primary aim of the current paper is to provide out‐of‐sample tests of the characteristics versus risk factor argument. The main focus of our tests is to examine the intercept terms in Fama‐French regressions, wherein test portfolios are formed by a three‐way sorting procedure on book‐to‐market, size and factor loadings. Our main test focuses on ‘characteristic‐balanced’ portfolio returns of high minus low factor loading portfolios, for different size and book‐to‐market groups. The Fama‐French model predicts that these regression intercepts should be zero while the characteristics model predicts that they should be negative. Generally, despite the short sample period employed, our findings support a risk‐factor interpretation as opposed to a characteristics interpretation. This is particularly so for the HML loading‐based test portfolios. More specifically, we find that: the majority of test portfolios tend to reveal higher returns for higher loadings (while controlling for book‐to‐market and size characteristics); the majority of the Fama‐French regression intercepts are statistically insignificant; for the characteristic‐balanced portfolios, very few of the Fama‐French regression intercepts are significant.

Details

Pacific Accounting Review, vol. 18 no. 1
Type: Research Article
ISSN: 0114-0582

Keywords

Article
Publication date: 1 October 2000

D. Dutta Majumder and Prasun Kumar Roy

Aims to investigate the causative factors and clinical applicability of spontaneous regression of malignant tumours without treatment, a really paradoxical phenomenon with many…

Abstract

Aims to investigate the causative factors and clinical applicability of spontaneous regression of malignant tumours without treatment, a really paradoxical phenomenon with many therapeutic potentialities. Analyses past cases to find that the commonest cause is a preceding episode of high fever‐induced thermal fluctuation which produces fluctuation of biochemical/immunological parameters. Using Prigogine‐Glansdorff‐Langevin stability theory and biocybernetic principles, develops the theoretical foundation of a tumour’s self‐control, homeostasis and regression induced by thermal, radiation or oxygenation fluctuations. Derives a threshold condition of perturbations for producing regression. Presents some striking confirmation of such fluctuation‐induced regression in Ewing tumour, Clear cell cancer and Lewis lung carcinoma. Using experimental data on patients, elucidates a novel therapeutic approach of multi‐modal hyper‐fluctuation utilizing radiotherapeutic hyper‐fractionation, temperature and immune‐status.

Details

Kybernetes, vol. 29 no. 7/8
Type: Research Article
ISSN: 0368-492X

Keywords

Article
Publication date: 20 October 2011

Renkuan Guo, Danni Guo and YanHong Cui

The purpose of this paper is to propose an uncertain regression model with an intrinsic error structure facilitated by an uncertain canonical process.

Abstract

Purpose

The purpose of this paper is to propose an uncertain regression model with an intrinsic error structure facilitated by an uncertain canonical process.

Design/methodology/approach

This model is suitable for dealing with expert's knowledge ranging from small to medium size data of impreciseness. In order to have a rigorous mathematical treatment on the new regression model, this paper establishes a series of new uncertainty concepts sequentially, such as uncertainty joint multivariate distribution, the uncertainty distribution of uncertainty product variables and uncertain covariance and correlation based on the axiomatic uncertainty theoretical foundation. Two examples are given for illustrating a small data regression analysis.

Findings

The uncertain regression model is formulated and the estimation of the model coefficients is developed.

Practical implications

The paper is devoted to a regression model to handle a small amount of data with mathematical rigor.

Originality/value

The theory and the methodology of the uncertain canonical process regression is proposed for the first time. It addresses the practical challenges of small data size modelling.

Article
Publication date: 7 March 2016

Georges Hübner

The Treynor and Mazuy framework is a widely used return-based model of market timing. However, existing corrections to the regression intercept can be manipulated through…

Abstract

Purpose

The Treynor and Mazuy framework is a widely used return-based model of market timing. However, existing corrections to the regression intercept can be manipulated through derivatives trading. Because they are conceptually flawed, these corrections produce biased performance measures. This paper aims to get back to Henriksson and Merton’s initial idea of option replication to overcome this issue and adapt the market timing model to various kinds of trading strategies and return-generating processes.

Design/methodology/approach

This paper proposes a theoretical adjustment based on Merton’s option replication approach adapted to the Treynor and Mazuy specification. The linear and quadratic coefficients of the regression are exploited to assess the cost of the replicating option that yields similar convexity for a passive portfolio. A similar reasoning applies for various timing patterns and in multi-factor models.

Findings

The proposed framework induces a potential rebalancing risk and involves the delicate issue of choosing the cheapest option. This paper shows that these issues can be overcome for reasonable tolerance levels. The option replication approach is a workable approach for practical applications.

Originality/value

The adaptation of Merton’s reasoning to the Treynor and Mazuy model has surprisingly never been proposed so far. This paper has the potential to correct for a pervasive bias in the estimation of the performance of a market timer in the context of this very popular quadratic regression setup. Because of the power of the option replication approach, the reasoning is shown to be applicable to multi-factor models, negative timing and market neutral strategies. This paper could fuel empirical studies that would shed new light on the genuine market timing skills of active portfolio managers.

Details

Studies in Economics and Finance, vol. 33 no. 1
Type: Research Article
ISSN: 1086-7376

Keywords

Article
Publication date: 17 August 2021

Md Vaseem Chavhan, M. Ramesh Naidu and Hayavadana Jamakhandi

This paper aims to propose the artificial neural network (ANN) and regression models for the estimation of the thread consumption at multilayered seam assembly stitched with lock…

Abstract

Purpose

This paper aims to propose the artificial neural network (ANN) and regression models for the estimation of the thread consumption at multilayered seam assembly stitched with lock stitch 301.

Design/methodology/approach

In the present study, the generalized regression and neural network models are developed by considering the fabric types: woven, nonwoven and multilayer combination thereof, with basic sewing parameters: sewing thread linear density, stitch density, needle count and fabric assembly thickness. The network with feed-forward backpropagation is considered to build the ANN, and the training function trainlm of MATLAB software is used to adjust weight and basic values according to the optimization of Levenberg Marquardt. The performance of networks measured in terms of the mean squared error and the layer output is set according to the sigmoid transfer function.

Findings

The proposed ANN and regression model are able to predict the thread consumption with more accuracy for multilayered seam assembly. The predictability of thread consumption from available geometrical models, regression models and industrial empirical techniques are compared with proposed linear regression, quadratic regression and neural network models. The proposed quadratic regression model showed a good correlation with practical thread consumption value and more accuracy in prediction with an overall 4.3% error, as compared to other techniques for given multilayer substrates. Further, the developed ANN network showed good accuracy in the prediction of thread consumption.

Originality/value

The estimation of thread consumed while stitching is the prerequisite of the garment industry for inventory management especially with the introduction of the costly high-performance sewing thread. In practice, different types of fabrics are stitched at multilayer combinations at different locations of the stitched product. The ANN and regression models are developed for multilayered seam assembly of woven and nonwoven fabric blend composition for better prediction of thread consumption.

Details

Research Journal of Textile and Apparel, vol. 26 no. 4
Type: Research Article
ISSN: 1560-6074

Keywords

Article
Publication date: 16 February 2015

Chang Li, Wei Zheng, Philip Chang and Shanmin Li

As literatures argue that managers’ personalities will affect both corporate governance structures and corporate performance, the correlation between them is a mixed result. The…

Abstract

Purpose

As literatures argue that managers’ personalities will affect both corporate governance structures and corporate performance, the correlation between them is a mixed result. The purpose of this paper is to separate different routes leading to the mixed correlation, and name the separated routes as regime effect and signal effect.

Design/methodology/approach

By theoretical analysis, the authors list three routes leading to the correlation between corporate governance and corporate performance. Routes 1 and 2 show that governance can directly and indirectly change the performance; while route 3 shows that both the governance and performance are results of managers’ personalities, and the governance has no influence onto the performance, which means the correlation led by route 3 is fake. By design a new econometric methodology, this paper separates the mixed correlation between corporate governance and performance, and names the correlation led by routes 1 and 2 as the regime effect and the correlation led by route 3 as signal effect.

Findings

By an empirical research on Chinese listed corporates, the authors find that the correlations between Chinese listed corporates’ market value and main corporate governance factors can be separated into regime effects and signal effects; and the authors also find that some factors (Share of Institutional Investors, Share of Real Controller and the Squared, Dummy of Identical CEO and Chairman, Ownership Concentration) only show regime effects, some factors (Separating Extent of Ownership and Controlling Right, Dummy of Provincial State-Owned Firms) only show signal effects, and some factors (Dummy of Republic State-Owned Firms, Scale of Board) show both. What’s more, the authors find out an interesting result that the state-owning has no negative regime effect on China SOEs’ performance but very significantly negative signal effect; in this paper, the authors suggest that this means the key negative factors of Chinese SOEs is not state-owning ownership structure but the managers’ corruption.

Practical implications

As only the factors with regime effects can directly and indirectly affect corporates’ performance and the factors with signal effects show that there’re some managers’ personalities affecting both the governance and performance, the separation method in this paper can help shareholders knowing which governance factors will be helpful to improve the performance and which others will show managers’ hard-working or corruption intention.

Originality/value

Separate the regime effect and the signal effect from the correlation between corporate governance and performance.

Details

China Finance Review International, vol. 5 no. 1
Type: Research Article
ISSN: 2044-1398

Keywords

Article
Publication date: 22 February 2022

Kura Alemayehu Beyene

Modeling helps to determine how structural parameters of fabric affect the surface of a fabric and also identify the way they influence fabric properties. Moreover, it helps to…

Abstract

Purpose

Modeling helps to determine how structural parameters of fabric affect the surface of a fabric and also identify the way they influence fabric properties. Moreover, it helps to estimate and evaluate without the complexity and time-consuming experimental procedures. The purpose of this study is to develop and select the best regression model equations for the prediction and evaluation of surface roughness of plain-woven fabrics.

Design/methodology/approach

In this study, a linear and quadratic regression model was developed for the prediction and evaluation of surface roughness of plain-woven fabrics, and the capability in accuracy and reliability of the two-model equation was determined by the root mean square error (RMSE). The Design-Expert AE11 software was used for developing the two model equations and analysis of variance “ANOVA.” The count and density were used for developing linear model equation one “SMD1” as well as for quadratic model equation two “SMD2.”

Findings

From results and findings, the effects of count and density and their interactions on the roughness of plain-woven fabric were found statistically significant for both linear and quadratic models at a confidence interval of 95%. The count has a positive correlation with surface roughness, while density has a negative correlation. The correlations revealed that models were strongly correlated at a confidence interval of 95% with adjusted R² of 0.8483 and R² of 0.9079, respectively. The RMSE values of the quadratic model equation and linear model equation were 0.1596 and 0.0747, respectively.

Originality/value

Thus, the quadratic model equation has better capability accuracy and reliability in predictions and evaluations of surface roughness than a linear model. These models can be used to select a suitable fabric for various end applications, and it was also used for tests and predicts surface roughness of plain-woven fabrics. The regression model helps to reduce the gap between the subjective and objective surface roughness measurement methods.

Details

Research Journal of Textile and Apparel, vol. 27 no. 2
Type: Research Article
ISSN: 1560-6074

Keywords

Article
Publication date: 1 February 2005

Trefor P. Williams

Ratios were constructed using bidding data for highway construction projects in Texas to study whether there are useful patterns in project bids that are indicators of the project…

1398

Abstract

Purpose

Ratios were constructed using bidding data for highway construction projects in Texas to study whether there are useful patterns in project bids that are indicators of the project completion cost. The use of the ratios to improve predictions of completed project cost was studied.

Design/methodology/approach

Ratios were calculated relating the second lowest bid, mean bid, and maximum bid to the low bid for the highway construction projects. Regression and neural network models were developed to predict the completed cost of the highway projects using bidding data. Models including the bidding ratios, low bid, second lowest bid, mean bid and maximum bid were developed. Natural log transformations were applied to the data to improve model performance.

Findings

Analysis of the bidding ratios indicates some relationship between high values of the bidding ratios and final project costs that deviate significantly from the low bid amount. Addition of the ratios to neural network and regression models to predict the completed project cost were not found to enhance the predictions. The best performing regression model used only the low bid as input. The best performing neural network model used the low bid and second lowest bid as inputs.

Originality/value

The nature of bid ratios that can describe the pattern of bids submitted for a project and the relationship of the ratios to project outcomes were studied. The ratio values may be useful indicators of project outcome that can be used by construction managers.

Details

Engineering, Construction and Architectural Management, vol. 12 no. 1
Type: Research Article
ISSN: 0969-9988

Keywords

1 – 10 of over 49000