Search results
1 – 10 of 385The linear regression technique is widely used to determine empirical parameters of fatigue life profile while the results may not continuously depend on experimental data. Thus…
Abstract
Purpose
The linear regression technique is widely used to determine empirical parameters of fatigue life profile while the results may not continuously depend on experimental data. Thus Tikhonov-Morozov method is utilized here to regularize the linear regression results and consequently reduces the influence of measurement noise without notably distorting the fatigue life distribution. The paper aims to discuss these issues.
Design/methodology/approach
Tikhonov-Morozov regularization method would be shown to effectively reduce the influences of measurement noise without distorting the fatigue life distribution. Moreover since iterative regularization methods are known to be an attractive alternative to Tikhonov regularization, four gradient iterative methods called as simple iteration, minimum error, steepest descent and conjugate gradient methods are examined with an appropriate initial guess of regularized coefficients.
Findings
It has been shown that in case of sparse fatigue life measurements, linear regression results may not have continuous dependence on experimental data and measurement error could lead to misinterpretations of the solution. Therefore from engineering safety point of view, utilizing regularization method could successfully reduce the influence of measurement noise without significantly distorting the fatigue life distribution.
Originality/value
An excellent initial guess for mixed iterative-direct algorithm is introduced and it has been shown that the combination of Newton iterative approach and Morozov discrepancy principle is one of the interesting strategies for determination of regularization parameter having an excellent rate of convergence. Moreover since iterative methods are known to be an attractive alternative to Tikhonov regularization, four gradient descend methods are examined here for regularization of the linear regression problem. It has been found that all of gradient decent methods with an appropriate initial guess of regularized coefficients have an excellent convergence to Tikhonov-Morozov regularization results.
Details
Keywords
The purpose of this paper is to develop a quantitative model of problem solving performance of students in the computer-based mathematics learning environment.
Abstract
Purpose
The purpose of this paper is to develop a quantitative model of problem solving performance of students in the computer-based mathematics learning environment.
Design/methodology/approach
Regularized logistic regression was used to create a quantitative model of problem solving performance of students that predicts whether students can solve a mathematics problem correctly based on how well they solved other problems in the past. The usefulness of the model was evaluated by comparing the predicted probability of correct problem solving to the actual problem solving performance on the data set that was not used in the model building process.
Findings
The regularized logistic regression model showed a better predictive power than the standard Bayesian Knowledge Tracing model, the most frequently used quantitative model of student learning in the Educational Data Mining research.
Originality/value
Providing instructional scaffolding is critical in order to facilitate student learning. However, most computer-based learning environments use heuristics or rely on the discretion of students when they determine whether instructional scaffolding needs be provided. The predictive model of problem solving performance of students can be used as a quantitative guideline that can help make a better decision on when to provide instructional supports and guidance in the computer-based learning environment, which can potentially maximize the learning outcome of students.
Details
Keywords
Ajit Kumar and A.K. Ghosh
The purpose of this study is to estimate aerodynamic parameters using regularized regression-based methods.
Abstract
Purpose
The purpose of this study is to estimate aerodynamic parameters using regularized regression-based methods.
Design/methodology/approach
Regularized regression methods used are LASSO, ridge and elastic net.
Findings
A viable option of aerodynamic parameter estimation from regularized regression-based methods is found.
Practical implications
Efficacy of the methods is examined on flight test data.
Originality/value
This study provides regularized regression-based methods for aerodynamic parameter estimation from the flight test data.
Details
Keywords
Fengjun Tian, Yang Yang, Zhenxing Mao and Wenyue Tang
This paper aims to compare the forecasting performance of different models with and without big data predictors from search engines and social media.
Abstract
Purpose
This paper aims to compare the forecasting performance of different models with and without big data predictors from search engines and social media.
Design/methodology/approach
Using daily tourist arrival data to Mount Longhu, China in 2018 and 2019, the authors estimated ARMA, ARMAX, Markov-switching auto-regression (MSAR), lasso model, elastic net model and post-lasso and post-elastic net models to conduct one- to seven-days-ahead forecasting. Search engine data and social media data from WeChat, Douyin and Weibo were incorporated to improve forecasting accuracy.
Findings
Results show that search engine data can substantially reduce forecasting error, whereas social media data has very limited value. Compared to the ARMAX/MSAR model without big data predictors, the corresponding post-lasso model reduced forecasting error by 39.29% based on mean square percentage error, 33.95% based on root mean square percentage error, 46.96% based on root mean squared error and 45.67% based on mean absolute scaled error.
Practical implications
Results highlight the importance of incorporating big data predictors into daily demand forecasting for tourism attractions.
Originality/value
This study represents a pioneering attempt to apply the regularized regression (e.g. lasso model and elastic net) in tourism forecasting and to explore various daily big data indicators across platforms as predictors.
Details
Keywords
Survival (default) data are frequently encountered in financial (especially credit risk), medical, educational, and other fields, where the “default” can be interpreted as the…
Abstract
Survival (default) data are frequently encountered in financial (especially credit risk), medical, educational, and other fields, where the “default” can be interpreted as the failure to fulfill debt payments of a specific company or the death of a patient in a medical study or the inability to pass some educational tests.
This paper introduces the basic ideas of Cox's original proportional model for the hazard rates and extends the model within a general framework of statistical data mining procedures. By employing regularization, basis expansion, boosting, bagging, Markov chain Monte Carlo (MCMC) and many other tools, we effectively calibrate a large and flexible class of proportional hazard models.
The proposed methods have important applications in the setting of credit risk. For example, the model for the default correlation through regularization can be used to price credit basket products, and the frailty factor models can explain the contagion effects in the defaults of multiple firms in the credit market.
Virok Sharma, Mohd Zaki, Kumar Neeraj Jha and N. M. Anoop Krishnan
This paper aims to use a data-driven approach towards optimizing construction operations. To this extent, it presents a machine learning (ML)-aided optimization approach, wherein…
Abstract
Purpose
This paper aims to use a data-driven approach towards optimizing construction operations. To this extent, it presents a machine learning (ML)-aided optimization approach, wherein the construction cost is predicted as a function of time, resources and environmental impact, which is further used as a surrogate model for cost optimization.
Design/methodology/approach
Taking a dataset from literature, the paper has applied various ML algorithms, namely, simple and regularized linear regression, random forest, gradient boosted trees, neural network and Gaussian process regression (GPR) to predict the construction cost as a function of time, resources and environmental impact. Further, the trained models were used to optimize the construction cost applying single-objective (with and without constraints) and multi-objective optimizations, employing Bayesian optimization, particle swarm optimization (PSO) and non-dominated sorted genetic algorithm.
Findings
The results presented in the paper demonstrate that the ensemble methods, such as gradient boosted trees, exhibit the best performance for construction cost prediction. Further, it shows that multi-objective optimization can be used to develop a Pareto front for two competing variables, such as cost and environmental impact, which directly allows a practitioner to make a rational decision.
Research limitations/implications
Note that the sequential nature of events which dictates the scheduling is not considered in the present work. This aspect could be incorporated in the future to develop a robust scheme that can optimize the scheduling dynamically.
Originality/value
The paper demonstrates that a ML approach coupled with optimization could enable the development of an efficient and economic strategy to plan the construction operations.
Details
Keywords
Alex A.T. Rathke, Amaury José Rezende and Christoph Watrin
This study investigates the impact of different transfer pricing rules on tax-induced profit shifting. Existing studies create different enforcement rankings of countries based on…
Abstract
Purpose
This study investigates the impact of different transfer pricing rules on tax-induced profit shifting. Existing studies create different enforcement rankings of countries based on specific transfer pricing provisions on the assumption that larger penalties and more extensive information requirements imply higher tax enforcement. This assumption carries limitations related to the impact of transfer pricing rules in different countries and to the interaction of different tax rules. Instead, the authors propose a nonordered segregation of groups of countries with different transfer pricing rules, and they empirically investigate the impact of these transfer pricing rules on the profit-shifting behavior of firms.
Design/methodology/approach
The authors apply the hierarchical clustering method to analyze 57 observable quantitative and qualitative characteristics of transfer pricing rules of each country. This approach allows the creation of groups of countries based on a comprehensive set of regulatory characteristics, to investigate evidence of profit shifting for each of these separate groups. Profit-shifting behavior is measured by the variation in the volume of import and export transactions between local firms and related parties located in other countries.
Findings
The results indicate that firms have a higher volume of intrafirm transactions with related parties located in countries with a lower tax rate. This result is consistent with the profit-shifting hypothesis. Moreover, the results show that relevant differences in transfer pricing rules across countries produce different effects on the volume of intrafirm transactions. The authors observe that the existence of domestic transfer pricing rules that override the OECD Transfer Pricing Guidelines may inhibit profit shifting. In addition, the results suggest that the OECD guidelines may facilitate profit shifting. Overall, it is observed that some transfer pricing rules may be more effective than others in curbing profit shifting and that firms are still able to manipulate transfer prices under some tax rules.
Research limitations/implications
(1) The authors focus on the Brazilian context, which provides a suitable set of profit-shifting incentives for the analysis, since it combines an extreme corporate tax rate, a highly complex tax system, and a unique set of transfer pricing rules. (2) Profit-shifting behavior is captured by the volume of intrafirm transactions. The authors would prefer to observe the transfer price directly; however, this information is not disclosed by firms, for it may represent a limitation to the investigation. Nonetheless, theory shows that the profit-shifting behavior is reflected by the manipulation of both transfer prices and intra-firm outputs.
Practical implications
The authors find that the volume of intrafirm transactions may decrease or increase, depending on the transfer pricing system of the foreign country (including the tax-differential effect). It suggests that some transfer pricing rules are more effective than others in curtailing the profit-shifting behavior and that firms are still able to find vulnerabilities in current rules and take advantage of them in deploying a profit-shifting strategy.
Social implications
Results provide knowledge about how key differences on transfer pricing rules across countries influence the profit-shifting behavior. The results of the study may have valuable application in solving regulatory mismatches, to eliminate blind spots in transfer pricing rules and thus to contribute to the current review of OECD guidelines and to the global tax reset movement.
Originality/value
Recent studies suggest that if tax-avoidance incentives are somewhat weak, it becomes difficult to observe the shifting behavior of firms. The puzzle is to check whether profit shifting is nonexistent under weak incentives or whether this is a matter of methodological limitations. The authors’ analysis is applied to a complex tax background with strong profit-shifting incentives; thus, it allows the authors to obtain robust evidences of the shifting behavior and the effect of different transfer pricing rules.
Details
Keywords
Adrian Gepp, Martina K. Linnenluecke, Terrence J. O’Neill and Tom Smith
This paper analyses the use of big data techniques in auditing, and finds that the practice is not as widespread as it is in other related fields. We first introduce contemporary…
Abstract
This paper analyses the use of big data techniques in auditing, and finds that the practice is not as widespread as it is in other related fields. We first introduce contemporary big data techniques to promote understanding of their potential application. Next, we review existing research on big data in accounting and finance. In addition to auditing, our analysis shows that existing research extends across three other genealogies: financial distress modelling, financial fraud modelling, and stock market prediction and quantitative modelling. Auditing is lagging behind the other research streams in the use of valuable big data techniques. A possible explanation is that auditors are reluctant to use techniques that are far ahead of those adopted by their clients, but we refute this argument. We call for more research and a greater alignment to practice. We also outline future opportunities for auditing in the context of real-time information and in collaborative platforms and peer-to-peer marketplaces.
Details
Keywords
Mohammad Arshad Rahman and Shubham Karnawat
This article is motivated by the lack of flexibility in Bayesian quantile regression for ordinal models where the error follows an asymmetric Laplace (AL) distribution. The…
Abstract
This article is motivated by the lack of flexibility in Bayesian quantile regression for ordinal models where the error follows an asymmetric Laplace (AL) distribution. The inflexibility arises because the skewness of the distribution is completely specified when a quantile is chosen. To overcome this shortcoming, we derive the cumulative distribution function (and the moment-generating function) of the generalized asymmetric Laplace (GAL) distribution – a generalization of AL distribution that separates the skewness from the quantile parameter – and construct a working likelihood for the ordinal quantile model. The resulting framework is termed flexible Bayesian quantile regression for ordinal (FBQROR) models. However, its estimation is not straightforward. We address estimation issues and propose an efficient Markov chain Monte Carlo (MCMC) procedure based on Gibbs sampling and joint Metropolis–Hastings algorithm. The advantages of the proposed model are demonstrated in multiple simulation studies and implemented to analyze public opinion on homeownership as the best long-term investment in the United States following the Great Recession.
Details
Keywords
Francesco Moscone, Veronica Vinciotti and Elisa Tosetti
This chapter reviews graphical modeling techniques for estimating large covariance matrices and their inverse. The chapter provides a selective survey of different models and…
Abstract
This chapter reviews graphical modeling techniques for estimating large covariance matrices and their inverse. The chapter provides a selective survey of different models and estimators proposed by the graphical modeling literature and offers some practical examples where these methods could be applied in the area of health economics.
Details