Search results

1 – 10 of over 7000
Open Access
Article
Publication date: 13 August 2020

Mariam AlKandari and Imtiaz Ahmad

Solar power forecasting will have a significant impact on the future of large-scale renewable energy plants. Predicting photovoltaic power generation depends heavily on climate…

10533

Abstract

Solar power forecasting will have a significant impact on the future of large-scale renewable energy plants. Predicting photovoltaic power generation depends heavily on climate conditions, which fluctuate over time. In this research, we propose a hybrid model that combines machine-learning methods with Theta statistical method for more accurate prediction of future solar power generation from renewable energy plants. The machine learning models include long short-term memory (LSTM), gate recurrent unit (GRU), AutoEncoder LSTM (Auto-LSTM) and a newly proposed Auto-GRU. To enhance the accuracy of the proposed Machine learning and Statistical Hybrid Model (MLSHM), we employ two diversity techniques, i.e. structural diversity and data diversity. To combine the prediction of the ensemble members in the proposed MLSHM, we exploit four combining methods: simple averaging approach, weighted averaging using linear approach and using non-linear approach, and combination through variance using inverse approach. The proposed MLSHM scheme was validated on two real-time series datasets, that sre Shagaya in Kuwait and Cocoa in the USA. The experiments show that the proposed MLSHM, using all the combination methods, achieved higher accuracy compared to the prediction of the traditional individual models. Results demonstrate that a hybrid model combining machine-learning methods with statistical method outperformed a hybrid model that only combines machine-learning models without statistical method.

Details

Applied Computing and Informatics, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2634-1964

Keywords

Open Access
Article
Publication date: 13 April 2022

Florian Schuberth, Manuel E. Rademaker and Jörg Henseler

This study aims to examine the role of an overall model fit assessment in the context of partial least squares path modeling (PLS-PM). In doing so, it will explain when it is…

6004

Abstract

Purpose

This study aims to examine the role of an overall model fit assessment in the context of partial least squares path modeling (PLS-PM). In doing so, it will explain when it is important to assess the overall model fit and provides ways of assessing the fit of composite models. Moreover, it will resolve major concerns about model fit assessment that have been raised in the literature on PLS-PM.

Design/methodology/approach

This paper explains when and how to assess the fit of PLS path models. Furthermore, it discusses the concerns raised in the PLS-PM literature about the overall model fit assessment and provides concise guidelines on assessing the overall fit of composite models.

Findings

This study explains that the model fit assessment is as important for composite models as it is for common factor models. To assess the overall fit of composite models, researchers can use a statistical test and several fit indices known through structural equation modeling (SEM) with latent variables.

Research limitations/implications

Researchers who use PLS-PM to assess composite models that aim to understand the mechanism of an underlying population and draw statistical inferences should take the concept of the overall model fit seriously.

Practical implications

To facilitate the overall fit assessment of composite models, this study presents a two-step procedure adopted from the literature on SEM with latent variables.

Originality/value

This paper clarifies that the necessity to assess model fit is not a question of which estimator will be used (PLS-PM, maximum likelihood, etc). but of the purpose of statistical modeling. Whereas, the model fit assessment is paramount in explanatory modeling, it is not imperative in predictive modeling.

Details

European Journal of Marketing, vol. 57 no. 6
Type: Research Article
ISSN: 0309-0566

Keywords

Open Access
Article
Publication date: 17 October 2019

Petros Maravelakis

The purpose this paper is to review some of the statistical methods used in the field of social sciences.

46832

Abstract

Purpose

The purpose this paper is to review some of the statistical methods used in the field of social sciences.

Design/methodology/approach

A review of some of the statistical methodologies used in areas like survey methodology, official statistics, sociology, psychology, political science, criminology, public policy, marketing research, demography, education and economics.

Findings

Several areas are presented such as parametric modeling, nonparametric modeling and multivariate methods. Focus is also given to time series modeling, analysis of categorical data and sampling issues and other useful techniques for the analysis of data in the social sciences. Indicative references are given for all the above methods along with some insights for the application of these techniques.

Originality/value

This paper reviews some statistical methods that are used in social sciences and the authors draw the attention of researchers on less popular methods. The purpose is not to give technical details and also not to refer to all the existing techniques or to all the possible areas of statistics. The focus is mainly on the applied aspect of the techniques and the authors give insights about techniques that can be used to answer problems in the abovementioned areas of research.

Details

Journal of Humanities and Applied Social Sciences, vol. 1 no. 2
Type: Research Article
ISSN:

Keywords

Content available
Book part
Publication date: 27 December 2016

Abstract

Details

Bad to Good
Type: Book
ISBN: 978-1-78635-333-7

Open Access
Article
Publication date: 25 April 2023

Manuela Cazzaro and Paola Maddalena Chiodini

Although the Net Promoter Score (NPS) index is simple, NPS has weaknesses that make NPS's interpretation misleading. The main criticism is that identical index values can…

1278

Abstract

Purpose

Although the Net Promoter Score (NPS) index is simple, NPS has weaknesses that make NPS's interpretation misleading. The main criticism is that identical index values can correspond to different levels of customer loyalty. This makes difficult to determine whether the company is improving/deteriorating in two different years. The authors describe the application of statistical tools to establish whether identical values may/may not be considered similar under statistical hypotheses.

Design/methodology/approach

Equal NPSs with a “similar” component composition should have a two-way table satisfying marginal homogeneity hypothesis. The authors compare the marginals using a cumulative marginal logit model that assumes a proportional odds structure: the model has the same effect for each logit. Marginal homogeneity corresponds to null effect. If the marginal homogeneity hypothesis is rejected, the cumulative odds ratio becomes a tool for measuring the proportionality between the odds.

Findings

The authors propose an algorithm that helps managers in their decision-making process. The authors' methodology provides a statistical tool to recognize customer base compositions. The authors suggest a statistical test of the marginal distribution homogeneity of the table representing the index compositions at two times. Through the calculation of cumulative odds ratios, the authors discriminate against the hypothesis of equality of the NPS.

Originality/value

The authors' contribution provides a statistical alternative that can be easily implemented by business operators to fill the known shortcomings of the index in the customer satisfaction's context. This paper confirms that although a single number summarizes and communicates a complex situation very quickly, the number is ambiguous and unreliable if not accompanied by other tools.

Details

The TQM Journal, vol. 35 no. 9
Type: Research Article
ISSN: 1754-2731

Keywords

Content available
Book part
Publication date: 2 July 2004

Abstract

Details

Functional Structure and Approximation in Econometrics
Type: Book
ISBN: 978-0-44450-861-4

Open Access
Article
Publication date: 17 August 2022

Jörg Henseler and Florian Schuberth

In their paper titled “A Miracle of Measurement or Accidental Constructivism? How PLS Subverts the Realist Search for Truth,” Cadogan and Lee (2022) cast serious doubt on PLS’s…

2077

Abstract

Purpose

In their paper titled “A Miracle of Measurement or Accidental Constructivism? How PLS Subverts the Realist Search for Truth,” Cadogan and Lee (2022) cast serious doubt on PLS’s suitability for scientific studies. The purpose of this commentary is to discuss the claims of Cadogan and Lee, correct some inaccuracies, and derive recommendations for researchers using structural equation models.

Design/methodology/approach

This paper uses scenario analysis to show which estimators are appropriate for reflective measurement models and composite models, and formulates the statistical model that underlies PLS Mode A. It also contrasts two different perspectives: PLS as an estimator for structural equation models vs. PLS-SEM as an overarching framework with a sui generis logic.

Findings

There are different variants of PLS, which include PLS, consistent PLS, PLSe1, PLSe2, proposed ordinal PLS and robust PLS, each of which serves a particular purpose. All of these are appropriate for scientific inquiry if applied properly. It is not PLS that subverts the realist search for truth, but some proponents of a framework called “PLS-SEM.” These proponents redefine the term “reflective measurement,” argue against the assessment of model fit and suggest that researchers could obtain “confirmation” for their model.

Research limitations/implications

Researchers should be more conscious, open and respectful regarding different research paradigms.

Practical implications

Researchers should select a statistical model that adequately represents their theory, not necessarily a common factor model, and formulate their model explicitly. Particularly for instrumentalists, pragmatists and constructivists, the composite model appears promising. Researchers should be concerned about their estimator’s properties, not about whether it is called “PLS.” Further, researchers should critically evaluate their model, not seek confirmation or blindly believe in its value.

Originality/value

This paper critically appraises Cadogan and Lee (2022) and reminds researchers who wish to use structural equation modeling, particularly PLS, for their statistical analysis, of some important scientific principles.

Content available
Article
Publication date: 26 October 2012

512

Abstract

Details

Journal of Modelling in Management, vol. 7 no. 3
Type: Research Article
ISSN: 1746-5664

Open Access
Article
Publication date: 15 February 2024

Di Kang, Steven W. Kirkpatrick, Zhipeng Zhang, Xiang Liu and Zheyong Bian

Accurately estimating the severity of derailment is a crucial step in quantifying train derailment consequences and, thereby, mitigating its impacts. The purpose of this paper is…

Abstract

Purpose

Accurately estimating the severity of derailment is a crucial step in quantifying train derailment consequences and, thereby, mitigating its impacts. The purpose of this paper is to propose a simplified approach aimed at addressing this research gap by developing a physics-informed 1-D model. The model is used to simulate train dynamics through a time-stepping algorithm, incorporating derailment data after the point of derailment.

Design/methodology/approach

In this study, a simplified approach is adopted that applies a 1-D kinematic analysis with data obtained from various derailments. These include the length and weight of the rail cars behind the point of derailment, the train braking effects, derailment blockage forces, the grade of the track and the train rolling and aerodynamic resistance. Since train braking/blockage effects and derailment blockage forces are not always available for historical or potential train derailment, it is also necessary to fit the historical data and find optimal parameters to estimate these two variables. Using these fitted parameters, a detailed comparison can be performed between the physics-informed 1-D model and previous statistical models to predict the derailment severity.

Findings

The results show that the proposed model outperforms the Truncated Geometric model (the latest statistical model used in prior research) in estimating derailment severity. The proposed model contributes to the understanding and prevention of train derailments and hazmat release consequences, offering improved accuracy for certain scenarios and train types

Originality/value

This paper presents a simplified physics-informed 1-D model, which could help understand the derailment mechanism and, thus, is expected to estimate train derailment severity more accurately for certain scenarios and train types compared with the latest statistical model. The performance of the braking response and the 1-D model is verified by comparing known ride-down profiles with estimated ones. This validation process ensures that both the braking response and the 1-D model accurately represent the expected behavior.

Details

Smart and Resilient Transportation, vol. 6 no. 1
Type: Research Article
ISSN: 2632-0487

Keywords

Open Access
Article
Publication date: 8 April 2020

Isabel María Parra Oller, Salvador Cruz Rambaud and María del Carmen Valls Martínez

The main purpose of this paper is to determine the discount function which better fits the individuals' preferences through the empirical analysis of the different functions used…

3476

Abstract

Purpose

The main purpose of this paper is to determine the discount function which better fits the individuals' preferences through the empirical analysis of the different functions used in the field of intertemporal choice.

Design/methodology/approach

After an in-depth revision of the existing literature and unlike most studies which only focus on exponential and hyperbolic discounting, this manuscript compares the adjustment of data to six different discount functions. To do this, the analysis is based on the usual statistical methods, and the non-linear least squares regression, through the algorithm of Gauss-Newton, in order to estimate the models' parameters; finally, the AICc method is used to compare the significance of the six proposed models.

Findings

This paper shows that the so-called q-exponential function deformed by the amount is the model which better explains the individuals' preferences on both delayed gains and losses. To the extent of the authors' knowledge, this is the first time that a function different from the general hyperbola fits better to the individuals' preferences.

Originality/value

This paper contributes to the search of an alternative model able to explain the individual behavior in a more realistic way.

Details

European Journal of Management and Business Economics, vol. 30 no. 1
Type: Research Article
ISSN: 2444-8451

Keywords

1 – 10 of over 7000