Search results
1 – 10 of over 96000Muhammad Zahir Khan and Muhammad Farid Khan
A significant number of studies have been conducted to analyze and understand the relationship between gas emissions and global temperature using conventional statistical…
Abstract
Purpose
A significant number of studies have been conducted to analyze and understand the relationship between gas emissions and global temperature using conventional statistical approaches. However, these techniques follow assumptions of probabilistic modeling, where results can be associated with large errors. Furthermore, such traditional techniques cannot be applied to imprecise data. The purpose of this paper is to avoid strict assumptions when studying the complex relationships between variables by using the three innovative, up-to-date, statistical modeling tools: adaptive neuro-fuzzy inference systems (ANFIS), artificial neural networks (ANNs) and fuzzy time series models.
Design/methodology/approach
These three approaches enabled us to effectively represent the relationship between global carbon dioxide (CO2) emissions from the energy sector (oil, gas and coal) and the average global temperature increase. Temperature was used in this study (1900-2012). Investigations were conducted into the predictive power and performance of different fuzzy techniques against conventional methods and among the fuzzy techniques themselves.
Findings
A performance comparison of the ANFIS model against conventional techniques showed that the root means square error (RMSE) of ANFIS and conventional techniques were found to be 0.1157 and 0.1915, respectively. On the other hand, the correlation coefficients of ANN and the conventional technique were computed to be 0.93 and 0.69, respectively. Furthermore, the fuzzy-based time series analysis of CO2 emissions and average global temperature using three fuzzy time series modeling techniques (Singh, Abbasov–Mamedova and NFTS) showed that the RMSE of fuzzy and conventional time series models were 110.51 and 1237.10, respectively.
Social implications
The paper provides more awareness about fuzzy techniques application in CO2 emissions studies.
Originality/value
These techniques can be extended to other models to assess the impact of CO2 emission from other sectors.
Details
Keywords
Marc Gürtler and Thomas Paulsen
Empirical publications on the time series modeling and forecasting of electricity prices vary widely regarding the conditions, and the findings make it difficult to…
Abstract
Purpose
Empirical publications on the time series modeling and forecasting of electricity prices vary widely regarding the conditions, and the findings make it difficult to generalize results. Against this background, it is surprising that there is a lack of statistics-based literature reviews on the forecasting performance when comparing different models. The purpose of the present study is to fill this gap.
Design/methodology/approach
The authors conduct a comprehensive literature analysis from 2000 to 2015, covering 86 empirical studies on the time series modeling and forecasting of electricity spot prices. Various statistics are presented to characterize the empirical literature on electricity spot price modeling, and the forecasting performance of several model types and modifications is analyzed. The key issue of this study is to offer a comparison between different model types and modeling conditions regarding their forecasting performance, which is referred to as a quasi-meta-analysis, i.e. the analysis of analyses to achieve more general findings independent of the circumstances of single studies.
Findings
The authors find evidence that generalized autoregressive conditional heteroscedasticity models outperform their autoregressive–moving-average counterparts and that the consideration of explanatory variables improves forecasts.
Originality/value
To the best knowledge of the authors, this paper is the first to apply the methodology of meta-analyses in a literature review of the empirical forecasting literature on electricity spot markets.
Details
Keywords
In the last few decades, there has been growing interest in forecasting with computer intelligence, and both fuzzy time series (FTS) and artificial neural networks (ANNs…
Abstract
In the last few decades, there has been growing interest in forecasting with computer intelligence, and both fuzzy time series (FTS) and artificial neural networks (ANNs) have gained particular popularity, among others. Rather than the conventional methods (e.g., econometrics), FTS and ANN are usually thought to be immune to fundamental concepts such as stationarity, theoretical causality, post-sample control, among others. On the other hand, a number of studies significantly indicated that these fundamental controls are required in terms of the theory of forecasting, and even application of such essential procedures substantially improves the forecasting accuracy. The aim of this paper is to fill the existing gap on modeling and forecasting in the FTS and ANN methods and figure out the fundamental concepts in a comprehensive work through merits and common failures in the literature. In addition to these merits, this paper may also be a guideline for eliminating unethical empirical settings in the forecasting studies.
Details
Keywords
Marc Gürtler and Thomas Paulsen
Study conditions of empirical publications on time series modeling and forecasting of electricity prices vary widely, making it difficult to generalize results. The key…
Abstract
Purpose
Study conditions of empirical publications on time series modeling and forecasting of electricity prices vary widely, making it difficult to generalize results. The key purpose of the present study is to offer a comparison of different model types and modeling conditions regarding their forecasting performance.
Design/methodology/approach
The authors analyze the forecasting performance of AR (autoregressive), MA (moving average), ARMA (autoregressive moving average) and GARCH (generalized autoregressive moving average) models with and without the explanatory variables, that is, power consumption and power generation from wind and solar. Additionally, the authors vary the detailed model specifications (choice of lag-terms) and transformations (using differenced time series or log-prices) of data and, thereby, obtain individual results from various perspectives. All analyses are conducted on rolling calibrating and testing time horizons between 2010 and 2014 on the German/Austrian electricity spot market.
Findings
The main result is that the best forecasts are generated by ARMAX models after spike preprocessing and differencing the data.
Originality/value
The present study extends the existing literature on electricity price forecasting by conducting a comprehensive analysis of the forecasting performance of different time series models under varying market conditions. The results of this study, in general, support the decision-making of electricity spot price modelers or forecasting tools regarding the choice of data transformation, segmentation and the specific model selection.
Details
Keywords
The keywords from patent documents contain a lot of information of technology. If we analyze the time series of keywords, we will be able to understand even more about…
Abstract
Purpose
The keywords from patent documents contain a lot of information of technology. If we analyze the time series of keywords, we will be able to understand even more about technological evolution. The previous researches of time series processes in patent analysis were based on time series regression or the Box-Jenkins methodology. The methods dealt with continuous time series data. But the keyword time series data in patent analysis are not continuous, they are frequency integer values. So we need a new methodology for integer-valued time series model. The purpose of this paper is to propose modeling of integer-valued time series for patent analysis.
Design/methodology/approach
For modeling frequency data of keywords, the authors used integer-valued generalized autoregressive conditional heteroskedasticity model with Poisson and negative binomial distributions. Using the proposed models, the authors forecast the future trends of target keywords of Apple in order to know the future technology of Apple.
Findings
The authors carry out a case study to illustrate how the methodology can be applied to real problem. In this paper, the authors collect the patent documents issued by Apple, and analyze them to find the technological trend of Apple company. From the results of Apple case study, the authors can find which technological keywords are more important or critical in the entire structure of Apple’s technologies.
Practical implications
This paper contributes to the research and development planning for producing new products. The authors can develop and launch the innovative products to improve the technological competition of a company through complete understanding of the technological keyword trends.
Originality/value
The retrieved patent documents from the patent databases are not suitable for statistical analysis. So, the authors have to transform the documents into structured data suitable for statistics. In general, the structured data are a matrix consisting of patent (row) and keyword (column), and its element is an occurred frequency of a keyword in each patent. The data type is not continuous but discrete. However, in most researches, they were analyzed by statistical methods for continuous data. In this paper, the authors build a statistical model based on discrete data.
Details
Keywords
The purpose of this paper is to propose a new temporal disaggregation method for time series based on the accumulated and inverse accumulated generating operations in grey…
Abstract
Purpose
The purpose of this paper is to propose a new temporal disaggregation method for time series based on the accumulated and inverse accumulated generating operations in grey modeling and the interpolation method.
Design/methodology/approach
This disaggregation method includes three main steps, including accumulation, interpolation, and differentiation (AID). First, a low frequency flow series is transformed to the corresponding stock series through accumulated generating operation. Then, values of the stock series at unobserved time is estimated through appropriate interpolation method. And finally, the disaggregated stock series is transformed back to high frequency flow series through inverse accumulated generating operation.
Findings
The AID method is tested with a sales series. Results shows that the disaggregated sales data are satisfactory and reliable compared with the original data and disaggregated data using a time series model. The AID method is applicable to both long time series and grey series with insufficient information.
Practical implications
The AID method can be easily used to disaggregate low frequency flow series.
Originality/value
The AID method is a combination of grey modeling technique and interpolation method. Compared with other disaggregation methods, the AID method is simple, and does not require auxiliary information or plausible minimizing criterion required by other disaggregation methods.
Details
Keywords
A. Kullaya Swamy and Sarojamma B.
Data mining plays a major role in forecasting the open price details of the stock market. However, it fails to address the dimensionality and expectancy of a naive…
Abstract
Purpose
Data mining plays a major role in forecasting the open price details of the stock market. However, it fails to address the dimensionality and expectancy of a naive investor. Hence, this paper aims to study a future prediction model named time series model is implemented.
Design/methodology/approach
In this model, the stock market data are fed to the proposed deep neural networks (DBN), and the number of hidden neurons is optimized by the modified JAYA Algorithm (JA), based on the fitness function. Hence, the algorithm is termed as fitness-oriented JA (FJA), and the proposed model is termed as FJA-DBN. The primary objective of this open price forecasting model is the minimization of the error function between the modeled and actual output.
Findings
The performance analysis demonstrates that the deviation of FJA–DBN in predicting the open price details of the Tata Motors, Reliance Power and Infosys data shows better performance in terms of mean error percentage, symmetric mean absolute percentage error, mean absolute scaled error, mean absolute error, root mean square error, L1-norm, L2-Norm and Infinity-Norm (least infinity error).
Research limitations/implications
The proposed model can be used to forecast the open price details.
Practical implications
The investors are constantly reviewing past pricing history and using it to influence their future investment decisions. There are some basic assumptions used in this analysis, first being that everything significant about a company is already priced into the stock, other being that the price moves in trends
Originality/value
This paper presents a technique for time series modeling using JA. This is the first work that uses FJA-based optimization for stock market open price prediction.
Details
Keywords
Doris Chenguang Wu, Haiyan Song and Shujie Shen
The purpose of this paper is to review recent studies published from 2007 to 2015 on tourism and hotel demand modeling and forecasting with a view to identifying the…
Abstract
Purpose
The purpose of this paper is to review recent studies published from 2007 to 2015 on tourism and hotel demand modeling and forecasting with a view to identifying the emerging topics and methods studied and to pointing future research directions in the field.
Design/methodology/approach
Articles on tourism and hotel demand modeling and forecasting published mostly in both science citation index and social sciences citation index journals were identified and analyzed.
Findings
This review finds that the studies focused on hotel demand are relatively less than those on tourism demand. It is also observed that more and more studies have moved away from the aggregate tourism demand analysis, whereas disaggregate markets and niche products have attracted increasing attention. Some studies have gone beyond neoclassical economic theory to seek additional explanations of the dynamics of tourism and hotel demand, such as environmental factors, tourist online behavior and consumer confidence indicators, among others. More sophisticated techniques such as nonlinear smooth transition regression, mixed-frequency modeling technique and nonparametric singular spectrum analysis have also been introduced to this research area.
Research limitations/implications
The main limitation of this review is that the articles included in this study only cover the English literature. Future review of this kind should also include articles published in other languages. The review provides a useful guide for researchers who are interested in future research on tourism and hotel demand modeling and forecasting.
Practical implications
This review provides important suggestions and recommendations for improving the efficiency of tourism and hospitality management practices.
Originality/value
The value of this review is that it identifies the current trends in tourism and hotel demand modeling and forecasting research and points out future research directions.
Details
Keywords
Pawel D. Domanski and Mateusz Gintrowski
This paper aims to present the results of the comparison between different approaches to the prediction of electricity prices. It is well-known that the properties of the…
Abstract
Purpose
This paper aims to present the results of the comparison between different approaches to the prediction of electricity prices. It is well-known that the properties of the data generation process may prefer some modeling methods over the others. The data having an origin in social or market processes are characterized by unexpectedly wide realization space resulting in the existence of the long tails in the probabilistic density function. These data may not be easy in time series prediction using standard approaches based on the normal distribution assumptions. The electricity prices on the deregulated market fall into this category.
Design/methodology/approach
The paper presents alternative approaches, i.e. memory-based prediction and fractal approach compared with established nonlinear method of neural networks. The appropriate interpretation of results is supported with the statistical data analysis and data conditioning. These algorithms have been applied to the problem of the energy price prediction on the deregulated electricity market with data from Polish and Austrian energy stock exchanges.
Findings
The first outcome of the analysis is that there are several situations in the task of time series prediction, when standard modeling approach based on the assumption that each change is independent of the last following random Gaussian bell pattern may not be a true. In this paper, such a case was considered: price data from energy markets. Electricity prices data are biased by the human nature. It is shown that more relevant for data properties was Cauchy probabilistic distribution. Results have shown that alternative approaches may be used and prediction for both data memory-based approach resulted in the best performance.
Research limitations/implications
“Personalization” of the model is crucial aspect in the whole methodology. All available knowledge should be used on the forecasted phenomenon and incorporate it into the model. In case of the memory-based modeling, it is a specific design of the history searching routine that uses the understanding of the process features. Importance should shift toward methodology structure design and algorithm customization and then to parameter estimation. Such modeling approach may be more descriptive for the user enabling understanding of the process and further iterative improvement in a continuous striving for perfection.
Practical implications
Memory-based modeling can be practically applied. These models have large potential that is worth to be exploited. One disadvantage of this modeling approach is large calculation effort connected with a need of constant evaluation of large data sets. It was shown that a graphics processing unit (GPU) approach through parallel calculation on the graphical cards can improve it dramatically.
Social implications
The modeling of the electricity prices has big impact of the daily operation of the electricity traders and distributors. From one side, appropriate modeling can improve performance mitigating risks associated with the process. Thus, the end users should receive higher quality of services ultimately with lower prices and minimized risk of the energy loss incidents.
Originality/value
The use of the alternative approaches, such as memory-based reasoning or fractals, is very rare in the field of the electricity price forecasting. Thus, it gives a new impact for further research enabling development of better solutions incorporating all available process knowledge and customized hybrid algorithms.
Details