Search results

1 – 10 of over 1000
Article
Publication date: 22 July 2014

Qingfeng Hu, Bin Lei, Kaifeng Ma and Tiesheng Wang

In order to study the laws of surface subsidence due to mining coal under thick alluvium of Quandian mine, two survey lines were designed on the surface above the first face. By…

Abstract

In order to study the laws of surface subsidence due to mining coal under thick alluvium of Quandian mine, two survey lines were designed on the surface above the first face. By analyzing the position of the maximum subsidence point of the tendency survey line, the zenith angle was obtained, and the correctness of taking bedrock thickness instead of average mining depth to design the position of the trend survey line in the mining area was confirmed; and based on the analyses of the measured data, the critical size of the gob full opening in the geological and mining conditions didn't follow the general law that the gob would reach the critical size when the panel has moved for a distance of 1.2H0 - 1.4H0. These results showed that the bedrock played a major role in the spread of whole overburden subsidence in the geological and mining conditions, while the loose layer had less affection. In addition, based on the analyses of the measured data, some rock movement parameters and probability integral prediction parameters of the surface subsidence of Quandian mine in the geological and mining conditions have been obtained. The study would have some important theoretical significance and practical guidance for implementing mining beneath buildings, water bodies and railways and prevent of this coal field as well as surface structure damage control.

Details

World Journal of Engineering, vol. 11 no. 3
Type: Research Article
ISSN: 1708-5284

Keywords

Book part
Publication date: 1 December 2016

Jacob Dearmon and Tony E. Smith

Statistical methods of spatial analysis are often successful at either prediction or explanation, but not necessarily both. In a recent paper, Dearmon and Smith (2016) showed that…

Abstract

Statistical methods of spatial analysis are often successful at either prediction or explanation, but not necessarily both. In a recent paper, Dearmon and Smith (2016) showed that by combining Gaussian Process Regression (GPR) with Bayesian Model Averaging (BMA), a modeling framework could be developed in which both needs are addressed. In particular, the smoothness properties of GPR together with the robustness of BMA allow local spatial analyses of individual variable effects that yield remarkably stable results. However, this GPR-BMA approach is not without its limitations. In particular, the standard (isotropic) covariance kernel of GPR treats all explanatory variables in a symmetric way that limits the analysis of their individual effects. Here we extend this approach by introducing a mixture of kernels (both isotropic and anisotropic) which allow different length scales for each variable. To do so in a computationally efficient manner, we also explore a number of Bayes-factor approximations that avoid the need for costly reversible-jump Monte Carlo methods.

To demonstrate the effectiveness of this Variable Length Scale (VLS) model in terms of both predictions and local marginal analyses, we employ selected simulations to compare VLS with Geographically Weighted Regression (GWR), which is currently the most popular method for such spatial modeling. In addition, we employ the classical Boston Housing data to compare VLS not only with GWR but also with other well-known spatial regression models that have been applied to this same data. Our main results are to show that VLS not only compares favorably with spatial regression at the aggregate level but is also far more accurate than GWR at the local level.

Details

Spatial Econometrics: Qualitative and Limited Dependent Variables
Type: Book
ISBN: 978-1-78560-986-2

Keywords

Book part
Publication date: 30 November 2011

Massimo Guidolin

I review the burgeoning literature on applications of Markov regime switching models in empirical finance. In particular, distinct attention is devoted to the ability of Markov…

Abstract

I review the burgeoning literature on applications of Markov regime switching models in empirical finance. In particular, distinct attention is devoted to the ability of Markov Switching models to fit the data, filter unknown regimes and states on the basis of the data, to allow a powerful tool to test hypotheses formulated in light of financial theories, and to their forecasting performance with reference to both point and density predictions. The review covers papers concerning a multiplicity of sub-fields in financial economics, ranging from empirical analyses of stock returns, the term structure of default-free interest rates, the dynamics of exchange rates, as well as the joint process of stock and bond returns.

Details

Missing Data Methods: Time-Series Methods and Applications
Type: Book
ISBN: 978-1-78052-526-6

Keywords

Book part
Publication date: 3 June 2008

Nathaniel T. Wilcox

Choice under risk has a large stochastic (unpredictable) component. This chapter examines five stochastic models for binary discrete choice under risk and how they combine with…

Abstract

Choice under risk has a large stochastic (unpredictable) component. This chapter examines five stochastic models for binary discrete choice under risk and how they combine with “structural” theories of choice under risk. Stochastic models are substantive theoretical hypotheses that are frequently testable in and of themselves, and also identifying restrictions for hypothesis tests, estimation and prediction. Econometric comparisons suggest that for the purpose of prediction (as opposed to explanation), choices of stochastic models may be far more consequential than choices of structures such as expected utility or rank-dependent utility.

Details

Risk Aversion in Experiments
Type: Book
ISBN: 978-1-84950-547-5

Article
Publication date: 11 October 2019

Wei Qin, Huichun Lv, Chengliang Liu, Datta Nirmalya and Peyman Jahanshahi

With the promotion of lithium-ion battery, it is more and more important to ensure the safety usage of the battery. The purpose of this paper is to analyze the battery operation…

Abstract

Purpose

With the promotion of lithium-ion battery, it is more and more important to ensure the safety usage of the battery. The purpose of this paper is to analyze the battery operation data and estimate the remaining life of the battery, and provide effective information to the user to avoid the risk of battery accidents.

Design/methodology/approach

The particle filter (PF) algorithm is taken as the core, and the double-exponential model is used as the state equation and the artificial neural network is used as the observation equation. After the importance resampling process, the battery degradation curve is obtained after getting the posterior parameter, and then the system could estimate remaining useful life (RUL).

Findings

Experiments were carried out by using the public data set. The results show that the Bayesian-based posterior estimation model has a good predictive effect and fits the degradation curve of the battery well, and the prediction accuracy will increase gradually as the cycle increases.

Originality/value

This paper combines the advantages of the data-driven method and PF algorithm. The proposed method has good prediction accuracy and has an uncertain expression on the RUL of the battery. Besides, the method proposed is relatively easy to implement in the battery management system, which has high practical value and can effectively avoid battery using risk for driver safety.

Details

Industrial Management & Data Systems, vol. 120 no. 2
Type: Research Article
ISSN: 0263-5577

Keywords

Article
Publication date: 17 October 2023

Zhixun Wen, Fei Li and Ming Li

The purpose of this paper is to apply the concept of equivalent initial flaw size (EIFS) to the anisotropic nickel-based single crystal (SX) material, and to predict the fatigue…

Abstract

Purpose

The purpose of this paper is to apply the concept of equivalent initial flaw size (EIFS) to the anisotropic nickel-based single crystal (SX) material, and to predict the fatigue life on this basis. The crack propagation law of SX material at different temperatures and the weak correlation of EIFS values verification under different loading conditions are also investigated.

Design/methodology/approach

A three-parameter time to crack initial (TTCI) method with multiple reference crack lengths under different loading conditions is established, which include the TTCI backstepping method and EIFS fitting method. Subsequently, the optimized EIFS distribution is obtained based on the random crack propagation rate and maximum likelihood estimation of median fatigue life. Then, an effective driving force based on anisotropic and mixed crack propagation mode is proposed to describe the crack propagation rate in the small crack stage. Finally, the fatigue life of three different temperature ESE(T) standard specimens is predicted based on the EIFS values under different survival rates.

Findings

The optimized EIFS distribution based on EIFS fitting - maximum likelihood estimation (MLE) method has the highest accuracy in predicting the total fatigue life, with the range of EIFS values being about [0.0028, 0.0875] (mm), and the mean value of EIFS being 0.0506 mm. The error between the predicted fatigue life based on the crack propagation rate and EIFS distribution for survival rates ranges from 5% to 95% and the experimental life is within two times dispersion band.

Originality/value

This paper systematically proposes a new anisotropic material EIFS prediction method, establishing a framework for predicting the fatigue life of SX material at different temperatures using fracture mechanics to avoid inaccurate anisotropic constitutive models and fatigue damage accumulation theory.

Details

Multidiscipline Modeling in Materials and Structures, vol. 19 no. 6
Type: Research Article
ISSN: 1573-6105

Keywords

Article
Publication date: 3 March 2021

Ye Li, Yuanping Ding, Yaqian Jing and Sandang Guo

The purpose of this paper is to construct an interval grey number NGM(1,1) direct prediction model (abbreviated as IGNGM(1,1)), which need not transform interval grey numbers…

Abstract

Purpose

The purpose of this paper is to construct an interval grey number NGM(1,1) direct prediction model (abbreviated as IGNGM(1,1)), which need not transform interval grey numbers sequences into real number sequences, and the Markov model is used to optimize residual sequences of IGNGM(1,1) model.

Design/methodology/approach

A definition equation of IGNGM(1,1) model is proposed in this paper, and its time response function is solved by recursive iteration method. Next, the optimal weight of development coefficients of two boundaries is obtained by genetic algorithm, which is designed by minimizing the average relative error based on time weighted. In addition to that, the Markov model is used to modify residual sequences.

Findings

The interval grey numbers’ sequences can be predicted directly by IGNGM(1,1) model and its residual sequences can be amended by Markov model. A case study shows that the proposed model has higher accuracy in prediction.

Practical implications

Uncertainty and volatility information is widespread in practical applications, and the information can be characterized by interval grey numbers. In this paper, an interval grey numbers direct prediction model is proposed, which provides a method for predicting the uncertainty information in the real world.

Originality/value

The main contribution of this paper is to propose an IGNGM(1,1) model which can realize interval grey numbers prediction without transforming them into real number and solve the optimal weight of integral development coefficient by genetic algorithm so as to avoid the distortion of prediction results. Moreover, the Markov model is used to modify residual sequences to further improve the modeling accuracy.

Details

Grey Systems: Theory and Application, vol. 12 no. 1
Type: Research Article
ISSN: 2043-9377

Keywords

Article
Publication date: 1 September 2000

H. Saranga and J. Knezevic

Mathematical models for reliability prediction under multiple condition parameters are discussed. The models are developed under the assumption that the different condition…

1158

Abstract

Mathematical models for reliability prediction under multiple condition parameters are discussed. The models are developed under the assumption that the different condition parameters can be independent or dependent. Presents a model based on the continuous time Markov chain where the rate of degradation of the system depends on the state of the system and the fact that different parameters that cause system degradation can be dependent. The models are illustrated using an example.

Details

Journal of Quality in Maintenance Engineering, vol. 6 no. 3
Type: Research Article
ISSN: 1355-2511

Keywords

Article
Publication date: 11 July 2022

Peng Jiang and Yi-Chung Hu

In contrast to point forecasts, interval forecasts provide information on future variability. This research thus aimed to develop interval prediction models by addressing two…

Abstract

Purpose

In contrast to point forecasts, interval forecasts provide information on future variability. This research thus aimed to develop interval prediction models by addressing two significant issues: (1) a simple average with an additive property is commonly used to derive combined forecasts, but this unreasonably ignores the interaction among sequences used as sources of information, and (2) the time series often does not conform to any statistical assumptions.

Design/methodology/approach

To develop an interval prediction model, the fuzzy integral was applied to nonlinearly combine forecasts generated by a set of grey prediction models, and a sequence including the combined forecasts was then used to construct a neural network. All required parameters relevant to the construction of an interval model were optimally determined by the genetic algorithm.

Findings

The empirical results for tourism demand showed that the proposed non-additive interval model outperformed the other interval prediction models considered.

Practical implications

The private and public sectors in economies with high tourism dependency can benefit from the proposed model by using the forecasts to help them formulate tourism strategies.

Originality/value

In light of the usefulness of combined point forecasts and interval model forecasting, this research contributed to the development of non-additive interval prediction models on the basis of combined forecasts generated by grey prediction models.

Details

Grey Systems: Theory and Application, vol. 13 no. 1
Type: Research Article
ISSN: 2043-9377

Keywords

Article
Publication date: 4 September 2018

Muhannad Aldosary, Jinsheng Wang and Chenfeng Li

This paper aims to provide a comprehensive review of uncertainty quantification methods supported by evidence-based comparison studies. Uncertainties are widely encountered in…

Abstract

Purpose

This paper aims to provide a comprehensive review of uncertainty quantification methods supported by evidence-based comparison studies. Uncertainties are widely encountered in engineering practice, arising from such diverse sources as heterogeneity of materials, variability in measurement, lack of data and ambiguity in knowledge. Academia and industries have long been researching for uncertainty quantification (UQ) methods to quantitatively account for the effects of various input uncertainties on the system response. Despite the rich literature of relevant research, UQ is not an easy subject for novice researchers/practitioners, where many different methods and techniques coexist with inconsistent input/output requirements and analysis schemes.

Design/methodology/approach

This confusing status significantly hampers the research progress and practical application of UQ methods in engineering. In the context of engineering analysis, the research efforts of UQ are most focused in two largely separate research fields: structural reliability analysis (SRA) and stochastic finite element method (SFEM). This paper provides a state-of-the-art review of SRA and SFEM, covering both technology and application aspects. Moreover, unlike standard survey papers that focus primarily on description and explanation, a thorough and rigorous comparative study is performed to test all UQ methods reviewed in the paper on a common set of reprehensive examples.

Findings

Over 20 uncertainty quantification methods in the fields of structural reliability analysis and stochastic finite element methods are reviewed and rigorously tested on carefully designed numerical examples. They include FORM/SORM, importance sampling, subset simulation, response surface method, surrogate methods, polynomial chaos expansion, perturbation method, stochastic collocation method, etc. The review and comparison tests comment and conclude not only on accuracy and efficiency of each method but also their applicability in different types of uncertainty propagation problems.

Originality/value

The research fields of structural reliability analysis and stochastic finite element methods have largely been developed separately, although both tackle uncertainty quantification in engineering problems. For the first time, all major uncertainty quantification methods in both fields are reviewed and rigorously tested on a common set of examples. Critical opinions and concluding remarks are drawn from the rigorous comparative study, providing objective evidence-based information for further research and practical applications.

Details

Engineering Computations, vol. 35 no. 6
Type: Research Article
ISSN: 0264-4401

Keywords

1 – 10 of over 1000