Search results
1 – 10 of 67Jingli Yang, Zhen Sun and Yinsheng Chen
This paper aims to enhance the reliability of self-validating multifunctional sensors.
Abstract
Purpose
This paper aims to enhance the reliability of self-validating multifunctional sensors.
Design/methodology/approach
An effective fault detection, isolation and data recovery (FDIR) strategy by using kernel principal component analysis (KPCA) coupled with gray bootstrap and fault reconstruction methods.
Findings
The proposed FDIR strategy is able to the address fault detection, isolation and data recovery problem of self-validating multifunctional sensors efficiently.
Originality/value
A KPCA-based model which can overcome the limitation of existing linear-based models is used to achieve the fault detection task. By using gray bootstrap method, the position of all faulty sensitive units can be calculated even under the multiple faults situation. A reconstruction-based contribution method is adopted to evaluate the amplitudes of the fault signals, and the fault-free output of the faulty sensitive units can be used to replace the fault output.
Details
Keywords
Lydie Myriam Marcelle Amelot, Ushad Subadar Agathee and Yuvraj Sunecher
This study constructs time series model, artificial neural networks (ANNs) and statistical topologies to examine the volatility and forecast foreign exchange rates. The Mauritian…
Abstract
Purpose
This study constructs time series model, artificial neural networks (ANNs) and statistical topologies to examine the volatility and forecast foreign exchange rates. The Mauritian forex market has been utilized as a case study, and daily data for nominal spot rate (during a time period of five years spanning from 2014 to 2018) for EUR/MUR, GBP/MUR, CAD/MUR and AUD/MUR have been applied for the predictions.
Design/methodology/approach
Autoregressive integrated moving average (ARIMA) and generalized autoregressive conditional heteroskedasticity (GARCH) models are used as a basis for time series modelling for the analysis, along with the non-linear autoregressive network with exogenous inputs (NARX) neural network backpropagation algorithm utilizing different training functions, namely, Levenberg–Marquardt (LM), Bayesian regularization and scaled conjugate gradient (SCG) algorithms. The study also features a hybrid kernel principal component analysis (KPCA) using the support vector regression (SVR) algorithm as an additional statistical tool to conduct financial market forecasting modelling. Mean squared error (MSE) and root mean square error (RMSE) are employed as indicators for the performance of the models.
Findings
The results demonstrated that the GARCH model performed better in terms of volatility clustering and prediction compared to the ARIMA model. On the other hand, the NARX model indicated that LM and Bayesian regularization training algorithms are the most appropriate method of forecasting the different currency exchange rates as the MSE and RMSE seemed to be the lowest error compared to the other training functions. Meanwhile, the results reported that NARX and KPCA–SVR topologies outperformed the linear time series models due to the theory based on the structural risk minimization principle. Finally, the comparison between the NARX model and KPCA–SVR illustrated that the NARX model outperformed the statistical prediction model. Overall, the study deduced that the NARX topology achieves better prediction performance results compared to time series and statistical parameters.
Research limitations/implications
The foreign exchange market is considered to be instable owing to uncertainties in the economic environment of any country and thus, accurate forecasting of foreign exchange rates is crucial for any foreign exchange activity. The study has an important economic implication as it will help researchers, investors, traders, speculators and financial analysts, users of financial news in banking and financial institutions, money changers, non-banking financial companies and stock exchange institutions in Mauritius to take investment decisions in terms of international portfolios. Moreover, currency rates instability might raise transaction costs and diminish the returns in terms of international trade. Exchange rate volatility raises the need to implement a highly organized risk management measures so as to disclose future trend and movement of the foreign currencies which could act as an essential guidance for foreign exchange participants. By this way, they will be more alert before conducting any forex transactions including hedging, asset pricing or any speculation activity, take corrective actions, thus preventing them from making any potential losses in the future and gain more profit.
Originality/value
This is one of the first studies applying artificial intelligence (AI) while making use of time series modelling, the NARX neural network backpropagation algorithm and hybrid KPCA–SVR to predict forex using multiple currencies in the foreign exchange market in Mauritius.
Details
Keywords
PengPeng Hu, Taku Komura, Duan Li, Ge Wu and Yueqi Zhong
The purpose of this paper is to present a novel framework of reconstructing the 3D textile model with synthesized texture.
Abstract
Purpose
The purpose of this paper is to present a novel framework of reconstructing the 3D textile model with synthesized texture.
Design/methodology/approach
First, a pipeline of 3D textile reconstruction based on KinectFusion is proposed to obtain a better 3D model. Second, “DeepTextures” method is applied to generate new textures for various three-dimensional textile models.
Findings
Experimental results show that the proposed method can conveniently reconstruct a three-dimensional textile model with synthesized texture.
Originality/value
A novel pipeline is designed to obtain 3D high-quality textile models based on KinectFusion. The accuracy and robustness of KinectFusion are improved via a turntable. To the best of the authors’ knowledge, this is the first paper to explore the synthesized textile texture for the 3D textile model. This is not only simply mapping the texture onto the 3D model, but also exploring the application of artificial intelligence in the field of textile.
Details
Keywords
PengPeng Hu, Duan Li, Ge Wu, Taku Komura, Dongliang Zhang and Yueqi Zhong
Currently, a common method of reconstructing mannequin is based on the body measurements or body features, which only preserve the body size lacking of the accurate body geometric…
Abstract
Purpose
Currently, a common method of reconstructing mannequin is based on the body measurements or body features, which only preserve the body size lacking of the accurate body geometric shape information. However, the same human body measurement does not equal to the same body shape. This may result in an unfit garment for the target human body. The purpose of this paper is to propose a novel scanning-based pipeline to reconstruct the personalized mannequin, which preserves both body size and body shape information.
Design/methodology/approach
The authors first capture the body of a subject via 3D scanning, and a statistical body model is fit to the scanned data. This results in a skinned articulated model of the subject. The scanned body is then adjusted to be pose-symmetric via linear blending skinning. The mannequin part is then extracted. Finally, a slice-based method is proposed to generate a shape-symmetric 3D mannequin.
Findings
A personalized 3D mannequin can be reconstructed from the scanned body. Compared to conventional methods, the method can preserve both the size and shape of the original scanned body. The reconstructed mannequin can be imported directly into the apparel CAD software. The proposed method provides a step for digitizing the apparel manufacturing.
Originality/value
Compared to the conventional methods, the main advantage of the authors’ system is that the authors can preserve both size and geometry of the original scanned body. The main contributions of this paper are as follows: decompose the process of the mannequin reconstruction into pose symmetry and shape symmetry; propose a novel scanning-based pipeline to reconstruct a 3D personalized mannequin; and present a slice-based method for the symmetrization of the 3D mesh.
Details
Keywords
The paper provides a detailed historical account of Douglass C. North's early intellectual contributions and analytical developments in pursuing a Grand Theory for why some…
Abstract
Purpose
The paper provides a detailed historical account of Douglass C. North's early intellectual contributions and analytical developments in pursuing a Grand Theory for why some countries are rich and others poor.
Design/methodology/approach
The author approaches the discussion using a theoretical and historical reconstruction based on published and unpublished materials.
Findings
The systematic, continuous and profound attempt to answer the Smithian social coordination problem shaped North's journey from being a young serious Marxist to becoming one of the founders of New Institutional Economics. In the process, he was converted in the early 1950s into a rigid neoclassical economist, being one of the leaders in promoting New Economic History. The success of the cliometric revolution exposed the frailties of the movement itself, namely, the limitations of neoclassical economic theory to explain economic growth and social change. Incorporating transaction costs, the institutional framework in which property rights and contracts are measured, defined and enforced assumes a prominent role in explaining economic performance.
Originality/value
In the early 1970s, North adopted a naive theory of institutions and property rights still grounded in neoclassical assumptions. Institutional and organizational analysis is modeled as a social maximizing efficient equilibrium outcome. However, the increasing tension between the neoclassical theoretical apparatus and its failure to account for contrasting political and institutional structures, diverging economic paths and social change propelled the modification of its assumptions and progressive conceptual innovation. In the later 1970s and early 1980s, North abandoned the efficiency view and gradually became more critical of the objective rationality postulate. In this intellectual movement, North's avant-garde research program contributed significantly to the creation of New Institutional Economics.
Details
Keywords
Li Lijun, Guan Tao, Ren Bo, Yao Xiaowen and Wang Cheng
The purpose of this paper is to propose a novel registration method using Euclidean reconstruction and natural features tracking for AR‐based assembly guidance systems.
Abstract
Purpose
The purpose of this paper is to propose a novel registration method using Euclidean reconstruction and natural features tracking for AR‐based assembly guidance systems.
Design/methodology/approach
The method operates in two steps: offline Euclidean reconstruction and online tracking. Offline stage involves obtaining the structure of scene using Euclidean reconstruction technique. The classification trees are constructed using affine transform for online initialization. In tracking, the classification‐based wide baseline matching strategy and Td,d test are used to get a fast and accurate initialization for the first frame after which a modified optical flow tracker is used to fulfill the task of feature tracking in the real‐time video sequences. The four specified points are transferred to the current image to compute the registration matrix for augmentation.
Findings
Firstly, Euclidean reconstruction was used instead of projective reconstruction to get the projections of predefined features. Compared with the six points needed in projective reconstruction‐based method, this method can run normally even when only four features are successfully tracked. Secondly, an adaptive strategy was proposed to adjust the classification trees using the tracked features in online stage by which one can initialize or reinitialize the system, even with large difference between the first and reference images.
Originality/value
Some indoor and outdoor experiments are provided to validate the performance of the proposed method.
Details
Keywords
En-Ze Rui, Guang-Zhi Zeng, Yi-Qing Ni, Zheng-Wei Chen and Shuo Hao
Current methods for flow field reconstruction mainly rely on data-driven algorithms which require an immense amount of experimental or field-measured data. Physics-informed neural…
Abstract
Purpose
Current methods for flow field reconstruction mainly rely on data-driven algorithms which require an immense amount of experimental or field-measured data. Physics-informed neural network (PINN), which was proposed to encode physical laws into neural networks, is a less data-demanding approach for flow field reconstruction. However, when the fluid physics is complex, it is tricky to obtain accurate solutions under the PINN framework. This study aims to propose a physics-based data-driven approach for time-averaged flow field reconstruction which can overcome the hurdles of the above methods.
Design/methodology/approach
A multifidelity strategy leveraging PINN and a nonlinear information fusion (NIF) algorithm is proposed. Plentiful low-fidelity data are generated from the predictions of a PINN which is constructed purely using Reynold-averaged Navier–Stokes equations, while sparse high-fidelity data are obtained by field or experimental measurements. The NIF algorithm is performed to elicit a multifidelity model, which blends the nonlinear cross-correlation information between low- and high-fidelity data.
Findings
Two experimental cases are used to verify the capability and efficacy of the proposed strategy through comparison with other widely used strategies. It is revealed that the missing flow information within the whole computational domain can be favorably recovered by the proposed multifidelity strategy with use of sparse measurement/experimental data. The elicited multifidelity model inherits the underlying physics inherent in low-fidelity PINN predictions and rectifies the low-fidelity predictions over the whole computational domain. The proposed strategy is much superior to other contrastive strategies in terms of the accuracy of reconstruction.
Originality/value
In this study, a physics-informed data-driven strategy for time-averaged flow field reconstruction is proposed which extends the applicability of the PINN framework. In addition, embedding physical laws when training the multifidelity model leads to less data demand for model development compared to purely data-driven methods for flow field reconstruction.
Details
Keywords
In the early 1930s, Nicholas Kaldor could be classified as an Austrian economist. The author reconstructs the intertwined paths of Kaldor and Friedrich A. Hayek to disequilibrium…
Abstract
Purpose
In the early 1930s, Nicholas Kaldor could be classified as an Austrian economist. The author reconstructs the intertwined paths of Kaldor and Friedrich A. Hayek to disequilibrium economics through the theoretical deficiencies exposed by the Austrian theory of capital and its consequences on equilibrium analysis.
Design/methodology/approach
The author approaches the discussion using a theoretical and historical reconstruction based on published and unpublished materials.
Findings
The integration of capital theory into a business cycle theory by the Austrians and its shortcomings – e.g. criticized by Piero Sraffa and Gunnar Myrdal – called attention to the limitation of the theoretical apparatus of equilibrium analysis in dynamic contexts. This was a central element to Kaldor’s emancipation in 1934 and his subsequent conversion to John Maynard Keynes’ The General Theory of Employment, Interest, and Money (1936). In addition, it was pivotal to Hayek’s reformulation of equilibrium as a social coordination problem in “Economics and Knowledge” (1937). It also had implications for Kaldor’s mature developments, such as the construction of the post-Keynesian models of growth and distribution, the Cambridge capital controversy, and his critique of neoclassical equilibrium economics.
Originality/value
The close encounter between Kaldor and Hayek in the early 1930s, the developments during that decade and its mature consequences are unexplored in the secondary literature. The author attempts to construct a coherent historical narrative that integrates many intertwined elements and personas (e.g. the reception of Knut Wicksell in the English-speaking world; Piero Sraffa’s critique of Hayek; Gunnar Myrdal’s critique of Wicksell, Hayek, and Keynes; the Hayek-Knight-Kaldor debate; the Kaldor-Hayek debate, etc.) that were not connected until now by previous commentators.
Details
Keywords
Lucas Silva and Alfredo Gay Neto
When establishing a mathematical model to simulate solid mechanics, considering realistic geometries, special tools are needed to translate measured data, possibly with noise…
Abstract
Purpose
When establishing a mathematical model to simulate solid mechanics, considering realistic geometries, special tools are needed to translate measured data, possibly with noise, into idealized geometrical entities. As an engineering application, wheel-rail contact interactions are fundamental in the dynamic modeling of railway vehicles. Many approaches used to solve the contact problem require a continuous parametric description of the geometries involved. However, measured wheel and rail profiles are often available as sets of discrete points. A reconstruction method is needed to transform discrete data into a continuous geometry.
Design/methodology/approach
The authors present an approximation method based on optimization to solve the problem of fitting a set of points with an arc spline. It consists of an initial guess based on a curvature function estimated from the data, followed by a least-squares optimization to improve the approximation. The authors also present a segmentation scheme that allows the method to increment the number of segments of the spline, trying to keep it at a minimal value, to satisfy a given error tolerance.
Findings
The paper provides a better understanding of arc splines and how they can be deformed. Examples with parametric curves and slightly noisy data from realistic wheel and rail profiles show that the approach is successful.
Originality/value
The developed methods have theoretical value. Furthermore, they have practical value since the approximation approach is better suited to deal with the reconstruction of wheel/rail profiles than interpolation, which most methods use to some degree.
Details
Keywords
Ayman Assem, Sherif Abdelmohsen and Mohamed Ezzeldin
Cities lying within conflict zones have continually faced hardships of both war aftermath and long-term sustainable reconstruction. Challenges have surpassed the typical question…
Abstract
Purpose
Cities lying within conflict zones have continually faced hardships of both war aftermath and long-term sustainable reconstruction. Challenges have surpassed the typical question of recovery from post-conflict trauma, preserving urban heritage and iconic elements of the built environment, to face issues of critical decision making, rebuilding effectiveness and funding mechanisms, leading to time-consuming processes that lack adequate consistent long-term management. Some approaches have explored methods of effective long-term city reconstruction management but have not fully developed comprehensive approaches that alleviate the management of such complex processes. The paper aims to discuss these issues.
Design/methodology/approach
The authors devise an approach for the smart management of post-conflict city reconstruction. The authors focus on evaluation, strategic planning, reconstruction projects and implementation. The authors integrate building information modeling and geographic/geospatial information systems in a platform that allows for real-time analysis, reporting, strategic planning and decision making for managing reconstruction operations and projects among involved stakeholders including government agencies, funding organizations, city managers and public participants.
Findings
The approach suggested a smart management system for the reconstruction process of post-conflict cities. Implementing this system was shown to provide a multi-objective solution for post-conflict city reconstruction based on its interlinked modules.
Research limitations/implications
Results may lack generalizability and require testing on several cases to provide rigorous findings for different case studies.
Practical implications
Implications include developing smart management systems for use by city managers and government authorities in post-conflict zones, as well as bottom-up decision making by including participant citizens especially populations in the diaspora.
Originality/value
The approach offers an integrated platform that informs city reconstruction decision makers, allowing for strategic planning tools for efficient planning, monitoring tools for continuous management during and after reconstruction, and effective platforms for communication among all stakeholders.
Details