Search results
1 – 10 of over 7000Ambuj Sharma, Sandeep Kumar and Amit Tyagi
The real challenges in online crack detection testing based on guided waves are random noise as well as narrow-band coherent noise; and to achieve efficient structural health…
Abstract
Purpose
The real challenges in online crack detection testing based on guided waves are random noise as well as narrow-band coherent noise; and to achieve efficient structural health assessment methodology, magnificent extraction of noise and analysis of the signals are essential. The purpose of this paper is to provide optimal noise filtering technique for Lamb waves in the diagnosis of structural singularities.
Design/methodology/approach
Filtration of time-frequency information of guided elastic waves through the noisy signal is investigated in the present analysis using matched filtering technique which “sniffs” the signal buried in noise and most favorable mother wavelet based denoising methods. The optimal wavelet function is selected using Shannon’s entropy criterion and verified by the analysis of root mean square error of the filtered signal.
Findings
Wavelet matched filter method, a newly developed filtering technique in this work and which is a combination of the wavelet transform and matched filtering method, significantly improves the accuracy of the filtered signal and identifies relatively small damage, especially in enormously noisy data. A comparative study is also performed using the statistical tool to know acceptability and practicability of filtered signals for guided wave application.
Practical implications
The proposed filtering techniques can be utilized in online monitoring of civil and mechanical structures. The algorithm of the method is easy to implement and found to be successful in accurately detecting damage.
Originality/value
Although many techniques have been developed over the past several years to suppress random noise in Lamb wave signal but filtration of interferences of wave modes and boundary reflection is not in a much matured stage and thus needs further investigation. The present study contains detailed information about various noise filtering methods, newly developed filtration technique and their efficacy in handling the above mentioned issues.
Details
Keywords
Ambuj Sharma, Sandeep Kumar and Amit Tyagi
The presence of random noise as well as narrow band coherent noise makes the structural health monitoring a really challenging issue and to achieve efficient structural health…
Abstract
Purpose
The presence of random noise as well as narrow band coherent noise makes the structural health monitoring a really challenging issue and to achieve efficient structural health assessment methodology, very good extraction of noise and analysis of the signals are essential. The purpose of this paper is to provide optimal noise filtering technique for Lamb waves in the diagnosis of structural singularities.
Design/methodology/approach
Filtration of time-frequency information of multimode Lamb waves through the noisy signal is investigated in the present analysis using matched filtering technique and wavelet denoising methods. Using Shannon’s entropy criterion, the optimal wavelet function is selected and verification is made via the analysis of root mean square error of filtered signal.
Findings
The authors propose wavelet matched filter method, a combination of the wavelet transform and matched filtering method, which can significantly improve the accuracy of the filtered signal and identify relatively small damage, especially in enormously noisy data. Correlation coefficient and root mean square error are additionally computed for performance evaluation of the filters.
Originality/value
The present study provides detailed information about various noise filtering methods and a first attempt to apply the combination of the different techniques in signal processing for the structural health monitoring application. A comparative study is performed using the statistical tool to know whether filtered signals obtained through three different methods are acceptable and practicable for guided wave application or not.
Details
Keywords
Recent developments in the hardware and software mean that the automation of visual fabric inspection tasks is becoming feasible at low cost. This paper investigates the techniques…
Abstract
Recent developments in the hardware and software mean that the automation of visual fabric inspection tasks is becoming feasible at low cost. This paper investigates the techniques that can be used to solve the problem of repetitive, tedious and physically demanding human inspection for defects in shirt collars. The faults studied in this work are those found in nine types of defects that can be present on shirt collar panels. Two statistical methods: moving group average, and moving divided group average are proposed. In addition, highlighting and variance techniques are applied to an image with moving group average and signature counting. These techniques gave an indication of fast computation time to detect the defects on the image, which is needed in manufacturing, and could be applied to most automated inspection systems.
Details
Keywords
Ravinder Singh and Kuldeep Singh Nagla
The purpose of this research is to provide the necessarily and resourceful information regarding range sensors to select the best fit sensor for robust autonomous navigation…
Abstract
Purpose
The purpose of this research is to provide the necessarily and resourceful information regarding range sensors to select the best fit sensor for robust autonomous navigation. Autonomous navigation is an emerging segment in the field of mobile robot in which the mobile robot navigates in the environment with high level of autonomy by lacking human interactions. Sensor-based perception is a prevailing aspect in the autonomous navigation of mobile robot along with localization and path planning. Various range sensors are used to get the efficient perception of the environment, but selecting the best-fit sensor to solve the navigation problem is still a vital assignment.
Design/methodology/approach
Autonomous navigation relies on the sensory information of various sensors, and each sensor relies on various operational parameters/characteristic for the reliable functioning. A simple strategy shown in this proposed study to select the best-fit sensor based on various parameters such as environment, 2 D/3D navigation, accuracy, speed, environmental conditions, etc. for the reliable autonomous navigation of a mobile robot.
Findings
This paper provides a comparative analysis for the diverse range sensors used in mobile robotics with respect to various aspects such as accuracy, computational load, 2D/3D navigation, environmental conditions, etc. to opt the best-fit sensors for achieving robust navigation of autonomous mobile robot.
Originality/value
This paper provides a straightforward platform for the researchers to select the best range sensor for the diverse robotics application.
Details
Keywords
Ka-fai Choi, Yunan Gong and Kwok-wing Yeung
Two dimensional band-pass filters can be used to enhance the edges of the defects contained in fabric images. In this paper, we designed two types of 2D band-pass filters for the…
Abstract
Two dimensional band-pass filters can be used to enhance the edges of the defects contained in fabric images. In this paper, we designed two types of 2D band-pass filters for the automatic detection of defects. One is the matched Gabor filter, and the other is the matched Mexican hat wavelet. Experiments show that the matched Gabor filter is more suitable for defects of higher frequency, while the matched Mexican hat wavelet is more effective for defects of lower frequency. Based on the two types of band-pass filters, an automatic fabric defect detection system was designed which boasts good accuracy and high speed.
Fatemeh Alyari and Nima Jafari Navimipour
This paper aims to identify, evaluate and integrate the findings of all relevant and high-quality individual studies addressing one or more research questions about recommender…
Abstract
Purpose
This paper aims to identify, evaluate and integrate the findings of all relevant and high-quality individual studies addressing one or more research questions about recommender systems and performing a comprehensive study of empirical research on recommender systems that have been divided into five main categories. To achieve this aim, the authors use systematic literature review (SLR) as a powerful method to collect and critically analyze the research papers. Also, the authors discuss the selected recommender systems and its main techniques, as well as their benefits and drawbacks in general.
Design/methodology/approach
In this paper, the SLR method is utilized with the aim of identifying, evaluating and integrating the findings of all relevant and high-quality individual studies addressing one or more research questions about recommender systems and performing a comprehensive study of empirical research on recommender systems that have been divided into five main categories. Also, the authors discussed recommender system and its techniques in general without a specific domain.
Findings
The major developments in categories of recommender systems are reviewed, and new challenges are outlined. Furthermore, insights on the identification of open issues and guidelines for future research are provided. Also, this paper presents the systematical analysis of the recommender system literature from 2005. The authors identified 536 papers, which were reduced to 51 primary studies through the paper selection process.
Originality/value
This survey will directly support academics and practical professionals in their understanding of developments in recommender systems and its techniques.
Details
Keywords
Computer matching is a mass surveillance technique involving thecomparison of data about many people, which have been acquired frommultiple sources. Its use offers potential…
Abstract
Computer matching is a mass surveillance technique involving the comparison of data about many people, which have been acquired from multiple sources. Its use offers potential benefits, particularly financial savings. It is also error‐prone, and its power results in threats to established patterns and values. The imperatives of efficiency and equity demand that computer matching be used, and the information privacy interest demands that it be used only where justified, and be subjected to effective controls. Provides background to this important technique, including its development and application in the USA and in Australia, and a detailed technical description. Contends that the technique, its use, and controls over its use are very important issues which demand research. Computing, telecommunications and robotics artefacts which have the capacity to change society radically need to be subjected to early and careful analysis, not only by sociologists, lawyers and philosophers, but also by information technologists themselves.
Details
Keywords
Jayalaxmi Anem, G. Sateeshkumar and R. Madhu
The main aim of this paper is to design a technique for improving the quality of EEG signal by removing artefacts which is obtained during acquisition. Initially, pre-processing…
Abstract
Purpose
The main aim of this paper is to design a technique for improving the quality of EEG signal by removing artefacts which is obtained during acquisition. Initially, pre-processing is done on EEG signal for quality improvement. Then, by using wavelet transform (WT) feature extraction is done. The artefacts present in the EEG are removed using deep convLSTM. This deep convLSTM is trained by proposed fractional calculus based flower pollination optimisation algorithm.
Design/methodology/approach
Nowadays' EEG signals play vital role in the field of neurophysiologic research. Brain activities of human can be analysed by using EEG signals. These signals are frequently affected by noise during acquisition and other external disturbances, which lead to degrade the signal quality. Denoising of EEG signals is necessary for the effective usage of signals in any application. This paper proposes a new technique named as flower pollination fractional calculus optimisation (FPFCO) algorithm for the removal of artefacts from EEG signal through deep learning scheme. FPFCO algorithm is the integration of flower pollination optimisation and fractional calculus which takes the advantages of both the flower pollination optimisation and fractional calculus which is used to train the deep convLSTM. The existed FPO algorithm is used for solution update through global and local pollinations. In this case, the fractional calculus (FC) method attempts to include the past solution by including the second order derivative. As a result, the suggested FPFCO algorithm approaches the best solution faster than the existing flower pollination optimization (FPO) method. Initially, 5 EEG signals are contaminated by artefacts such as EMG, EOG, EEG and random noise. These contaminated EEG signals are pre-processed to remove baseline and power line noises. Further, feature extraction is done by using WT and extracted features are applied to deep convLSTM, which is trained by proposed fractional calculus based flower pollination optimisation algorithm. FPFCO is used for the effective removal of artefacts from EEG signal. The proposed technique is compared with existing techniques in terms of SNR and MSE.
Findings
The proposed technique is compared with existing techniques in terms of SNR, RMSE and MSE.
Originality/value
100%.
Details
Keywords
Xiaoming Zhang, Mingming Meng, Xiaoling Sun and Yu Bai
With the advent of the era of Big Data, the scale of knowledge graph (KG) in various domains is growing rapidly, which holds huge amount of knowledge surely benefiting the…
Abstract
Purpose
With the advent of the era of Big Data, the scale of knowledge graph (KG) in various domains is growing rapidly, which holds huge amount of knowledge surely benefiting the question answering (QA) research. However, the KG, which is always constituted of entities and relations, is structurally inconsistent with the natural language query. Thus, the QA system based on KG is still faced with difficulties. The purpose of this paper is to propose a method to answer the domain-specific questions based on KG, providing conveniences for the information query over domain KG.
Design/methodology/approach
The authors propose a method FactQA to answer the factual questions about specific domain. A series of logical rules are designed to transform the factual questions into the triples, in order to solve the structural inconsistency between the user’s question and the domain knowledge. Then, the query expansion strategies and filtering strategies are proposed from two levels (i.e. words and triples in the question). For matching the question with domain knowledge, not only the similarity values between the words in the question and the resources in the domain knowledge but also the tag information of these words is considered. And the tag information is obtained by parsing the question using Stanford CoreNLP. In this paper, the KG in metallic materials domain is used to illustrate the FactQA method.
Findings
The designed logical rules have time stability for transforming the factual questions into the triples. Additionally, after filtering the synonym expansion results of the words in the question, the expansion quality of the triple representation of the question is improved. The tag information of the words in the question is considered in the process of data matching, which could help to filter out the wrong matches.
Originality/value
Although the FactQA is proposed for domain-specific QA, it can also be applied to any other domain besides metallic materials domain. For a question that cannot be answered, FactQA would generate a new related question to answer, providing as much as possible the user with the information they probably need. The FactQA could facilitate the user’s information query based on the emerging KG.
Details
Keywords
Gives introductory remarks about chapter 1 of this group of 31 papers, from ISEF 1999 Proceedings, in the methodologies for field analysis, in the electromagnetic community…
Abstract
Gives introductory remarks about chapter 1 of this group of 31 papers, from ISEF 1999 Proceedings, in the methodologies for field analysis, in the electromagnetic community. Observes that computer package implementation theory contributes to clarification. Discusses the areas covered by some of the papers ‐ such as artificial intelligence using fuzzy logic. Includes applications such as permanent magnets and looks at eddy current problems. States the finite element method is currently the most popular method used for field computation. Closes by pointing out the amalgam of topics.
Details