Search results
1 – 6 of 6Ema Utami, Irwan Oyong, Suwanto Raharjo, Anggit Dwi Hartanto and Sumarni Adi
Gathering knowledge regarding personality traits has long been the interest of academics and researchers in the fields of psychology and in computer science. Analyzing profile…
Abstract
Purpose
Gathering knowledge regarding personality traits has long been the interest of academics and researchers in the fields of psychology and in computer science. Analyzing profile data from personal social media accounts reduces data collection time, as this method does not require users to fill any questionnaires. A pure natural language processing (NLP) approach can give decent results, and its reliability can be improved by combining it with machine learning (as shown by previous studies).
Design/methodology/approach
In this, cleaning the dataset and extracting relevant potential features “as assessed by psychological experts” are essential, as Indonesians tend to mix formal words, non-formal words, slang and abbreviations when writing social media posts. For this article, raw data were derived from a predefined dominance, influence, stability and conscientious (DISC) quiz website, returning 316,967 tweets from 1,244 Twitter accounts “filtered to include only personal and Indonesian-language accounts”. Using a combination of NLP techniques and machine learning, the authors aim to develop a better approach and more robust model, especially for the Indonesian language.
Findings
The authors find that employing a SMOTETomek re-sampling technique and hyperparameter tuning boosts the model’s performance on formalized datasets by 57% (as measured through the F1-score).
Originality/value
The process of cleaning dataset and extracting relevant potential features assessed by psychological experts from it are essential because Indonesian people tend to mix formal words, non-formal words, slang words and abbreviations when writing tweets. Organic data derived from a predefined DISC quiz website resulting 1244 records of Twitter accounts and 316.967 tweets.
Details
Keywords
Oladosu Oyebisi Oladimeji and Ayodeji Olusegun J. Ibitoye
Diagnosing brain tumors is a process that demands a significant amount of time and is heavily dependent on the proficiency and accumulated knowledge of radiologists. Over the…
Abstract
Purpose
Diagnosing brain tumors is a process that demands a significant amount of time and is heavily dependent on the proficiency and accumulated knowledge of radiologists. Over the traditional methods, deep learning approaches have gained popularity in automating the diagnosis of brain tumors, offering the potential for more accurate and efficient results. Notably, attention-based models have emerged as an advanced, dynamically refining and amplifying model feature to further elevate diagnostic capabilities. However, the specific impact of using channel, spatial or combined attention methods of the convolutional block attention module (CBAM) for brain tumor classification has not been fully investigated.
Design/methodology/approach
To selectively emphasize relevant features while suppressing noise, ResNet50 coupled with the CBAM (ResNet50-CBAM) was used for the classification of brain tumors in this research.
Findings
The ResNet50-CBAM outperformed existing deep learning classification methods like convolutional neural network (CNN), ResNet-CBAM achieved a superior performance of 99.43%, 99.01%, 98.7% and 99.25% in accuracy, recall, precision and AUC, respectively, when compared to the existing classification methods using the same dataset.
Practical implications
Since ResNet-CBAM fusion can capture the spatial context while enhancing feature representation, it can be integrated into the brain classification software platforms for physicians toward enhanced clinical decision-making and improved brain tumor classification.
Originality/value
This research has not been published anywhere else.
Details
Keywords
This study focuses on the classification of targets with varying shapes using radar cross section (RCS), which is influenced by the target’s shape. This study aims to develop a…
Abstract
Purpose
This study focuses on the classification of targets with varying shapes using radar cross section (RCS), which is influenced by the target’s shape. This study aims to develop a robust classification method by considering an incident angle with minor random fluctuations and using a physical optics simulation to generate data sets.
Design/methodology/approach
The approach involves several supervised machine learning and classification methods, including traditional algorithms and a deep neural network classifier. It uses histogram-based definitions of the RCS for feature extraction, with an emphasis on resilience against noise in the RCS data. Data enrichment techniques are incorporated, including the use of noise-impacted histogram data sets.
Findings
The classification algorithms are extensively evaluated, highlighting their efficacy in feature extraction from RCS histograms. Among the studied algorithms, the K-nearest neighbour is found to be the most accurate of the traditional methods, but it is surpassed in accuracy by a deep learning network classifier. The results demonstrate the robustness of the feature extraction from the RCS histograms, motivated by mm-wave radar applications.
Originality/value
This study presents a novel approach to target classification that extends beyond traditional methods by integrating deep neural networks and focusing on histogram-based methodologies. It also incorporates data enrichment techniques to enhance the analysis, providing a comprehensive perspective for target detection using RCS.
Details
Keywords
Andreas Gschwentner, Manfred Kaltenbacher, Barbara Kaltenbacher and Klaus Roppert
Performing accurate numerical simulations of electrical drives, the precise knowledge of the local magnetic material properties is of utmost importance. Due to the various…
Abstract
Purpose
Performing accurate numerical simulations of electrical drives, the precise knowledge of the local magnetic material properties is of utmost importance. Due to the various manufacturing steps, e.g. heat treatment or cutting techniques, the magnetic material properties can strongly vary locally, and the assumption of homogenized global material parameters is no longer feasible. This paper aims to present the general methodology and two different solution strategies for determining the local magnetic material properties using reference and simulation data.
Design/methodology/approach
The general methodology combines methods based on measurement, numerical simulation and solving an inverse problem. Therefore, a sensor-actuator system is used to characterize electrical steel sheets locally. Based on the measurement data and results from the finite element simulation, the inverse problem is solved with two different solution strategies. The first one is a quasi Newton method (QNM) using Broyden's update formula to approximate the Jacobian and the second is an adjoint method. For comparison of both methods regarding convergence and efficiency, an artificial example with a linear material model is considered.
Findings
The QNM and the adjoint method show similar convergence behavior for two different cutting-edge effects. Furthermore, considering a priori information improved the convergence rate. However, no impact on the stability and the remaining error is observed.
Originality/value
The presented methodology enables a fast and simple determination of the local magnetic material properties of electrical steel sheets without the need for a large number of samples or special preparation procedures.
Details
Keywords
Xue Xin, Yuepeng Jiao, Yunfeng Zhang, Ming Liang and Zhanyong Yao
This study aims to ensure reliable analysis of dynamic responses in asphalt pavement structures. It investigates noise reduction and data mining techniques for pavement dynamic…
Abstract
Purpose
This study aims to ensure reliable analysis of dynamic responses in asphalt pavement structures. It investigates noise reduction and data mining techniques for pavement dynamic response signals.
Design/methodology/approach
The paper conducts time-frequency analysis on signals of pavement dynamic response initially. It also uses two common noise reduction methods, namely, low-pass filtering and wavelet decomposition reconstruction, to evaluate their effectiveness in reducing noise in these signals. Furthermore, as these signals are generated in response to vehicle loading, they contain a substantial amount of data and are prone to environmental interference, potentially resulting in outliers. Hence, it becomes crucial to extract dynamic strain response features (e.g. peaks and peak intervals) in real-time and efficiently.
Findings
The study introduces an improved density-based spatial clustering of applications with Noise (DBSCAN) algorithm for identifying outliers in denoised data. The results demonstrate that low-pass filtering is highly effective in reducing noise in pavement dynamic response signals within specified frequency ranges. The improved DBSCAN algorithm effectively identifies outliers in these signals through testing. Furthermore, the peak detection process, using the enhanced findpeaks function, consistently achieves excellent performance in identifying peak values, even when complex multi-axle heavy-duty truck strain signals are present.
Originality/value
The authors identified a suitable frequency domain range for low-pass filtering in asphalt road dynamic response signals, revealing minimal amplitude loss and effective strain information reflection between road layers. Furthermore, the authors introduced the DBSCAN-based anomaly data detection method and enhancements to the Matlab findpeaks function, enabling the detection of anomalies in road sensor data and automated peak identification.
Details
Keywords
Stefano Francesco Musso and Giovanna Franco
This article sets out to show how principles and questions about method that underlie a way of interpreting the discipline of conservation and restoration can find results in…
Abstract
Purpose
This article sets out to show how principles and questions about method that underlie a way of interpreting the discipline of conservation and restoration can find results in research and studies, aiming at achieving even conscious reuse process. The occasion is the very recent research performed on the former Church of Saints Gerolamo and Francesco Saverio in Genoa, Italy, the Jesuit church annexed to the 17th-century College of the order. It is a small Baroque jewel in the heart of the ancient city, former University Library and actually abandoned, forgotten for years, inaccessible and awaiting a new use.
Design/methodology/approach
The two-year work carried out on the monumental building was conducted according to a study and research methodology developed and refined over the years within the activities of the School of Specialisation in Architectural Heritage and Landscape of the University of Genoa. It is a multidisciplinary and rigorous approach, which aims to train high-level professionals, up-to-date and aware of the multiple problems that interventions on existing buildings, especially of a monumental nature, involve.
Findings
The biennal study has been carried out within the activities of the Post-Graduate Programme in Architectural Heritage and Landscape of the University of Genoa. The work methodology faces the challenges of the contemporary complexity, raised by the progressive broadening of the concept of cultural “heritage” and by the problems of its conservation, its active safeguard and its reuse: safety in respect of seismic risk, fire and hydro geological instability, universal accessibility – cognitive, physical and alternative – resource efficiency, comfort and savings in energy consumption, sustainability, communication and involvement of local communities and stakeholders.
Originality/value
The goals of the work were the following: understanding of the architectural heritage, through the correlated study of its geometries, elements and construction materials, surfaces, structures, spaces and functions; understanding of the transformations that the building has undergone over time, relating the results of historical reconstructions from indirect sources and those of direct archaeological analysis; assessment of the state of conservation of the building recognising phenomena of deterioration, damage, faults and deficits that affect materials, construction elements, systems and structures; identification of the causes and extent of damage, faults and deficits, assessing the vulnerability and level of exposure of the asset to the aggression of environmental factors and related risks; evaluation of the compatibility between the characteristics of the available spaces, the primary needs of conservation, the instance of regeneration and possible new uses; the definition of criteria and guidelines for establishing the planning of conservation, restoration and redevelopment interventions.
Details