Search results
1 – 10 of 84Mahmood Al-khassaweneh and Omar AlShorman
In the big data era, image compression is of significant importance in today’s world. Importantly, compression of large sized images is required for everyday tasks; including…
Abstract
In the big data era, image compression is of significant importance in today’s world. Importantly, compression of large sized images is required for everyday tasks; including electronic data communications and internet transactions. However, two important measures should be considered for any compression algorithm: the compression factor and the quality of the decompressed image. In this paper, we use Frei-Chen bases technique and the Modified Run Length Encoding (RLE) to compress images. The Frei-Chen bases technique is applied at the first stage in which the average subspace is applied to each 3 × 3 block. Those blocks with the highest energy are replaced by a single value that represents the average value of the pixels in the corresponding block. Even though Frei-Chen bases technique provides lossy compression, it maintains the main characteristics of the image. Additionally, the Frei-Chen bases technique enhances the compression factor, making it advantageous to use. In the second stage, RLE is applied to further increase the compression factor. The goal of using RLE is to enhance the compression factor without adding any distortion to the resultant decompressed image. Integrating RLE with Frei-Chen bases technique, as described in the proposed algorithm, ensures high quality decompressed images and high compression rate. The results of the proposed algorithms are shown to be comparable in quality and performance with other existing methods.
Details
Keywords
Francois Du Rand, André Francois van der Merwe and Malan van Tonder
This paper aims to discuss the development of a defect classification system that can be used to detect and classify powder bed surface defects from captured layer images without…
Abstract
Purpose
This paper aims to discuss the development of a defect classification system that can be used to detect and classify powder bed surface defects from captured layer images without the need for specialised computational hardware. The idea is to develop this system by making use of more traditional machine learning (ML) models instead of using computationally intensive deep learning (DL) models.
Design/methodology/approach
The approach that is used by this study is to use traditional image processing and classification techniques that can be applied to captured layer images to detect and classify defects without the need for DL algorithms.
Findings
The study proved that a defect classification algorithm could be developed by making use of traditional ML models with a high degree of accuracy and the images could be processed at higher speeds than typically reported in literature when making use of DL models.
Originality/value
This paper addresses a need that has been identified for a high-speed defect classification algorithm that can detect and classify defects without the need for specialised hardware that is typically used when making use of DL technologies. This is because when developing closed-loop feedback systems for these additive manufacturing machines, it is important to detect and classify defects without inducing additional delays to the control system.
Details
Keywords
Transaction cost becomes significant when one holds many securities in a large portfolio where capital allocations are frequently rebalanced due to variations in non-stationary…
Abstract
Purpose
Transaction cost becomes significant when one holds many securities in a large portfolio where capital allocations are frequently rebalanced due to variations in non-stationary statistical characteristics of the asset returns. The purpose of this paper is to employ a sparsing method to sparse the eigenportfolios, so that the transaction cost can be reduced and without any loss of its performance.
Design/methodology/approach
In this paper, the authors have designed pdf-optimized mid-tread Lloyd-Max quantizers based on the distribution of each eigenportfolio, and then employed them to sparse the eigenportfolios, so those small size orders may usually be ignored (sparsed), as the result, the trading costs have been reduced.
Findings
The authors find that the sparsing technique addressed in this paper is methodic, easy to implement for large size portfolios and it offers significant reduction in transaction cost without any loss of performance.
Originality/value
In this paper, the authors investigated the performance the sparsed eigenportfolios of stock returns in S&P500 Index. It is shown that the sparsing method is simple to implement and it provides high levels of sparsity without causing PNL loss. Therefore, transaction cost of managing a large size portfolio is reduced by employing such an efficient sparsity method.
Details
Keywords
Alenka Kavčič Čolić and Andreja Hari
The current predominant delivery format resulting from digitization is PDF, which is not appropriate for the blind, partially sighted and people who read on mobile devices. To…
Abstract
Purpose
The current predominant delivery format resulting from digitization is PDF, which is not appropriate for the blind, partially sighted and people who read on mobile devices. To meet the needs of both communities, as well as broader ones, alternative file formats are required. With the findings of the eBooks-On-Demand-Network Opening Publications for European Netizens project research, this study aims to improve access to digitized content for these communities.
Design/methodology/approach
In 2022, the authors conducted research on the digitization experiences of 13 EODOPEN partners at their organizations. The authors distributed the same sample of scans in English with different characteristics, and in accordance with Web content accessibility guidelines, the authors created 24 criteria to analyze their digitization workflows, output formats and optical character recognition (OCR) quality.
Findings
In this contribution, the authors present the results of a trial implementation among EODOPEN partners regarding their digitization workflows, used delivery file formats and the resulting quality of OCR results, depending on the type of digitization output file format. It was shown that partners using the OCR tool ABBYY FineReader Professional and producing scanning outputs in tagged PDF and PDF/UA formats achieved better results according to set criteria.
Research limitations/implications
The trial implementations were limited to 13 project partners’ organizations only.
Originality/value
This research paper can be a valuable contribution to the field of massive digitization practices, particularly in terms of improving the accessibility of the output delivery file formats.
Details
Keywords
Pingan Zhu, Chao Zhang and Jun Zou
The purpose of the work is to provide a comprehensive review of the digital image correlation (DIC) technique for those who are interested in performing the DIC technique in the…
Abstract
Purpose
The purpose of the work is to provide a comprehensive review of the digital image correlation (DIC) technique for those who are interested in performing the DIC technique in the area of manufacturing.
Design/methodology/approach
No methodology was used because the paper is a review article.
Findings
no fundings.
Originality/value
Herein, the historical development, main strengths and measurement setup of DIC are introduced. Subsequently, the basic principles of the DIC technique are outlined in detail. The analysis of measurement accuracy associated with experimental factors and correlation algorithms is discussed and some useful recommendations for reducing measurement errors are also offered. Then, the utilization of DIC in different manufacturing fields (e.g. cutting, welding, forming and additive manufacturing) is summarized. Finally, the current challenges and prospects of DIC in intelligent manufacturing are discussed.
Details
Keywords
Reginald Masimba Mbona and Kong Yusheng
The Chinese Telecoms Industry has been rapidly growing over the years since 2001. An analysis of financial performance of the three giants in this industry is very important…
Abstract
Purpose
The Chinese Telecoms Industry has been rapidly growing over the years since 2001. An analysis of financial performance of the three giants in this industry is very important. However, it is difficult to know how many ratios can be used best with little information loss. The paper aims to discuss this issue.
Design/methodology/approach
A total of 18 financial ratios were calculated based on the financial statements for three companies, namely, China Mobile, China Unicom and China Telecom for a period of 17 years. A principal component analysis was run to come up with variables with significance value above 0.5 from each component.
Findings
At the end, the authors conclude how financial performance can be analysed using 12 ratios instead of the costly analysis of too many ratios that may be complex to interpret. The results also showed that ratios are all related as they come from the same statements, hence, the authors can use a few to represent the rest with limited loss of information.
Originality/value
This study will help different stakeholders who are interested in the financial performance of each company by giving them a shorter way to analyse performance. It will also assist those who do financial reporting on picking the ratios which matter in reflecting the performance of their companies. The use of PCA gives unbiased ratios that are most significant in assessing performance.
Details
Keywords
Xiaojie Xu and Yun Zhang
Forecasts of commodity prices are vital issues to market participants and policy makers. Those of corn are of no exception, considering its strategic importance. In the present…
Abstract
Purpose
Forecasts of commodity prices are vital issues to market participants and policy makers. Those of corn are of no exception, considering its strategic importance. In the present study, the authors assess the forecast problem for the weekly wholesale price index of yellow corn in China during January 1, 2010–January 10, 2020 period.
Design/methodology/approach
The authors employ the nonlinear auto-regressive neural network as the forecast tool and evaluate forecast performance of different model settings over algorithms, delays, hidden neurons and data splitting ratios in arriving at the final model.
Findings
The final model is relatively simple and leads to accurate and stable results. Particularly, it generates relative root mean square errors of 1.05%, 1.08% and 1.03% for training, validation and testing, respectively.
Originality/value
Through the analysis, the study shows usefulness of the neural network technique for commodity price forecasts. The results might serve as technical forecasts on a standalone basis or be combined with other fundamental forecasts for perspectives of price trends and corresponding policy analysis.
Details
Keywords
Alberto Giubilini and Paolo Minetola
The purpose of this study is to evaluate the 3D printability of a multimaterial, fully self-supporting auxetic structure. This will contribute to expanding the application of…
Abstract
Purpose
The purpose of this study is to evaluate the 3D printability of a multimaterial, fully self-supporting auxetic structure. This will contribute to expanding the application of additive manufacturing (AM) to new products, such as automotive suspensions.
Design/methodology/approach
An experimental approach for sample fabrication on a multiextruder 3D printer and characterization by compression testing was conducted along with numerical simulations, which were used to support the design of different auxetic configurations for the jounce bumper.
Findings
The effect of stacking different auxetic cell modules was discussed, and the findings demonstrated that a one-piece printed structure has a better performance than one composed of multiple single modules stacked on top of each other.
Research limitations/implications
The quality of the 3D printing process affected the performance of the final components and reproducibility of the results. Therefore, researchers are encouraged to further study component fabrication optimization to achieve a more reliable process.
Practical implications
This research work can help improve the manufacturing and functionality of a critical element of automotive suspension systems, such as the jounce bumper, which can efficiently reduce noise, vibration and harshness by absorbing impact energy.
Originality/value
In previous research, auxetic structures for the application of jounce bumpers have already been suggested. However, to the best of the authors’ knowledge, in this work, an AM approach was used for the first time to fabricate multimaterial auxetic structures, not only by co-printing a flexible thermoplastic polymer with a stiffer one but also by continuously extruding multilevel structures of auxetic cell modules.
Details
Keywords
Jorge Manuel Mercado-Colmenero, M. Dolores La Rubia, Elena Mata-García, Moisés Rodriguez-Santiago and Cristina Martin-Doñate
Because of the anisotropy of the process and the variability in the quality of printed parts, finite element analysis is not directly applicable to recycled materials manufactured…
Abstract
Purpose
Because of the anisotropy of the process and the variability in the quality of printed parts, finite element analysis is not directly applicable to recycled materials manufactured using fused filament fabrication. The purpose of this study is to investigate the numerical-experimental mechanical behavior modeling of the recycled polymer, that is, recyclable polyethylene terephthalate (rPET), manufactured by a deposition FFF process under compressive stresses for new sustainable designs.
Design/methodology/approach
In all, 42 test specimens were manufactured and analyzed according to the ASTM D695-15 standards. Eight numerical analyzes were performed on a real design manufactured with rPET using Young's compression modulus from the experimental tests. Finally, eight additional experimental tests under uniaxial compression loads were performed on the real sustainable design for validating its mechanical behavior versus computational numerical tests.
Findings
As a result of the experimental tests, rPET behaves linearly until it reaches the elastic limit, along each manufacturing axis. The results of this study confirmed the design's structural safety by the load scenario and operating boundary conditions. Experimental and numerical results show a difference of 0.001–0.024 mm, allowing for the rPET to be configured as isotropic in numerical simulation software without having to modify its material modeling equations.
Practical implications
The results obtained are of great help to industry, designers and researchers because they validate the use of recycled rPET for the ecological production of real-sustainable products using MEX technology under compressive stress and its configuration for numerical simulations. Major design companies are now using recycled plastic materials in their high-end designs.
Originality/value
Validation results have been presented on test specimens and real items, comparing experimental material configuration values with numerical results. Specifically, to the best of the authors’ knowledge, no industrial or scientific work has been conducted with rPET subjected to uniaxial compression loads for characterizing experimentally and numerically the material using these results for validating a real case of a sustainable industrial product.
Details
Keywords
María Belén Prados-Peña, George Pavlidis and Ana García-López
This study aims to analyze the impact of Artificial Intelligence (AI) and Machine Learning (ML) on heritage conservation and preservation, and to identify relevant future research…
Abstract
Purpose
This study aims to analyze the impact of Artificial Intelligence (AI) and Machine Learning (ML) on heritage conservation and preservation, and to identify relevant future research trends, by applying scientometrics.
Design/methodology/approach
A total of 1,646 articles, published between 1985 and 2021, concerning research on the application of ML and AI in cultural heritage were collected from the Scopus database and analyzed using bibliometric methodologies.
Findings
The findings of this study have shown that although there is a very important increase in academic literature in relation to AI and ML, publications that specifically deal with these issues in relation to cultural heritage and its conservation and preservation are significantly limited.
Originality/value
This study enriches the academic outline by highlighting the limited literature in this context and therefore the need to advance the study of AI and ML as key elements that support heritage researchers and practitioners in conservation and preservation work.
Details