Search results
1 – 8 of 8Praveen Kumar Lendale and N.M. Nandhitha
Speckle noise removal in ultrasound images is one of the important tasks in biomedical-imaging applications. Many filtering -based despeckling methods are discussed in many…
Abstract
Purpose
Speckle noise removal in ultrasound images is one of the important tasks in biomedical-imaging applications. Many filtering -based despeckling methods are discussed in many existing works. Two-dimensional (2-D) transforms are also used enormously for the reduction of speckle noise in ultrasound medical images. In recent years, many soft computing-based intelligent techniques have been applied to noise removal and segmentation techniques. However, there is a requirement to improve the accuracy of despeckling using hybrid approaches.
Design/methodology/approach
The work focuses on double-bank anatomy with framelet transform combined with Gaussian filter (GF) and also consists of a fuzzy kind of clustering approach for despeckling ultrasound medical images. The presented transform efficiently rejects the speckle noise based on the gray scale relative thresholding where the directional filter group (DFB) preserves the edge information.
Findings
The proposed approach is evaluated by different performance indicators such as the mean square error (MSE), peak signal to noise ratio (PSNR) speckle suppression index (SSI), mean structural similarity and the edge preservation index (EPI) accordingly. It is found that the proposed methodology is superior in terms of all the above performance indicators.
Originality/value
Fuzzy kind clustering methods have been proved to be better than the conventional threshold methods for noise dismissal. The algorithm gives a reconcilable development as compared to other modern speckle reduction procedures, as it preserves the geometric features even after the noise dismissal.
Details
Keywords
Tze Huey Tam, Muhammad Zulkarnain Abdul Rahman, Sobri Harun, Shamsuddin Shahid, Sophal Try, Mohamad Hidayat Jamal, Zamri Ismail, Khamarrul Azahari Razak, Mohd Khairolden Ghani and Yusrin Faiz Abdul Wahab
The present study aims to evaluate the effect of climate change on the flood hazard potential in the Kelantan River Basin using current and future scenarios.
Abstract
Purpose
The present study aims to evaluate the effect of climate change on the flood hazard potential in the Kelantan River Basin using current and future scenarios.
Design/methodology/approach
The intensity-duration-frequency (IDF) was used to estimate the current 50- and 100-year return period 24-h design rainfall, and the climate change factor (CCF) was used to compute the future design rainfall. The CCF was calculated from the rainfall projections of two global climate models, CGCM1 and CCSM3, with different pre-processing steps applied to each. The IDF data were used in the rainfall-runoff-inundation model to simulate current and future flood inundation scenarios.
Findings
The estimated CCF values demonstrate a contrast, whereby each station had a CCF value greater than one for CGCM1, while some stations had a CCF value of less than one for CCSM3. Therefore, CGCM1 projected an aggravation and CCSM3 a reduction of flood hazard for future scenarios. The study reveals that topography plays an essential role in calculating the CCF.
Originality/value
To the best of the author’s knowledge, this is the first study to examine flood projections in the Kelantan River Basin. It is, therefore, hoped that these results could benefit local managers and authorities by enabling them to make informed decisions regarding flood risk mitigation in a climate change scenario.
Details
Keywords
Issahaku Haruna and Charles Godfred Ackah
Africa's business environment (BE) is characteristically unfriendly and poses severe development challenges. This study evaluates the impact of business climate on productivity in…
Abstract
Purpose
Africa's business environment (BE) is characteristically unfriendly and poses severe development challenges. This study evaluates the impact of business climate on productivity in sub-Saharan Africa (SSA).
Design/methodology/approach
Macroeconomic data for 51 sub-Saharan African economies from 1990 to 2018 are employed for the analysis. The seemingly unrelated regression model is used to address inter-sectorial linkages.
Findings
The study uncovers several findings. First, a high start-up cost substantially leads to productivity losses by limiting the funds available for investment in productivity-enhancing labour and technology and limiting the number of businesses that see the light of day. The productivity impacts of start-up costs are most enormous for industry, followed by services and agriculture. Second, economies with favourable financing environments tend to be more productive economy wide and sector wise. Third, high taxes and tax inefficiency lower productivity by reducing the resource envelope of firms, thus lowering investment amounts. Fourth, poor business infrastructure inflicts the most damage on productivity. Lastly, business administration and macroeconomic environments impact sectoral and economy-wide productivity.
Practical implications
SSA economies must strive to lower the cost of starting a business as high start-up costs injure productivity. One way of reducing start-up costs is to create a one-stop shop for registering and formalising a business. Another way is to automate business registration and administrative processes to reduce red tape and corruption.
Originality/value
The authors extend the body of knowledge by analysing sectoral and economy-wide productivity effects of various business climate indicators while accounting for inter-sectoral linkages, cross-sectional dependence and endogeneity.
Details
Keywords
Anwesa Kar and Rajiv Nandan Rai
The concept of sustainable product design (SPD) is gaining significant attention in recent research. However, due to inherent uncertainties associated with new product development…
Abstract
Purpose
The concept of sustainable product design (SPD) is gaining significant attention in recent research. However, due to inherent uncertainties associated with new product development and incorporation of multiple qualitative and quantitative criteria; SPD is a complex and challenging task. The purpose of this paper is to introduce a novel approach by integrating quality function deployment (QFD), multi-criteria decision making (MCDM) technique and Six Sigma evaluation for facilitating SPD in the context of Industry 4.0.
Design/methodology/approach
The customer requirements are evaluated through the neutrosophic-decision-making trial and evaluation laboratory-analytic network process (DEMATEL-ANP)-based approach followed by utilizing QFD matrix to estimate the weights of the engineering characteristics (EC). The Six Sigma method is then employed to evaluate the alternatives’ design based on the ECs’ values.
Findings
The effectiveness of the suggested approach is illustrated through an example. The result indicates that utilization of the neutrosophic MCDM technique with integration of Six Sigma methodology provides a simple, effective and computationally inexpensive method for SPD.
Practical implications
The proposed approach is helpful in upstream evaluation of the product design with limited experimental/numerical data, maintaining a strong competitive position in the market and enhancing customer satisfaction.
Originality/value
This work provides a novel approach to objectively quantify performance of SPD under the paradigm of Industry 4.0 using the integration of QFD-based hybrid MCDM with Six Sigma method.
Details
Keywords
Marcus Achenbach and Guido Morgenthal
The design check regarding the fire resistance of concrete slabs can be easily performed using tabulated values. These tables are based on experimental results, but the level of…
Abstract
Purpose
The design check regarding the fire resistance of concrete slabs can be easily performed using tabulated values. These tables are based on experimental results, but the level of safety, which is obtained by this approach, is not known. On the other hand, performance-based methods are more accepted, but require a target reliability as performance criterion. Hence, there is a need for calibration of the performance-based methods using the results of the “traditional” descriptive approach.
Design/methodology/approach
The calibration is performed for a single span concrete slab, where the axis distance of the reinforcement is chosen according to Eurocode 2 for a defined fire rating. A “standard” compartment is selected to cover typical fields of application. The opening factor is considered as parameter to obtain the maximum peak temperatures in the compartment. A Monte Carlo simulation, in combination with a response surface method, is set up to calculate the probabilities of failure.
Findings
The results indicate that the calculated reliability index for a standard is within the range, which has been used for the derivation of safety and combination factors in the Eurocodes. It can be observed that members designed for a fire rating R90 have a significant increase in the structural safety for natural fires compared to a design for a fire rating R30.
Originality/value
The level of safety, which is obtained by a design based on tabulated values, is quantified for concrete slabs. The results are a necessary input for the calibration of performance-based methods and could stimulate discussions among scientists and building authorities.
Details
Keywords
Olumide O. Olaoye and Mulatu F. Zerihun
The study examined the roles of fiscal and monetary policy in reducing poverty in sub-Saharan Africa (SSA), while accounting for macroeconomic disruptions. In particular, the…
Abstract
Purpose
The study examined the roles of fiscal and monetary policy in reducing poverty in sub-Saharan Africa (SSA), while accounting for macroeconomic disruptions. In particular, the study examined the complementarity of fiscal and monetary policy to mitigate shocks and reduce poverty in SSA.
Design/methodology/approach
The study adopts the fixed effect (within regression) model to account for country-specific characteristics, and a cross-sectional dependence – consistent model to control for the potential cross-sectional in panel data modelling. The study used the dummy variable approach to account for the macroeconomic shocks. The authors assigned 1 to the following years – 2008, 2014 and 2020; and 0 otherwise to take care of the global financial crisis, commodity terms of trade shocks and the COVID-19 pandemic respectively.
Findings
The study found that fiscal policy (particularly, government spending on health and education) has the greater capacity to reduce the level of poverty in SSA. The results also indicate that fiscal policy and monetary policy can work in tandem to reduce the negative effects of a pandemic. However, the study found an optimal threshold level of monetary policy beyond which monetary policy reduces the effectiveness of fiscal policy to reduce poverty in SSA. The research and policy implications are discussed.
Originality/value
The study, unlike previous studies, accounts for the impact of macroeconomic shocks in the monetary/fiscal policy and poverty literature.
Details
Keywords
B. Vasavi, P. Dileep and Ulligaddala Srinivasarao
Aspect-based sentiment analysis (ASA) is a task of sentiment analysis that requires predicting aspect sentiment polarity for a given sentence. Many traditional techniques use…
Abstract
Purpose
Aspect-based sentiment analysis (ASA) is a task of sentiment analysis that requires predicting aspect sentiment polarity for a given sentence. Many traditional techniques use graph-based mechanisms, which reduce prediction accuracy and introduce large amounts of noise. The other problem with graph-based mechanisms is that for some context words, the feelings change depending on the aspect, and therefore it is impossible to draw conclusions on their own. ASA is challenging because a given sentence can reveal complicated feelings about multiple aspects.
Design/methodology/approach
This research proposed an optimized attention-based DL model known as optimized aspect and self-attention aware long short-term memory for target-based semantic analysis (OAS-LSTM-TSA). The proposed model goes through three phases: preprocessing, aspect extraction and classification. Aspect extraction is done using a double-layered convolutional neural network (DL-CNN). The optimized aspect and self-attention embedded LSTM (OAS-LSTM) is used to classify aspect sentiment into three classes: positive, neutral and negative.
Findings
To detect and classify sentiment polarity of the aspect using the optimized aspect and self-attention embedded LSTM (OAS-LSTM) model. The results of the proposed method revealed that it achieves a high accuracy of 95.3 per cent for the restaurant dataset and 96.7 per cent for the laptop dataset.
Originality/value
The novelty of the research work is the addition of two effective attention layers in the network model, loss function reduction and accuracy enhancement, using a recent efficient optimization algorithm. The loss function in OAS-LSTM is minimized using the adaptive pelican optimization algorithm, thus increasing the accuracy rate. The performance of the proposed method is validated on four real-time datasets, Rest14, Lap14, Rest15 and Rest16, for various performance metrics.
Details
Keywords
Nehal Elshaboury, Eslam Mohammed Abdelkader, Abobakr Al-Sakkaf and Ashutosh Bagchi
The energy efficiency of buildings has been emphasized along with the continual development in the building and construction sector that consumes a significant amount of energy…
Abstract
Purpose
The energy efficiency of buildings has been emphasized along with the continual development in the building and construction sector that consumes a significant amount of energy. To this end, the purpose of this research paper is to forecast energy consumption to improve energy resource planning and management.
Design/methodology/approach
This study proposes the application of the convolutional neural network (CNN) for estimating the electricity consumption in the Grey Nuns building in Canada. The performance of the proposed model is compared against that of long short-term memory (LSTM) and multilayer perceptron (MLP) neural networks. The models are trained and tested using monthly electricity consumption records (i.e. from May 2009 to December 2021) available from Concordia’s facility department. Statistical measures (e.g. determination coefficient [R2], root mean squared error [RMSE], mean absolute error [MAE] and mean absolute percentage error [MAPE]) are used to evaluate the outcomes of models.
Findings
The results reveal that the CNN model outperforms the other model predictions for 6 and 12 months ahead. It enhances the performance metrics reported by the LSTM and MLP models concerning the R2, RMSE, MAE and MAPE by more than 4%, 6%, 42% and 46%, respectively. Therefore, the proposed model uses the available data to predict the electricity consumption for 6 and 12 months ahead. In June and December 2022, the overall electricity consumption is estimated to be 195,312 kWh and 254,737 kWh, respectively.
Originality/value
This study discusses the development of an effective time-series model that can forecast future electricity consumption in a Canadian heritage building. Deep learning techniques are being used for the first time to anticipate the electricity consumption of the Grey Nuns building in Canada. Additionally, it evaluates the effectiveness of deep learning and machine learning methods for predicting electricity consumption using established performance indicators. Recognizing electricity consumption in buildings is beneficial for utility providers, facility managers and end users by improving energy and environmental efficiency.
Details