Search results

1 – 10 of over 9000
Article
Publication date: 12 October 2010

Hong‐jun Li, Zhi‐min Zhao and Xiao‐lei Yu

The traditional total variation (TV) models in wavelet domain use thresholding directly in coefficients selection and show that Gibbs' phenomenon exists. However, the nonzero…

Abstract

Purpose

The traditional total variation (TV) models in wavelet domain use thresholding directly in coefficients selection and show that Gibbs' phenomenon exists. However, the nonzero coefficient index set selected by hard thresholding techniques may not be the best choice to obtain the least oscillatory reconstructions near edges. This paper aims to propose an image denoising method based on TV and grey theory in the wavelet domain to solve the defect of traditional methods.

Design/methodology/approach

In this paper, the authors divide wavelet into two parts: low frequency area and high frequency area; in different areas different methods are used. They apply grey theory in wavelet coefficient selection. The new algorithm gives a new method of wavelet coefficient selection, solves the nonzero coefficients sort, and achieves a good image denoising result while reducing the phenomenon of “Gibbs.”

Findings

The results show that the method proposed in this paper can distinguish between the information of image and noise accurately and also reduce the Gibbs artifacts. From the comparisons, the model proposed preserves the important information of the image very well and shows very good performance.

Originality/value

The proposed image denoising model introducing grey relation analysis in the wavelet coefficients selecting and modifying is original. The proposed model provides a viable tool to engineers for processing the image.

Details

Engineering Computations, vol. 27 no. 7
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 27 July 2021

Papangkorn Pidchayathanakorn and Siriporn Supratid

A major key success factor regarding proficient Bayes threshold denoising refers to noise variance estimation. This paper focuses on assessing different noise variance estimations…

Abstract

Purpose

A major key success factor regarding proficient Bayes threshold denoising refers to noise variance estimation. This paper focuses on assessing different noise variance estimations in three Bayes threshold models on two different characteristic brain lesions/tumor magnetic resonance imaging (MRIs).

Design/methodology/approach

Here, three Bayes threshold denoising models based on different noise variance estimations under the stationary wavelet transforms (SWT) domain are mainly assessed, compared to state-of-the-art non-local means (NLMs). Each of those three models, namely D1, GB and DR models, respectively, depends on the most detail wavelet subband at the first resolution level, on the entirely global detail subbands and on the detail subband in each direction/resolution. Explicit and implicit denoising performance are consecutively assessed by threshold denoising and segmentation identification results.

Findings

Implicit performance assessment points the first–second best accuracy, 0.9181 and 0.9048 Dice similarity coefficient (Dice), sequentially yielded by GB and DR; reliability is indicated by 45.66% Dice dropping of DR, compared against 53.38, 61.03 and 35.48% of D1 GB and NLMs, when increasing 0.2 to 0.9 noise level on brain lesions MRI. For brain tumor MRI under 0.2 noise level, it denotes the best accuracy of 0.9592 Dice, resulted by DR; however, 8.09% Dice dropping of DR, relative to 6.72%, 8.85 and 39.36% of D1, GB and NLMs is denoted. The lowest explicit and implicit denoising performances of NLMs are obviously pointed.

Research limitations/implications

A future improvement of denoising performance possibly refers to creating a semi-supervised denoising conjunction model. Such model utilizes the denoised MRIs, resulted by DR and D1 thresholding model as uncorrupted image version along with the noisy MRIs, representing corrupted version ones during autoencoder training phase, to reconstruct the original clean image.

Practical implications

This paper should be of interest to readers in the areas of technologies of computing and information science, including data science and applications, computational health informatics, especially applied as a decision support tool for medical image processing.

Originality/value

In most cases, DR and D1 provide the first–second best implicit performances in terms of accuracy and reliability on both simulated, low-detail small-size region-of-interest (ROI) brain lesions and realistic, high-detail large-size ROI brain tumor MRIs.

Article
Publication date: 23 August 2013

Li Hong‐jun, Hu Wei, Xie Zheng‐guang and Wang Wei

The paper aims to do some further research on grey relational analysis applied in wavelet transform, and proposed a grey relational threshold algorithm for image denoising. This…

Abstract

Purpose

The paper aims to do some further research on grey relational analysis applied in wavelet transform, and proposed a grey relational threshold algorithm for image denoising. This study tries to suppress the noise while retaining the edges and important structures as much as possible.

Design/methodology/approach

The paper analyzed the characters of noises and edges distribution in different subbands; then used the grey relational value to calculate the relationship of scale, direction and noise deviation. This paper used the grey relational value of scale, direction and noise deviation as influenced factors, and proposed a grey relational threshold algorithm.

Findings

Grey relational analysis used in threshold setting has the superiority in image denoising. The simulation results have already certified both in visual effect and peak signal to noise ratio (PSNR).

Originality/value

This paper applied grey relation theory into image denoising, and proposed a grey relational threshold algorithm. It provides a novel method for image denoising.

Details

Grey Systems: Theory and Application, vol. 3 no. 2
Type: Research Article
ISSN: 2043-9377

Keywords

Article
Publication date: 13 May 2020

Scott B. Beyer, J. Christopher Hughen and Robert A. Kunkel

The authors examine the relation between noise trading in equity markets and stochastic volatility by estimating a two-factor jump diffusion model. Their analysis shows that…

Abstract

Purpose

The authors examine the relation between noise trading in equity markets and stochastic volatility by estimating a two-factor jump diffusion model. Their analysis shows that contemporaneous price deviations in the derivatives market are statistically significant in explaining movements in index futures prices and option-market volatility measures.

Design/methodology/approach

To understand the impact noise may have in the S&P 500 derivatives market, the authors first measure and evaluate the influence noise exerts on futures prices and then investigate its influence on option volatility.

Findings

In the period from 1996 to 2003, this study finds significant changes in the volatility and mean reversion in the noise level and a significant increase in its relation to implied volatility in option prices. The results are consistent with a bubble in technology stocks that occurred with significant increases in noise trading.

Research limitations/implications

This study provides estimates for this model during the periods preceding and during the technology bubble. The study analysis shows that the volatility and mean reversion in the noise level are much stronger during the bubble period. Furthermore, the relation between noise trading and implied volatility in the futures market was of a significantly larger magnitude during this period. The study results support the importance of noise trading in market bubbles.

Practical implications

Bloomfield, O'Hara and Saar (2009) find that noise traders lower bid–ask spreads and improve liquidity through increases in trading volume and market depth. Such improved market conditions could have positive effects on market quality, and this impact could be evidenced by lower implied volatility when noise traders are more active. Indeed, the results in this study indicate that the level and characteristics of noise trading are fundamentally different during the technology bubble, and this noise trading activity has a larger impact during this period on implied volatility in the options market.

Originality/value

This paper uniquely analyzes derivatives on the S&P 500 Index in order to detect the presence and influence of noise traders. The authors derive and implement a two-factor jump diffusion noise model. In their model, noise rectifies the difference of analysts' opinions, market information and beliefs among traders. By incorporating a reduced-form temporal expression of heterogeneities among traders, the model is rich enough to capture salient time-series characteristics of equity prices (i.e. stochastic volatility and jumps). A singular feature of the authors’ model is that stochastic volatility represents the random movements in asset prices that are attributed to nonmarket fundamentals.

Details

Managerial Finance, vol. 46 no. 9
Type: Research Article
ISSN: 0307-4358

Keywords

Open Access
Article
Publication date: 17 January 2020

Erkki Kalervo Laitinen

The purpose of this study is to introduce a matching function approach to analyze matching in financial reporting.

7349

Abstract

Purpose

The purpose of this study is to introduce a matching function approach to analyze matching in financial reporting.

Design/methodology/approach

The matching function is first analyzed analytically. It is specified as a multiplicative Cobb-Douglas-type function of three categories of expenses (labor expense, material expense and depreciation). The specified matching function is solved by the generalized reduced gradient method (GRG) for 10-year time series from 8,226 Finnish firms. The coefficient of determination of the logarithmic model (CODL) is compared with the linear revenue-expense correlation coefficient (REC) that is generally used in previous studies.

Findings

Empirical evidence showed that REC is outperformed by CODL. CODL was found independent of or weakly negatively dependent on the matching elasticity of labor expense, positively dependent on the material expense elasticity and negatively dependent on depreciation elasticity. Therefore, the differences in matching accuracy between industries emphasizing different expense categories are significant.

Research limitations/implications

The matching function is a general approach to assess the matching accuracy but it is in this study specified multiplicatively for three categories of expenses. Moreover, only one algorithm is tested in the empirical estimation of the function. The analysis is concentrated on ten-year time-series of a limited sample of Finnish firms.

Practical implications

The matching function approach provides a large set of important information for considering the matching process in practice. It can prove a useful method also to accounting standard-setters and other specialists such as managers, consultants and auditors.

Originality/value

This study is the first study to apply the new matching function approach.

Details

Journal of Financial Reporting and Accounting, vol. 18 no. 1
Type: Research Article
ISSN: 1985-2517

Keywords

Article
Publication date: 21 November 2022

Aissa Boucedra and Madani Bederina

This paper aims to characterize and develop a new ecological lightweight concrete reinforced by addition of palm plant fibers (from vegetal waste) to be used in the thermal and…

Abstract

Purpose

This paper aims to characterize and develop a new ecological lightweight concrete reinforced by addition of palm plant fibers (from vegetal waste) to be used in the thermal and acoustical insulation of local constructions. The date palm plant fibers are characterized by their low sensitivity to chemical reactions, low cost and large availability in local regions. Therefore, the newly obtained lightweight concrete may suggest a great interest, as it seems to be able to achieve good solutions for local construction problems, technically, economically and ecologically.

Design/methodology/approach

The experimental program focused on developing the composition of palm-fiber-reinforced concrete, by studying the effect of the length of the fibers (10, 20, 30 and 40 mm) and their mass percentage (0.5%, 1%, 1.5% and 2%), on the mechanical and acoustical properties of the composite. The main measured parameters were the compressive strength and flexural strength, sound absorption coefficient, noise reduction coefficient (NRC), etc. These tests were also borne out by the measure of density and water absorption, as well as microstructure analyses. To fully appreciate the behavior of the material, visualizations under optical microscope and scanning electron microscope analyses were carried out.

Findings

The addition of plant fibers to concrete made it possible to formulate a new lightweight concrete having interesting properties. The addition of date palm fibers significantly decreased the density of the concrete and consequently reduced its mechanical strength, particularly in compression. Acceptable compressive strength values were possible, according to the fibers content, while better values have been obtained in flexion. On the other hand, good acoustical performances were obtained: a considerable increase in the sound absorption coefficient and the NRC was recorded, according to the content and length of fibers. Even the rheological behavior has been improved with the addition of fibers, but with short fibers only.

Originality/value

Over the recent decades, many studies have attempted to search for more sustainable and environmentally friendly building materials. Therefore, this work aims to study the possibility of using waste from date palm trees as fibers in concrete instead of the conventionally used fibers. Although many researches have already been conducted on the effect of palm plant fibers on the mechanical/physical properties of concrete, no information is available neither on the formulation of this type of concrete nor on its acoustical properties. Indeed, due to the scarcity of raw materials and the excessive consumption of energy, the trend of plant fibers as resources, which are natural and renewable, is very attractive. It is therefore a major recycling project of waste and recovery of local materials.

Details

World Journal of Engineering, vol. 21 no. 1
Type: Research Article
ISSN: 1708-5284

Keywords

Article
Publication date: 1 September 1996

A. Caddemi and M. Sannino

Noise parameters of high electron mobility transistors (HEMT) at microwave frequencies are a subject of active research since the knowledge of their performance is of key…

Abstract

Noise parameters of high electron mobility transistors (HEMT) at microwave frequencies are a subject of active research since the knowledge of their performance is of key importance for the use of these devices for designing low‐noise amplifiers. Employs a simple noise model to derive the analytical expressions for the device noise parameters F0, Γ0 and N in terms of the electrical elements associated with the basic equivalent circuit of an HEMT. Analyses such expressions to establish some fundamental relationships, as well as the expected noise performance of the device when the parasitic elements representing package effects are included.

Details

COMPEL - The international journal for computation and mathematics in electrical and electronic engineering, vol. 15 no. 3
Type: Research Article
ISSN: 0332-1649

Keywords

Article
Publication date: 1 February 2018

Chunhua Ren, Xiaoming Hu, Poyun Qin, Leilei Li and Tong He

Measurement-while-drilling (MWD) system has been used to provide trajectory and inclination parameters of the oil and gas well. Fluxgate magnetometer is a traditional choice for…

Abstract

Purpose

Measurement-while-drilling (MWD) system has been used to provide trajectory and inclination parameters of the oil and gas well. Fluxgate magnetometer is a traditional choice for one MWD system; however, it cannot obtain effective trajectory parameters in nonmagnetic environments. Fiber-optic-gyroscope (FOG) inclinometer system is a favorable substitute of fluxgate magnetometer, which can avoid the flaws associated with magnetic monitoring devices. However, there are some limitations and increasing surveying errors in this system under high impact conditions. This paper aims to overcome these imperfections of the FOG inclinometer system.

Design/methodology/approach

To overcome the imperfections, filtering algorithms are used to improve the precision of the equipment. The authors compare the low-pass filtering algorithm with the wavelet de-noising algorithm applied to real experimental data. Quantitative comparison of the error between the true and processed signal revealed that the wavelet de-noising method outperformed the low-pass filtering method. To achieve optimal positioning effects, the wavelet de-noising algorithm is finally used to inhibit the interference caused by the impact.

Findings

The experimental results show that the method proposed can ensure the azimuth accuracy lower than ±2 degrees and the inclination accuracy lower than ± 0.15 degrees under the condition of interval impact. The method proposed can overcome the interference generated by the impact in the well, which makes the instrument suitable for the measurement of small-diameter casing well.

Originality/value

After conducting the wavelet threshold filtering on the raw data of accelerometers, the noise generated by the impact is successfully suppressed, which is expected to meet the special requirement of the down-hole survey environment.

Details

Sensor Review, vol. 38 no. 3
Type: Research Article
ISSN: 0260-2288

Keywords

Article
Publication date: 5 June 2020

Hiren Mewada, Amit V. Patel, Jitendra Chaudhari, Keyur Mahant and Alpesh Vala

In clinical analysis, medical image segmentation is an important step to study the anatomical structure. This helps to diagnose and classify abnormality in the image. The wide…

Abstract

Purpose

In clinical analysis, medical image segmentation is an important step to study the anatomical structure. This helps to diagnose and classify abnormality in the image. The wide variations in the image modality and limitations in the acquisition process of instruments make this segmentation challenging. This paper aims to propose a semi-automatic model to tackle these challenges and to segment medical images.

Design/methodology/approach

The authors propose Legendre polynomial-based active contour to segment region of interest (ROI) from the noisy, low-resolution and inhomogeneous medical images using the soft computing and multi-resolution framework. In the first phase, initial segmentation (i.e. prior clustering) is obtained from low-resolution medical images using fuzzy C-mean (FCM) clustering and noise is suppressed using wavelet energy-based multi-resolution approach. In the second phase, resultant segmentation is obtained using the Legendre polynomial-based level set approach.

Findings

The proposed model is tested on different medical images such as x-ray images for brain tumor identification, magnetic resonance imaging (MRI), spine images, blood cells and blood vessels. The rigorous analysis of the model is carried out by calculating the improvement against noise, required processing time and accuracy of the segmentation. The comparative analysis concludes that the proposed model withstands the noise and succeeds to segment any type of medical modality achieving an average accuracy of 99.57%.

Originality/value

The proposed design is an improvement to the Legendre level set (L2S) model. The integration of FCM and wavelet transform in L2S makes model insensitive to noise and intensity inhomogeneity and hence it succeeds to segment ROI from a wide variety of medical images even for the images where L2S failed to segment them.

Details

Engineering Computations, vol. 37 no. 9
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 16 November 2018

Michael J. McCord, Sean MacIntyre, Paul Bidanset, Daniel Lo and Peadar Davis

Air quality, noise and proximity to urban infrastructure can arguably have an important impact on the quality of life. Environmental quality (the price of good health) has become…

Abstract

Purpose

Air quality, noise and proximity to urban infrastructure can arguably have an important impact on the quality of life. Environmental quality (the price of good health) has become a central tenet for consumer choice in urban locales when deciding on a residential neighbourhood. Unlike the market for most tangible goods, the market for environmental quality does not yield an observable per unit price effect. As no explicit price exists for a unit of environmental quality, this paper aims to use the housing market to derive its implicit price and test whether these constituent elements of health and well-being are indeed capitalised into property prices and thus implicitly priced in the market place.

Design/methodology/approach

A considerable number of studies have used hedonic pricing models by incorporating spatial effects to assess the impact of air quality, noise and proximity to noise pollutants on property market pricing. This study presents a spatial analysis of air quality and noise pollution and their association with house prices, using 2,501 sale transactions for the period 2013. To assess the impact of the pollutants, three different spatial modelling approaches are used, namely, ordinary least squares using spatial dummies, a geographically weighted regression (GWR) and a spatial lag model (SLM).

Findings

The findings suggest that air quality pollutants have an adverse impact on house prices, which fluctuate across the urban area. The analysis suggests that the noise level does matter, although this varies significantly over the urban setting and varies by source.

Originality/value

Air quality and environmental noise pollution are important concerns for health and well-being. Noise impact seems to depend not only on the noise intensity to which dwellings are exposed but also on the nature of the noise source. This may suggest the presence of other externalities that arouse social aversion. This research presents an original study utilising advanced spatial modelling approaches. The research has value in further understanding the market impact of environmental factors and in providing findings to support local air zone management strategies, noise abatement and management strategies and is of value to the wider urban planning and public health disciplines.

Details

Journal of European Real Estate Research, vol. 11 no. 3
Type: Research Article
ISSN: 1753-9269

Keywords

1 – 10 of over 9000