Search results
1 – 10 of 132Anirudh Singh and Madhumita Chakraborty
This paper analyzes how air pollution and the public attention to it influence the returns of stocks in the Indian context.
Abstract
Purpose
This paper analyzes how air pollution and the public attention to it influence the returns of stocks in the Indian context.
Design/methodology/approach
The study uses firm-level data for the stocks listed on National Stock Exchange in India. Air quality is measured using the Air Quality Index (AQI) values provided by US Embassy and Consulates’ Air Quality Monitor in India. Google Search Volume Index (GSVI) of the relevant terms acts as the measure of public attention. Appropriate regression models are used to address how AQI and attention influence stock returns.
Findings
It is observed that degrading air quality alone is unable to explain the stock returns. It is the combined effect of increasing AQI and subsequent rise in associated public attention that negatively impacts these returns. Returns of firms with poor environment score component in their environmental, social, governance (ESG) scores are more negatively affected compared to firms with higher environment scores.
Practical implications
Investors can make use of this knowledge to formulate effective trading strategies and ensure higher chances of profitability in the share market.
Originality/value
To the knowledge of the authors, no earlier study has investigated the effects of AQI and attention together to explain stock price movements. The study is conducted in the Indian context providing a unique opportunity to study the behavioral impact of these effects in one of the fastest growing global economies, which is also plagued by an alarming increase in ambient air pollution.
Details
Keywords
Xiaoxia Zhang, Jin Zhang, Peiyan Du and Guohe Wang
In this paper, the brain potential changes caused by touching fabrics for handle evaluation were recorded by event related potential (ERP) method, compared with subjective…
Abstract
Purpose
In this paper, the brain potential changes caused by touching fabrics for handle evaluation were recorded by event related potential (ERP) method, compared with subjective evaluation scores and physical index of KES, explore the cognitive mechanism of the transformation of tactile sensation into neural impulses triggered by subtle mechanical stimuli such as material, texture, density and morphology in fabrics. By combining subjective evaluation of fabric tactile sensation, objective physical properties of fabrics and objective neurobiological signals, explore the neurophysiological mechanism of tactile cognition and the signal characteristics and time process of tactile information processing.
Design/methodology/approach
The ERP technology was first proposed by a British psychologist named Grey Walter. It is an imaging technique of noninvasive brain cognition, whose potential changes are related to the human physical and mental activities. ERP is different from electroencephalography (EEG) and evoked potentials (EP) on the fact that it cannot only record stimulated physical information which is transmitted to brain, but also response to the psychological activities which related to attention, identification, comparison, memory, judgment and cognition as well as to human’s neural physiological changes which are caused by cognitive process of the feeling by stimulation.
Findings
According to potential changes in the cerebral cortex evoked by touching four types of silk fabrics, human brain received the physical stimulation in the early stage (50 ms) of fabrics handle evaluation, and the P50 component amplitude showed negative correlation with fabric smoothness sensations. Around 200 ms after tactile stimulus onset, the amplitude of P200 component show positive correlation with the softness sensation of silk fabrics. The relationship between the amplitude of P300 and the sense of smoothness and softness need further evidence to proof.
Originality/value
In this paper, the brain potential changes caused by touching fabrics for handle evaluation were recorded by event related potential (ERP) method, compared with subjective evaluation scores and physical index of KES, the results shown that the maximum amplitude of P50 component evoked by fabric touching is related to the fabrics’ smoothness and roughness emotion, which means in the early stage processing of tactile sensation, the rougher fabrics could arouse more attention. In addition, the amplitude of P200 component shows positive correlation with the softness sensation of silk fabrics.
Details
Keywords
Winston T. Su, Zach W.Y. Lee, Xinming He and Tommy K.H. Chan
The global market for cloud gaming is growing rapidly. How gamers evaluate the service quality of this emerging form of cloud service has become a critical issue for both…
Abstract
Purpose
The global market for cloud gaming is growing rapidly. How gamers evaluate the service quality of this emerging form of cloud service has become a critical issue for both researchers and practitioners. Building on the literature on service quality and software as a service, this study develops and validates a gamer-centric measurement instrument for cloud gaming service quality.
Design/methodology/approach
A three-step measurement instrument development process, including item generation, scale development and instrument testing, was adopted to conceptualize and operationalize cloud gaming service quality.
Findings
Cloud gaming service quality consists of two second-order constructs of support service quality and technical service quality with seven first-order dimensions, namely rapport, responsiveness, reliability, compatibility, ubiquity, smoothness and comprehensiveness. The instrument exhibits desirable psychometric properties.
Practical implications
Practitioners can use this new measurement instrument to evaluate gamers' perceptions toward their service and to identify areas for improvement.
Originality/value
This study contributes to the service quality literature by utilizing qualitative and quantitative approaches to develop and validate a new measurement instrument of service quality in the context of cloud gaming and by identifying new dimensions (compatibility, ubiquity, smoothness and comprehensiveness) specific to it.
Details
Keywords
J.I. Ramos and Carmen María García López
The purpose of this paper is to analyze numerically the blowup in finite time of the solutions to a one-dimensional, bidirectional, nonlinear wave model equation for the…
Abstract
Purpose
The purpose of this paper is to analyze numerically the blowup in finite time of the solutions to a one-dimensional, bidirectional, nonlinear wave model equation for the propagation of small-amplitude waves in shallow water, as a function of the relaxation time, linear and nonlinear drift, power of the nonlinear advection flux, viscosity coefficient, viscous attenuation, and amplitude, smoothness and width of three types of initial conditions.
Design/methodology/approach
An implicit, first-order accurate in time, finite difference method valid for semipositive relaxation times has been used to solve the equation in a truncated domain for three different initial conditions, a first-order time derivative initially equal to zero and several constant wave speeds.
Findings
The numerical experiments show a very rapid transient from the initial conditions to the formation of a leading propagating wave, whose duration depends strongly on the shape, amplitude and width of the initial data as well as on the coefficients of the bidirectional equation. The blowup times for the triangular conditions have been found to be larger than those for the Gaussian ones, and the latter are larger than those for rectangular conditions, thus indicating that the blowup time decreases as the smoothness of the initial conditions decreases. The blowup time has also been found to decrease as the relaxation time, degree of nonlinearity, linear drift coefficient and amplitude of the initial conditions are increased, and as the width of the initial condition is decreased, but it increases as the viscosity coefficient is increased. No blowup has been observed for relaxation times smaller than one-hundredth, viscosity coefficients larger than ten-thousandths, quadratic and cubic nonlinearities, and initial Gaussian, triangular and rectangular conditions of unity amplitude.
Originality/value
The blowup of a one-dimensional, bidirectional equation that is a model for the propagation of waves in shallow water, longitudinal displacement in homogeneous viscoelastic bars, nerve conduction, nonlinear acoustics and heat transfer in very small devices and/or at very high transfer rates has been determined numerically as a function of the linear and nonlinear drift coefficients, power of the nonlinear drift, viscosity coefficient, viscous attenuation, and amplitude, smoothness and width of the initial conditions for nonzero relaxation times.
Details
Keywords
Douglas Ramalho Queiroz Pacheco
This study aims to propose and numerically assess different ways of discretising a very weak formulation of the Poisson problem.
Abstract
Purpose
This study aims to propose and numerically assess different ways of discretising a very weak formulation of the Poisson problem.
Design/methodology/approach
We use integration by parts twice to shift smoothness requirements to the test functions, thereby allowing low-regularity data and solutions.
Findings
Various conforming discretisations are presented and tested, with numerical results indicating good accuracy and stability in different types of problems.
Originality/value
This is one of the first articles to propose and test concrete discretisations for very weak variational formulations in primal form. The numerical results, which include a problem based on real MRI data, indicate the potential of very weak finite element methods for tackling problems with low regularity.
Details
Keywords
Claudio Columbano, Lucia Biondi and Enrico Bracci
This paper aims to contribute to the debate over the desirability of introducing an accrual-based accounting system in the public sector by examining whether accrual-based…
Abstract
Purpose
This paper aims to contribute to the debate over the desirability of introducing an accrual-based accounting system in the public sector by examining whether accrual-based accounting information is superior to cash-based information in the context of public sector entities.
Design/methodology/approach
This paper applies a quantitative research method to assess the degree of smoothness and relevance of the accrual components of income recorded by 302 entities of the Italian National Health Service (INHS) over the period 2014–2020.
Findings
The analysis reveals that net income is smoother than cash flows as a summary measure of economic results and that accounting for accruals improves the predictability of future cash flows. However, the authors' novel disaggregation of accrual accounts reveals that those accounts that contribute the most to making income smoother than cash flows – noncurrent assets and liabilities – are also those that contribute the least to predicting future cash flows.
Originality/value
The disaggregation of accrual accounts allows to identify the sources of the informational benefits of accrual accounting, and to document the existence of an informational “trade-off” between smoothness and relevance in the context of public sector entities.
Details
Keywords
Wenxue Wang, Qingxia Li and Wenhong Wei
Community detection of dynamic networks provides more effective information than static network community detection in the real world. The mainstream method for community…
Abstract
Purpose
Community detection of dynamic networks provides more effective information than static network community detection in the real world. The mainstream method for community detection in dynamic networks is evolutionary clustering, which uses temporal smoothness of community structures to connect snapshots of networks in adjacent time intervals. However, the error accumulation issues limit the effectiveness of evolutionary clustering. While the multi-objective evolutionary approach can solve the issue of fixed settings of the two objective function weight parameters in the evolutionary clustering framework, the traditional multi-objective evolutionary approach lacks self-adaptability.
Design/methodology/approach
This paper proposes a community detection algorithm that integrates evolutionary clustering and decomposition-based multi-objective optimization methods. In this approach, a benchmark correction procedure is added to the evolutionary clustering framework to prevent the division results from drifting.
Findings
Experimental results demonstrate the superior accuracy of this method compared to similar algorithms in both real and synthetic dynamic datasets.
Originality/value
To enhance the clustering results, adaptive variances and crossover probabilities are designed based on the relative change amounts of the subproblems decomposed by MOEA/D (A Multiobjective Optimization Evolutionary Algorithm based on Decomposition) to dynamically adjust the focus of different evolutionary stages.
Details
Keywords
Hongping Xing, Yu Liu and Xiaodan Sun
The smoothness of the high-speed railway (HSR) on the bridge may exceed the allowable standard when an earthquake causes vibrations for HSR bridges, which may threaten the safety…
Abstract
Purpose
The smoothness of the high-speed railway (HSR) on the bridge may exceed the allowable standard when an earthquake causes vibrations for HSR bridges, which may threaten the safety of running trains. Indeed, few studies have evaluated the exceeding probability of rail displacement exceeding the allowable standard. The purposes of this article are to provide a method for investigating the exceeding probability of the rail displacement of HSRs under seismic excitation and to calculate the exceeding probability.
Design/methodology/approach
In order to investigate the exceeding probability of the rail displacement under different seismic excitations, the workflow of analyzing the smoothness of the rail based on incremental dynamic analysis (IDA) is proposed, and the intensity measure and limit state for the exceeding probability analysis of HSRs are defined. Then a finite element model (FEM) of an assumed HSR track-bridge system is constructed, which comprises a five-span simply-supported girder bridge supporting a finite length CRTS II ballastless track. Under different seismic excitations, the seismic displacement response of the rail is calculated; the character of the rail displacement is analyzed; and the exceeding probability of the rail vertical displacement exceeding the allowable standard (2mm) is investigated.
Findings
The results show that: (1) The bridge-abutment joint position may form a step-like under seismic excitation, threatening the running safety of high-speed trains under seismic excitations, and the rail displacements at mid-span positions are bigger than that at other positions on the bridge. (2) The exceeding probability of rail displacement is up to about 44% when PGA = 0.01g, which is the level-five risk probability and can be described as 'very likely to happen'. (3) The exceeding probability of the rail at the mid-span positions is bigger than that above other positions of the bridge, and the mid-span positions of the track-bridge system above the bridge may be the most hazardous area for the running safety of trains under seismic excitation when high-speed trains run on bridges.
Originality/value
The work extends the seismic hazardous analysis of HSRs and would lead to a better understanding of the exceeding probability for the rail of HSRs under seismic excitations and better references for the alert of the HSR operation.
Details
Keywords
Johnny Kwok Wai Wong, Mojtaba Maghrebi, Alireza Ahmadian Fard Fini, Mohammad Amin Alizadeh Golestani, Mahdi Ahmadnia and Michael Er
Images taken from construction site interiors often suffer from low illumination and poor natural colors, which restrict their application for high-level site management purposes…
Abstract
Purpose
Images taken from construction site interiors often suffer from low illumination and poor natural colors, which restrict their application for high-level site management purposes. The state-of-the-art low-light image enhancement method provides promising image enhancement results. However, they generally require a longer execution time to complete the enhancement. This study aims to develop a refined image enhancement approach to improve execution efficiency and performance accuracy.
Design/methodology/approach
To develop the refined illumination enhancement algorithm named enhanced illumination quality (EIQ), a quadratic expression was first added to the initial illumination map. Subsequently, an adjusted weight matrix was added to improve the smoothness of the illumination map. A coordinated descent optimization algorithm was then applied to minimize the processing time. Gamma correction was also applied to further enhance the illumination map. Finally, a frame comparing and averaging method was used to identify interior site progress.
Findings
The proposed refined approach took around 4.36–4.52 s to achieve the expected results while outperforming the current low-light image enhancement method. EIQ demonstrated a lower lightness-order error and provided higher object resolution in enhanced images. EIQ also has a higher structural similarity index and peak-signal-to-noise ratio, which indicated better image reconstruction performance.
Originality/value
The proposed approach provides an alternative to shorten the execution time, improve equalization of the illumination map and provide a better image reconstruction. The approach could be applied to low-light video enhancement tasks and other dark or poor jobsite images for object detection processes.
Details
Keywords
Mohamed Marzouk and Mohamed Zaher
Facility management gained profound importance due to the increasing complexity of different systems and the cost of operation and maintenance. However, due to the increasing…
Abstract
Purpose
Facility management gained profound importance due to the increasing complexity of different systems and the cost of operation and maintenance. However, due to the increasing complexity of different systems, facility managers may suffer from a lack of information. The purpose of this paper is to propose a new facility management approach that links segmented assets to the vital data required for managing facilities.
Design/methodology/approach
Automatic point cloud segmentation is one of the most crucial processes required for modelling building facilities. In this research, laser scanning is used for point cloud acquisition. The research utilises region growing algorithm, colour-based region-growing algorithm and Euclidean cluster algorithm.
Findings
A case study is worked out to test the accuracy of the considered point cloud segmentation algorithms utilising metrics precision, recall and F-score. The results indicate that Euclidean cluster extraction and region growing algorithm revealed high accuracy for segmentation.
Originality/value
The research presents a comparative approach for selecting the most appropriate segmentation approach required for accurate modelling. As such, the segmented assets can be linked easily with the data required for facility management.
Details