Search results
1 – 10 of 838Igor Menezes, Ana Cristina Menezes, Elton Moraes and Pedro P. Pires
This study investigates organizational climate under the thriving at work perspective using a network approach. The authors demonstrate how organizational climate functions as a…
Abstract
Purpose
This study investigates organizational climate under the thriving at work perspective using a network approach. The authors demonstrate how organizational climate functions as a complex system and what relationships between variables from different dimensions are the most important to characterize the construct.
Design/methodology/approach
By surveying 119,266 workers from 284 companies based in Brazil, the authors estimated a Gaussian graphical model with LASSO regularization for the complete dataset and for two subsets of cases randomly drawn from the whole dataset. The walktrap algorithm was applied for community detection, and a strong model for measurement invariance was fit to test whether the organizational climate is perceived similarly across groups.
Findings
Results show that the networks estimated for both groups are quite consistent, with similar number of communities and items detected. The same pattern was found for the expected influence of each item. Measurement invariance was confirmed, showing that organizational climate is perceived similarly in both groups. The most important community detected and whose items have higher levels of centrality was organizational commitment, followed by a community centered around macro-organizational aspects covering cultural integrity, organizational agility and responsible leadership.
Research limitations/implications
Studies in the field have attested to the possibility of investigating the phenomenon from four (Campbell et al., 1970) to over 80 dimensions (Koys and DeCottis, 1991). As a result, since several dimensions have been produced to investigate organizational climate, there is no consensus on the quality and number of dimensions that should be considered to measure such a vast and multifaceted construct. Built on thriving at work perspective, eight dimensions were devised to cover a wide range of characteristics that distinguish organizational climate, including those related to Industry 4.0 (Coetzee, 2019). However, one may argue that a few dimensions, namely social responsibility, diversity and inclusion, or even more items describing work-life balance could expand the depth and breadth of the instrument and potentially trigger new associations that might eventually impose a new logic to the comprehension of climate as a system. Future studies combining the dimensions investigated in this study with other dimensions are therefore highly recommended for an even more comprehensive investigation.
Practical implications
The results of this investigation show how to apply psychological networks to gain insights into different variables and dimensions of organizational climate. These findings can be used for the development of organizational policies focused on the most relevant aspects of organizational climate. This information would allow organizations to go beyond simply describing the individual frequencies for each item and could even be used to create a weighted scoring model that could prioritize variables with higher levels of centrality.
Originality/value
To the authors’ knowledge, this is the first study that investigates organizational climate using psychological networks; it provides a better understanding of the relationships established between items from different dimensions as opposed to the common cause framework whose focus is on the investigation of dimensions separately.
Details
Keywords
Dimitrios Dimitriou, Eleftherios Goulas, Christos Kallandranis, Alexandros Tsioutsios and Thi Ngoc Bich Thi Ngoc Ta
This paper aims to examine potential diversification benefits between Eurozone (i.e. EURO STOXX 50) and key Asia markets: HSI (Hong Kong), KOSPI (South Korea), NIKKEI 225 (Japan…
Abstract
Purpose
This paper aims to examine potential diversification benefits between Eurozone (i.e. EURO STOXX 50) and key Asia markets: HSI (Hong Kong), KOSPI (South Korea), NIKKEI 225 (Japan) and TSEC (Taiwan). The sample covers the period from 04-01-2008 to 19-10-2023 in daily frequency.
Design/methodology/approach
The empirical investigation is based on the wavelet coherence analysis, which is a localized correlation coefficient in the time and frequency domain.
Findings
The results provide evidence that long-term diversification benefits exist between EURO STOXX and NIKKEI, EURO STOXX and KOSPI (after 2015) and there are signs for the pair and EURO STOXX-TSEC (after 2014). During the short term, there are signs of diversification benefits during the sample period. However, during the medium term, the diversification benefits seem to diminish.
Originality/value
These results have crucial implications for investors regarding the benefits of international portfolio diversification.
Details
Keywords
Vivian W.Y. Tam and Khoa N. Le
Various method have been used by organisations in the construction industry to improve quality, employing mainly two major techniques: management techniques such as quality…
Abstract
Purpose
Various method have been used by organisations in the construction industry to improve quality, employing mainly two major techniques: management techniques such as quality control, quality assurance, total quality management; and statistical techniques such as cost of quality, customer satisfaction and the six sigma principle. The purpose of this paper is to show that it is possible to employ the six sigma principle in the field of construction management provided that sufficient information on a particular population is obtained.
Design/methodology/approach
Statistical properties of the hyperbolic distribution are given and quality factors such as population in range, number of defects, yield percentage and defects per million opportunities are estimated. Graphical illustrations of the hyperbolic and Gaussian distributions are also given. From that, detailed comparisons of these two distributions are numerically obtained. The impacts of these quality factors are briefly discussed to give a rough guidance to organisations in the construction industry on how to lower cost and to improve project quality by prevention. A case study on a construction project is given in which it is shown that the hyperbolic distribution is better suited to the cost data than the Gaussian distribution. Cost and quality data of all projects in the company are collected over a period of eight years. Each project may consist of a number of phases, typically spanning about three months. Each phase can be considered as a member of the project population. Quality factors of this population are estimated using the six sigma principle.
Findings
The paper finds that by using a suitable distribution, it is possible to improve quality factors such as population in range, yield percentage and number of defects per million opportunities.
Originality/value
This paper is of value in assessing the suitability of the hyperbolic and Gaussian distributions in modelling the population and showing that hyperbolic distribution can be more effectively used to model the cost data than the Gaussian distribution.
Details
Keywords
Tuan-Hui Shen and Cong Lu
This paper aims to develop a method to improve the accuracy of tolerance analysis considering the spatial distribution characteristics of part surface morphology (SDCPSM) and…
Abstract
Purpose
This paper aims to develop a method to improve the accuracy of tolerance analysis considering the spatial distribution characteristics of part surface morphology (SDCPSM) and local surface deformations (LSD) of planar mating surfaces during the assembly process.
Design/methodology/approach
First, this paper proposes a skin modeling method considering SDCPSM based on Non-Gaussian random field. Second, based on the skin model shapes, an improved boundary element method is adopted to solve LSD of nonideal planar mating surfaces, and the progressive contact method is adopted to obtain relative positioning deviation of mating surfaces. Finally, the case study is given to verify the proposed approach.
Findings
Through the case study, the results show that different SDCPSM have different influences on tolerance analysis, and LSD have nonnegligible and different influence on tolerance analysis considering different SDCPSM. In addition, the LSD have a greater influence on translational deviation along the z-axis than rotational deviation around the x- and y-axes.
Originality/value
The surface morphology with different spatial distribution characteristics leads to different contact behavior of planar mating surfaces, especially when considering the LSD of mating surfaces during the assembly process, which will have further influence on tolerance analysis. To address the above problem, this paper proposes a tolerance analysis method with skin modeling considering SDCPSM and LSD of mating surfaces, which can help to improve the accuracy of tolerance analysis.
Details
Keywords
Pawel D. Domanski and Mateusz Gintrowski
This paper aims to present the results of the comparison between different approaches to the prediction of electricity prices. It is well-known that the properties of the data…
Abstract
Purpose
This paper aims to present the results of the comparison between different approaches to the prediction of electricity prices. It is well-known that the properties of the data generation process may prefer some modeling methods over the others. The data having an origin in social or market processes are characterized by unexpectedly wide realization space resulting in the existence of the long tails in the probabilistic density function. These data may not be easy in time series prediction using standard approaches based on the normal distribution assumptions. The electricity prices on the deregulated market fall into this category.
Design/methodology/approach
The paper presents alternative approaches, i.e. memory-based prediction and fractal approach compared with established nonlinear method of neural networks. The appropriate interpretation of results is supported with the statistical data analysis and data conditioning. These algorithms have been applied to the problem of the energy price prediction on the deregulated electricity market with data from Polish and Austrian energy stock exchanges.
Findings
The first outcome of the analysis is that there are several situations in the task of time series prediction, when standard modeling approach based on the assumption that each change is independent of the last following random Gaussian bell pattern may not be a true. In this paper, such a case was considered: price data from energy markets. Electricity prices data are biased by the human nature. It is shown that more relevant for data properties was Cauchy probabilistic distribution. Results have shown that alternative approaches may be used and prediction for both data memory-based approach resulted in the best performance.
Research limitations/implications
“Personalization” of the model is crucial aspect in the whole methodology. All available knowledge should be used on the forecasted phenomenon and incorporate it into the model. In case of the memory-based modeling, it is a specific design of the history searching routine that uses the understanding of the process features. Importance should shift toward methodology structure design and algorithm customization and then to parameter estimation. Such modeling approach may be more descriptive for the user enabling understanding of the process and further iterative improvement in a continuous striving for perfection.
Practical implications
Memory-based modeling can be practically applied. These models have large potential that is worth to be exploited. One disadvantage of this modeling approach is large calculation effort connected with a need of constant evaluation of large data sets. It was shown that a graphics processing unit (GPU) approach through parallel calculation on the graphical cards can improve it dramatically.
Social implications
The modeling of the electricity prices has big impact of the daily operation of the electricity traders and distributors. From one side, appropriate modeling can improve performance mitigating risks associated with the process. Thus, the end users should receive higher quality of services ultimately with lower prices and minimized risk of the energy loss incidents.
Originality/value
The use of the alternative approaches, such as memory-based reasoning or fractals, is very rare in the field of the electricity price forecasting. Thus, it gives a new impact for further research enabling development of better solutions incorporating all available process knowledge and customized hybrid algorithms.
Details
Keywords
A. Reyana, Sandeep Kautish, A.S. Vibith and S.B. Goyal
In the traffic monitoring system, the detection of stirring vehicles is monitored by fitting static cameras in the traffic scenarios. Background subtraction a commonly used method…
Abstract
Purpose
In the traffic monitoring system, the detection of stirring vehicles is monitored by fitting static cameras in the traffic scenarios. Background subtraction a commonly used method detaches poignant objects in the foreground from the background. The method applies a Gaussian Mixture Model, which can effortlessly be contaminated through slow-moving or momentarily stopped vehicles.
Design/methodology/approach
This paper proposes the Enhanced Gaussian Mixture Model to overcome the addressed issue, efficiently detecting vehicles in complex traffic scenarios.
Findings
The model was evaluated with experiments conducted using real-world on-road travel videos. The evidence intimates that the proposed model excels with other approaches showing the accuracy of 0.9759 when compared with the existing Gaussian mixture model (GMM) model and avoids contamination of slow-moving or momentarily stopped vehicles.
Originality/value
The proposed method effectively combines, tracks and classifies the traffic vehicles, resolving the contamination problem that occurred by slow-moving or momentarily stopped vehicles.
Details
Keywords
Putta Hemalatha and Geetha Mary Amalanathan
Adequate resources for learning and training the data are an important constraint to develop an efficient classifier with outstanding performance. The data usually follows a…
Abstract
Purpose
Adequate resources for learning and training the data are an important constraint to develop an efficient classifier with outstanding performance. The data usually follows a biased distribution of classes that reflects an unequal distribution of classes within a dataset. This issue is known as the imbalance problem, which is one of the most common issues occurring in real-time applications. Learning of imbalanced datasets is a ubiquitous challenge in the field of data mining. Imbalanced data degrades the performance of the classifier by producing inaccurate results.
Design/methodology/approach
In the proposed work, a novel fuzzy-based Gaussian synthetic minority oversampling (FG-SMOTE) algorithm is proposed to process the imbalanced data. The mechanism of the Gaussian SMOTE technique is based on finding the nearest neighbour concept to balance the ratio between minority and majority class datasets. The ratio of the datasets belonging to the minority and majority class is balanced using a fuzzy-based Levenshtein distance measure technique.
Findings
The performance and the accuracy of the proposed algorithm is evaluated using the deep belief networks classifier and the results showed the efficiency of the fuzzy-based Gaussian SMOTE technique achieved an AUC: 93.7%. F1 Score Prediction: 94.2%, Geometric Mean Score: 93.6% predicted from confusion matrix.
Research limitations/implications
The proposed research still retains some of the challenges that need to be focused such as application FG-SMOTE to multiclass imbalanced dataset and to evaluate dataset imbalance problem in a distributed environment.
Originality/value
The proposed algorithm fundamentally solves the data imbalance issues and challenges involved in handling the imbalanced data. FG-SMOTE has aided in balancing minority and majority class datasets.
Details
Keywords
Laura K. Taylor and Celia Bähr
Over 60% of armed conflicts re-occur; the seed of future conflict is sown even as a peace agreement is signed. The cyclical nature of war calls for a focus on youth who can…
Abstract
Purpose
Over 60% of armed conflicts re-occur; the seed of future conflict is sown even as a peace agreement is signed. The cyclical nature of war calls for a focus on youth who can disrupt this pattern over time. Addressing this concern, the developmental peace-building model calls for a dynamic, multi-level and longitudinal approach. Using an innovative statistical approach, this study aims to investigate the associations among four youth peace-building dimensions and quality peace.
Design/methodology/approach
Multi-level time-series network analysis of a data set containing 193 countries and spanning the years between 2011 and 2020 was performed. This statistical approach allows for complex modelling that can reveal new patterns of how different youth peace-building dimensions (i.e. education, engagement, information, inclusion), identified through rapid evidence assessment, promote quality peace over time. Such a methodology not only assesses between-country differences but also within-country change.
Findings
While the within-country contemporaneous network shows positive links for education, the temporal network shows significant lagged effects for all four dimensions on quality peace. The between-country network indicates significant direct effects of education and information, on average, and indirect effects of inclusion and engagement, on quality peace.
Originality/value
This approach demonstrates a novel application of multi-level time-series network analysis to explore the dynamic development of quality peace, capturing both stability and change. The analysis illustrates how youth peace-building dimensions impact quality peace in the macro-system globally. This investigation of quality peace thus illustrates that the science of peace does not necessitate violent conflict.
Details
Keywords
F.A. DiazDelaO and S. Adhikari
In the dynamical analysis of engineering systems, running a detailed high‐resolution finite element model can be expensive even for obtaining the dynamic response at few frequency…
Abstract
Purpose
In the dynamical analysis of engineering systems, running a detailed high‐resolution finite element model can be expensive even for obtaining the dynamic response at few frequency points. To address this problem, this paper aims to investigate the possibility of representing the output of an expensive computer code as a Gaussian stochastic process.
Design/methodology/approach
The Gaussian process emulator method is discussed and then applied to both simulated and experimentally measured data from the frequency response of a cantilever plate excited by a harmonic force. The dynamic response over a frequency range is approximated using only a small number of response values, obtained both by running a finite element model at carefully selected frequency points and from experimental measurements. The results are then validated applying some adequacy diagnostics.
Findings
It is shown that the Gaussian process emulator method can be an effective predictive tool for medium and high‐frequency vibration problems, whenever the data are expensive to obtain, either from a computer‐intensive code or a resource‐consuming experiment.
Originality/value
Although Gaussian process emulators have been used in other disciplines, there is no knowledge of it having been implemented for structural dynamic analyses and it has good potential for this area of engineering.
Details
Keywords
The purpose of this paper is to provide an analysis of the dependence structure between returns from real estate investment trusts (REITS) and a stock market index. Further, the…
Abstract
Purpose
The purpose of this paper is to provide an analysis of the dependence structure between returns from real estate investment trusts (REITS) and a stock market index. Further, the aim is to illustrate how copula approaches can be applied to model the complex dependence structure between the assets and for risk measurement of a portfolio containing investments in REIT and equity indices.
Design/methodology/approach
The usually suggested multivariate normal or variance‐ covariance approach is applied, as well as various copula models in order to investigate the dependence structure between returns of Australian REITS and the Australian stock market. Different models including the Gaussian, Student t, Clayton and Gumbel copula are estimated and goodness‐of‐fit tests are conducted. For the return series, both the Gaussian and a non‐parametric estimate of the distribution is applied. A risk analysis is provided based on Monte Carlo simulations for the different models. The value‐at‐risk measure is also applied for quantification of the risks for a portfolio combining investments in real estate and stock markets.
Findings
The findings suggest that the multivariate normal model is not appropriate to measure the complex dependence structure between the returns of the two asset classes. Instead, a model using non‐parametric estimates for the return series in combination with a Student t copula is clearly more suitable. It further illustrates that the usually applied variance‐covariance approach leads to a significant underestimation of the actual risk for a portfolio consisting of investments in REITS and equity indices. The nature of risk is better captured by the suggested copula models.
Originality/value
To the authors', knowledge, this is one of the first studies to apply and test different copula models in real estate markets. Results help international investors and portfolio managers to deepen their understanding of the dependence structure between returns from real estate and equity markets. Additionally, the results should be helpful for implementation of a more adequate risk management for portfolios containing investments in both REITS and equity indices.
Details