Search results
1 – 10 of 87Rose Opengart, Peter M. Ralston and Steve LeMay
The purpose of this paper is to extend the concept of myopia and introduce the concept of labor market myopia (LMM), as well as the role that human resources management (HRM…
Abstract
Purpose
The purpose of this paper is to extend the concept of myopia and introduce the concept of labor market myopia (LMM), as well as the role that human resources management (HRM) plays in its prevention and resolution. LMM, a more specific form of factor market myopia (FMM), is a myopic view of labor needs. LMM is only going to increase as human capital becomes increasingly scarce due to labor shortages.
Design/methodology/approach
This conceptual review focuses on research on factor market rivalry (FMR) in the supply chain. Using three sample job categories, the concept of myopia is applied toward the human resources context to propose a new term describing a failure to consider future labor needs.
Findings
The authors position HRM/talent management as critical in preventing and addressing LMM at both firm and industry levels and the critical role of labor markets in FMR. HR strategies are suggested to prevent LMM include: expansion of the available workforce; increasing current workforce productivity, economic remedies like paying higher wages and proactively assessing and forecasting the current and future human resource capacity and needs.
Practical implications
Labor needs to be considered as a factor in the same realm of importance as other resources. The HR strategies discussed are key to preventing LMM and improving organizational performance and effectiveness.
Originality/value
The authors argue that organizations not only compete for resources downstream (i.e. customers and markets) but also upstream, such as with human resources. The authors introduced a new concept/term to frame the effect on organizations when supply chain planning and HR strategy do not take labor into consideration. This was accomplished by first narrowing the concept of marketing myopia to FMM, and in this conceptual paper, it was subsequently narrowed to introduce the term LMM.
Details
Keywords
Everton Boos, Fermín S.V. Bazán and Vanda M. Luchesi
This paper aims to reconstruct the spatially varying orthotropic conductivity based on a two-dimensional inverse heat conduction problem described by a partial differential…
Abstract
Purpose
This paper aims to reconstruct the spatially varying orthotropic conductivity based on a two-dimensional inverse heat conduction problem described by a partial differential equation (PDE) model with mixed boundary conditions. The proposed discretization uses a highly accurate technique and allows simple implementations. Also, the authors solve the related inverse problem in such a way that smoothness is enforced on the iterations, showing promising results in synthetic examples and real problems with moving heat source.
Design/methodology/approach
The discretization procedure applied to the model for the direct problem uses a pseudospectral collocation strategy in the spatial variables and Crank–Nicolson method for the time-dependent variable. Then, the related inverse problem of recovering the conductivity from temperature measurements is solved by a modified version of Levenberg–Marquardt method (LMM) which uses singular scaling matrices. Problems where data availability is limited are also considered, motivated by a face milling operation problem. Numerical examples are presented to indicate the accuracy and efficiency of the proposed method.
Findings
The paper presents a discretization for the PDEs model aiming on simple implementations and numerical performance. The modified version of LMM introduced using singular scaling matrices shows the capabilities on recovering quantities with precision at a low number of iterations. Numerical results showed good fit between exact and approximate solutions for synthetic noisy data and quite acceptable inverse solutions when experimental data are inverted.
Originality/value
The paper is significant because of the pseudospectral approach, known for its high precision and easy implementation, and usage of singular regularization matrices on LMM iterations, unlike classic implementations of the method, impacting positively on the reconstruction process.
Details
Keywords
Chao Xu, Xianqiang Yang and Xiaofeng Liu
This paper aims to investigate a probabilistic mixture model for the nonrigid point set registration problem in the computer vision tasks. The equations to estimate the mixture…
Abstract
Purpose
This paper aims to investigate a probabilistic mixture model for the nonrigid point set registration problem in the computer vision tasks. The equations to estimate the mixture model parameters and the constraint items are derived simultaneously in the proposed strategy.
Design/methodology/approach
The problem of point set registration is expressed as Laplace mixture model (LMM) instead of Gaussian mixture model. Three constraint items, namely, distance, the transformation and the correspondence, are introduced to improve the accuracy. The expectation-maximization (EM) algorithm is used to optimize the objection function and the transformation matrix and correspondence matrix are given concurrently.
Findings
Although amounts of the researchers study the nonrigid registration problem, the LMM is not considered for most of them. The nonrigid registration problem is considered in the LMM with the constraint items in this paper. Three experiments are performed to verify the effectiveness and robustness and demonstrate the validity.
Originality/value
The novel method to solve the nonrigid point set registration problem in the presence of the constraint items with EM algorithm is put forward in this work.
Details
Keywords
The paper aims to propose a consistent and robust pricing/hedging methodology for callable fixed income structures with embedded caplet‐linked options.
Abstract
Purpose
The paper aims to propose a consistent and robust pricing/hedging methodology for callable fixed income structures with embedded caplet‐linked options.
Design/methodology/approach
A range of recently published (1997‐2003) works about the Libor Market Model (LMM) tackle the problems of modelling the forward curve with more than two factors and calibrating it to caps either/or to swaps. Other articles involve the pricing of Bermudan options using Monte Carlo simulation. In the form of case study, the very popular structure of multicallable range accrual bonds is used. A complete calibration methodology is described in detail, which links the structure's price to the market caps and swaptions prices as well as to the historical correlations between forward rates. We present the direct implementation of the Monte Carlo technique for this particular problem. Furthermore, we explore the application of the Longstaff–Schwartz least squares algorithm and its variations for the estimation of the expected value of continuation.
Findings
This paper suceeds in producing a consistent and robust pricing/hedging methodology for callable fixed income structures with embedded caplet‐linked options.
Practical implications
The increased complexity of similar fixed income structures makes traditional approaches like Black–Derman–Toy or Hull‐White trees inadequate for the task of consistent pricing and hedging. Therefore, care must be taken to ensure consisted hedging across the different volatility markets.
Originality/value
This article explores variations and settings of the popular LMM and the Longstaff‐Scwartz algorithm that can be relatively consistent with both the cap and swaption volatility market. The framework is built using as a benchmark the most liquid fixed income structure so that it can be tested for robustness.
Yuan Fangyang and Chen Zhongli
The purpose of this paper is to develop new types of direct expansion method of moments (DEMM) by using the n/3th moments for simulating nanoparticle Brownian coagulation in the…
Abstract
Purpose
The purpose of this paper is to develop new types of direct expansion method of moments (DEMM) by using the n/3th moments for simulating nanoparticle Brownian coagulation in the free molecule regime. The feasibilities of new proposed DEMMs with n/3th moments are investigated to describe the evolution of aerosol size distribution, and some of the models will be applied to further simulation of physical processes.
Design/methodology/approach
The accuracy and efficiency of some kinds of methods of moments are mainly compared including the quadrature method of moments (QMOM), Taylor-expansion method of moments (TEMOM), the log-normal preserving method of moments proposed by Lee (LMM) and the derived DEMM in this paper. QMOM with 12 quadrature approximation points is taken as a reference to evaluate other methods.
Findings
The newly derived models, namely DEMM(4/3,4) and DEMM(2,6), as well as the previous DEMM(2,4), are considered to be qualified models due to their high accuracy and efficiency. They are confirmed to be valid and alternative models to describe the evolution of aerosol size distribution for particle dynamical process involving the n/3th moments.
Originality/value
The n/3th moments, which have clear physical interpretations when n stands for first several integers, are first introduced in the DEMM method for simulating nanoparticle Brownian coagulation in the free molecule regime.
Details
Keywords
Katie Russell, Nima Moghaddam, Anna Tickle, Gina Campion, Christine Cobley, Stephanie Page and Paul Langthorne
By older adulthood, the majority of individuals will have experienced at least one traumatic event. Trauma-informed care (TIC) is proposed to improve effectivity of health-care…
Abstract
Purpose
By older adulthood, the majority of individuals will have experienced at least one traumatic event. Trauma-informed care (TIC) is proposed to improve effectivity of health-care provision and to reduce likelihood of services causing retraumatisation. This study aims to assess the effectiveness of staff training in TIC in older adult services.
Design/methodology/approach
TIC training was delivered across eight Older Adult Community Mental Health Teams in the same UK organisation. Questionnaires were administered before and after training: a psychometrically robust measure, the Attitudes Related to Trauma-Informed Care, was used to assess TIC-related attitudes, and a service-developed scale was used to measure changes in TIC competence. Data was analysed using linear mixed effects modelling (LMM). Qualitative data regarding the impact of training was gathered one month after training through a free-text questionnaire.
Findings
There were 45 participants, all of whom were white British. LMM on pre- and post-data revealed that staff training significantly increased competencies across all measured TIC domains. Overall, staff attitudes were also significantly more trauma-informed after training. Qualitatively, staff identified time as the only additional resource required to deliver the skills and knowledge gained from training.
Practical implications
Training was found to be effective in increasing TIC-related skills and attitudes. Organisations aiming to become trauma-informed should consider staff training as one aspect of a wider development plan.
Originality/value
To the best of the authors’ knowledge, this paper is the first to examine TIC training for staff working in Older Adults Mental Health Services. Recommendations for services aiming to develop a trauma-informed culture have been provided.
Details
Keywords
Abid Hussain, Amjid Khan and Pervaiz Ahmad
As a part of doctoral study, this study aims to analyze research on library management models (LMMs) by conducting a systematic literature review (SLR).
Abstract
Purpose
As a part of doctoral study, this study aims to analyze research on library management models (LMMs) by conducting a systematic literature review (SLR).
Design/methodology/approach
A Preferred Reporting Items for Systematic Review and Mata-Analysis approach was used to search four databases. The search criteria included studies published in English until 2022, resulting 9,125 records. Out of these records, a total of 36 studies were selected for final analysis
Findings
The results show a positive attitude among researchers toward the development of LMM for libraries globally. The results depict that more than one-third (39%) of the target population was comprised of academic staff and students. The majority (91.76%) of studies were conducted using survey. Quantitative methods were predominant (89%) for LMMs. There were a significant number of studies conducted in 2016. The country-wise distribution shows the USA and China each contribute (20%) of the studies.
Practical implications
The findings of this research could assist policymakers and authorities in reconciling the LMMs applied in libraries for providing efficient access to information resources and services to end users.
Originality/value
To the best of the authors’ knowledge, this study is unique as no comprehensive study has been conducted on LMMs using the SLR method.
Details
Keywords
Khalil Nimer, Ahmed Bani-Mustafa, Anas AlQudah, Mamoon Alameen and Ahmed Hassanein
This paper aims to explore how the role of the perception of good public governance reduces tax evasion (TE). Besides, this study investigates whether the nexus of public…
Abstract
Purpose
This paper aims to explore how the role of the perception of good public governance reduces tax evasion (TE). Besides, this study investigates whether the nexus of public governance and TE differs between developed and developing economies.
Design/methodology/approach
Apart from the ordinary least squares (OLS) model, this study uses the linear mixed modeling technique. The World Governance Indicators and the multiple causes estimation (MIMIC) method are used to measure public governance. The shadow economy is used as a proxy for TE.
Findings
The results show that people's perceptions of public governance and the quality of government institutions are core elements that influence tax-evasion behavior. Besides, the rule of law (RoL) and political stability (PS) significantly impact tax-evasion behavior in developing countries. Nevertheless, the RoL, the control of corruption and PS are the most critical tax-evasion determinants among public governance indicators for developed countries. Regulatory quality shows a substantial positive relationship with TE in developed but not developing countries.
Practical implications
This paper provides a guide for policymakers on reducing tax-evasion behavior by paying more attention to maintaining the RoL and PS and fighting corruption. Additionally, this study highlights the importance of people's perceptions of the government's pursuit of the above policy-related improvements, which, in turn, affect their tax behavior.
Originality/value
To the best of the authors’ knowledge, this study is the first to explore the role of people's perceptions of improvements in public governance and how this can reduce TE behavior in developed and developing economies. Unlike prior studies, this study used the linear mixed model method, which is more advantageous than OLS and produces robust estimators.
Details
Keywords
Shi‐Woei Lin and Ming‐Tsang Lu
Methods and techniques of aggregating preferences or priorities in the analytic hierarchy process (AHP) usually ignore variation or dispersion among experts and are vulnerable to…
Abstract
Purpose
Methods and techniques of aggregating preferences or priorities in the analytic hierarchy process (AHP) usually ignore variation or dispersion among experts and are vulnerable to extreme values (generated by particular viewpoints or experts trying to distort the final ranking). The purpose of this paper is to propose a modelling approach and a graphical representation to characterize inconsistency and disagreement in the group decision making in the AHP.
Design/methodology/approach
The authors apply a regression approach for estimating the decision weights of the AHP using linear mixed models (LMM). They also test the linear mixed model and the multi‐dimensional scaling graphical display using a case of strategic performance management in education.
Findings
In addition to determining the weight vectors, this model also allows the authors to decompose the variation or uncertainty in experts' judgment. Well‐known statistical theories can estimate and rigorously test disagreement among experts, the residual uncertainty due to rounding errors in AHP scale, and the inconsistency within individual experts' judgments. Other than characterizing different sources of uncertainty, this model allows the authors to rigorously test other factors that might significantly affect weight assessments.
Originality/value
This study provides a model to better characterize different sources of uncertainty. This approach can improve decision quality by allowing analysts to view the aggregated judgments in a proper context and pinpoint the uncertain component that significantly affects decisions.
Details
Keywords
Mingyu Nie, Zhi Liu, Xiaomei Li, Qiang Wu, Bo Tang, Xiaoyan Xiao, Yulin Sun, Jun Chang and Chengyun Zheng
This paper aims to effectively achieve endmembers and relative abundances simultaneously in hyperspectral image unmixing yield. Hyperspectral unmixing, which is an important step…
Abstract
Purpose
This paper aims to effectively achieve endmembers and relative abundances simultaneously in hyperspectral image unmixing yield. Hyperspectral unmixing, which is an important step before image classification and recognition, is a challenging issue because of the limited resolution of image sensors and the complex diversity of nature. Unmixing can be performed using different methods, such as blind source separation and semi-supervised spectral unmixing. However, these methods have disadvantages such as inaccurate results or the need for the spectral library to be known a priori.
Design/methodology/approach
This paper proposes a novel method for hyperspectral unmixing called fuzzy c-means unmixing, which achieves endmembers and relative abundance through repeated iteration analysis at the same time.
Findings
Experimental results demonstrate that the proposed method can effectively implement hyperspectral unmixing with high accuracy.
Originality/value
The proposed method present an effective framework for the challenging field of hyperspectral image unmixing.
Details