Search results

1 – 10 of 22
Article
Publication date: 5 September 2024

Leyla Orudzheva, Manjula S. Salimath and Robert Pavur

The consequences of corporate corruption control (CCC) have either been investigated outside the firm (e.g. foreign direct investment inflows) or inside the firm (e.g…

Abstract

Purpose

The consequences of corporate corruption control (CCC) have either been investigated outside the firm (e.g. foreign direct investment inflows) or inside the firm (e.g. profitability). Yet prior research addresses these implications separately, treating them as distinct phenomena, ignoring questions at their intersection. However, corruption control can be leveraged to benefit both organizations (internally) and environments (externally). In line with open systems theory, this study aims to explore a ripple effect of corruption control not only inside organizations (efficiency through adoption of sustainable resource management practices) but also outside [community-centered corporate social performance (CSP)].

Design/methodology/approach

Using a longitudinal sample of multinational enterprises from Forbes list of “The World’s Largest Public Companies,” the authors use a cross-lagged panel design to provide clarity regarding causal effects.

Findings

Results confirm causal directionality and support the positive effect of corruption control on resource management and community CSP, contributing toward understanding implications at the organization–environment interface.

Originality/value

The authors examine both internal and external implications of CCC. The use of a cross-lagged design that is relatively novel to the management field allows to check for casual effects between CSP elements that were previously assumed to have reciprocal casual effects.

Details

Society and Business Review, vol. 19 no. 4
Type: Research Article
ISSN: 1746-5680

Keywords

Article
Publication date: 15 April 2022

Md Rasel Al Mamun, Victor R. Prybutok, Daniel A. Peak, Russell Torres and Robert J. Pavur

This study aims to examine the relationship between emotional attachment (EA) and intelligent personal assistant (IPA) continuance intention. While existing theories emphasize…

1292

Abstract

Purpose

This study aims to examine the relationship between emotional attachment (EA) and intelligent personal assistant (IPA) continuance intention. While existing theories emphasize purely rational and goal-oriented factors in terms of information technology (IT) continuance intention, this research examines how users' EA toward technology impacts their continuance intention in the absence of cognitive and habitual factors.

Design/methodology/approach

This study contextualizes attachment theory from the social psychology/consumer psychology literature to an IT application and formulates and tests a new model that is proposed in the context of IPA continuance. Five research hypotheses developed from contextualization and application of the theory were posited in a structural model and empirically validated using survey results from IPA users.

Findings

The results show that users' EA to IPA use significantly influences their IPA continuance intention, along with emotional trust and interaction quality with the IPA.

Originality/value

This study contextualizes attachment theory developed in the social psychology/consumer psychology literature to formulate and test a new model in the context of IPA continuance. This work contributes to the theoretical understanding by investigating IPA continuance intention in the absence of cognitive or habitual factors and fills a critical research gap in IT post-adoption literature. IPA is just one example of technologies to which individuals can form attachments and this research provides an important foundation for future research by positing and testing the value of EA in IT post-adoption behavior. This research also contributes to practical knowledge by inferring that IPA manufacturers, managers and vendors could extend their revenue streams by integrating product features that capture emotion.

Details

Information Technology & People, vol. 36 no. 2
Type: Research Article
ISSN: 0959-3845

Keywords

Article
Publication date: 5 May 2020

Kathryn Ostermeier, Mark Davis and Robert Pavur

The purpose of this study is to examine the facilitating and inhibiting influence of team-level negative affectivity and conscientiousness on a dyad of emergent states, adopting…

Abstract

Purpose

The purpose of this study is to examine the facilitating and inhibiting influence of team-level negative affectivity and conscientiousness on a dyad of emergent states, adopting and comparing both the composition and compilation perspectives.

Design/methodology/approach

Data were collected over three time points from 410 undergraduate students nested within cross-functional project teams (N = 62). The data, including individual self-reports and judges’ ratings of team performance, were aggregated to the team-level using both composition (mean) and compilation (skewness) approaches.

Findings

The findings indicate that mean-levels of negative affectivity were associated with decreased psychological safety. The use of skewed conscientiousness counterintuitively suggests too many highly conscientious members can also be detrimental to psychological safety. Psychological safety influences team potency and ultimately performance.

Originality/value

The results of this study highlight that the aggregation approach used is important. For example, the use of skewed (but not mean-level) conscientiousness brought an undetected and counterintuitive relationship to light. Future research should use compilation approaches in addition to composition approaches.

Details

Team Performance Management: An International Journal, vol. 26 no. 3/4
Type: Research Article
ISSN: 1352-7592

Keywords

Article
Publication date: 29 December 2022

Xiaoguang Tian, Robert Pavur, Henry Han and Lili Zhang

Studies on mining text and generating intelligence on human resource documents are rare. This research aims to use artificial intelligence and machine learning techniques to…

2425

Abstract

Purpose

Studies on mining text and generating intelligence on human resource documents are rare. This research aims to use artificial intelligence and machine learning techniques to facilitate the employee selection process through latent semantic analysis (LSA), bidirectional encoder representations from transformers (BERT) and support vector machines (SVM). The research also compares the performance of different machine learning, text vectorization and sampling approaches on the human resource (HR) resume data.

Design/methodology/approach

LSA and BERT are used to discover and understand the hidden patterns from a textual resume dataset, and SVM is applied to build the screening model and improve performance.

Findings

Based on the results of this study, LSA and BERT are proved useful in retrieving critical topics, and SVM can optimize the prediction model performance with the help of cross-validation and variable selection strategies.

Research limitations/implications

The technique and its empirical conclusions provide a practical, theoretical basis and reference for HR research.

Practical implications

The novel methods proposed in the study can assist HR practitioners in designing and improving their existing recruitment process. The topic detection techniques used in the study provide HR practitioners insights to identify the skill set of a particular recruiting position.

Originality/value

To the best of the authors’ knowledge, this research is the first study that uses LSA, BERT, SVM and other machine learning models in human resource management and resume classification. Compared with the existing machine learning-based resume screening system, the proposed system can provide more interpretable insights for HR professionals to understand the recommendation results through the topics extracted from the resumes. The findings of this study can also help organizations to find a better and effective approach for resume screening and evaluation.

Details

Business Process Management Journal, vol. 29 no. 1
Type: Research Article
ISSN: 1463-7154

Keywords

Article
Publication date: 18 May 2022

Amit Malhan, Ila Manuj, Lou Pelton and Robert Pavur

Warren Buffett asserted that the greatest issue confronting American business and the economy is rising health-care costs, which have risen to 17% of gross domestic product…

Abstract

Purpose

Warren Buffett asserted that the greatest issue confronting American business and the economy is rising health-care costs, which have risen to 17% of gross domestic product. Public policymakers, health-care providers and other stakeholders grapple with cost-containment and increased health-care delivery efficiencies. There exists a paucity of theory-driven research addressing how information technology vis-à-vis electronic health records (EHR) may supply a managerial mechanism for increasing bottom-line hospital performance, thereby attaining competitive advantage.

Design/methodology/approach

A systematic interdisciplinary literature review motivated by resource advantage theory (RAT) offers a conceptual foundation for analyzing the financial, informational and physical workflows that are core elements of supply chain management in a hospital.

Findings

RAT links how EHR impacts profitability, competitive advantage and macromarketing factors in hospital supply chains. The literature review provides a research synthesis of the implementation and adoption of EHR to reveal its impact on a hospital’s competitive advantage. Although legislative initiatives like the 2009 Health Information Technology for Economic and Clinical Health Act and the Affordable Care Act encourage EHR adoption, there remains a reluctance for hospitals to do so.

Originality/value

The extant literature precedes the relevant legislation, has incomplete data or focuses solely on patient outcomes.

Details

Records Management Journal, vol. 32 no. 2
Type: Research Article
ISSN: 0956-5698

Keywords

Book part
Publication date: 26 October 2017

Matthew Lindsey and Robert Pavur

Control charts are designed to be effective in detecting a shift in the distribution of a process. Typically, these charts assume that the data for these processes follow an…

Abstract

Control charts are designed to be effective in detecting a shift in the distribution of a process. Typically, these charts assume that the data for these processes follow an approximately normal distribution or some known distribution. However, if a data-generating process has a large proportion of zeros, that is, the data is intermittent, then traditional control charts may not adequately monitor these processes. The purpose of this study is to examine proposed control chart methods designed for monitoring a process with intermittent data to determine if they have a sufficiently small percentage of false out-of-control signals. Forecasting techniques for slow-moving/intermittent product demand have been extensively explored as intermittent data is common to operational management applications (Syntetos & Boylan, 2001, 2005, 2011; Willemain, Smart, & Schwarz, 2004). Extensions and modifications of traditional forecasting models have been proposed to model intermittent or slow-moving demand, including the associated trends, correlated demand, seasonality and other characteristics (Altay, Litteral, & Rudisill, 2012). Croston’s (1972) method and its adaptations have been among the principal procedures used in these applications. This paper proposes adapting Croston’s methodology to design control charts, similar to Exponentially Weighted Moving Average (EWMA) control charts, to be effective in monitoring processes with intermittent data. A simulation study is conducted to assess the performance of these proposed control charts by evaluating their Average Run Lengths (ARLs), or equivalently, their percent of false positive signals.

Details

Advances in Business and Management Forecasting
Type: Book
ISBN: 978-1-78743-069-3

Keywords

Book part
Publication date: 13 March 2013

Matthew Lindsey and Robert Pavur

One aspect of forecasting intermittent demand for slow-moving inventory that has not been investigated to any depth in the literature is seasonality. This is due in part to the…

Abstract

One aspect of forecasting intermittent demand for slow-moving inventory that has not been investigated to any depth in the literature is seasonality. This is due in part to the reliability of computed seasonal indexes when many of the periods have zero demand. This chapter proposes an innovative approach which adapts Croston's (1970) method to data with a multiplicative seasonal component. Adaptations of Croston's (1970) method are popular in the literature. This method is one of the most popular techniques to forecast items with intermittent demand. A simulation is conducted to examine the effectiveness of the proposed technique extending Croston's (1970) method to incorporate seasonality.

Details

Advances in Business and Management Forecasting
Type: Book
ISBN: 978-1-78190-331-5

Keywords

Book part
Publication date: 12 November 2014

Matthew Lindsey and Robert Pavur

A Bayesian approach to demand forecasting to optimize spare parts inventory that requires periodic replenishment is examined relative to a non-Bayesian approach when the demand…

Abstract

A Bayesian approach to demand forecasting to optimize spare parts inventory that requires periodic replenishment is examined relative to a non-Bayesian approach when the demand rate is unknown. That is, optimal inventory levels are decided using these two approaches at consecutive time intervals. Simulations were conducted to compare the total inventory cost using a Bayesian approach and a non-Bayesian approach to a theoretical minimum cost over a variety of demand rate conditions including the challenging slow moving or intermittent type of spare parts. Although Bayesian approaches are often recommended, this study’s results reveal that under conditions of large variability across the demand rates of spare parts, the inventory cost using the Bayes model was not superior to that using the non-Bayesian approach. For spare parts with homogeneous demand rates, the inventory cost using the Bayes model for forecasting was generally lower than that of the non-Bayesian model. Practitioners may still opt to use the non-Bayesian model since a prior distribution for the demand does not need to be identified.

Details

Advances in Business and Management Forecasting
Type: Book
ISBN: 978-1-78441-209-8

Keywords

Book part
Publication date: 18 July 2016

Matthew Lindsey and Robert Pavur

Research in the area of forecasting and stock inventory control for intermittent demand is designed to provide robust models for the underlying demand which appears at random…

Abstract

Research in the area of forecasting and stock inventory control for intermittent demand is designed to provide robust models for the underlying demand which appears at random, with some time periods having no demand at all. Croston’s method is a popular technique for these models and it uses two single exponential smoothing (SES) models which involve smoothing constants. A key issue is the choice of the values due to the sensitivity of the forecasts to changes in demand. Suggested selections of the smoothing constants include values between 0.1 and 0.3. Since an ARIMA model has been illustrated to be equivalent to SES, an optimal smoothing constant can be selected from the ARIMA model for SES. This chapter will conduct simulations to investigate whether using an optimal smoothing constant versus the suggested smoothing constant is important. Since SES is designed to be an adapted method, data are simulated which vary between slow and fast demand.

Details

Advances in Business and Management Forecasting
Type: Book
ISBN: 978-1-78635-534-8

Keywords

Article
Publication date: 2 January 2009

A.B.M. Abdullah, David Mitchell and Robert Pavur

The purpose of this study is to investigate forecast models using data provided by the Texas Commission on Environmental Quality (TCEQ) to monitor and develop forecast models for…

Abstract

Purpose

The purpose of this study is to investigate forecast models using data provided by the Texas Commission on Environmental Quality (TCEQ) to monitor and develop forecast models for air quality management.

Design/methodology/approach

The models used in this research are the LDF (Fisher Linear Discriminant Function), QDF (Quadratic Discriminant Function), REGF (Regression Function), BPNN (Backprop Neural Network), and the RBFN (Radial Basis Function Network). The data used for model evaluations span a 12‐year period from 1990 to 2002. A control chart of the data is also examined for possible shifts in the distribution of ozone present in the Houston atmosphere during this time period.

Findings

Results of this research reveal variables that are significantly related to the ozone problem in the Houston area.

Practical implications

Models developed in this paper may assist air quality managers in modeling and forecasting ozone formations using meteorological variables.

Originality/value

This is the first study that has extensively compared the efficiency of LDF, QDF, REGF, BPNN and RBFN forecast models used for tracking air quality. Prior studies have evaluated Neural Networks, ARIMA and regression models.

Details

Management of Environmental Quality: An International Journal, vol. 20 no. 1
Type: Research Article
ISSN: 1477-7835

Keywords

1 – 10 of 22