Search results

1 – 10 of over 3000
Article
Publication date: 12 August 2022

Qianqian Chen, Zhen Tian, Tian Lei and Shenghan Huang

Cross operation is a common operation method in the building construction process nowadays. Due to the crossover, each other's operations are disturbed, and risks also interact…

Abstract

Purpose

Cross operation is a common operation method in the building construction process nowadays. Due to the crossover, each other's operations are disturbed, and risks also interact. This superimposed relationship of risks is worthy of attention. The study aims to develop a model for analyzing cross-working risks. This model can quantify the correlation of various risk factors.

Design/methodology/approach

The concept of cross operation and the cross types involved are clarified. The risk factors were extracted from cross-operation accidents. The association rule mining (ARM) was used to analyze the results of various cross-types accidents. With the help of visualization tools, the intensity distribution and correlation path of the relationship between each factor were obtained. A complete cross-operation risk analysis model was established.

Findings

The application of ARM method proves that there are obvious risk correlation deviations in different types of cross operations. A high-frequency risk common to all cross operations is on-site safety inspection and process supervision, but the subsequent problems are different. Cutting off the high-lift risk chain timely according to the results obtained by ARM can reduce or eliminate the danger of high-frequency risk factors.

Originality/value

This is the first systematic analysis of cross-work risk in the construction. The study determined the priority of risk management. The results contribute to targeted cross-work control to reduce accidents caused by cross-work.

Details

Engineering, Construction and Architectural Management, vol. 30 no. 10
Type: Research Article
ISSN: 0969-9988

Keywords

Article
Publication date: 21 November 2023

Hua Pan and Rong Liu

On the one hand, this paper is to further understand the residents' differentiated power consumption behaviors and tap the residential family characteristics labels from the…

Abstract

Purpose

On the one hand, this paper is to further understand the residents' differentiated power consumption behaviors and tap the residential family characteristics labels from the perspective of electricity stability. On the other hand, this paper is to address the problem of lack of causal relationship in the existing research on the association analysis of residential electricity consumption behavior and basic information data.

Design/methodology/approach

First, the density-based spatial clustering of applications with noise method is used to extract the typical daily load curve of residents. Second, the degree of electricity consumption stability is described from three perspectives: daily minimum load rate, daily load rate and daily load fluctuation rate, and is evaluated comprehensively using the entropy weight method. Finally, residential customer labels are constructed from sociological characteristics, residential characteristics and energy use attitudes, and the enhanced FP-growth algorithm is employed to investigate any potential links between each factor and the stability of electricity consumption.

Findings

Compared with the original FP-growth algorithm, the improved algorithm can realize the excavation of rules containing specific attribute labels, which improves the excavation efficiency. In terms of factors influencing electricity stability, characteristics such as a large number of family members, being well employed, having children in the household and newer dwelling labels may all lead to poorer electricity stability, but residents' attitudes toward energy use and dwelling type are not significantly associated with electricity stability.

Originality/value

This paper aims to uncover household socioeconomic traits that influence the stability of home electricity use and to shed light on the intricate connections between them. Firstly, in this article, from the perspective of electricity stability, the characteristics of the power consumption of residents' users are refined. And the authors use the entropy weight method to comprehensively evaluate the stability of electricity usage. Secondly, the labels of residential users' household characteristics are screened and organized. Finally, the improved FP-growth algorithm is used to mine the residential household characteristic labels that are strongly associated with electricity consumption stability.

Highlights

  1. The stability of electricity consumption is important to the stable operation of the grid.

  2. An improved FP-growth algorithm is employed to explore the influencing factors.

  3. The improved algorithm enables the mining of rules containing specific attribute labels.

  4. Residents' attitudes toward energy use are largely unrelated to the stability of electricity use.

The stability of electricity consumption is important to the stable operation of the grid.

An improved FP-growth algorithm is employed to explore the influencing factors.

The improved algorithm enables the mining of rules containing specific attribute labels.

Residents' attitudes toward energy use are largely unrelated to the stability of electricity use.

Details

Management of Environmental Quality: An International Journal, vol. 35 no. 3
Type: Research Article
ISSN: 1477-7835

Keywords

Article
Publication date: 26 March 2024

Yuanwen Han, Jiang Shen, Xuwei Zhu, Bang An and Xueying Bao

This study aims to develop an interface management risk interaction modeling and analysis methodology applicable to complex systems in high-speed rail construction projects…

Abstract

Purpose

This study aims to develop an interface management risk interaction modeling and analysis methodology applicable to complex systems in high-speed rail construction projects, reveal the interaction mechanism of interface management risk and provide theoretical support for project managers to develop appropriate interface management risk response strategies.

Design/methodology/approach

This paper introduces the association rule mining technique to improve the complex network modeling method. Taking China as an example, based on the stakeholder perspective, the risk factors and significant accident types of interface management of high-speed rail construction projects are systematically identified, and a database is established. Then, the Apriori algorithm is used to mine and analyze the strong association rules among the factors in the database, construct the complex network, and analyze its topological characteristics to reveal the interaction mechanism of the interface management risk of high-speed rail construction projects.

Findings

The results show that the network is both scale-free and small-world, implying that construction accidents are not random events but rather the result of strong interactions between numerous interface management risks. Contractors, technical interfaces, mechanical equipment, and environmental factors are the primary direct causal factors of accidents, while owners and designers are essential indirect causal factors. The global importance of stakeholders such as owners, designers, and supervisors rises significantly after considering the indirect correlations between factors. This theoretically explains the need to consider the interactions between interface management risks.

Originality/value

The interaction mechanism between interface management risks is unclear, which is an essential factor influencing the decision of risk response measures. This study proposes a new methodology for analyzing interface management risk response strategies that incorporate quantitative analysis methods and considers the interaction of interface management risks.

Details

Engineering, Construction and Architectural Management, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0969-9988

Keywords

Open Access
Article
Publication date: 7 July 2023

Marcello Braglia, Francesco Di Paco, Marco Frosolini and Leonardo Marrazzini

This paper presents Quick Changeover Design (QCD), which is a structured methodological approach for Original Equipment Manufacturers to drive and support the design of machines…

1311

Abstract

Purpose

This paper presents Quick Changeover Design (QCD), which is a structured methodological approach for Original Equipment Manufacturers to drive and support the design of machines in terms of rapid changeover capability.

Design/methodology/approach

To improve the performance in terms of set up time, QCD addresses machine design from a single-minute digit exchange of die (SMED). Although conceived to aid the design of completely new machines, QCD can be adapted to support for simple design upgrades on pre-existing machines. The QCD is structured in three consecutive steps, each supported by specific tools and analysis forms to facilitate and better structure the designers' activities.

Findings

QCD helps equipment manufacturers to understand the current and future needs of the manufacturers' customers to: (1) anticipate the requirements for new and different set-up process; (2) prioritize the possible technical solutions; (3) build machines and equipment that are easy and fast to set-up under variable contexts. When applied to a production system consisting of machines subject to frequent or time-consuming set-up processes, QCD enhances both responsiveness to external market demands and internal control of factory operations.

Originality/value

The QCD approach is a support system for the development of completely new machines and is also particularly effective in upgrading existing ones. QCD's practical application is demonstrated using a case study concerning a vertical spindle machine.

Details

Journal of Manufacturing Technology Management, vol. 34 no. 9
Type: Research Article
ISSN: 1741-038X

Keywords

Open Access
Article
Publication date: 2 April 2024

Koraljka Golub, Osma Suominen, Ahmed Taiye Mohammed, Harriet Aagaard and Olof Osterman

In order to estimate the value of semi-automated subject indexing in operative library catalogues, the study aimed to investigate five different automated implementations of an…

Abstract

Purpose

In order to estimate the value of semi-automated subject indexing in operative library catalogues, the study aimed to investigate five different automated implementations of an open source software package on a large set of Swedish union catalogue metadata records, with Dewey Decimal Classification (DDC) as the target classification system. It also aimed to contribute to the body of research on aboutness and related challenges in automated subject indexing and evaluation.

Design/methodology/approach

On a sample of over 230,000 records with close to 12,000 distinct DDC classes, an open source tool Annif, developed by the National Library of Finland, was applied in the following implementations: lexical algorithm, support vector classifier, fastText, Omikuji Bonsai and an ensemble approach combing the former four. A qualitative study involving two senior catalogue librarians and three students of library and information studies was also conducted to investigate the value and inter-rater agreement of automatically assigned classes, on a sample of 60 records.

Findings

The best results were achieved using the ensemble approach that achieved 66.82% accuracy on the three-digit DDC classification task. The qualitative study confirmed earlier studies reporting low inter-rater agreement but also pointed to the potential value of automatically assigned classes as additional access points in information retrieval.

Originality/value

The paper presents an extensive study of automated classification in an operative library catalogue, accompanied by a qualitative study of automated classes. It demonstrates the value of applying semi-automated indexing in operative information retrieval systems.

Details

Journal of Documentation, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0022-0418

Keywords

Article
Publication date: 29 January 2024

Kai Wang

The identification of network user relationship in Fancircle contributes to quantifying the violence index of user text, mining the internal correlation of network behaviors among…

Abstract

Purpose

The identification of network user relationship in Fancircle contributes to quantifying the violence index of user text, mining the internal correlation of network behaviors among users, which provides necessary data support for the construction of knowledge graph.

Design/methodology/approach

A correlation identification method based on sentiment analysis (CRDM-SA) is put forward by extracting user semantic information, as well as introducing violent sentiment membership. To be specific, the topic of the implementation of topology mapping in the community can be obtained based on self-built field of violent sentiment dictionary (VSD) by extracting user text information. Afterward, the violence index of the user text is calculated to quantify the fuzzy sentiment representation between the user and the topic. Finally, the multi-granularity violence association rules mining of user text is realized by constructing violence fuzzy concept lattice.

Findings

It is helpful to reveal the internal relationship of online violence under complex network environment. In that case, the sentiment dependence of users can be characterized from a granular perspective.

Originality/value

The membership degree of violent sentiment into user relationship recognition in Fancircle community is introduced, and a text sentiment association recognition method based on VSD is proposed. By calculating the value of violent sentiment in the user text, the annotation of violent sentiment in the topic dimension of the text is achieved, and the partial order relation between fuzzy concepts of violence under the effective confidence threshold is utilized to obtain the association relation.

Details

Data Technologies and Applications, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2514-9288

Keywords

Article
Publication date: 16 April 2024

Amir Schreiber and Ilan Schreiber

In the modern digital realm, while artificial intelligence (AI) technologies pave the way for unprecedented opportunities, they also give rise to intricate cybersecurity issues…

Abstract

Purpose

In the modern digital realm, while artificial intelligence (AI) technologies pave the way for unprecedented opportunities, they also give rise to intricate cybersecurity issues, including threats like deepfakes and unanticipated AI-induced risks. This study aims to address the insufficient exploration of AI cybersecurity awareness in the current literature.

Design/methodology/approach

Using in-depth surveys across varied sectors (N = 150), the authors analyzed the correlation between the absence of AI risk content in organizational cybersecurity awareness programs and its impact on employee awareness.

Findings

A significant AI-risk knowledge void was observed among users: despite frequent interaction with AI tools, a majority remain unaware of specialized AI threats. A pronounced knowledge difference existed between those that are trained in AI risks and those who are not, more apparent among non-technical personnel and sectors managing sensitive information.

Research limitations/implications

This study paves the way for thorough research, allowing for refinement of awareness initiatives tailored to distinct industries.

Practical implications

It is imperative for organizations to emphasize AI risk training, especially among non-technical staff. Industries handling sensitive data should be at the forefront.

Social implications

Ensuring employees are aware of AI-related threats can lead to a safer digital environment for both organizations and society at large, given the pervasive nature of AI in everyday life.

Originality/value

Unlike most of the papers about AI risks, the authors do not trust subjective data from second hand papers, but use objective authentic data from the authors’ own up-to-date anonymous survey.

Details

Information & Computer Security, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2056-4961

Keywords

Article
Publication date: 2 May 2024

Obafemi Olekanma, Christian Harrison, Adebukola E. Oyewunmi and Oluwatomi Adedeji

This empirical study aims to explore how actors in specific human resource practices (HRPs) such as line managers (LMs) impact employee productivity measures in the context of…

Abstract

Purpose

This empirical study aims to explore how actors in specific human resource practices (HRPs) such as line managers (LMs) impact employee productivity measures in the context of financial institutions (FI) banks.

Design/methodology/approach

This cross-country study adopted a qualitative methodology. It employed semi-structured interviews to collect data from purposefully selected 12 business facing directors (BFDs) working in the top 10 banks in Nigeria and the UK. The data collected were analysed with the help of the trans-positional cognition approach (TPCA) phenomenological method.

Findings

The findings of a TPCA analytical process imply that in the UK and Nigeria’s FIs, the BFDs line managers’ human resources practices (LMHRPs) resulted in a highly regulated workplace, knowledge gap, service operations challenges and subjective quantitatively driven key performance indicators, considered service productivity paradoxical elements. Although the practices in the UK and Nigerian FIs had similar labels, their aggregates were underpinned by different contextual issues.

Practical implications

To support LMs in better understanding and managing FIs BFDs productivity measures and outcomes, we propose the Managerial Employee Productivity Operational Definition framework as part of their toolkit. This study will be helpful for banking sectors, their regulators, policymakers, other FIs’ industry stakeholders and future researchers in the field.

Originality/value

Within the context of the UK and Nigeria’s FIs, this study is the first attempt to understand how LMHRPs impact BFDs productivity in this manner. It confirms that LMHRPs result in service productivity paradoxical elements with perceived or lost productivity implications.

Details

International Journal of Productivity and Performance Management, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1741-0401

Keywords

Case study
Publication date: 27 October 2023

Joe Anderson, Mahendra Joshi and Susan K. Williams

This compact case provides a relatively large data set that students explore using visualization and a Tableau dynamic dashboard that they create. Students were asked to describe…

Abstract

Theoretical basis

This compact case provides a relatively large data set that students explore using visualization and a Tableau dynamic dashboard that they create. Students were asked to describe what the data set contained in relation to employee attrition experience of Baca Beverage Distributors (BBD). The application and managerial questions are set in human resources and a company that is facing high attrition during the pandemic.

Research methodology

BBD shared their data and problem scenario for this compact case. The protagonist, Morgan Matthews, was the authors’ contact and provided significant clarification and guidance about the data. Both the company and the protagonist have been disguised. Some of the job positions have been rephrased. All names of employees, supervisors and managers have been replaced with codes.

Case overview/synopsis

During the 2020–2022 pandemic years, BBD experienced, like many companies, a higher than usual employee turnover rate and Morgan Matthews, Director of People, was concerned. Not only was it time-consuming, expensive and disruptive but the company had prided itself on being a good place to work. Were they hiring the right people, people that fit the company culture and people that fit the positions for which they were hired? The company had been using the Predictive Index [1] when on-boarding employees. In addition, there were results from self-reviews and manager reviews that could be used. Morgan wondered if data visualization and visual analytics would be useful in describing their employees and whether it would reveal any opportunities to improve the turnover rate. Before seeking a solution for the high turnover, it was important to step back and learn what the data said about who was leaving and the reasons they gave for leaving.

Complexity academic level

This compact case can be used in courses that include visualization using Tableau and dashboards. As it is a compact case, it requires less preparation time from the students and less class time for discussion. The case is for students who have been recently introduced to business analytics, specifically visualization and data storytelling with Tableau. For this reason, significant guidance has been provided in the case assignment. The level of the case can be adjusted by the amount of guidance provided in the case assignment. Courses include introduction to business analytics, descriptive analytics and visualization, communication through data storytelling. The case can be used for all modalities – in person, hybrid, online. The authors use it here for visualization and dynamic dashboards but using the same data set and compact case description, exploratory data analysis could be assigned.

Supplementary material

Supplementary material for this article can be found online.

Article
Publication date: 23 October 2023

Mariam Bader, Jiju Antony, Raja Jayaraman, Vikas Swarnakar, Ravindra S. Goonetilleke, Maher Maalouf, Jose Arturo Garza-Reyes and Kevin Linderman

The purpose of this study is to examine the critical failure factors (CFFs) linked to various types of process improvement (PI) projects such as Kaizen, Lean, Six Sigma, Lean Six…

Abstract

Purpose

The purpose of this study is to examine the critical failure factors (CFFs) linked to various types of process improvement (PI) projects such as Kaizen, Lean, Six Sigma, Lean Six Sigma and Agile. Proposing a mitigation framework accordingly is also an aim of this study.

Design/methodology/approach

This research undertakes a systematic literature review of 49 papers that were relevant to the scope of the study and that were published in four prominent databases, including Google Scholar, Scopus, Web of Science and EBSCO.

Findings

Further analysis identifies 39 factors that contribute to the failure of PI projects. Among these factors, significant emphasis is placed on issues such as “resistance to cultural change,” “insufficient support from top management,” “inadequate training and education,” “poor communication” and “lack of resources,” as primary causes of PI project failures. To address and overcome the PI project failures, the authors propose a framework for failure mitigation based on change management models. The authors present future research directions that aim to enhance both the theoretical understanding and practical aspects of PI project failures.

Practical implications

Through this study, researchers and project managers can benefit from well-structured guidelines and invaluable insights that will help them identify and address potential failures, leading to successful implementation and sustainable improvements within organizations.

Originality/value

To the best of the author’s knowledge, this paper is the first study of its kind to examine the CFFs of five PI methodologies and introduces a novel approach derived from change management theory as a solution to minimize the risk associated with PI failure.

Details

International Journal of Lean Six Sigma, vol. 15 no. 3
Type: Research Article
ISSN: 2040-4166

Keywords

1 – 10 of over 3000