Search results
1 – 10 of over 210000The process approach to multi-level organizational behavior is based on the assumption that multi-level organizational behavior is processual in nature. This article defines group…
Abstract
The process approach to multi-level organizational behavior is based on the assumption that multi-level organizational behavior is processual in nature. This article defines group and organizational processes and their representation as process frameworks. Both functional and inclusional classes of levels exist, each of which has at least five categories of levels. All ten categories are special cases of process frameworks. This article provides examples of each category level, which it uses to illustrate new models of organizational work, extended models of interdependence, a new typology of theories based on their levels of processes, and a new tool for survey research called knobby analyses. After explaining the basic idea of knobby analysis, the article briefly describes the processual theory of the organizational hologram, the use of linear programming, and causal-chain analysis to provide multi-level explanations of employee opinion data. These ideas are embodied in conducting a strategic organizational diagnosis, which is the first stage of organizational design. Organizational design encompasses multiple stages, each of which itself involves multiple, multi-level phenomena and analyses. The basic point is that the processual nature of multi-level organizational phenomena gives more hope for improvements in theory building and their application if one uses the process approach rather than a variable approach.
The frequent and increasingly potent cyber-attacks because of lack of an optimal mix of technical as well as non-technical IT controls has led to increased adoption of security…
Abstract
Purpose
The frequent and increasingly potent cyber-attacks because of lack of an optimal mix of technical as well as non-technical IT controls has led to increased adoption of security governance controls by organizations. The purpose of this paper, thus, is to construct and empirically validate an information security governance (ISG) process model through the plan–do–check–act (PDCA) cycle model of Deming.
Design/methodology/approach
This descriptive research using an interpretive paradigm follows a qualitative methodology using expert interviews of five respondents working in the ISG domain in United Arab Emirates (UAE) to validate the theoretical model.
Findings
The findings of this paper suggest the primacy of the PDCA Deming cycle for initiating ISG through a risk-based approach assisted by industry-wide best practices in ISG. Regarding selection of ISG frameworks, respondents preferred to have ISO 27K supported by NIST as the core framework with other relevant ISG frameworks/standards forming the peripheral layer. The implementation focus of the ISG model is on mapping ISO 27K/NIST IT controls relevant IT controls selected from ISG frameworks from a horizontal and vertical perspective. Respondents asserted the automation of measurement and control mechanism through automation to assist in the feedback loop of the PDCA cycle.
Originality/value
The validated model helps academics and practitioners gain insight into the methodology of the phased implementation of an information systems governance process through the PDCA model, as well as the positioning of ITG and ITG frameworks in ISG. Practitioners can glean valuable insights from the empirical section of the research where experts detail the success factors, the sequential steps and justification of these factors in the ISG implementation process.
Details
Keywords
Abdulrahman Alrabiah and Steve Drew
This paper first aims to examine how business process change decisions (BPCDs) were implemented in a government organisation bound by tightly coupled temporal constraints (TTCs)…
Abstract
Purpose
This paper first aims to examine how business process change decisions (BPCDs) were implemented in a government organisation bound by tightly coupled temporal constraints (TTCs). Second, it focuses on how to achieve optimal and efficient BPCDs that require tight compliance with regulators’ temporal constraints. Finally, it formulates a rigorous framework that can facilitate the execution of optimal BPCDs with maximum efficiency and minimal effort, time and cost.
Design/methodology/approach
Decision-making biases by individuals or groups in organisations can impede optimal BPC implementation; to demonstrate this, a case study is investigated and the formulated framework is applied to tackle these failings.
Findings
The case study analysis shows 76 per cent of the BPCDs implemented were inefficient, mostly because of poor decisions, and these resulted in negative ripple effects. In response, the newly developed hierarchical change management structure (HCMS) framework was used to empower organisations to execute high-velocity BPCDs, enabling them to handle any temporal constraints imposed by regulators or other exogenous factors. The HCMS framework was found to be highly effective, scoring an average improvement of more than 100 per cent when measured using decision quality dimensions. This paper would be of value for business executives and strategic decision makers engaging with BPC.
Research limitations/implications
The HCMS framework has been applied in a single case study as a proof of concept. Future research could extend its application to broader domains that have multi-attribute structures and environments. The evaluation processes of the proposed framework are based on subjective metrics. Causal links from the framework to business process metrics will provide a more complete performance picture.
Practical implications
The outcome of this research assists in formulating a systematic BPCD framework that is otherwise unavailable. The practical use of the proposed framework would potentially impact on quality outcomes for organisations. The model is derived from decision trees and analytical hierarchical processes and is tailored to address this problematic area. The proposed HCMS framework would help organisations to execute efficient BPCDs with minimal time, effort and cost. The HCMS framework contributes to the academic literature on BPCD that leverages diverse stakeholders to engage in BPC initiatives.
Originality/value
The research presents a novel framework –HCMS – that provides a platform for organisations to easily determine and solve hierarchical decision structure problems, thereby allowing them to efficiently automate and institutionalise optimal BPCDs.
Details
Keywords
Jonathan Robin Crusoe and Karin Ahlin
This paper aims to develop a user process framework with activities and their variations for the use of open government data (OGD) based on empirical material and previous…
Abstract
Purpose
This paper aims to develop a user process framework with activities and their variations for the use of open government data (OGD) based on empirical material and previous research. OGD is interoperable data that is shared by public organisations (publishers) for anyone (users) to reuse without restrictions to create new digital products and services. The user process was roughly identified in previous research but lacks an in-depth description. This lack can hamper the ability to encourage the use and the development of related theories.
Design/methodology/approach
A three-stage research approach was used. First, a tentative framework was created from previous research and empirical material. This stage involved three different literature reviews, data mapping and seven interviews with OGD experts. The empirical material was analysed with inductive analysis, and previous research was integrated into the framework through concept mapping. Second, the tentative framework was reviewed by informed OGD experts. Third, the framework was finalised with additional literature reviews, eight interviews with OGD users, and a member check, including all the respondents. The framework was used to guide the data collection and as a tool in the analysis.
Findings
The user process framework covers activities and related variations, where the included phases are: start, identify, acquire, enrich and deploy. The start varies relating to the intended use of the OGD. In the identify phase, the user is exploring the accessible data to decide if the data are relevant. In the acquire phase, the user is preparing for the delivery of the data from the publisher and receiving it. In the enrich phase, the user is concocting and making something. In the final deploy phase, the user has a product or service that can be provided to end-users.
Research limitations/implications
The framework development has some limitations: the framework needs testing and development in different contexts and further verification. The implications are that the framework can help guide researchers towards relevant and essential data of the user process, be used as a point of comparison in analysis, and be used as a skeleton for more precious theories.
Practical implications
The framework has some practical implications for users, publishers and portals. It can introduce users to the user process and help them plan for the execution of it. The framework can help publishers understand how the users can work with their data and what can be expected of them. The framework can help portal owners to understand the portal’s role between users and publishers and what functionality and features they can provide to support to the user.
Originality/value
In previous research, no user process with an in-depth description was identified. However, several studies have given a rough recall. Thus, this research provides an in-depth description of the user process with its variations. The framework can support practice and leads to new research avenues.
Details
Keywords
Mohammad Ehson Rangiha, Marco Comuzzi and Bill Karakostas
The purpose of this paper is to present a framework for social business process management (BPM) in which social tagging is used to capture process knowledge emerging during the…
Abstract
Purpose
The purpose of this paper is to present a framework for social business process management (BPM) in which social tagging is used to capture process knowledge emerging during the design and enactment of the processes. Process knowledge concerns both the type of activities chosen to fulfil a certain goal and the skills and experience of users in executing specific tasks. This knowledge is exploited by recommendation tools to support the design and enactment of current and future process instances.
Design/methodology/approach
The literature about traditional BPM is analysed to highlight the limitations of traditional BPM regarding management of ad hoc and semi-structured processes. Having identified this gap, an innovative BPM framework based on social tagging is proposed to address these limitations. This model is exemplified in a real case scenario and evaluated through the implementation of a prototype and a case study in real world non-profit organisation.
Findings
An overview of the social BPM framework is presented, introducing the concepts of role and task recommendation, which are supported by social tagging. The prototype shows the buildability of the social BPM framework as an extension of a Wiki platform. The case study demonstrates that the social BPM framework improves user collaborativeness in designing and executing process instances.
Research limitations/implications
The applicability of the framework is targeted to ad hoc and possibly semi-structured business processes and it does not extend to highly procedural and codified processes. A single case study limits the generalisability of the evaluation results.
Originality/value
The social BPM framework is the first to introduce task and role recommendation supported by social tagging to overcome the limitations of traditional BPM models.
Details
Keywords
Edward Kabaale and Geoffrey Mayoka Kituyi
Requirements engineering (RE) and process improvement has been identified as one of the key factors for improving software quality. Despite this, little scholarly work has been…
Abstract
Purpose
Requirements engineering (RE) and process improvement has been identified as one of the key factors for improving software quality. Despite this, little scholarly work has been done on developing ways to improve the RE process. The situation of RE and process improvement is even worse in small and medium enterprises that produce software. Consequently, the quality of software being produced by these companies has kept deteriorating. The purpose of this paper is to design a framework that will help small and medium software companies improve their RE processes in order to compete favorably with larger software companies, more especially in terms of software quality.
Design/methodology/approach
A qualitative research approach was adapted. Four software companies in Uganda were purposively selected to participate in the study. Data were collected using questionnaires. The requirements for designing the framework were gathered and refined from both primary and secondary data.
Findings
The key requirements for process improvement in small and medium software companies were identified as user involvement, use of evolutionary requirements engineering process improvement (REPI) strategy, change management, training and education, management support and commitment.
Practical implications
The designed framework was validated to ensure that it can be applied in RE and process improvement in small and medium software companies. Validation results show that the proposed framework is applicable and can be used to improve RE and process improvement in small and medium software companies.
Originality/value
The paper presents an improvement of the systematic approach to REPI by Kabaale and Nabukenya which is decomposed for easy understanding by non-technical readers and users.
Details
Keywords
Elisa Gonzalez Santacruz, David Romero, Julieta Noguez and Thorsten Wuest
This research paper aims to analyze the scientific and grey literature on Quality 4.0 and zero-defect manufacturing (ZDM) frameworks to develop an integrated quality 4.0 framework…
Abstract
Purpose
This research paper aims to analyze the scientific and grey literature on Quality 4.0 and zero-defect manufacturing (ZDM) frameworks to develop an integrated quality 4.0 framework (IQ4.0F) for quality improvement (QI) based on Six Sigma and machine learning (ML) techniques towards ZDM. The IQ4.0F aims to contribute to the advancement of defect prediction approaches in diverse manufacturing processes. Furthermore, the work enables a comprehensive analysis of process variables influencing product quality with emphasis on the use of supervised and unsupervised ML techniques in Six Sigma’s DMAIC (Define, Measure, Analyze, Improve and Control) cycle stage of “Analyze.”
Design/methodology/approach
The research methodology employed a systematic literature review (SLR) based on PRISMA guidelines to develop the integrated framework, followed by a real industrial case study set in the automotive industry to fulfill the objectives of verifying and validating the proposed IQ4.0F with primary data.
Findings
This research work demonstrates the value of a “stepwise framework” to facilitate a shift from conventional quality management systems (QMSs) to QMSs 4.0. It uses the IDEF0 modeling methodology and Six Sigma’s DMAIC cycle to structure the steps to be followed to adopt the Quality 4.0 paradigm for QI. It also proves the worth of integrating Six Sigma and ML techniques into the “Analyze” stage of the DMAIC cycle for improving defect prediction in manufacturing processes and supporting problem-solving activities for quality managers.
Originality/value
This research paper introduces a first-of-its-kind Quality 4.0 framework – the IQ4.0F. Each step of the IQ4.0F was verified and validated in an original industrial case study set in the automotive industry. It is the first Quality 4.0 framework, according to the SLR conducted, to utilize the principal component analysis technique as a substitute for “Screening Design” in the Design of Experiments phase and K-means clustering technique for multivariable analysis, identifying process parameters that significantly impact product quality. The proposed IQ4.0F not only empowers decision-makers with the knowledge to launch a Quality 4.0 initiative but also provides quality managers with a systematic problem-solving methodology for quality improvement.
Details
Keywords
Diane-Laure Arjaliès, Daniela Laurel-Fois and Nicolas Mottis
This article seeks to unravel the mechanisms through which financial actors agreed upon a sustainability accounting standard without financializing social and environmental…
Abstract
Purpose
This article seeks to unravel the mechanisms through which financial actors agreed upon a sustainability accounting standard without financializing social and environmental issues, i.e. assigning a monetary value to sustainability.
Design/methodology/approach
The article examines the Reporting and Assessment Framework created by the United Nations Principles for Responsible Investment (UN-PRI), the leading reporting sustainability framework in the asset management industry. It relies on a longitudinal case study that draws upon interviews, participant observation, and archival data.
Findings
The article demonstrates that the conception of the framework was a funnelling process of sustainability valuation comprising two co-constituted mechanisms: a process of valorization – judging what is deemed of value – and a process of evaluation – agreeing on how to assess value. This valuation process was unfolded by creating the framework, thanks to two enabling conditions: the creation of non-prescriptive evaluative criteria that avoided financialization and the valuation support of an enabling organization.
Originality/value
The article helps understand how an industry can encompass the diversity of motives and practices associated with the adoption of sustainability by its economic actors while suggesting a common framework to report on and assess those practices. It uncovers alternatives to the financialization process of sustainability accounting standards. The article also offers insights into the advantages and inconveniences of such a framework. The article enriches the literature in the sociology of valuation, financialization, and sustainability accounting.
Details
Keywords
Euthemia Stavrulaki and Mark Davis
As supply chain management has become more strategic (rather than transactional) in nature the need for a more integrated perspective of how products, and processes should be…
Abstract
Purpose
As supply chain management has become more strategic (rather than transactional) in nature the need for a more integrated perspective of how products, and processes should be aligned with strategic decisions to enhance competitive advantage has been amplified. The purpose of this paper is to provide a better understanding of how this alignment should be done.
Design/methodology/approach
A conceptual framework was developed that emphasizes the need for alignment between the key aspects of a product and its supply chain processes and highlight, the links between supply chain processes and supply chain strategy.
Findings
Products can be produced with one of four distinct supply chain structures: make to stock, assemble to order, built to order and design to order. Each supply chain structure is appropriate for different products based on their demand characteristics. Each supply chain structure orients its production and logistics processes differently based on its strategic priorities.
Practical implications
High volume, low demand uncertainty products should be matched with lean supply chains enabled by efficient processes, whereas low volume, high uncertainty products should be matched with agile supply chains enabled by flexible processes. Medium volume and medium demand uncertainty products should use leagile supply chains that use a combination of efficient and flexible processes.
Originality/value
After thoroughly reviewing and synthesizing important findings from existing literature, an integrated framework is derived that highlights how products should be best matched with their production and logistics processes. Also, the framework is compared with two well‐known, process‐oriented supply chain frameworks: the supply chain operations reference (SCOR) and the global supply chain forum (GSCF) models.
Details
Keywords
Michele Rubino, Filippo Vitolla and Antonello Garzoni
The purpose of this paper is to analyze how an IT governance framework [Control Objectives for Information and related Technology (COBIT)] influences the control environment and…
Abstract
Purpose
The purpose of this paper is to analyze how an IT governance framework [Control Objectives for Information and related Technology (COBIT)] influences the control environment and the internal control system. In particular, it aims to illustrate how the COBIT’s structure and processes impact on the seven categories of factors that compose the control environment.
Design/methodology/approach
This paper aims to highlight how an IT governance framework with its processes enables to improve the control environment assessment and implementation.
Findings
The analysis indicates that the implementation of the COBIT framework provides some indications for managers and auditors, which must implement or assess internal control system.
Practical implications
The adoption of the framework allows managers to focus effectively on integrating, aligning and linking processes. This improves the understanding of the key aspects connected to the control environment. In addition, the adoption of the framework allows overcoming some limitations regarding the Committee of Sponsoring Organizations framework.
Originality/value
This paper addresses an area of relevance to both practitioners and academics. This analysis focuses on Accounting Information Systems themes and, through the examination of an IT governance framework, suggests solutions and tools than can help managers and auditors to address the control environment assessment.
Details