Search results
11 – 20 of over 258000Thomas R. Gulledge and Rainer A. Sommer
The management of the US Department of Defense (DoD) enterprise must change. Years of under‐funding have led to a wide gap between enterprise support requirements and resources…
Abstract
The management of the US Department of Defense (DoD) enterprise must change. Years of under‐funding have led to a wide gap between enterprise support requirements and resources. Private sector firms have faced similar choices. This paper shows how the public enterprise can be changed. Our hypothesis is that private sector implementations of standard software will lead to increased effectiveness and efficiency in public sector organizations. Sufficient detail is provided on how to transition to a modern integrated public sector enterprise, and the steps for implementing such a project are outlined, following standard private sector implementation practices. To explain the problem and solution, the DoD installation management enterprise is used as an example.
Details
Keywords
Martin Aruldoss, Miranda Lakshmi Travis and V. Prasanna Venkatesan
Bankruptcy is a financial failure of a business or an organization. Different kinds of bankruptcy prediction techniques are proposed to predict it. But, they are restricted as…
Abstract
Purpose
Bankruptcy is a financial failure of a business or an organization. Different kinds of bankruptcy prediction techniques are proposed to predict it. But, they are restricted as techniques in predicting the bankruptcy and not addressing the associated activities like acquiring the suitable data and delivering the results to the user after processing it. This situation demands to look for a comprehensive solution for predicting bankruptcy with intelligence. The paper aims to discuss these issues.
Design/methodology/approach
To model Business Intelligence (BI) solution for BP the concept of reference model is used. A Reference Model for Business Intelligence to Predict Bankruptcy (RMBIPB) is designed by applying unit operations as hierarchical structure with abstract components. The layers of RMBIPB are constructed from the hierarchical structure of the model and the components, which are part of the reference model. In this model, each layer is designed based on the functional requirements of the Business Intelligence System (BIS).
Findings
This reference model exhibits the non functional software qualities intended for the appropriate unit operations. It has flexible design in which techniques are selected with minimal effort to conduct the bankruptcy prediction. The same reference model for another domain can be implemented with different kinds of techniques for bankruptcy prediction.
Research limitations/implications
This model is designed using unit operations and the software qualities exhibited by RMBIPB are limited by unit operations. The data set which is applied in RMBIPB is limited to Indian banks.
Originality/value
A comprehensive bankruptcy prediction model using BI with customized reporting.
Details
Keywords
Martin Hubert Ofner, Kevin Straub, Boris Otto and Hubert Oesterle
The purpose of the paper is to propose a reference model describing a holistic view of the master data lifecycle, including strategic, tactical and operational aspects. The Master…
Abstract
Purpose
The purpose of the paper is to propose a reference model describing a holistic view of the master data lifecycle, including strategic, tactical and operational aspects. The Master Data Lifecycle Management (MDLM) map provides a structured approach to analyze the master data lifecycle.
Design/methodology/approach
Embedded in a design oriented research process, the paper applies the Component Business Model (CBM) method and suggests a reference model which identifies the business components required to manage the master data lifecycle. CBM is a patented IBM method to analyze the key components of a business domain. The paper uses a participative case study to evaluate the suggested model.
Findings
Based on a participative case study, the paper shows how the reference model makes it possible to analyze the master data lifecycle on a strategic, a tactical and an operational level, and how it helps identify areas of improvement.
Research limitations/implications
The paper presents design work and a participative case study. The reference model is grounded in existing literature and represents a comprehensive framework forming the foundation for future analysis of the master data lifecycle. Furthermore, the model represents an abstraction of an organization's master data lifecycle. Hence, it forms a “theory for designing”. More research is needed in order to more thoroughly evaluate the presented model in a variety of real‐life settings.
Practical implications
The paper shows how the reference model enables practitioners to analyze the master data lifecycle and how it helps identify areas of improvement.
Originality/value
The paper reports on an attempt to establish a holistic view of the master data lifecycle, including strategic, tactical and operational aspects, in order to provide more comprehensive support for its analysis and improvement.
Details
Keywords
Aggeliki Tsohou, Habin Lee, Zahir Irani, Vishanth Weerakkody, Ibrahim H. Osman, Abdel L. Anouze and Tunc Medeni
Evaluating and optimizing e‐government services is imperative for governments especially due to the capacity of e‐services to transform public administrations and assist the…
Abstract
Purpose
Evaluating and optimizing e‐government services is imperative for governments especially due to the capacity of e‐services to transform public administrations and assist the interactions of governments with citizens, businesses and other government agencies. Existing widely applied evaluation approaches neglect to incorporate citizens' satisfaction measures. The purpose of this paper is twofold: to contribute to the understanding of citizen‐centric e‐government evaluation and unify existing key performance indicators (KPIs); and to propose a reference process model of a novel evaluation approach that uses the unified KPIs to facilitate the creation of a “know‐how” repository.
Design/methodology/approach
The authors adopt a quantitative research approach for the evaluation of e‐government services that is based on data envelope analysis (DEA). A survey was conducted for the empirical investigation and data were collected from 13 e‐government services in Turkey. Based on the empirical application of the e‐government evaluation method, a reference process model is designed.
Findings
The proposed evaluation method was proved valid and able to provide assessment with richer explanations than traditional statistical measurements. DEA enabled the identification of insufficient e‐government services and the provision of suggested improvements.
Research limitations/implications
The reference process model is constructed based on the experience gained by applying the method to a sole cultural setting;, i.e. e‐government services in Turkey.
Practical implications
The proposed evaluation method, in comparison to other user‐oriented ones, provided assessments with richer explanations than traditional statistical measurements, such as structured equation modelling. The reference process model constructed based on the empirical research is expected to accelerate the citizen‐oriented evaluation of e‐government and promote impact‐oriented indicators.
Originality/value
This is the first application of DEA in the e‐government field, although it has been widely applied for performance measurement in other fields, especially operations research. The novelty of DEA is that the assessment results provide suggestions for strategic improvement of the e‐services.
Details
Keywords
Thomas R. Gulledge, Phil Hayes, Alexander Lotterer and Georg Simon
The US Department of Defense (DoD) is engaged in a multi‐year transformation of logistics planning and execution, known as the Future Logistics Enterprise (FLE). It is currently…
Abstract
The US Department of Defense (DoD) is engaged in a multi‐year transformation of logistics planning and execution, known as the Future Logistics Enterprise (FLE). It is currently being defined in policy documents and an implementation plan known as the Future Logistics Architecture (FLA). The systems strategy for the FLE is still emerging, but it is anticipated that commercial standard software will play a significant role in the enablement of the new logistics business processes. A number of products are available for implementation, but this paper focuses on mySAP.com from SAP AG. We show the strategy for aligning the SAP reference hierarchy and the associated reference business process models with the FLA. The result of the mapping and associated analysis is an SAP reference model for the FLE, which can be used as a guide for the software vendor for future product development strategies. This paper reports on the development of the FLA, its alignment with mySAP.com and the development of the SAP reference model.
Details
Keywords
Luiz C.R. Carpinetti, Thiago Buosi and Mateus C. Gerólamo
This paper presents a reference model for the process of management of quality and improvement based on a conceptual framework for managing the process of systematically deriving…
Abstract
This paper presents a reference model for the process of management of quality and improvement based on a conceptual framework for managing the process of systematically deriving improvement actions from customer expectations and strategic decisions through business processes, and prioritising actions that will most contribute to achievement strategic objectives. After some introductory theoretical background to discuss the need for systematically managing quality and improvement as well as the contribution of mapping business processes, the process reference model is described to a certain extent and detailed by means of activity tree and event‐driven process chain (EPC) diagrams. Finally, some considerations are made on the benefits of using such an approach.
Details
Keywords
David Little, Matthew Peck, Ralph Rollins and Keith Porter
For the past 20 years production planning and control has been dominated by manufacturing resource planning (MRPII) and its antecedents. The authors are completing case study…
Abstract
For the past 20 years production planning and control has been dominated by manufacturing resource planning (MRPII) and its antecedents. The authors are completing case study based research that is aimed at developing novel planning and scheduling reference models for industrial sectors where the MRPII paradigm is not appropriate. It outlines the process mapping approach adopted for data capture within the case study companies and the use of ARIS, Scheer’s enterprise modelling tool, for the production of sector reference models.
Details
Keywords
Thomas Gulledge and Tamer Chavusholu
This paper aims to automate the supply chain operations reference (SCOR) model as an enabler for process‐oriented supply chain business intelligence.
Abstract
Purpose
This paper aims to automate the supply chain operations reference (SCOR) model as an enabler for process‐oriented supply chain business intelligence.
Design/methodology/approach
The hypothesis is the following: SCOR model automation is possible using data that is directly extracted from integrated enterprise systems. To test the hypothesis, an alignment product that allows the SCOR model to be automated with information that is directly extracted from the Oracle E‐Business Suite was developed.
Findings
In order to achieve the full benefits from the SCOR model, effective business process management and the SCOR key performance indicators (KPIs) must be implemented and used. Unless data collection to support KPI construction is automated, it is difficult to institutionalize the SCOR model as a measurement and benchmarking framework. We have demonstrated that automated support for KPIs is feasible and achievable.
Research limitations/implications
The E‐Business Suite is a single enterprise solution, but we assert that the same procedures could be followed with other enterprise solutions or even applied in a legacy system environment.
Originality/value
The developed solution described in the paper can immediately be applied to the design, development, and deployment of corporate performance management systems.
Details
Keywords
Sarra Dahmani, Xavier Boucher, Didier Gourc, Sophie Peillon and François Marmier
The paper proposes an innovative systemic method helping decision-makers to control servitization transition process, through decision process risk diagnosis.
Abstract
Purpose
The paper proposes an innovative systemic method helping decision-makers to control servitization transition process, through decision process risk diagnosis.
Design/methodology/approach
The proposed method is based on the modeling of decision processes and risk identification and analysis. This method was based on an action-research approach, in close relationship with two companies (SMEs). The paper develops the feasibility experiment at Automelec company.
Findings
The method was successfully implemented and delivered concrete diagnosis results.
Research limitations/implications
The generalization of the applicability of the method needs to be tested on several different cases.
Practical implications
The first practical implication is related to the efficiency of the method to help decision-makers in a servitization context to limit uncertainty and get a global view of the weaknesses of their decision-making process, it raises their awareness about servitization transition for their companies. Furthermore, the method also helps to explain the strategy of a servitization transition. It enhances the level of maturity of the decision process of the company, and can be used as a training/learning tool for managers.
Social implications
The results brought by the research contribute to give the decision-making boards for organization living a servitization transition and especially SMEs a better control over the servitization decision process and related risks, which will increase the economic stability of the company and its vision over long, medium and short horizons. This will bring positive impact on the overall economic and social environment and networks of the servitized SME, and enhance the confidence of coworkers, subcontractors and clients.
Originality/value
The first originality of the paper is related to the new way of considering risk, not only as an analysis criterion but as the central driver in steering a strategic transition for the company, such as servitization. The second originality of the study is about assessing risk occurrence over a decision-making process through decision reliability and decision confidence.
Details
Keywords
Frank Teuteberg, Martin Kluth, Frederik Ahlemann and Stefan Smolnik
The purpose of this paper is to illustrate and evaluate the semantic process benchmarking concept.
Abstract
Purpose
The purpose of this paper is to illustrate and evaluate the semantic process benchmarking concept.
Design/methodology/approach
The authors' approach includes the use of metamodels and ontologies, which make the process models syntactically and semantically comparable. Furthermore, a software prototype is presented to analyze and compare individual process models and their performance information. Thereafter, the technical, conceptual, and economic perspectives of the approach's evaluation are aligned with their respective outcomes.
Findings
The evaluation proves that this approach is generally suitable to generate novel and useful information on different process models and their performance within the same problem domain. However, the initial set‐up costs are high and will only pay off once process models are used regularly.
Practical implications
The proposed approach depends strongly on the availability of appropriate metrics and ontologies, as well as on the annotation of these ontologies to process models, which is a time‐consuming task. If large benchmarking clearing centers are established, the approach will be more cost‐effective. The developed SEMAT prototype, that demonstrates and proves the proposed approach's general viability, supports cost‐effective ontology engineering and annotation in the context of semantic process benchmarking initiatives.
Originality/value
To date, process benchmarking has primarily been a manual process. In this article, the authors suggest an approach that allows time‐consuming and costly process analysis to be partially automated, which makes the performance indicators, as well as qualitative differences between processes, apparent.
Details