Search results

1 – 10 of over 59000
Article
Publication date: 1 February 2011

Risto Silvola, Olli Jaaskelainen, Hanna Kropsu‐Vehkapera and Harri Haapasalo

This paper aims to provide a framework of the multidimensional concept of one master data. Preconditions required for successful one master data implementation and usage in large…

4819

Abstract

Purpose

This paper aims to provide a framework of the multidimensional concept of one master data. Preconditions required for successful one master data implementation and usage in large high‐tech companies are presented and related current challenges companies have today are identified.

Design/methodology/approach

This paper is qualitative in nature. First, literature was studied to find out the elements of one master data. Second, an interview study was carried out in eight high‐tech companies and in three expert companies.

Findings

One master data management framework is the composition of data, processes and information systems. Accordingly, the key challenges related to the data are that the definitions of master data are unclear and overall data quality is poor. Challenges on processes related to managing master data are inadequately defined data ownership, incoherent data management practices and lack of continuous data quality practices. Integrations between applications are fundamental challenge to tackle when constructing an holistic one master data.

Research limitations/implications

Studied companies are vanguards in the area of master data management (MDM), providing good views on topical issues in large companies. This study offers a general view of the topic but not describes special company situations as companies need to adapt the presented concepts for their specific case. Significant implication for future research is that MDM can no more be classified and discussed as only an IT problem but it is a managerial challenge which requires structural changes on mindset how issues are handled.

Practical implications

This paper provides a better understanding over the issues which are impacting on the implementation of one master data. The preconditions of implementing and executing one master data are: an organization wide and defined data model; clear data ownership definitions; pro‐active data quality surveillance; data friendly company culture; the clear definitions of roles and responsibilities; organizational structure that supports data processes; clear data process definitions; support from the managerial level; and information systems that utilize the unified data model. The list of preconditions is wide and it also describes the incoherence of current understanding about MDM. This list helps business managers to understand the extent of the concept and to see that master data management is not only an IT issue.

Originality/value

The existing practical research on master data management is limited and, for example, the general challenges have not been reported earlier. This paper offers practical research on one master data. The obtained results illustrates the extent of the topic and the fact that business relevant data management is not only an IT (application) issue but requires understanding of the data, its utilization in organization and supporting practices such as data ownership.

Details

Industrial Management & Data Systems, vol. 111 no. 1
Type: Research Article
ISSN: 0263-5577

Keywords

Article
Publication date: 19 July 2013

Martin Hubert Ofner, Kevin Straub, Boris Otto and Hubert Oesterle

The purpose of the paper is to propose a reference model describing a holistic view of the master data lifecycle, including strategic, tactical and operational aspects. The Master

3569

Abstract

Purpose

The purpose of the paper is to propose a reference model describing a holistic view of the master data lifecycle, including strategic, tactical and operational aspects. The Master Data Lifecycle Management (MDLM) map provides a structured approach to analyze the master data lifecycle.

Design/methodology/approach

Embedded in a design oriented research process, the paper applies the Component Business Model (CBM) method and suggests a reference model which identifies the business components required to manage the master data lifecycle. CBM is a patented IBM method to analyze the key components of a business domain. The paper uses a participative case study to evaluate the suggested model.

Findings

Based on a participative case study, the paper shows how the reference model makes it possible to analyze the master data lifecycle on a strategic, a tactical and an operational level, and how it helps identify areas of improvement.

Research limitations/implications

The paper presents design work and a participative case study. The reference model is grounded in existing literature and represents a comprehensive framework forming the foundation for future analysis of the master data lifecycle. Furthermore, the model represents an abstraction of an organization's master data lifecycle. Hence, it forms a “theory for designing”. More research is needed in order to more thoroughly evaluate the presented model in a variety of real‐life settings.

Practical implications

The paper shows how the reference model enables practitioners to analyze the master data lifecycle and how it helps identify areas of improvement.

Originality/value

The paper reports on an attempt to establish a holistic view of the master data lifecycle, including strategic, tactical and operational aspects, in order to provide more comprehensive support for its analysis and improvement.

Details

Journal of Enterprise Information Management, vol. 26 no. 4
Type: Research Article
ISSN: 1741-0398

Keywords

Abstract

Details

Process Automation Strategy in Services, Manufacturing and Construction
Type: Book
ISBN: 978-1-80455-144-8

Article
Publication date: 10 April 2017

Riikka Vilminko-Heikkinen and Samuli Pekkola

Master data management (MDM) aims to improve the value of an organization’s most important data, such as customer data, by bridging the silos between organizational units and…

3186

Abstract

Purpose

Master data management (MDM) aims to improve the value of an organization’s most important data, such as customer data, by bridging the silos between organizational units and information systems. However, incorporating data management practices into an organization is not a simple task. The purpose of this paper is to provide a new understanding of the challenges in establishing and developing the MDM function within an organization.

Design/methodology/approach

This papers report an ethnographic study within a municipality. The data were collected from two consecutive MDM development projects over the time period of 32 months by observing MDM-related activities and interviewing appropriate actors. Observations, interviews, and impressions were documented to a diary that was later qualitatively analyzed. Various project documentation were also used.

Findings

In total 15 challenges were identified. Seven of these were not identified earlier in the literature. New challenges included legislation-driven challenges, mutual understanding of master data domains, and the level of granularity for those domains. Eight issues, such as data owner and data definitions, were MDM specific, others being more generic. All of the issues were identified as preconditions or as affecting factors for the others. Three of the issues were identified as pivotal. The issues emphasize strong alignment between the complex concept of MDM and the organization adopting it.

Research limitations/implications

This research was based on a single qualitative case study, and caution should be exercised with regard to generalizations. The findings increase understanding about the complex organizational phenomena. The study offers public sector and private sector practitioners insights of the organizational issues that establishing a MDM function can encounter.

Originality/value

The issues discovered in the research shed light on the strong alignment between the complex concept of MDM and the organization. The results of this study assist researchers in their endeavor to understand the organizational aspects of MDM, and to build theoretical models, frameworks, practices, and explanations.

Details

Journal of Enterprise Information Management, vol. 30 no. 3
Type: Research Article
ISSN: 1741-0398

Keywords

Article
Publication date: 19 April 2011

Anders Haug and Jan Stentoft Arlbjørn

While few would disagree that high data quality is a precondition for the efficiency of a company, this remains an area to which many companies do not give adequate attention…

5112

Abstract

Purpose

While few would disagree that high data quality is a precondition for the efficiency of a company, this remains an area to which many companies do not give adequate attention. Thus, this paper aims to identify which are the most important barriers preventing companies from achieving high data quality. By improving awareness of barriers on which to concentrate, companies are put in a better position to achieve high quality data.

Design/methodology/approach

First, a literature review of data quality and data quality barriers is carried out. Based on this literature review, the paper identifies a set of overall barriers to ensuring high data quality. The significance of these barriers is investigated by a questionnaire study, which includes responses from 90 Danish companies. Because of the fundamental difference between master data and transaction data, the questionnaire is limited to focusing only on master data.

Findings

The results of the survey indicate that a lack of delegation of responsibilities for maintaining master data is the single aspect which has the largest impact on master data quality. Also, the survey shows that the vast majority of the companies believe that poor master data quality does have significant negative effects.

Research limitations/implications

The contributions of this paper represent a step towards an improved understanding of how to increase the level of master data quality in companies. This knowledge may have a positive impact on the data quality in companies. However, since the study presented in this paper appears to be the first of its kind, the conclusions drawn need further investigation by other research studies in the future.

Practical implications

This paper identifies the main barriers for ensuring high master data quality and investigates which of these factors are the most important. By focusing on these barriers, companies will have better chances of increasing their data quality.

Originality/value

The study presented in this paper appears to be the first of its kind, and it represents an important step towards understanding better why companies find it difficult to achieve satisfactory data quality levels.

Details

Journal of Enterprise Information Management, vol. 24 no. 3
Type: Research Article
ISSN: 1741-0398

Keywords

Article
Publication date: 20 November 2017

Jacqueline Edana Tyler

The purpose of this paper is to share the experience of the document discovery process, during the implementation of an asset management system for a rail company. This system…

1140

Abstract

Purpose

The purpose of this paper is to share the experience of the document discovery process, during the implementation of an asset management system for a rail company. This system will deliver comprehensive enterprise asset management information from a single source, with information provided to mobile devices, for use by field workers. This case study presents the challenges encountered in the search, retrieval and management of documentation for use on a daily basis for civil standard maintenance tasks.

Design/methodology/approach

Evidence gathered for this paper was a result of direct and participant observation over a period of 18 months from 2014 to 2016. As a member of the project team, certain privileges were accorded to the researcher who was placed in a unique position to act as the main research instrument, able to collect data on the systems used as well as the everyday practices on information capture and document production.

Findings

Document quality and standards can be overlooked or deemed as not crucial; the value, significance and importance of documentation are lost when no one takes ownership; the understanding and application of standards, quality management and governance can have a direct bearing on the effective management and control of documents and subsequent records produced.

Research limitations/implications

Research is limited, as this is a single case study.

Practical implications

By highlighting the challenges faced and the resolutions used, this paper hopes to offer a level of practical guidance with the detection process for maintenance tasks for the civil assets discipline for a rail network.

Originality/value

This case study contributes to the understanding of quality management and the role it plays in document management and in turn the search and retrieval process. It provides evidence that documents must be systematically managed and controlled to limit risk both internally and externally.

Details

Records Management Journal, vol. 27 no. 3
Type: Research Article
ISSN: 0956-5698

Keywords

Article
Publication date: 8 March 2013

Anders Haug, Jan Stentoft Arlbjørn, Frederik Zachariassen and Jakob Schlichter

The development of IT has enabled organizations to collect and store many times more data than they were able to just decades ago. This means that companies are now faced with…

2879

Abstract

Purpose

The development of IT has enabled organizations to collect and store many times more data than they were able to just decades ago. This means that companies are now faced with managing huge amounts of data, which represents new challenges in ensuring high data quality. The purpose of this paper is to identify barriers to obtaining high master data quality.

Design/methodology/approach

This paper defines relevant master data quality barriers and investigates their mutual importance through organizing data quality barriers identified in literature into a framework for analysis of data quality. The importance of the different classes of data quality barriers is investigated by a large questionnaire study, including answers from 787 Danish manufacturing companies.

Findings

Based on a literature review, the paper identifies 12 master data quality barriers. The relevance and completeness of this classification is investigated by a large questionnaire study, which also clarifies the mutual importance of the defined barriers and the differences in importance in small, medium, and large companies.

Research limitations/implications

The defined classification of data quality barriers provides a point of departure for future research by pointing to relevant areas for investigation of data quality problems. The limitations of the study are that it focuses only on manufacturing companies and master data (i.e. not transaction data).

Practical implications

The classification of data quality barriers can give companies increased awareness of why they experience data quality problems. In addition, the paper suggests giving primary focus to organizational issues rather than perceiving poor data quality as an IT problem.

Originality/value

Compared to extant classifications of data quality barriers, the contribution of this paper represents a more detailed and complete picture of what the barriers are in relation to data quality. Furthermore, the presented classification has been investigated by a large questionnaire study, for which reason it is founded on a more solid empirical basis than existing classifications.

Details

Industrial Management & Data Systems, vol. 113 no. 2
Type: Research Article
ISSN: 0263-5577

Keywords

Article
Publication date: 3 October 2019

Hannu Hannila, Joni Koskinen, Janne Harkonen and Harri Haapasalo

The purpose of this paper is to analyse current challenges and to articulate the preconditions for data-driven, fact-based product portfolio management (PPM) based on commercial…

1265

Abstract

Purpose

The purpose of this paper is to analyse current challenges and to articulate the preconditions for data-driven, fact-based product portfolio management (PPM) based on commercial and technical product structures, critical business processes, corporate business IT and company data assets. Here, data assets were classified from a PPM perspective in terms of (product/customer/supplier) master data, transaction data and Internet of Things data. The study also addresses the supporting role of corporate-level data governance.

Design/methodology/approach

The study combines a literature review and qualitative analysis of empirical data collected from eight international companies of varying size.

Findings

Companies’ current inability to analyse products effectively based on existing data is surprising. The present findings identify a number of preconditions for data-driven, fact-based PPM, including mutual understanding of company products (to establish a consistent commercial and technical product structure), product classification as strategic, supportive or non-strategic (to link commercial and technical product structures with product strategy) and a holistic, corporate-level data model for adjusting the company’s business IT (to support product portfolio visualisation).

Practical implications

The findings provide a logical and empirical basis for fact-based, product-level analysis of product profitability and analysis of the product portfolio over the product life cycle, supporting a data-driven approach to the optimisation of commercial and technical product structure, business IT systems and company product strategy. As a virtual representation of reality, the company data model facilitates product visualisation. The findings are of great practical value, as they demonstrate the significance of corporate-level data assets, data governance and business-critical data for managing a company’s products and portfolio.

Originality/value

The study contributes to the existing literature by specifying the preconditions for data-driven, fact-based PPM as a basis for product-level analysis and decision making, emphasising the role of company data assets and clarifying the links between business processes, information systems and data assets for PPM.

Details

Journal of Enterprise Information Management, vol. 33 no. 1
Type: Research Article
ISSN: 1741-0398

Keywords

Article
Publication date: 2 November 2015

Lukas Prorokowski and Hubert Prorokowski

BCBS 239 sets out a challenging standard for risk data processing and reporting. Any bank striving to comply with the principles will be keen to introspect how risk data is…

Abstract

Purpose

BCBS 239 sets out a challenging standard for risk data processing and reporting. Any bank striving to comply with the principles will be keen to introspect how risk data is organized and what execution capabilities are at their disposal. With this in mind, the current paper advises banks on the growing number of solutions, tools and techniques that can be used to support risk data management frameworks under BCBS 239.

Design/methodology/approach

This paper, based on a survey with 29 major financial institutions, including G-SIBs and D-SIBs from diversified geographical regions such as North America, Europe and APAC, aims to advise banks and other financial services firms on what is needed to become ready and compliant with BCBS 239. This paper discusses best practice solutions for master data management, data lineage and end user implementations.

Findings

The primary conclusion of this paper is that banks should not treat BCBS 239 as yet another compliance exercise. The BCBS 239 principles constitute a driving force to restore viability and improve risk governance. In light of the new standards, banks can benefit from making significant progress towards risk data management transformation. This report argues that banks need to invest in a solution that empowers those who use the data to manage risk data. Thus, operational complexities are lifted and no data operations team is needed for proprietary coding of the data. Only then banks will stay abreast of the competition, while becoming fully compliant with the BCBS 239 principles.

Practical implications

As noted by Prorokowski (2014), “Increasingly zero accountability, imposed, leveraged omnipresent vast endeavors, yielding ongoing understanding […] of the impact of the global financial crisis on the ways data should be treated” sparked off international debates addressing the need for an effective solution to risk data management and reporting.

Originality/value

This paper discusses the forthcoming regulatory change that will have a significant impact on the banking industry. The Basel Committee on Banking Supervision published its Principles for effective risk data aggregation and risk reporting (BCBS239) in January last year. The document contains 11 principles that Global Systemically Important Banks (G-SIBs) will need to comply with by January 2016. The BCBS 239 principles are regarded as the least known components of the new regulatory reforms. As it transpires, the principles require many banks to undertake a significant amount of technical work and investments in IT infrastructure. Furthermore, BCBS 239 urges financial services firms to review their definitions of the completeness of risk data.

Details

Journal of Investment Compliance, vol. 16 no. 4
Type: Research Article
ISSN: 1528-5812

Keywords

Article
Publication date: 13 April 2012

Boris Otto

The paper seeks to investigate the question as to how the business benefits of product data management (PDM) can be assessed and realized. In particular, it aims at understanding…

2399

Abstract

Purpose

The paper seeks to investigate the question as to how the business benefits of product data management (PDM) can be assessed and realized. In particular, it aims at understanding the means‐end relationship between PDM and product data on the one hand and a company's business goals on the other hand.

Design/methodology/approach

The paper uses a case study research approach. The case of Festo is unique and allows for detailed examination of both the business benefits of PDM and of the inter‐dependencies of various business benefit enablers. Due to the limited amount of scientific knowledge with regard to the management of PDM business benefits, the study is exploratory in nature. The conceptual framework used to guide the study combines business engineering concepts and the business dependency network technique.

Findings

The findings are threefold. First, the paper explicates and details the understanding of the nature of PDM business benefits. Second, it provides insight into the complexity and interdependency of various “means” – such as data ownership, product data standards, for example – and the “ends” of PDM, namely the contribution to a company's business goals. Third, the paper forms the baseline for a comprehensive method supporting the management of PDM business benefits.

Research limitations/implications

Single‐case studies require further validation of findings. Thus, future research should aim at replicating the findings and at developing a comprehensive method for the management of PDM business benefits.

Practical implications

Companies may take up the results as a “blueprint” for their own PDM activities and may reflect their own business benefits against the case of Festo.

Originality/value

The paper is one of the first contributions focusing on the means‐end relationship between PDM and product data on the one hand and a company's business goals on the other.

1 – 10 of over 59000