Search results

1 – 10 of over 259000
Article
Publication date: 1 September 2005

Z.M. Ma

To provide a selective bibliography for researchers and practitioners interested in database modeling of engineering information with sources which can help them develop…

1948

Abstract

Purpose

To provide a selective bibliography for researchers and practitioners interested in database modeling of engineering information with sources which can help them develop engineering information systems.

Design/methodology/approach

Identifies the requirements for engineering information modeling and then investigates how current database models satisfy these requirements at two levels: conceptual data models and logical database models.

Findings

Presents the relationships among the conceptual data models and the logical database models for engineering information modeling viewed from database conceptual design.

Originality/value

Currently few papers provide comprehensive discussions about how current engineering information modeling can be supported by database technologies. This paper fills this gap. The contribution of the paper is to identify the direction of database study viewed from engineering applications and provide a guidance of information modeling for engineering design, manufacturing, and production management.

Details

Industrial Management & Data Systems, vol. 105 no. 7
Type: Research Article
ISSN: 0263-5577

Keywords

Article
Publication date: 2 November 2012

Martin H. Ofner, Boris Otto and Hubert Österle

The purpose of this paper is to conceptualize data quality (DQ) in the context of business process management and to propose a DQ oriented approach for business process modeling

2788

Abstract

Purpose

The purpose of this paper is to conceptualize data quality (DQ) in the context of business process management and to propose a DQ oriented approach for business process modeling. The approach is based on key concepts and metrics from the data quality management domain and supports decision‐making in process re‐design projects on the basis of process models.

Design/methodology/approach

The paper applies a design oriented research approach, in the course of which a modeling method is developed as a design artifact. To do so, method engineering is used as a design technique. The artifact is theoretically founded and incorporates DQ considerations into process re‐design. Furthermore, the paper uses a case study to evaluate the suggested approach.

Findings

The paper shows that the DQ oriented process modeling approach facilitates and improves managerial decision‐making in the context of process re‐design. Data quality is considered as a success factor for business processes and is conceptualized using a rule‐based approach.

Research limitations/implications

The paper presents design research and a case study. More research is needed to triangulate the findings and to allow generalizability of the results.

Practical implications

The paper supports decision‐makers in enterprises in taking a DQ perspective in business process re‐design initiatives.

Originality/value

The paper reports on integrating DQ considerations into business process management in general and into process modeling in particular, in order to provide more comprehensive decision‐making support in process re‐design projects. The paper represents one of the first contributions to literature regarding a contemporary phenomenon of high practical and scientific relevance.

Details

Business Process Management Journal, vol. 18 no. 6
Type: Research Article
ISSN: 1463-7154

Keywords

Article
Publication date: 12 January 2015

Ângelo Márcio Oliveira Sant'Anna

The purpose of this paper is to propose a framework of decision making to aid practitioners in modeling and optimization experimental data for improvement quality of industrial…

Abstract

Purpose

The purpose of this paper is to propose a framework of decision making to aid practitioners in modeling and optimization experimental data for improvement quality of industrial processes, reinforcing idea that planning and conducting data modeling are as important as formal analysis.

Design/methodology/approach

The paper presents an application was carried out about the modeling of experimental data at mining company, with support at Catholic University from partnership projects. The literature seems to be more focussed on the data analysis than on providing a sequence of operational steps or decision support which would lead to the best regression model given for the problem that researcher is confronted with. The authors use the concept of statistical regression technique called generalized linear models.

Findings

The authors analyze the relevant case study in mining company, based on best statistical regression models. Starting from this analysis, the results of the industrial case study illustrates the strong relationship of the improvement process with the presented framework approach into practice. Moreover, the case study consolidating a fundamental advantage of regression models: modeling guided provides more knowledge about products, processes and technologies, even in unsuccessful case studies.

Research limitations/implications

The study advances in regression model for data modeling are applicable in several types of industrial processes and phenomena random. It is possible to find unsuccessful data modeling due to lack of knowledge of statistical technique.

Originality/value

An essential point is that the study is based on the feedback from practitioners and industrial managers, which makes the analyses and conclusions from practical points of view, without relevant theoretical knowledge of relationship among the process variables. Regression model has its own characteristics related to response variable and factors, and misspecification of the regression model or their components can yield inappropriate inferences and erroneous experimental results.

Article
Publication date: 15 June 2010

Emad Samadiani and Yogendra Joshi

The purpose of this paper is to review the available reduced order modeling approaches in the literature for predicting the flow and specially temperature fields inside data

Abstract

Purpose

The purpose of this paper is to review the available reduced order modeling approaches in the literature for predicting the flow and specially temperature fields inside data centers in terms of the involved design parameters.

Design/methodology/approach

This paper begins with a motivation for flow/thermal modeling needs for designing an energy‐efficient thermal management system in data centers. Recent studies on air velocity and temperature field simulations in data centers through computational fluid dynamics/heat transfer (CFD/HT) are reviewed. Meta‐modeling and reduced order modeling are tools to generate accurate and rapid surrogate models for a complex system. These tools, with a focus on low‐dimensional models of turbulent flows are reviewed. Reduced order modeling techniques based on turbulent coherent structures identification, in particular the proper orthogonal decomposition (POD) are explained and reviewed in more details. Then, the available approaches for rapid thermal modeling of data centers are reviewed. Finally, recent studies on generating POD‐based reduced order thermal models of data centers are reviewed and representative results are presented and compared for a case study.

Findings

It is concluded that low‐dimensional models are needed in order to predict the multi‐parameter dependent thermal behavior of data centers accurately and rapidly for design and control purposes. POD‐based techniques have shown great approximation for multi‐parameter thermal modeling of data centers. It is believed that wavelet‐based techniques due to the their ability to separate between coherent and incoherent structures – something that POD cannot do – can be considered as new promising tools for reduced order thermal modeling of complex electronic systems such as data centers

Originality/value

The paper reviews different numerical methods and provides the reader with some insight for reduced order thermal modeling of complex convective systems such as data centers.

Details

International Journal of Numerical Methods for Heat & Fluid Flow, vol. 20 no. 5
Type: Research Article
ISSN: 0961-5539

Keywords

Book part
Publication date: 6 September 2021

Rachel S. Rauvola, Cort W. Rudolph and Hannes Zacher

In this chapter, the authors consider the role of time for research in occupational stress and well-being. First, temporal issues in studying occupational health longitudinally…

Abstract

In this chapter, the authors consider the role of time for research in occupational stress and well-being. First, temporal issues in studying occupational health longitudinally, focusing in particular on the role of time lags and their implications for observed results (e.g., effect detectability), analyses (e.g., handling unequal durations between measurement occasions), and interpretation (e.g., result generalizability, theoretical revision) were discussed. Then, time-based assumptions when modeling lagged effects in occupational health research, providing a focused review of how research has handled (or ignored) these assumptions in the past, and the relative benefits and drawbacks of these approaches were discussed. Finally, recommendations for readers, an accessible tutorial (including example data and code), and discussion of a new structural equation modeling technique, continuous time structural equation modeling, that can “handle” time in longitudinal studies of occupational health were provided.

Details

Examining and Exploring the Shifting Nature of Occupational Stress and Well-Being
Type: Book
ISBN: 978-1-80117-422-0

Keywords

Article
Publication date: 9 May 2016

Margarida Jerónimo Barbosa, Pieter Pauwels, Victor Ferreira and Luís Mateus

Building information modeling (BIM) is most often used for the construction of new buildings. By using BIM in such projects, collaboration among stakeholders in an architecture…

3479

Abstract

Purpose

Building information modeling (BIM) is most often used for the construction of new buildings. By using BIM in such projects, collaboration among stakeholders in an architecture, engineering and construction project is improved. To even further improve collaboration, there is a move toward the production and usage of BIM standards in various countries. These are typically national documents, including guides, protocols, and mandatory regulations, that introduce guidelines about what information should be exchanged at what time between which partners and in what formats. If a nation or a construction team agrees on these guidelines, improved collaboration can come about on top of the collaboration benefits induced by the mere usage of BIM. This scenario might also be targeted for interventions in existing buildings. The paper aims to discuss these issues.

Design/methodology/approach

In this paper, the authors investigate the general content and usage of existing BIM standards for new constructions, describing specifications about BIM deliverable documents, modeling, and collaboration procedures. The authors suggest to what extent the content in the BIM standards can also be used for interventions in existing buildings. These suggestions rely heavily on literature study, supported by on-site use case experiences.

Findings

From this research, the authors can conclude that the existing standards give a solid basis for BIM collaboration in existing building interventions, but that they need to be extended in order to be of better use in any intervention project in an existing building. This extension should happen at: data modeling level: other kinds of data formats need to be considered, coming from terrestrial laser scanning and automatic digital photogrammetry tools; at data exchange level: exchange requirements should take explicit statements about modeling tolerances and levels of (un)certainty; and at process modeling level: business process models should include information exchange processes from the very start of the building survey (BIM→facility management→BIM or regular audit).

Originality/value

BIM environments are not often used to document existing buildings or interventions in existing buildings. The authors propose to improve the situation by using BIM standards and/or guidelines, and the authors give an initial overview of components that should be included in such a standard and/or guideline.

Details

Structural Survey, vol. 34 no. 2
Type: Research Article
ISSN: 0263-080X

Keywords

Book part
Publication date: 1 November 2007

Irina Farquhar and Alan Sorkin

This study proposes targeted modernization of the Department of Defense (DoD's) Joint Forces Ammunition Logistics information system by implementing the optimized innovative…

Abstract

This study proposes targeted modernization of the Department of Defense (DoD's) Joint Forces Ammunition Logistics information system by implementing the optimized innovative information technology open architecture design and integrating Radio Frequency Identification Device data technologies and real-time optimization and control mechanisms as the critical technology components of the solution. The innovative information technology, which pursues the focused logistics, will be deployed in 36 months at the estimated cost of $568 million in constant dollars. We estimate that the Systems, Applications, Products (SAP)-based enterprise integration solution that the Army currently pursues will cost another $1.5 billion through the year 2014; however, it is unlikely to deliver the intended technical capabilities.

Details

The Value of Innovation: Impact on Health, Life Quality, Safety, and Regulatory Research
Type: Book
ISBN: 978-1-84950-551-2

Abstract

Details

Handbook of Transport Geography and Spatial Systems
Type: Book
ISBN: 978-1-615-83253-8

Article
Publication date: 1 May 2006

Rajugan Rajagopalapillai, Elizabeth Chang, Tharam S. Dillon and Ling Feng

In data engineering, view formalisms are used to provide flexibility to users and user applications by allowing them to extract and elaborate data from the stored data sources…

Abstract

In data engineering, view formalisms are used to provide flexibility to users and user applications by allowing them to extract and elaborate data from the stored data sources. Conversely, since the introduction of EXtensible Markup Language (XML), it is fast emerging as the dominant standard for storing, describing, and interchanging data among various web and heterogeneous data sources. In combination with XML Schema, XML provides rich facilities for defining and constraining user‐defined data semantics and properties, a feature that is unique to XML. In this context, it is interesting to investigate traditional database features, such as view models and view design techniques for XML. However, traditional view formalisms are strongly coupled to the data language and its syntax, thus it proves to be a difficult task to support views in the case of semi‐structured data models. Therefore, in this paper we propose a Layered View Model (LVM) for XML with conceptual and schemata extensions. Here our work is three‐fold; first we propose an approach to separate the implementation and conceptual aspects of the views that provides a clear separation of concerns, thus, allowing analysis and design of views to be separated from their implementation. Secondly, we define representations to express and construct these views at the conceptual level. Thirdly, we define a view transformation methodology for XML views in the LVM, which carries out automated transformation to a view schema and a view query expression in an appropriate query language. Also, to validate and apply the LVM concepts, methods and transformations developed, we propose a viewdriven application development framework with the flexibility to develop web and database applications for XML, at varying levels of abstraction.

Details

International Journal of Web Information Systems, vol. 2 no. 2
Type: Research Article
ISSN: 1744-0084

Keywords

Article
Publication date: 3 April 2017

Pawel D. Domanski and Mateusz Gintrowski

This paper aims to present the results of the comparison between different approaches to the prediction of electricity prices. It is well-known that the properties of the data

Abstract

Purpose

This paper aims to present the results of the comparison between different approaches to the prediction of electricity prices. It is well-known that the properties of the data generation process may prefer some modeling methods over the others. The data having an origin in social or market processes are characterized by unexpectedly wide realization space resulting in the existence of the long tails in the probabilistic density function. These data may not be easy in time series prediction using standard approaches based on the normal distribution assumptions. The electricity prices on the deregulated market fall into this category.

Design/methodology/approach

The paper presents alternative approaches, i.e. memory-based prediction and fractal approach compared with established nonlinear method of neural networks. The appropriate interpretation of results is supported with the statistical data analysis and data conditioning. These algorithms have been applied to the problem of the energy price prediction on the deregulated electricity market with data from Polish and Austrian energy stock exchanges.

Findings

The first outcome of the analysis is that there are several situations in the task of time series prediction, when standard modeling approach based on the assumption that each change is independent of the last following random Gaussian bell pattern may not be a true. In this paper, such a case was considered: price data from energy markets. Electricity prices data are biased by the human nature. It is shown that more relevant for data properties was Cauchy probabilistic distribution. Results have shown that alternative approaches may be used and prediction for both data memory-based approach resulted in the best performance.

Research limitations/implications

“Personalization” of the model is crucial aspect in the whole methodology. All available knowledge should be used on the forecasted phenomenon and incorporate it into the model. In case of the memory-based modeling, it is a specific design of the history searching routine that uses the understanding of the process features. Importance should shift toward methodology structure design and algorithm customization and then to parameter estimation. Such modeling approach may be more descriptive for the user enabling understanding of the process and further iterative improvement in a continuous striving for perfection.

Practical implications

Memory-based modeling can be practically applied. These models have large potential that is worth to be exploited. One disadvantage of this modeling approach is large calculation effort connected with a need of constant evaluation of large data sets. It was shown that a graphics processing unit (GPU) approach through parallel calculation on the graphical cards can improve it dramatically.

Social implications

The modeling of the electricity prices has big impact of the daily operation of the electricity traders and distributors. From one side, appropriate modeling can improve performance mitigating risks associated with the process. Thus, the end users should receive higher quality of services ultimately with lower prices and minimized risk of the energy loss incidents.

Originality/value

The use of the alternative approaches, such as memory-based reasoning or fractals, is very rare in the field of the electricity price forecasting. Thus, it gives a new impact for further research enabling development of better solutions incorporating all available process knowledge and customized hybrid algorithms.

Details

International Journal of Energy Sector Management, vol. 11 no. 1
Type: Research Article
ISSN: 1750-6220

Keywords

1 – 10 of over 259000