Search results

1 – 10 of over 2000
Article
Publication date: 1 February 2005

Mike Tao Zhang and Ken Goldberg

Semiconductor manufacturing industry requires highly accurate robot operation with short install/setup downtime.

Abstract

Purpose

Semiconductor manufacturing industry requires highly accurate robot operation with short install/setup downtime.

Design/methodology/approach

We develop a fast, low cost and easy‐to‐operate calibration system for wafer‐handling robots. The system is defined by a fixture and a simple compensation algorithm. Given robot repeatability, end effector uncertainties, and the tolerance requirements of wafer placement points, we derive fixture design and placement specifications based on a statistical tolerance model.

Findings

By employing the fixture‐based calibration, we successfully relax the tolerance requirement of the end effector by 20 times.

Originality/value

Semiconductor manufacturing requires fast and easy‐to‐operate calibration systems for wafer‐handling robots. In this paper, we describe a new methodology to solve this problem using fixtures. We develop fixture design criteria and a simple compensate algorithm to satisfy calibration requirements. We also verify our approach by a physical example.

Details

Industrial Robot: An International Journal, vol. 32 no. 1
Type: Research Article
ISSN: 0143-991X

Keywords

Article
Publication date: 21 January 2022

Maximilien de Zordo-Banliat, Xavier Merle, Gregory Dergham and Paola Cinnella

The Reynolds-averaged Navier–Stokes (RANS) equations represent the computational workhorse for engineering design, despite their numerous flaws. Improving and quantifying the…

94

Abstract

Purpose

The Reynolds-averaged Navier–Stokes (RANS) equations represent the computational workhorse for engineering design, despite their numerous flaws. Improving and quantifying the uncertainties associated with RANS models is particularly critical in view of the analysis and optimization of complex turbomachinery flows.

Design/methodology/approach

First, an efficient strategy is introduced for calibrating turbulence model coefficients from high-fidelity data. The results are highly sensitive to the flow configuration (called a calibration scenario) used to inform the coefficients. Second, the bias introduced by the choice of a specific turbulence model is reduced by constructing a mixture model by means of Bayesian model-scenario averaging (BMSA). The BMSA model makes predictions of flows not included in the calibration scenarios as a probability-weighted average of a set of competing turbulence models, each supplemented with multiple sets of closure coefficients inferred from alternative calibration scenarios.

Findings

Different choices for the scenario probabilities are assessed for the prediction of the NACA65 V103 cascade at off-design conditions. In all cases, BMSA improves the solution accuracy with respect to the baseline turbulence models, and the estimated uncertainty intervals encompass reasonably well the reference data. The BMSA results were found to be little sensitive to the user-defined scenario-weighting criterion, both in terms of average prediction and of estimated confidence intervals.

Originality/value

A delicate step in the BMSA is the selection of suitable scenario-weighting criteria, i.e. suitable prior probability mass functions (PMFs) for the calibration scenarios. The role of such PMFs is to assign higher probability to calibration scenarios more likely to provide an accurate estimate of model coefficients for the new flow. In this paper, three mixture models are constructed, based on alternative choices of the scenario probabilities. The authors then compare the capabilities of three different criteria.

Details

International Journal of Numerical Methods for Heat & Fluid Flow, vol. 32 no. 4
Type: Research Article
ISSN: 0961-5539

Keywords

Article
Publication date: 30 July 2021

Bing Zhang, Raiyan Seede, Austin Whitt, David Shoukr, Xueqin Huang, Ibrahim Karaman, Raymundo Arroyave and Alaa Elwany

There is recent emphasis on designing new materials and alloys specifically for metal additive manufacturing (AM) processes, in contrast to AM of existing alloys that were…

Abstract

Purpose

There is recent emphasis on designing new materials and alloys specifically for metal additive manufacturing (AM) processes, in contrast to AM of existing alloys that were developed for other traditional manufacturing methods involving considerably different physics. Process optimization to determine processing recipes for newly developed materials is expensive and time-consuming. The purpose of the current work is to use a systematic printability assessment framework developed by the co-authors to determine windows of processing parameters to print defect-free parts from a binary nickel-niobium alloy (NiNb5) using laser powder bed fusion (LPBF) metal AM.

Design/methodology/approach

The printability assessment framework integrates analytical thermal modeling, uncertainty quantification and experimental characterization to determine processing windows for NiNb5 in an accelerated fashion. Test coupons and mechanical test samples were fabricated on a ProX 200 commercial LPBF system. A series of density, microstructure and mechanical property characterization was conducted to validate the proposed framework.

Findings

Near fully-dense parts with more than 99% density were successfully printed using the proposed framework. Furthermore, the mechanical properties of as-printed parts showed low variability, good tensile strength of up to 662 MPa and tensile ductility 51% higher than what has been reported in the literature.

Originality/value

Although many literature studies investigate process optimization for metal AM, there is a lack of a systematic printability assessment framework to determine manufacturing process parameters for newly designed AM materials in an accelerated fashion. Moreover, the majority of existing process optimization approaches involve either time- and cost-intensive experimental campaigns or require the use of proprietary computational materials codes. Through the use of a readily accessible analytical thermal model coupled with statistical calibration and uncertainty quantification techniques, the proposed framework achieves both efficiency and accessibility to the user. Furthermore, this study demonstrates that following this framework results in printed parts with low degrees of variability in their mechanical properties.

Article
Publication date: 26 July 2013

Fasil Ejigu Eregno, Chong‐Yu Xu and Nils‐Otto Kitterød

Recent advances in hydrological impact studies point that the response of specific catchments to climate change scenario using a single model approach is questionable. This study…

1748

Abstract

Purpose

Recent advances in hydrological impact studies point that the response of specific catchments to climate change scenario using a single model approach is questionable. This study was aimed at investigating the impact of climate change on three river basins in China, Ethiopia and Norway using WASMOD and HBV hydrological models.

Design/methodology/approach

First, hydrological models' parameters were determined using current hydro‐climatic data inputs. Second, the historical time series of climatic data was adjusted according to the climate change scenarios. Third, the hydrological characteristics of the catchments under the adjusted climatic conditions were simulated using the calibrated hydrological models. Finally, comparisons of the model simulations of the current and possible future hydrological characteristics were performed. Responses were evaluated in terms of runoff, actual evapotranspiration and soil moisture change for incremental precipitation and temperature change scenarios.

Findings

From the results obtained, it can be inferred that two equally well calibrated models gave different hydrological response to hypothetical climatic scenarios. The authors' findings support the concern that climate change analysis using lumped hydrological models may lead to unreliable conclusions.

Practical implications

Extrapolation of driving forces (temperature and precipitation) beyond the range of parameter calibration yields unreliable response. It is beyond the scope of this study to reduce this model ambiguity, but reduction of uncertainty is a challenge for further research.

Originality/value

The research was conducted based on the primary time series data using the existing two hydrological models to test the magnitude differences one can expect when using different hydrological models to simulate hydrological response of climate changes in different climate zones.

Details

International Journal of Climate Change Strategies and Management, vol. 5 no. 3
Type: Research Article
ISSN: 1756-8692

Keywords

Article
Publication date: 1 June 2012

Teodor Sommestad, Hannes Holm and Mathias Ekstedt

The purpose of this paper is to identify the importance of the factors that influence the success rate of remote arbitrary code execution attacks. In other words, attacks which…

Abstract

Purpose

The purpose of this paper is to identify the importance of the factors that influence the success rate of remote arbitrary code execution attacks. In other words, attacks which use software vulnerabilities to execute the attacker's own code on targeted machines. Both attacks against servers and attacks against clients are studied.

Design/methodology/approach

The success rates of attacks are assessed for 24 scenarios: 16 scenarios for server‐side attacks and eight for client‐side attacks. The assessment is made through domain experts and is synthesized using Cooke's classical method, an established method for weighting experts' judgments. The variables included in the study were selected based on the literature, a pilot study, and interviews with domain experts.

Findings

Depending on the scenario in question, the expected success rate varies between 15 and 67 percent for server‐side attacks and between 43 and 67 percent for client‐side attacks. Based on these scenarios, the influence of different protective measures is identified.

Practical implications

The results of this study offer guidance to decision makers on how to best secure their assets against remote code execution attacks. These results also indicate the overall risk posed by this type of attack.

Originality/value

Attacks that use software vulnerabilities to execute code on targeted machines are common and pose a serious risk to most enterprises. However, there are no quantitative data on how difficult such attacks are to execute or on how effective security measures are against them. The paper provides such data using a structured technique to combine expert judgments.

Article
Publication date: 7 October 2013

Hannes Holm and Mathias Ekstedt

The purpose of this paper is to estimate the effectiveness of web application firewalls (WAFs) at preventing injection attacks by professional penetration testers given presence…

1022

Abstract

Purpose

The purpose of this paper is to estimate the effectiveness of web application firewalls (WAFs) at preventing injection attacks by professional penetration testers given presence or absence of four conditions: whether there is an experienced operator monitoring the WAF; whether an automated black box tool has been used when tuning the WAF; whether the individual tuning the WAF is an experienced professional; and whether significant effort has been spent tuning the WAF.

Design/methodology/approach

Estimates on the effectiveness of WAFs are made for 16 operational scenarios utilizing judgments by 49 domain experts participating in a web survey. The judgments of these experts are pooled using Cooke's classical method.

Findings

The results show that the median prevention rate of a WAF is 80 percent if all measures have been employed. If no measure is employed then its median prevention rate is 25 percent. Also, there are no strong dependencies between any of the studied measures.

Research limitations/implications

The results are only valid for the attacker profile of a professional penetration tester who prepares one week for attacking a WA protected by a WAF.

Practical implications

The competence of the individual(s) tuning a WAF, employment of an automated black box tool for tuning and the manual effort spent on tuning are of great importance for the effectiveness of a WAF. The presence of an operator monitoring it has minor positive influence on its effectiveness.

Originality/value

WA vulnerabilities are widely considered a serious concern. To manage them in deployed software, many enterprises employ WAFs. However, the effectiveness of this type of countermeasure under different operational scenarios is largely unknown.

Details

Information Management & Computer Security, vol. 21 no. 4
Type: Research Article
ISSN: 0968-5227

Keywords

Article
Publication date: 1 May 2002

Alireza Lari

The current information system use in quality management and ISO 9000 certification process is typically utilized in administration and documentation. The existing software do not…

9514

Abstract

The current information system use in quality management and ISO 9000 certification process is typically utilized in administration and documentation. The existing software do not satisfy the technical information needs for ISO 9000. There is a need for tools that can help management to decide on technical aspects such as proper corrective and preventive actions or design verification and validation activities. This paper analyzes the information requirements of ISO 9000 standards and identifies the areas where a decision support system can be used. Further, it proposes a conceptual framework for company‐wide information management, while it explains the modular approach to the system development by introducing and empirically testing the prototype model for a corrective and preventive actions module. The proposed system will provide the conceptual structure for a quality assurance information system within organizations.

Details

Business Process Management Journal, vol. 8 no. 2
Type: Research Article
ISSN: 1463-7154

Keywords

Article
Publication date: 17 October 2008

Marcus Selart, Svein Tvedt Johansen, Tore Holmesland and Kjell Grønhaug

The purpose of this paper is to clarify how IT managers' decision styles affect their evaluation of information technology.

1436

Abstract

Purpose

The purpose of this paper is to clarify how IT managers' decision styles affect their evaluation of information technology.

Design/methodology/approach

Four different decision styles were assessed in a leadership test directed towards IT managers. Each style included two dimensions: confidence judgment ability and decision heuristic usage. Participants belonging to each style were interviewed and their answers analysed with regard to their reasoning about central areas of IT management.

Findings

Results suggest that a decision style combining intuitive and analytical capabilities lead to better evaluations of information technology.

Originality/vale

The results of the present study are valuable for the understanding of how decision styles impact on IT management in everyday life.

Details

Management Decision, vol. 46 no. 9
Type: Research Article
ISSN: 0025-1747

Keywords

Article
Publication date: 1 June 2015

Abdykappar Ashimov and Yuriy V. Borovskiy

The purpose of this paper is to demonstrate an effectiveness of applying a number of the new methods, proposed in the parametric control theory for testing macroeconomic models…

Abstract

Purpose

The purpose of this paper is to demonstrate an effectiveness of applying a number of the new methods, proposed in the parametric control theory for testing macroeconomic models for the possibility of their practical application.

Design/methodology/approach

Approaches of system analysis on building and calibrating the mathematical models; provisions of the parametric control theory for both numerical testing of the calibrated models for the possibility of their practical application and solving the parametric control problems.

Findings

First, one global computable general equilibrium model (CGE model) is built and calibrated. Second, in solving the problem of testing this model for the possibility of its practical application the effectiveness of applying two developed numerical algorithms is demonstrated. These algorithms are for estimating stability indicators and estimating stability (in the sense of the theory of smooth mappings stability) of mappings defined by the model. Third, on the base of the tested CGE model there are given the solution results for a number of the parametric control problems aimed at economic growth and decrease of economic disparities of regions.

Originality/value

By the example of the developed CGE model, it is demonstrated an approach of the parametric control theory for testing macroeconomic models for the possibility of their practical application.

Article
Publication date: 1 April 1991

Michael Scott

The use of the Near Infra‐Red (NIR) part of the electromagnetic spectrum for laboratory measurement has been well established for many years. It was the food industry and its…

Abstract

The use of the Near Infra‐Red (NIR) part of the electromagnetic spectrum for laboratory measurement has been well established for many years. It was the food industry and its related agricultural base that pioneered the technique, and names like Dicky‐John, Technicon and Neotech have become common usage terms.

Details

Sensor Review, vol. 11 no. 4
Type: Research Article
ISSN: 0260-2288

1 – 10 of over 2000