Search results

1 – 10 of over 19000
Article
Publication date: 11 January 2022

Rajkumar Bhimgonda Patil, Suyog Subhash Patil, Gajanand Gupta and Anand K. Bewoor

The purpose of this paper is to carry out a reliability analysis of a mechanical system considering the degraded states to get a proper understanding of system behavior and its…

Abstract

Purpose

The purpose of this paper is to carry out a reliability analysis of a mechanical system considering the degraded states to get a proper understanding of system behavior and its propagation towards complete failure.

Design/methodology/approach

The reliability analysis of computerized numerical control machine tools (CNCMTs) using a multi-state system (MSS) approach that considers various degraded states rather than a binary approach is carried out. The failures of the CNCMT are classified into five states: one fully operational state, three degraded states and one failed state.

Findings

The analysis of failure data collected from the field and tests conducted in the laboratory provided detailed understandings about the quality of the material and its failure behavior used in designing and the capability of the manufacturing system. The present work identified that Class II (major failure) is critical from a maintainability perspective whereas Class III (moderate failure) and Class IV (minor failure) are critical from a reliability perspective.

Research limitations/implications

This research applies to reliability data analysis of systems that consider various degraded states.

Practical implications

MSS reliability analysis approach will help to identify various degraded states of the system that affect the performance and productivity and also to improve system reliability, availability and performance.

Social implications

Industrial system designers recognized that reliability and maintainability is a critical design attribute. Reliability studies using the binary state approach are insufficient and incorrect for the systems with degraded failures states, and such analysis can give incorrect results, and increase the cost. The proposed MSS approach is more suitable for complex systems such as CNCMT rather than the binary-state system approach.

Originality/value

This paper presents a generalized framework MSS's failure and repair data analysis has been developed and applied to a CNCMT.

Details

International Journal of Quality & Reliability Management, vol. 39 no. 7
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 1 February 2001

LEO M. TILMAN and PAVEL BRUSILOVSKIY

Value‐at‐Risk (VaR) has become a mainstream risk management technique employed by a large proportion of financial institutions. There exists a substantial amount of research…

Abstract

Value‐at‐Risk (VaR) has become a mainstream risk management technique employed by a large proportion of financial institutions. There exists a substantial amount of research dealing with this task, most commonly referred to as VaR backtesting. A new generation of “self‐learning” VaR models (Conditional Autoregressive Value‐at‐Risk or CAViaR) combine backtesting results with ex ante VaR estimates in an ARIMA framework in order to forecast P/L distributions more accurately. In this commentary, the authors present a systematic overview of several classes of applied statistical techniques that can make VaR backtesting more comprehensive and provide valuable insights into the analytical properties of VaR models in various market environments. In addition, they discuss the challenges associated with extending traditional backtesting approaches for VaR horizons longer than one day and propose solutions to this important problem.

Details

The Journal of Risk Finance, vol. 2 no. 3
Type: Research Article
ISSN: 1526-5943

Article
Publication date: 6 February 2019

Soni Bisht and S.B. Singh

The purpose of this paper is to evaluate various reliability measures like reliability, expected lifetime (mean time to failure), signature reliability and compare networks based…

Abstract

Purpose

The purpose of this paper is to evaluate various reliability measures like reliability, expected lifetime (mean time to failure), signature reliability and compare networks based on the different flows.

Design/methodology/approach

The reliability characteristics of complex bridge networks have been evaluated using different algorithms with the help of universal generating function (UGF). Further, the signature reliability of the considered networks has been determined using Owen’s method.

Findings

The present paper proposes an efficient algorithm to compute the reliability indices of complex bridge networks having i.i.d. lifetime components (nodes, edges) with the help of UGF and Owen’s method. This study reveals that a slight change in the complex bridge network affects the reliability significantly. Finally, by the reliability structure function, proposed algorithms are used to find the signature and MTTF. From signature, we have determined the different failure probabilities corresponding to edges of the network.

Originality/value

In this work, we have evaluated reliability characteristics and signature reliability of the complex bridge networks using UGF method and Owen’s method respectively unlike done in the past.

Details

International Journal of Quality & Reliability Management, vol. 36 no. 2
Type: Research Article
ISSN: 0265-671X

Keywords

Book part
Publication date: 10 August 2010

Gordon Burt

The Wikipedia (2008) entry for mathematical sociology cites four books with ‘mathematical sociology’ in the title: Coleman (1964), Fararo (1973), Leik and Meeker (1975) and…

Abstract

The Wikipedia (2008) entry for mathematical sociology cites four books with ‘mathematical sociology’ in the title: Coleman (1964), Fararo (1973), Leik and Meeker (1975) and Bonacich (2008). Fararo (1973, pp. 764–766) provides a guide to the literature in mathematical sociology covering journals, bibliographies, reviews and expository essays, readers, texts, original monographs and research papers. Many of the references are either broader than mathematical sociology, for example, concerning the behavioural sciences in general, or narrower, dealing with a particular topic within sociology, or concerning a related field such as social psychology. Three classical original monographs are identified: Dodd (1942), Zipf (1949) and Rashevsky (1951). Included in a second generation of monographs is Coleman's (1964)An Introduction to Mathematical Sociology’. Could it be that this is the first use of the phrase ‘mathematical sociology’?

Details

Conflict, Complexity and Mathematical Social Science
Type: Book
ISBN: 978-1-84950-973-2

Article
Publication date: 13 April 2015

Craig Hatcher

This paper aims to problematise the relation between “legality” and the state, through a case study analysis of law at work within the built environment. In doing so, the paper…

Abstract

Purpose

This paper aims to problematise the relation between “legality” and the state, through a case study analysis of law at work within the built environment. In doing so, the paper argues that studies on law and geography should consider the broader processes of state “law making” to understand the production of illegal space.

Design/methodology/approach

The liminal boundary of illegal/legal and its relation with the state is developed through a case study on the legalisation process of a “squatter” settlement located on the outskirts of Bishkek, the capital of Kyrgyzstan. The paper draws on primary qualitative research (semi-structured interviews) and legal analysis undertaken in Kyrgyzstan at various times over seven months between 2011 and 2013.

Findings

Examining law as static and pre-existing is problematic in developing an understanding of the production of illegal and legal spaces within the built environment. An emphasis on law-making and the process of legalisation draws attention to the different groups, practices and policies involved and reframes the relation between the state and legality.

Originality/value

Using a case study anchoring the analysis within law’s constitutive and contested presence within the built environment, the paper addresses a theoretical and empirical panacea in legal geography by unpacking the “legal” with reference to its plurality internally within the state. Moreover, studies on law and geography have tended to focus on European or North American contexts, whereas this paper draws on data from Central Asia.

Details

International Journal of Law in the Built Environment, vol. 7 no. 1
Type: Research Article
ISSN: 1756-1450

Keywords

Article
Publication date: 13 May 2019

Danny Woosik Choi, Seoki Lee and Manisha Singal

The purpose of this study is to examine how the lodging market and the state economy affected by Hurricane Sandy have recovered from the damages sustained. Specifically, this…

Abstract

Purpose

The purpose of this study is to examine how the lodging market and the state economy affected by Hurricane Sandy have recovered from the damages sustained. Specifically, this study examines and predicts the influence of revenue management key performance indicators (KPIs) on recovery and lodging revenue in the affected states and the states’ economies. These KPIs include average daily rate (ADR), occupancy and revenue per available room (RevPAR).

Design/methodology/approach

Secondary financial data were collected for the states most damaged by Hurricane Sandy. Subsequently, pooled Ordinary Least Square (OLS) regression was conducted combining time and non-time dependent variables based on the states and radius from the landfall.

Findings

The results indicate that although the lodging market and the state economies have recovered since the onslaught of Hurricane Sandy, certain KPIs still need to improve.

Practical implications

Managerial implications are suggested in terms of dynamic pricing, market-based recovery, the KPIs, federal aid and facility management.

Originality/value

Despite its importance, research on the effects of climate change in the hospitality context has not actively progressed after Hurricane Katrina. Time and non-time dependent variables are combined in this analysis to gain a richer understanding of the impacts and recovery of KPIs on the revenue in the lodging market and the revenue on states’ economies. Additional analysis based on the radius from the landfall of the hurricane was performed to examine the impact and recovery based on geographical proximity.

Details

International Journal of Contemporary Hospitality Management, vol. 31 no. 5
Type: Research Article
ISSN: 0959-6119

Keywords

Article
Publication date: 1 September 2005

Brian L. Withrow and Brien Bolin

To document the police protective custody (PPC) process and in doing so develop a predictive model to better inform police decision makers on the factors that are more likely to…

Abstract

Purpose

To document the police protective custody (PPC) process and in doing so develop a predictive model to better inform police decision makers on the factors that are more likely to result in the state maintaining custody of a child.

Design/methodology/approach

Data for the current study were gathered through a series of focus groups and 6,607 existing records of PPC admissions into a children's home in the Wichita Children's Home (WCH) (Kansas). Systematic predictive modeling (logistic regression) was used to differentiate between children that are likely to need continued involvement of the child welfare system and those who could remain in the custody of their families.

Findings

Documents the PPC process by which a child is referred to be housed by WCH by a law enforcement agency. Reports on the design of a decision model which identifies the factors affecting the outcome of the PPC process.

Originality/value

Provides recommendations for streamlining the PPC process as well as the improvement of police policies and procedures.

Details

Policing: An International Journal of Police Strategies & Management, vol. 28 no. 3
Type: Research Article
ISSN: 1363-951X

Keywords

Article
Publication date: 5 January 2010

Ron Layman, Samy Missoum and Jonathan Vande Geest

The use of stent‐grafts to canalize aortic blood flow for patients with aortic aneurysms is subject to serious failure mechanisms such as a leak between the stent‐graft and the…

Abstract

Purpose

The use of stent‐grafts to canalize aortic blood flow for patients with aortic aneurysms is subject to serious failure mechanisms such as a leak between the stent‐graft and the aorta (Type I endoleak). The purpose of this paper is to describe a novel computational approach to understand the influence of relevant variables on the occurrence of stent‐graft failure and quantify the probability of failure for aneurysm patients.

Design/methodology/approach

A parameterized fluid‐structure interaction finite element model of aortic aneurysm is built based on a multi‐material formulation available in LS‐DYNA. Probabilities of failure are assessed using an explicit construction of limit state functions with support vector machines (SVM) and uniform designs of experiments. The probabilistic approach is applied to two aneurysm geometries to provide a map of probabilities of failure for various design parameter values.

Findings

Parametric studies conducted in the course of this research successfully identified intuitive failure regions in the parameter space, and failure probabilities were calculated using both a simplified and more complex aneurysmal geometry.

Originality/value

This research introduces the use of SVM‐based explicit design space decomposition for probabilistic assessment applied to bioengineering problems. This technique allows one to efficiently calculate probabilities of failure. It is particularly suited for problems where outcomes can only be classified as safe or failed (e.g. leak or no leak). Finally, the proposed fluid‐structure interaction simulation accounts for the initiation of Type I endoleak between the graft and the aneurysm due to simultaneous fluid and solid forces.

Details

Engineering Computations, vol. 27 no. 1
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 13 July 2018

M. Arif Wani and Saduf Afzal

Many strategies have been put forward for training deep network models, however, stacking of several layers of non-linearities typically results in poor propagation of gradients…

Abstract

Purpose

Many strategies have been put forward for training deep network models, however, stacking of several layers of non-linearities typically results in poor propagation of gradients and activations. The purpose of this paper is to explore the use of two steps strategy where initial deep learning model is obtained first by unsupervised learning and then optimizing the initial deep learning model by fine tuning. A number of fine tuning algorithms are explored in this work for optimizing deep learning models. This includes proposing a new algorithm where Backpropagation with adaptive gain algorithm is integrated with Dropout technique and the authors evaluate its performance in the fine tuning of the pretrained deep network.

Design/methodology/approach

The parameters of deep neural networks are first learnt using greedy layer-wise unsupervised pretraining. The proposed technique is then used to perform supervised fine tuning of the deep neural network model. Extensive experimental study is performed to evaluate the performance of the proposed fine tuning technique on three benchmark data sets: USPS, Gisette and MNIST. The authors have tested the approach on varying size data sets which include randomly chosen training samples of size 20, 50, 70 and 100 percent from the original data set.

Findings

Through extensive experimental study, it is concluded that the two steps strategy and the proposed fine tuning technique significantly yield promising results in optimization of deep network models.

Originality/value

This paper proposes employing several algorithms for fine tuning of deep network model. A new approach that integrates adaptive gain Backpropagation (BP) algorithm with Dropout technique is proposed for fine tuning of deep networks. Evaluation and comparison of various algorithms proposed for fine tuning on three benchmark data sets is presented in the paper.

Details

International Journal of Intelligent Computing and Cybernetics, vol. 11 no. 3
Type: Research Article
ISSN: 1756-378X

Keywords

Article
Publication date: 1 April 1994

Hamid Z. Fardi

Numerical device simulation is developed to study the steady‐state and transient current‐voltage characteristics of double heterostructure AlGaAs/GaAs PNPN electro‐photonic device…

Abstract

Numerical device simulation is developed to study the steady‐state and transient current‐voltage characteristics of double heterostructure AlGaAs/GaAs PNPN electro‐photonic device when its performance is influenced by the presence of interface and bulk recombination mechanism. The simulation results show that the holding current and voltage and the breakover point are strongly affected by varying the minority carrier lifetime at outer heterojunctions. Numerical results also indicate that shortening the minority carrier lifetime in the inner PN homojunction region only increases the OFF‐state current. These results are in agreement with experimental data on AlGaAs/GaAs PNPN devices. The numerical modelling approach taken in this study is shown to be essential in the design and optimization of PNPN switch.

Details

COMPEL - The international journal for computation and mathematics in electrical and electronic engineering, vol. 13 no. 4
Type: Research Article
ISSN: 0332-1649

1 – 10 of over 19000