Search results

1 – 10 of 30
Open Access
Article
Publication date: 11 August 2023

Niansheng Xi and Hongmin Xu

The study aims to provide a basis for the effective use of safety-related information data and a quantitative assessment way for the occurrence probability of the safety risk such…

Abstract

Purpose

The study aims to provide a basis for the effective use of safety-related information data and a quantitative assessment way for the occurrence probability of the safety risk such as the fatigue fracture of the key components.

Design/methodology/approach

The fatigue crack growth rate is of dispersion, which is often used to accurately describe with probability density. In view of the external dispersion caused by the load, a simple and applicable probability expression of fatigue crack growth rate is adopted based on the fatigue growth theory. Considering the isolation among the pairs of crack length a and crack formation time t (a∼t data) obtained from same kind of structural parts, a statistical analysis approach of t distribution is proposed, which divides the crack length in several segments. Furthermore, according to the compatibility criterion of crack growth, that is, there is statistical development correspondence among a∼t data, the probability model of crack growth rate is established.

Findings

The results show that the crack growth rate in the stable growth stage can be approximately expressed by the crack growth control curve da/dt = Q•a, and the probability density of the crack growth parameter Q represents the external dispersion; t follows two-parameter Weibull distribution in certain a values.

Originality/value

The probability density f(Q) can be estimated by using the probability model of crack growth rate, and a calculation example shows that the estimation method is effective and practical.

Details

Railway Sciences, vol. 2 no. 3
Type: Research Article
ISSN: 2755-0907

Keywords

Open Access
Article
Publication date: 7 August 2019

Jinbao Zhang, Yongqiang Zhao, Ming Liu and Lingxian Kong

A generalized distribution with wide range of skewness and elongation will be suitable for the data mining and compatible for the misspecification of the distribution. Hence, the…

2270

Abstract

Purpose

A generalized distribution with wide range of skewness and elongation will be suitable for the data mining and compatible for the misspecification of the distribution. Hence, the purpose of this paper is to present a distribution-based approach for estimating degradation reliability considering these conditions.

Design/methodology/approach

Tukey’s g-and-h distribution with the quantile expression is introduced to fit the degradation paths of the population over time. The Newton–Raphson algorithm is used to approximately evaluate the reliability. Simulation verification for parameter estimation with particle swarm optimization (PSO) is carried out. The effectiveness and validity of the proposed approach for degradation reliability is verified by the two-stage verification and the comparison with others’ work.

Findings

Simulation studies have proved the effectiveness of PSO in the parameter estimation. Two degradation datasets of GaAs laser devices and crack growth are performed by the proposed approach. The results show that it can well match the initial failure time and be more compatible than the normal distribution and the Weibull distribution.

Originality/value

Tukey’s g-and-h distribution is first proposed to investigate the influence of the tail and the skewness on the degradation reliability. In addition, the parameters of the Tukey’s g-and-h distribution is estimated by PSO with root-mean-square error as the object function.

Details

Engineering Computations, vol. 36 no. 5
Type: Research Article
ISSN: 0264-4401

Keywords

Open Access
Article
Publication date: 21 September 2022

Sang Won Lee, Su Bok Ryu, Tae Young Kim and Jin Q. Jeon

This paper examines how the macroeconomic environment affects the determinants of prepayment of mortgage loans from October 2004 to February 2020. For more accurate analysis, the…

2537

Abstract

This paper examines how the macroeconomic environment affects the determinants of prepayment of mortgage loans from October 2004 to February 2020. For more accurate analysis, the authors define the timing of prepayment not only before the loan maturity but also at the time when 50% or more of the loan principal is repaid. The results show that, during the global financial crisis as well as the recent period of low interest rates, macroeconomic variables such as interest rate spreads and housing prices have a different effect compared to the normal situation. Also, significant explanatory variables, such as debt to income (DTI) ratio, loan amount ratio and poor credit score, have different effects depending on the macroenvironment. On the other hand, in all periods, the possibility of prepayment increases as comprehensive loan to value (CLTV) increases, and the younger the age, the shorter the loan maturity. The results suggest that, in the case of ultralong (40 years) mortgage loans recently introduced to support young people purchasing houses, the prepayment risk can be, at least partially, migrated by offsetting the increase in prepayment by young people and the decrease in prepayment due to long loan maturity. In addition, this study confirms that the accelerated time failure model compared to the logit model and COX proportional risk model has the potential to be more appropriate as a prepayment model for individual borrower analysis in terms of the explanatory power.

Details

Journal of Derivatives and Quantitative Studies: 선물연구, vol. 30 no. 4
Type: Research Article
ISSN: 1229-988X

Keywords

Open Access
Article
Publication date: 17 July 2023

Lei Xu, K. Praveen Parboteeah and Hanqing Fang

The authors enrich and extend the existing institutional anomie theory (IAT) in the hope of sharpening the understanding of the joint effects of selected cultural values and…

Abstract

Purpose

The authors enrich and extend the existing institutional anomie theory (IAT) in the hope of sharpening the understanding of the joint effects of selected cultural values and social institutional changes on women's pre-entrant entrepreneurial attempts. The authors theorize that women are culturally discouraged to pursue pre-entrant entrepreneurial attempts or wealth accumulation in a specific culture. This discouragement creates an anomic strain that motivates women to deviate from cultural prescriptions by engaging in pre-entrant entrepreneurial attempts at a faster speed. Building on this premise, the authors hypothesize that changes in social institutions facilitate the means of achievement for women due to the potential opportunities inherent in such institutional changes.

Design/methodology/approach

Using a randomly selected sample of 1,431 registered active individual users with a minimum of 10,000 followers on a leading entertainment live-streaming platform in the People's Republic of China, the authors examined a unique mix of cultural and institutional changes and their effects on the speed of women's engagement in live-streaming platform activity.

Findings

The authors find support for the impact of the interaction between changes in social institution conditions and cultural values. Unexpectedly, the authors also find a negative impact of cultural values on women's speed of engaging in pre-entrant entrepreneurial attempts.

Originality/value

The authors add institutional change to the IAT framework and provide a novel account for the variation in the pre-entrant entrepreneurial attempts by women on the platform.

Details

New England Journal of Entrepreneurship, vol. 26 no. 2
Type: Research Article
ISSN: 2574-8904

Keywords

Open Access
Article
Publication date: 12 April 2018

Chunlan Li, Jun Wang, Min Liu, Desalegn Yayeh Ayal, Qian Gong, Richa Hu, Shan Yin and Yuhai Bao

Extreme high temperatures are a significant feature of global climate change and have become more frequent and intense in recent years. These pose a significant threat to both…

1423

Abstract

Purpose

Extreme high temperatures are a significant feature of global climate change and have become more frequent and intense in recent years. These pose a significant threat to both human health and economic activity, and thus are receiving increasing research attention. Understanding the hazards posed by extreme high temperatures are important for selecting intervention measures targeted at reducing socioeconomic and environmental damage.

Design/methodology/approach

In this study, detrended fluctuation analysis is used to identify extreme high-temperature events, based on homogenized daily minimum and maximum temperatures from nine meteorological stations in a major grassland region, Hulunbuir, China, over the past 56 years.

Findings

Compared with the commonly used functions, Weibull distribution has been selected to simulate extreme high-temperature scenarios. It has been found that there was an increasing trend of extreme high temperature, and in addition, the probability of its indices increased significantly, with regional differences. The extreme high temperatures in four return periods exhibited an extreme low hazard in the central region of Hulunbuir, and increased from the center to the periphery. With the increased length of the return period, the area of high hazard and extreme high hazard increased. Topography and anomalous atmospheric circulation patterns may be the main factors influencing the occurrence of extreme high temperatures.

Originality/value

These results may contribute to a better insight in the hazard of extreme high temperatures, and facilitate the development of appropriate adaptation and mitigation strategies to cope with the adverse effects.

Details

International Journal of Climate Change Strategies and Management, vol. 11 no. 1
Type: Research Article
ISSN: 1756-8692

Keywords

Open Access
Article
Publication date: 11 February 2019

Peter Burggraef, Johannes Wagner, Matthias Dannapfel and Sebastian Patrick Vierschilling

The purpose of this paper is to investigate the benefit of pre-emptive disruption management measures for assembly systems towards the target dimension adherence to delivery times.

2292

Abstract

Purpose

The purpose of this paper is to investigate the benefit of pre-emptive disruption management measures for assembly systems towards the target dimension adherence to delivery times.

Design/methodology/approach

The research was conducted by creating simulation models for typical assembly systems and measuring its varying throughput times due to changes in their disruption profiles. Due to the variability of assembly systems, key influence factors were investigated and used as a foundation for the simulation setup. Additionally, a disruption profile for each simulated process was developed, using the established disruption categories material, information and capacity. The categories are described by statistical distributions, defining the interval between the disruptions and the disruption duration. By a statistical experiment plan, the effect of a reduced disruption potential onto the throughput time was investigated.

Findings

Pre-emptive disruption management is beneficial, but its benefit depends on the operated assembly system and its organisation form, such as line or group assembly. Measures have on average a higher beneficial impact on group assemblies than on line assemblies. Furthermore, it was proven that the benefit, in form of better adherence to delivery times, per reduced disruption potential has a declining character and approximates a distinct maximum.

Originality/value

Characterising the benefit of pre-emptive disruption management measures enables managers to use this concept in their daily production to minimise overall costs. Despite the hardly predictable influence of pre-emptive disruption measures, these research results can be implemented into a heuristic for efficiently choosing these measures.

Details

Journal of Modelling in Management, vol. 14 no. 1
Type: Research Article
ISSN: 1746-5664

Keywords

Open Access
Article
Publication date: 9 April 2021

Rosane Hungria-Gunnelin, Fredrik Kopsch and Carl Johan Enegren

The role of list price is often discussed in a narrative describing sellers’ preferences or sellers’ price expectations. This paper aims to investigate a set of list price…

1009

Abstract

Purpose

The role of list price is often discussed in a narrative describing sellers’ preferences or sellers’ price expectations. This paper aims to investigate a set of list price strategies that real estate brokers have available to influence the outcome of the sale, which may be many times self-serving.

Design/methodology/approach

By analyzing real estate brokers’ arguments on the choice of the list price level, a couple of hypotheses are formulated with regard to different expected outcomes that depend on the list price. This study empirically tests two hypotheses for the underlying incentives in the choice of list price from the real estate broker’s perspective: lower list price compared to market value leads to the higher sales price, lower list price compared to market value leads to a quicker sale. To investigate the two hypotheses, this paper adopts different methodological frameworks: H1 is tested by running a classical hedonic model, while H2 is tested through a duration model. This study further tests the hypotheses by splitting the full sample into two different price segments: above and below the median list price.

Findings

The results show that H1 is rejected for the full sample and for the two sub-samples. That is, contrary to the common narrative among brokers that underpricing leads to a higher sales price, underpricing lower sales price. H2, however, receives support for the full sample and for the two sub-samples. The latter result points to that brokers may be tempted to recommend a list price significantly below the expected selling price to minimize their effort while showing a high turnover of apartments.

Originality/value

Although there are a large number of previous studies analyzing list price strategies in the housing market, this paper is one of the few empirical studies that address the effect of list price choice level on auction outcomes of non-distressed housing sales.

Details

International Journal of Housing Markets and Analysis, vol. 14 no. 3
Type: Research Article
ISSN: 1753-8270

Keywords

Open Access
Article
Publication date: 27 July 2022

Ruilin Yu, Yuxin Zhang, Luyao Wang and Xinyi Du

Time headway (THW) is an essential parameter in traffic safety and is used as a typical control variable by many vehicle control algorithms, especially in safety-critical ADAS and…

1253

Abstract

Purpose

Time headway (THW) is an essential parameter in traffic safety and is used as a typical control variable by many vehicle control algorithms, especially in safety-critical ADAS and automated driving systems. However, due to the randomness of human drivers, THW cannot be accurately represented, affecting scholars’ more profound research.

Design/methodology/approach

In this work, two data sets are used as the experimental data to calculate the goodness-of-fit of 18 commonly used distribution models of THW to select the best distribution model. Subsequently, the characteristic parameters of traffic flow are extracted from the data set, and three variables with higher importance are extracted using the random forest model. Combining the best distribution model parameters of the data set, this study obtained a distribution model with adaptive parameters, and its performance and applicability are verified.

Findings

In this work, two data sets are used as the experimental data to calculate the goodness-of-fit of 18 commonly used distribution models of THW to select the best distribution model. Subsequently, the characteristic parameters of traffic flow are extracted from the data set, and three variables with higher importance are extracted using the random forest model. Combining the best distribution model parameters of the data set, this study obtained a distribution model with adaptive parameters, and its performance and applicability are verified.

Originality/value

The results show that the proposed model has a 62.7% performance improvement over the distribution model with fixed parameters. Moreover, the parameter function of the distribution model can be regarded as a quantitative analysis of the degree of influence of the traffic flow state on THW.

Details

Journal of Intelligent and Connected Vehicles, vol. 5 no. 3
Type: Research Article
ISSN: 2399-9802

Keywords

Open Access
Article
Publication date: 29 May 2023

Christopher Amaral, Ceren Kolsarici and Mikhail Nediak

The purpose of this study is to understand the profit implications of analytics-driven centralized discriminatory pricing at the headquarter level compared with sales force price…

1500

Abstract

Purpose

The purpose of this study is to understand the profit implications of analytics-driven centralized discriminatory pricing at the headquarter level compared with sales force price delegation in the purchase of an aftermarket good through an indirect retail channel with symmetric information.

Design/methodology/approach

Using individual-level loan application and approval data from a North American financial institution and segment-level customer risk as the price discrimination criterion for the firm, the authors develop a three-stage model that accounts for the salesperson’s price decision within the limits of the latitude provided by the firm; the firm’s decision to approve or not approve a sales application; and the customer’s decision to accept or reject a sales offer conditional on the firm’s approval. Next, the authors compare the profitability of this sales force price delegation model to that of a segment-level centralized pricing model where agent incentives and consumer prices are simultaneously optimized using a quasi-Newton nonlinear optimization algorithm (i.e. Broyden–Fletcher–Goldfarb–Shanno algorithm).

Findings

The results suggest that implementation of analytics-driven centralized discriminatory pricing and optimal sales force incentives leads to double-digit lifts in firm profits. Moreover, the authors find that the high-risk customer segment is less price-sensitive and firms, upon leveraging this segment’s willingness to pay, not only improve their bottom-line but also allow these marginalized customers with traditionally low approval rates access to loans. This points out the important customer welfare implications of the findings.

Originality/value

Substantively, to the best of the authors’ knowledge, this paper is the first to empirically investigate the profitability of analytics-driven segment-level (i.e. discriminatory) centralized pricing compared with sales force price delegation in indirect retail channels (i.e. where agents are external to the firm and have access to competitor products), taking into account the decisions of the three key stakeholders of the process, namely, the consumer, the salesperson and the firm and simultaneously optimizing sales commission and centralized consumer price.

Details

European Journal of Marketing, vol. 57 no. 13
Type: Research Article
ISSN: 0309-0566

Keywords

Open Access
Article
Publication date: 19 August 2021

Linh Truong-Hong, Roderik Lindenbergh and Thu Anh Nguyen

Terrestrial laser scanning (TLS) point clouds have been widely used in deformation measurement for structures. However, reliability and accuracy of resulting deformation…

2306

Abstract

Purpose

Terrestrial laser scanning (TLS) point clouds have been widely used in deformation measurement for structures. However, reliability and accuracy of resulting deformation estimation strongly depends on quality of each step of a workflow, which are not fully addressed. This study aims to give insight error of these steps, and results of the study would be guidelines for a practical community to either develop a new workflow or refine an existing one of deformation estimation based on TLS point clouds. Thus, the main contributions of the paper are investigating point cloud registration error affecting resulting deformation estimation, identifying an appropriate segmentation method used to extract data points of a deformed surface, investigating a methodology to determine an un-deformed or a reference surface for estimating deformation, and proposing a methodology to minimize the impact of outlier, noisy data and/or mixed pixels on deformation estimation.

Design/methodology/approach

In practice, the quality of data point clouds and of surface extraction strongly impacts on resulting deformation estimation based on laser scanning point clouds, which can cause an incorrect decision on the state of the structure if uncertainty is available. In an effort to have more comprehensive insight into those impacts, this study addresses four issues: data errors due to data registration from multiple scanning stations (Issue 1), methods used to extract point clouds of structure surfaces (Issue 2), selection of the reference surface Sref to measure deformation (Issue 3), and available outlier and/or mixed pixels (Issue 4). This investigation demonstrates through estimating deformation of the bridge abutment, building and an oil storage tank.

Findings

The study shows that both random sample consensus (RANSAC) and region growing–based methods [a cell-based/voxel-based region growing (CRG/VRG)] can be extracted data points of surfaces, but RANSAC is only applicable for a primary primitive surface (e.g. a plane in this study) subjected to a small deformation (case study 2 and 3) and cannot eliminate mixed pixels. On another hand, CRG and VRG impose a suitable method applied for deformed, free-form surfaces. In addition, in practice, a reference surface of a structure is mostly not available. The use of a fitting plane based on a point cloud of a current surface would cause unrealistic and inaccurate deformation because outlier data points and data points of damaged areas affect an accuracy of the fitting plane. This study would recommend the use of a reference surface determined based on a design concept/specification. A smoothing method with a spatial interval can be effectively minimize, negative impact of outlier, noisy data and/or mixed pixels on deformation estimation.

Research limitations/implications

Due to difficulty in logistics, an independent measurement cannot be established to assess the deformation accuracy based on TLS data point cloud in the case studies of this research. However, common laser scanners using the time-of-flight or phase-shift principle provide point clouds with accuracy in the order of 1–6 mm, while the point clouds of triangulation scanners have sub-millimetre accuracy.

Practical implications

This study aims to give insight error of these steps, and the results of the study would be guidelines for a practical community to either develop a new workflow or refine an existing one of deformation estimation based on TLS point clouds.

Social implications

The results of this study would provide guidelines for a practical community to either develop a new workflow or refine an existing one of deformation estimation based on TLS point clouds. A low-cost method can be applied for deformation analysis of the structure.

Originality/value

Although a large amount of the studies used laser scanning to measure structure deformation in the last two decades, the methods mainly applied were to measure change between two states (or epochs) of the structure surface and focused on quantifying deformation-based TLS point clouds. Those studies proved that a laser scanner could be an alternative unit to acquire spatial information for deformation monitoring. However, there are still challenges in establishing an appropriate procedure to collect a high quality of point clouds and develop methods to interpret the point clouds to obtain reliable and accurate deformation, when uncertainty, including data quality and reference information, is available. Therefore, this study demonstrates the impact of data quality in a term of point cloud registration error, selected methods for extracting point clouds of surfaces, identifying reference information, and available outlier, noisy data and/or mixed pixels on deformation estimation.

Details

International Journal of Building Pathology and Adaptation, vol. 40 no. 3
Type: Research Article
ISSN: 2398-4708

Keywords

1 – 10 of 30