Search results

1 – 10 of 208
Article
Publication date: 6 February 2007

L. Tang, L.C. Thomas, S. Thomas and J‐F. Bozzetto

The purpose of this research is to undertake an examination of the impacts of socio‐demographic and economic variables on the probability of purchasing financial products. There…

1760

Abstract

Purpose

The purpose of this research is to undertake an examination of the impacts of socio‐demographic and economic variables on the probability of purchasing financial products. There is relatively little empirical research that has been taken to understand how the underlying economy affects customers' subsequent financial product purchase behaviours. Understanding this influence would improve prediction of when purchases will occur and hence is important for the Customer lifetime value models of financial service organisations.

Design/methodology/approach

Two proportional hazard modelling approaches – Cox and Weibull – are compared in terms of predictive ability on a data set from a major insurance company. The risk factors for purchase are both economic and socio‐demographic.

Findings

The results show that the external economic environment is an extremely important influence in driving customers' financial products purchasing behaviours. Furthermore, the results also indicate that Cox's proportional hazard models are superior to Weibull proportional hazard models in this case because of an annual purchase effect.

Practical implications

Financial organisations need to consider the current economic conditions before determining how much marketing effort to undertake.

Originality/value

The originality of this paper is that it considers economic conditions and socio‐demographic variables in modelling the long run purchase behaviour of customers for insurance and savings products. It has a large data set from a major insurance company. It is also one of the first papers to make a detailed comparison between the semi‐parametric and parametric proportional hazard models in the bank marketing area.

Details

International Journal of Bank Marketing, vol. 25 no. 1
Type: Research Article
ISSN: 0265-2323

Keywords

Article
Publication date: 1 March 2022

Qiang Zhang, Xinyu Zhu, J. Leon Zhao and Liang Liang

Digital platforms have grown significantly in recent years. Although high platform failure risks (PFR) have plagued the industry, the literature has only given this issue scant…

Abstract

Purpose

Digital platforms have grown significantly in recent years. Although high platform failure risks (PFR) have plagued the industry, the literature has only given this issue scant treatment. Customer sentiments are crucial for platforms and have a growing body of knowledge on its analysis. However, previous studies have overlooked rich contextual information emb`edded in user-generated content (UGC). Confronting the research gap of digital platform failure and drawbacks of customer sentiment analysis, we aim to detect signals of PFR based on our advanced customer sentiment analysis approach for UGC and to illustrate how customer sentiments could predict PFR.

Design/methodology/approach

We develop a deep-learning based approach to improve the accuracy of customer sentiment analysis for further predicting PFR. We leverage a unique dataset of online P2P lending, i.e., a typical setting of transactional digital platforms, including 97,876 pieces of UGC for 2,467 platforms from 2011 to 2018.

Findings

Our results show that the proposed approach can improve the accuracy of measuring customer sentiment by integrating word embedding technique and bidirectional long short-term memory (Bi-LSTM). On top of that, we show that customer sentiment can improve the accuracy for predicting PFR by 10.96%. Additionally, we do not only focus on a single type of customer sentiment in a static view. We discuss how the predictive power varies across positive, neutral, negative customer sentiments, and during different time periods.

Originality/value

Our research results contribute to the literature stream on digital platform failure with online information processing and offer implications for digital platform risk management with advanced customer sentiment analysis.

Details

Industrial Management & Data Systems, vol. 122 no. 3
Type: Research Article
ISSN: 0263-5577

Keywords

Article
Publication date: 1 April 2005

Marc J. LeClere

Research in the area of financial distress often uses a proportional hazards model to determine the influence of covariates on the duration of time that precedes financial…

288

Abstract

Research in the area of financial distress often uses a proportional hazards model to determine the influence of covariates on the duration of time that precedes financial distress. Acritical issue in the use of a proportional hazards model is the use of time‐invariant and time‐dependent covariates. Time‐invariant covariates remain fixed while time‐dependent covariates change during the estimation of the model. Although the choice of covariates might substantially affect the estimation of the proportional hazards model, existing literature often fails to consider the potential effect of this choice on model estimation. This paper reviews the distinction between time‐invariant and time‐dependent covariates and the effect of covariate selection on the estimation of a proportional hazards model. Using a sample of financially distressed and non‐financially distressed firms, this paper suggests the choice of time dependence substantially influences model estimation and that covariate selection should be given more serious consideration in financial distress research.

Details

Review of Accounting and Finance, vol. 4 no. 4
Type: Research Article
ISSN: 1475-7702

Book part
Publication date: 1 December 2008

Zhen Wei

Survival (default) data are frequently encountered in financial (especially credit risk), medical, educational, and other fields, where the “default” can be interpreted as the…

Abstract

Survival (default) data are frequently encountered in financial (especially credit risk), medical, educational, and other fields, where the “default” can be interpreted as the failure to fulfill debt payments of a specific company or the death of a patient in a medical study or the inability to pass some educational tests.

This paper introduces the basic ideas of Cox's original proportional model for the hazard rates and extends the model within a general framework of statistical data mining procedures. By employing regularization, basis expansion, boosting, bagging, Markov chain Monte Carlo (MCMC) and many other tools, we effectively calibrate a large and flexible class of proportional hazard models.

The proposed methods have important applications in the setting of credit risk. For example, the model for the default correlation through regularization can be used to price credit basket products, and the frailty factor models can explain the contagion effects in the defaults of multiple firms in the credit market.

Details

Econometrics and Risk Management
Type: Book
ISBN: 978-1-84855-196-1

Article
Publication date: 10 April 2020

Jiang Hu and Fuheng Ma

The purpose of this study is to develop and verify a methodology for a zoned deformation prediction model for super high arch dams, which is indeed a panel data-based regression…

Abstract

Purpose

The purpose of this study is to develop and verify a methodology for a zoned deformation prediction model for super high arch dams, which is indeed a panel data-based regression model with the hierarchical clustering on principal components.

Design/methodology/approach

The hierarchical clustering method is used to highlight the main features of the time series. This method is used to select the typical points of the measured ambient and concrete temperatures as predictors and divide the deformation observation points into groups. Based on this, the panel data of each zone can be established, and its type can be judged using F and Hausman tests successively. Then hydrostatic–temperature–time–season models for zones can be constructed. Through the comparative analyses of the distributions and the fitted coefficients of these zones, the spatial deformation mechanism of a dam can be identified. A super high arch dam is taken as a case study.

Findings

According to the measured radial displacements during the initial operation period, the investigated pendulums are divided into four zones. After tests, fixed-effect regression models are established. The comparative analyses show that the dam deformation conforms to the natural condition. The factors such as the unstable temperature field and the nonlinear time-dependent effect have obvious effects on the dam deformation. The results show the efficiency of the proposed methodology in zoning and prediction modeling for deformation of super high arch dams and the potential to mining dam deformation mechanism.

Originality/value

A zoned deformation prediction model for super high arch dams is proposed where hierarchical clustering on principal component method and panel data model are combined.

Details

Engineering Computations, vol. 37 no. 9
Type: Research Article
ISSN: 0264-4401

Keywords

Book part
Publication date: 19 July 2005

Rodolphe Durand and Zahia Guessoum

The aim of this paper is to give empirical evidence of the fundamental mechanisms underlying the resource systemics: time compression diseconomies, asset mass efficiency, and…

Abstract

The aim of this paper is to give empirical evidence of the fundamental mechanisms underlying the resource systemics: time compression diseconomies, asset mass efficiency, and interconnectedness of assets. It assumes that time, resource properties and interactions are the critical elements leading to accumulation of idiosyncratic resources, firm performance and survival. Results from a Cox regression on a simulated dataset confirm the protective effects of time compression diseconomies, asset mass efficiency, and interconnectedness of assets against firm's death.

Details

Competence Perspectives on Resources, Stakeholders and Renewal
Type: Book
ISBN: 978-0-76231-170-5

Article
Publication date: 8 October 2018

Tanmoy Hazra, C.R.S. Kumar and Manisha Nene

The purpose of this paper is to propose a model for a two-agent multi-target-searching scenario in a two-dimensional region, where some places of the region have limited resource…

Abstract

Purpose

The purpose of this paper is to propose a model for a two-agent multi-target-searching scenario in a two-dimensional region, where some places of the region have limited resource capacity in terms of the number of agents that can simultaneously pass through those places and few places of the region are unreachable that expand with time. The proposed cooperative search model and Petri net model facilitate the search operation considering the constraints mentioned in the paper. The Petri net model graphically illustrates different scenarios and helps the agents to validate the strategies.

Design/methodology/approach

In this paper, the authors have applied an optimization approach to determine the optimal locations of base stations, a cooperative search model, inclusion–exclusion principle, Cartesian product to optimize the search operation and a Petri net model to validate the search technique.

Findings

The proposed approach finds the optimal locations of the base stations in the region. The proposed cooperative search model allows various constraints such as resource capacity, time-dependent unreachable places/obstacles, fuel capacities of the agents, two types of targets assigned to two agents and limited sortie lengths. On the other hand, a Petri net model graphically represents whether collisions/deadlocks between the two agents are possible or not for a particular combination of paths as well as effect of time-dependent unreachable places for different combination of paths are also illustrated.

Practical implications

The problem addressed in this paper is similar to various real-time problems such as rescue operations during/after flood, landslide, earthquake, accident, patrolling in urban areas, international borders, forests, etc. Thus, the proposed model can benefit various organizations and departments such as rescue operation authorities, defense organizations, police departments, etc.

Originality/value

To the best of the authors’ knowledge, the problem addressed in this paper has not been completely explored, and the proposed cooperative search model to conduct the search operation considering the above-mentioned constraints is new. To the best of the authors’ knowledge, no paper has modeled time-dependent unreachable places with the help of Petri net.

Details

International Journal of Intelligent Unmanned Systems, vol. 6 no. 4
Type: Research Article
ISSN: 2049-6427

Keywords

Article
Publication date: 1 March 2003

Yasuhiko Nishio and Tadashi Dohi

The software reliability models to describe the reliability growth phenomenon are formulated by any stochastic point process with state‐dependent or time‐dependent intensity…

Abstract

The software reliability models to describe the reliability growth phenomenon are formulated by any stochastic point process with state‐dependent or time‐dependent intensity function. On the other hand, to deal with the environmental data, which consists of covariates influencing times to software failure, it may be useful to apply the Cox’s proportional hazards model for assessing the software reliability. In this paper, we review the proportional hazards software reliability models and discuss the problem to determine the optimal software release time under the expected total software cost criterion. Numerical examples are devoted to examine the dependence of the covariate structure in both the software reliability prediction and the optimal software release decision.

Details

Journal of Quality in Maintenance Engineering, vol. 9 no. 1
Type: Research Article
ISSN: 1355-2511

Keywords

Article
Publication date: 22 March 2013

Rudi Meijer and Sandjai Bhulai

The purpose of this paper is to study the optimal pricing problem that retailers are challenged with when dealing with seasonal products. The friction between expected demand and…

Abstract

Purpose

The purpose of this paper is to study the optimal pricing problem that retailers are challenged with when dealing with seasonal products. The friction between expected demand and realized demand creates a risk that supply during the season is not cleared, thus forcing the retailer to markdown overstocked supply.

Design/methodology/approach

The authors propose a framework based on a Cox regression analysis to determine optimal markdown paths. They illustrate this framework by a case study on a large department store.

Findings

The framework allows one to determine when and how much to markdown in order to optimize expected total profit given the available supply. When the law of demand holds at a disaggregated level, i.e. the individual retailer, it is also possible to optimize the markdown path.

Originality/value

This paper provides a framework for the complex dynamic pricing problem in retail using transactional data. The case study shows that significant revenues can be generated when applying this framework.

Details

International Journal of Retail & Distribution Management, vol. 41 no. 4
Type: Research Article
ISSN: 0959-0552

Keywords

Book part
Publication date: 4 December 2020

K.S.S. Iyer and Madhavi Damle

This chapter has been seminal work of Dr K.S.S. Iyer, which has taken time to develop, for over the last 56 years to be presented here. The method in advance predictive analytics…

Abstract

This chapter has been seminal work of Dr K.S.S. Iyer, which has taken time to develop, for over the last 56 years to be presented here. The method in advance predictive analytics has developed, from his several other applications, in predictive modeling by using the stochastic point process technique. In the chapter on advance predictive analytics, Dr Iyer is collecting his approaches and generalizing it in this chapter. In this chapter, two of the techniques of stochastic point process known as Product Density and Random point process used in modelling problems in High energy particles and cancer, are redefined to suit problems currently in demand in IoT and customer equity in marketing (Iyer, Patil, & Chetlapalli, 2014b). This formulation arises from these techniques being used in different fields like energy requirement in Internet of Things (IoT) devices, growth of cancer cells, cosmic rays’ study, to customer equity and many more approaches.

1 – 10 of 208