Search results

1 – 3 of 3
To view the access options for this content please click here
Article
Publication date: 3 June 2019

Hongqi Han, Yongsheng Yu, Lijun Wang, Xiaorui Zhai, Yaxin Ran and Jingpeng Han

The aim of this study is to present a novel approach based on semantic fingerprinting and a clustering algorithm called density-based spatial clustering of applications…

Abstract

Purpose

The aim of this study is to present a novel approach based on semantic fingerprinting and a clustering algorithm called density-based spatial clustering of applications with noise (DBSCAN), which can be used to convert investor records into 128-bit semantic fingerprints. Inventor disambiguation is a method used to discover a unique set of underlying inventors and map a set of patents to their corresponding inventors. Resolving the ambiguities between inventors is necessary to improve the quality of the patent database and to ensure accurate entity-level analysis. Most existing methods are based on machine learning and, while they often show good performance, this comes at the cost of time, computational power and storage space.

Design/methodology/approach

Using DBSCAN, the meta and textual data in inventor records are converted into 128-bit semantic fingerprints. However, rather than using a string comparison or cosine similarity to calculate the distance between pair-wise fingerprint records, a binary number comparison function was used in DBSCAN. DBSCAN then clusters the inventor records based on this distance to disambiguate inventor names.

Findings

Experiments conducted on the PatentsView campaign database of the United States Patent and Trademark Office show that this method disambiguates inventor names with recall greater than 99 per cent in less time and with substantially smaller storage requirement.

Research limitations/implications

A better semantic fingerprint algorithm and a better distance function may improve precision. Setting of different clustering parameters for each block or other clustering algorithms will be considered to improve the accuracy of the disambiguation results even further.

Originality/value

Compared with the existing methods, the proposed method does not rely on feature selection and complex feature comparison computation. Most importantly, running time and storage requirements are drastically reduced.

Details

The Electronic Library , vol. 37 no. 2
Type: Research Article
ISSN: 0264-0473

Keywords

To view the access options for this content please click here
Article
Publication date: 16 December 2020

Yaxin Ming, Jing (Elaine) Chen and Chenxi Li

This study aims to investigate the effect of acquisition modes on customer behavioral loyalty to enrich our knowledge of the effectiveness of acquisition modes and how to…

Abstract

Purpose

This study aims to investigate the effect of acquisition modes on customer behavioral loyalty to enrich our knowledge of the effectiveness of acquisition modes and how to better target customers in the service industry.

Design/methodology/approach

Using a data set from a large commercial bank in China, this study conducts a series of empirical analyses to examine the impacts of two types of acquisition modes (i.e. the gift acquisition mode and customer referral) on customer behavioral loyalty.

Findings

Gift acquisition has a negative effect on customer behavioral loyalty, as measured by the dropout probability, consumption amount and consumption frequency. Furthermore, this negative relationship could be weakened if the customer is referred by an existing customer.

Originality/value

Although prior studies have investigated the effectiveness of some acquisition modes in terms of customer loyalty, customer acquisition through the provision of gifts, which is widely implemented in marketing practice, has not been well investigated. This study addresses this research gap and identifies the joint influence of acquisition modes on customer behavioral loyalty, further enriching our knowledge of the effectiveness of different acquisition modes.

Details

International Journal of Bank Marketing, vol. 39 no. 1
Type: Research Article
ISSN: 0265-2323

Keywords

To view the access options for this content please click here
Article
Publication date: 11 October 2019

Yaxin Peng, Naiwu Wen, Chaomin Shen, Xiaohuang Zhu and Shihui Ying

Partial alignment for 3 D point sets is a challenging problem for laser calibration and robot calibration due to the unbalance of data sets, especially when the overlap of…

Abstract

Purpose

Partial alignment for 3 D point sets is a challenging problem for laser calibration and robot calibration due to the unbalance of data sets, especially when the overlap of data sets is low. Geometric features can promote the accuracy of alignment. However, the corresponding feature extraction methods are time consuming. The purpose of this paper is to find a framework for partial alignment by an adaptive trimmed strategy.

Design/methodology/approach

First, the authors propose an adaptive trimmed strategy based on point feature histograms (PFH) coding. Second, they obtain an initial transformation based on this partition, which improves the accuracy of the normal direction weighted trimmed iterative closest point (ICP) method. Third, they conduct a series of GPU parallel implementations for time efficiency.

Findings

The initial partition based on PFH feature improves the accuracy of the partial registration significantly. Moreover, the parallel GPU algorithms accelerate the alignment process.

Research limitations/implications

This study is applicable to rigid transformation so far. It could be extended to non-rigid transformation.

Practical implications

In practice, point set alignment for calibration is a technique widely used in the fields of aircraft assembly, industry examination, simultaneous localization and mapping and surgery navigation.

Social implications

Point set calibration is a building block in the field of intelligent manufacturing.

Originality/value

The contributions are as follows: first, the authors introduce a novel coarse alignment as an initial calibration by PFH descriptor similarity, which can be viewed as a coarse trimmed process by partitioning the data to the almost overlap part and the rest part; second, they reduce the computation time by GPU parallel coding during the acquisition of feature descriptor; finally, they use the weighted trimmed ICP method to refine the transformation.

Details

Assembly Automation, vol. 40 no. 2
Type: Research Article
ISSN: 0144-5154

Keywords

1 – 3 of 3