Search results1 – 3 of 3
The aim of this study is to present a novel approach based on semantic fingerprinting and a clustering algorithm called density-based spatial clustering of applications…
The aim of this study is to present a novel approach based on semantic fingerprinting and a clustering algorithm called density-based spatial clustering of applications with noise (DBSCAN), which can be used to convert investor records into 128-bit semantic fingerprints. Inventor disambiguation is a method used to discover a unique set of underlying inventors and map a set of patents to their corresponding inventors. Resolving the ambiguities between inventors is necessary to improve the quality of the patent database and to ensure accurate entity-level analysis. Most existing methods are based on machine learning and, while they often show good performance, this comes at the cost of time, computational power and storage space.
Using DBSCAN, the meta and textual data in inventor records are converted into 128-bit semantic fingerprints. However, rather than using a string comparison or cosine similarity to calculate the distance between pair-wise fingerprint records, a binary number comparison function was used in DBSCAN. DBSCAN then clusters the inventor records based on this distance to disambiguate inventor names.
Experiments conducted on the PatentsView campaign database of the United States Patent and Trademark Office show that this method disambiguates inventor names with recall greater than 99 per cent in less time and with substantially smaller storage requirement.
A better semantic fingerprint algorithm and a better distance function may improve precision. Setting of different clustering parameters for each block or other clustering algorithms will be considered to improve the accuracy of the disambiguation results even further.
Compared with the existing methods, the proposed method does not rely on feature selection and complex feature comparison computation. Most importantly, running time and storage requirements are drastically reduced.
This study aims to investigate the effect of acquisition modes on customer behavioral loyalty to enrich our knowledge of the effectiveness of acquisition modes and how to…
This study aims to investigate the effect of acquisition modes on customer behavioral loyalty to enrich our knowledge of the effectiveness of acquisition modes and how to better target customers in the service industry.
Using a data set from a large commercial bank in China, this study conducts a series of empirical analyses to examine the impacts of two types of acquisition modes (i.e. the gift acquisition mode and customer referral) on customer behavioral loyalty.
Gift acquisition has a negative effect on customer behavioral loyalty, as measured by the dropout probability, consumption amount and consumption frequency. Furthermore, this negative relationship could be weakened if the customer is referred by an existing customer.
Although prior studies have investigated the effectiveness of some acquisition modes in terms of customer loyalty, customer acquisition through the provision of gifts, which is widely implemented in marketing practice, has not been well investigated. This study addresses this research gap and identifies the joint influence of acquisition modes on customer behavioral loyalty, further enriching our knowledge of the effectiveness of different acquisition modes.
Partial alignment for 3 D point sets is a challenging problem for laser calibration and robot calibration due to the unbalance of data sets, especially when the overlap of…
Partial alignment for 3 D point sets is a challenging problem for laser calibration and robot calibration due to the unbalance of data sets, especially when the overlap of data sets is low. Geometric features can promote the accuracy of alignment. However, the corresponding feature extraction methods are time consuming. The purpose of this paper is to find a framework for partial alignment by an adaptive trimmed strategy.
First, the authors propose an adaptive trimmed strategy based on point feature histograms (PFH) coding. Second, they obtain an initial transformation based on this partition, which improves the accuracy of the normal direction weighted trimmed iterative closest point (ICP) method. Third, they conduct a series of GPU parallel implementations for time efficiency.
The initial partition based on PFH feature improves the accuracy of the partial registration significantly. Moreover, the parallel GPU algorithms accelerate the alignment process.
This study is applicable to rigid transformation so far. It could be extended to non-rigid transformation.
In practice, point set alignment for calibration is a technique widely used in the fields of aircraft assembly, industry examination, simultaneous localization and mapping and surgery navigation.
Point set calibration is a building block in the field of intelligent manufacturing.
The contributions are as follows: first, the authors introduce a novel coarse alignment as an initial calibration by PFH descriptor similarity, which can be viewed as a coarse trimmed process by partitioning the data to the almost overlap part and the rest part; second, they reduce the computation time by GPU parallel coding during the acquisition of feature descriptor; finally, they use the weighted trimmed ICP method to refine the transformation.