Search results

1 – 10 of over 185000
Article
Publication date: 5 September 2023

Xinyu Zhang and Liling Ge

A multi-laser sensors-based measurement instrument is proposed for the measurement of geometry errors of a differential body and quality evaluation. This paper aims to discuss the…

Abstract

Purpose

A multi-laser sensors-based measurement instrument is proposed for the measurement of geometry errors of a differential body and quality evaluation. This paper aims to discuss the aforementioned idea.

Design/methodology/approach

First, the differential body is set on a rotation platform before measuring. Then one laser sensor called as “primary sensor”, is installed on the intern of the differential body. The spherical surface and four holes on the differential body are sampled by the primary sensor when the rotation platform rotates one revolution. Another sensor called as “secondary sensor”, is installed above to sample the external cylinder surface and the planar surface on the top of the differential body, and the external cylinder surface and the planar surface are high in manufacturing precision, which are used as datum surfaces to compute the errors caused by the motion of the rotation platform. Finally, the sampled points from the primary sensor are compensated to improve the measurement accuracy.

Findings

A multi-laser sensors-based measurement instrument is proposed for the measurement of geometry errors of a differential body. Based on the characteristics of the measurement data, a gradient image-based method is proposed to distinguish different objects from laser measurement data. A case study is presented to validate the measurement principle and data processing approach.

Research limitations/implications

The study investigates the possibility of correction of sensor data by the measurement results of multiple sensors to improving measurement accuracy. The proposed technique enables the error analysis and compensation by the geometric correlation relationship of various features on the measurand.

Originality/value

The proposed error compensation principle by using multiple sensors proved to be useful for the design of new measurement device for special part inspection. The proposed approach to describe the measuring data by image also is proved to be useful to simplify the measurement data processing.

Details

Engineering Computations, vol. 40 no. 9/10
Type: Research Article
ISSN: 0264-4401

Keywords

Open Access
Article
Publication date: 6 February 2020

Jun Liu, Asad Khattak, Lee Han and Quan Yuan

Individuals’ driving behavior data are becoming available widely through Global Positioning System devices and on-board diagnostic systems. The incoming data can be sampled at…

1346

Abstract

Purpose

Individuals’ driving behavior data are becoming available widely through Global Positioning System devices and on-board diagnostic systems. The incoming data can be sampled at rates ranging from one Hertz (or even lower) to hundreds of Hertz. Failing to capture substantial changes in vehicle movements over time by “undersampling” can cause loss of information and misinterpretations of the data, but “oversampling” can waste storage and processing resources. The purpose of this study is to empirically explore how micro-driving decisions to maintain speed, accelerate or decelerate, can be best captured, without substantial loss of information.

Design/methodology/approach

This study creates a set of indicators to quantify the magnitude of information loss (MIL). Each indicator is calculated as a percentage to index the extent of information loss (EIL) in different situations. An overall information loss index named EIL is created to combine the MIL indicators. Data from a driving simulator study collected at 20 Hertz are analyzed (N = 718,481 data points from 35,924 s of driving tests). The study quantifies the relationship between information loss indicators and sampling rates.

Findings

The results show that marginally more information is lost as data are sampled down from 20 to 0.5 Hz, but the relationship is not linear. With four indicators of MILs, the overall EIL is 3.85 per cent for 1-Hz sampling rate driving behavior data. If sampling rates are higher than 2 Hz, all MILs are under 5 per cent for importation loss.

Originality/value

This study contributes by developing a framework for quantifying the relationship between sampling rates, and information loss and depending on the objective of their study, researchers can choose the appropriate sampling rate necessary to get the right amount of accuracy.

Details

Journal of Intelligent and Connected Vehicles, vol. 3 no. 1
Type: Research Article
ISSN: 2399-9802

Keywords

Article
Publication date: 18 November 2019

Guanying Huo, Xin Jiang, Zhiming Zheng and Deyi Xue

Metamodeling is an effective method to approximate the relations between input and output parameters when significant efforts of experiments and simulations are required to…

Abstract

Purpose

Metamodeling is an effective method to approximate the relations between input and output parameters when significant efforts of experiments and simulations are required to collect the data to build the relations. This paper aims to develop a new sequential sampling method for adaptive metamodeling by using the data with highly nonlinear relation between input and output parameters.

Design/methodology/approach

In this method, the Latin hypercube sampling method is used to sample the initial data, and kriging method is used to construct the metamodel. In this work, input parameter values for collecting the next output data to update the currently achieved metamodel are determined based on qualities of data in both the input and output parameter spaces. Uniformity is used to evaluate data in the input parameter space. Leave-one-out errors and sensitivities are considered to evaluate data in the output parameter space.

Findings

This new method has been compared with the existing methods to demonstrate its effectiveness in approximation. This new method has also been compared with the existing methods in solving global optimization problems. An engineering case is used at last to verify the method further.

Originality/value

This paper provides an effective sequential sampling method for adaptive metamodeling to approximate highly nonlinear relations between input and output parameters.

Details

Engineering Computations, vol. 37 no. 3
Type: Research Article
ISSN: 0264-4401

Keywords

Book part
Publication date: 21 November 2014

Eric Ghysels and J. Isaac Miller

We analyze the sizes of standard cointegration tests applied to data subject to linear interpolation, discovering evidence of substantial size distortions induced by the…

Abstract

We analyze the sizes of standard cointegration tests applied to data subject to linear interpolation, discovering evidence of substantial size distortions induced by the interpolation. We propose modifications to these tests to effectively eliminate size distortions from such tests conducted on data interpolated from end-of-period sampled low-frequency series. Our results generally do not support linear interpolation when alternatives such as aggregation or mixed-frequency-modified tests are possible.

Details

Essays in Honor of Peter C. B. Phillips
Type: Book
ISBN: 978-1-78441-183-1

Keywords

Book part
Publication date: 29 March 2006

Elena Andreou and Eric Ghysels

Despite the difference in information sets, we are able to compare the asymptotic distribution of volatility estimators involving data sampled at different frequencies. To do so…

Abstract

Despite the difference in information sets, we are able to compare the asymptotic distribution of volatility estimators involving data sampled at different frequencies. To do so, we propose extensions of the continuous record asymptotic analysis for rolling sample variance estimators developed by Foster and Nelson (1996, Econometrica, 64, 139–174). We focus on traditional historical volatility filters involving monthly, daily and intradaily observations. Theoretical results are complemented with Monte Carlo simulations in order to assess the validity of the asymptotics for sample sizes and filters encountered in empirical studies.

Details

Econometric Analysis of Financial and Economic Time Series
Type: Book
ISBN: 978-0-76231-274-0

Abstract

Details

Handbook of Transport Modelling
Type: Book
ISBN: 978-0-08-045376-7

Book part
Publication date: 9 May 2012

Caroline O. Ford and William R. Pasewark

We conduct an experiment to analyze the impact of a well-established psychological construct, need for cognition, in an audit-related decision context. By simulating a basic audit…

Abstract

We conduct an experiment to analyze the impact of a well-established psychological construct, need for cognition, in an audit-related decision context. By simulating a basic audit sampling task, we determine whether the desire to engage in a cognitive process influences decisions made during that task. Specifically, we investigate whether an individual's need for cognition influences the quantity of data collected, the revision of a predetermined sampling plan, and the time taken to make a decision. Additionally, we examine the impact of cost constraints during the decision-making process.

Contrary to results in previous studies, we find those with a higher need for cognition sought less data than those with a lower need for cognition to make an audit sampling decision. In addition, we find that the need for cognition had no relationship to sampling plan revisions or the time needed to make an audit sampling decision. Previous studies regarding the need for cognition did not utilize incremental costs for additional decision-making information. Potentially, these costs provided cognitive challenges that influenced decision outcomes.

Details

Advances in Accounting Behavioral Research
Type: Book
ISBN: 978-1-78052-758-1

Article
Publication date: 22 February 2008

Chunxia Huang, Qixin Cao, Zhuang Fu and Chuntao Leng

This paper sets out to propose a wafer prealigner based on multi‐sensor integration and an effective prealignment method implemented on it.

Abstract

Purpose

This paper sets out to propose a wafer prealigner based on multi‐sensor integration and an effective prealignment method implemented on it.

Design/methodology/approach

The wafer and notch eccentricities, on which wafer prealignment is based, are calculated with the peripheral data of the wafer detected by a laser displacement sensor and a transmission laser sensor by means of barycenter acquiring algorithm in a one‐particle system.

Findings

The center and notch prealignment precisions of the system are, respectively, ±1.5 μm and ±30 μrad. Experimentation has proved the validity and effectiveness of the system.

Practical implications

The wafer prealigner is a subsystem of the lithography in the semiconductor industry. The prealignment algorithm can be implemented in any object with random figures.

Originality/value

The periphery of the wafer is detected by a high‐precision laser displacement sensor and a low‐cost transmission laser sensor instead of a CCD linear sensor used by traditional wafer prealigners, which saves the space occupation of the structure and enhances the systematic prealignment precision. Using barycenter acquiring algorithm in a one‐particle system to calculate the wafer and notch eccentricities is effective and valid.

Details

Assembly Automation, vol. 28 no. 1
Type: Research Article
ISSN: 0144-5154

Keywords

Book part
Publication date: 1 May 2012

Sarin Anantarak

Several studies have observed that stocks tend to drop by an amount that is less than the dividend on the ex-dividend day, the so-called ex-dividend day anomaly. However, there…

Abstract

Several studies have observed that stocks tend to drop by an amount that is less than the dividend on the ex-dividend day, the so-called ex-dividend day anomaly. However, there still remains a lack of consensus for a single explanation of this anomaly. Different from other studies, this dissertation attempts to answer the primary research question: how can investors make trading profits from the ex-dividend day anomaly and how much can they earn? With this goal, I examine the economic motivations of equity investors through four main hypotheses identified in the anomaly's literature: the tax differential hypothesis, the short-term trading hypothesis, the tick size hypothesis, and the leverage hypothesis.

While the U.S. ex-dividend anomaly is well studied, I examine a long data window (1975–2010) of Thailand data. The unique structure of the Thai stock market allows me to assess all four main hypotheses proposed in the literature simultaneously. Although I extract the sample data from two data sources, I demonstrate that the combined data are consistently sampled. I further construct three trading strategies – “daily return,” “lag one daily return,” and “weekly return” – to alleviate the potential effect of irregular data observation.

I find that the ex-dividend day anomaly exists in Thailand, is governed by the tax differential, and is driven by short-term trading activities. That is, investors trade heavily around the ex-dividend day to reap the benefits of the tax differential. I find mixed results for the predictions of the tick size hypothesis and results that are inconsistent with the predictions of the leverage hypothesis.

I conclude that, on the Stock Exchange of Thailand, juristic and foreign investors can profitably buy stocks cum-dividend and sell them ex-dividend while local investors should engage in short sale transactions. On average, investors who employ the daily return strategy have earned significant abnormal return up to 0.15% (45.66% annualized rate) and up to 0.17% (50.99% annualized rate) for the lag one daily return strategy. Investors can also make a trading profit by conducting the weekly return strategy and earn up to 0.59% (35.67% annualized rate), on average.

Details

Research in Finance
Type: Book
ISBN: 978-1-78052-752-9

Book part
Publication date: 1 January 2008

Ivan Jeliazkov, Jennifer Graves and Mark Kutzbach

In this paper, we consider the analysis of models for univariate and multivariate ordinal outcomes in the context of the latent variable inferential framework of Albert and Chib…

Abstract

In this paper, we consider the analysis of models for univariate and multivariate ordinal outcomes in the context of the latent variable inferential framework of Albert and Chib (1993). We review several alternative modeling and identification schemes and evaluate how each aids or hampers estimation by Markov chain Monte Carlo simulation methods. For each identification scheme we also discuss the question of model comparison by marginal likelihoods and Bayes factors. In addition, we develop a simulation-based framework for analyzing covariate effects that can provide interpretability of the results despite the nonlinearities in the model and the different identification restrictions that can be implemented. The methods are employed to analyze problems in labor economics (educational attainment), political economy (voter opinions), and health economics (consumers’ reliance on alternative sources of medical information).

Details

Bayesian Econometrics
Type: Book
ISBN: 978-1-84855-308-8

1 – 10 of over 185000