Search results

1 – 10 of over 15000
Open Access
Article
Publication date: 6 February 2020

Jun Liu, Asad Khattak, Lee Han and Quan Yuan

Individuals’ driving behavior data are becoming available widely through Global Positioning System devices and on-board diagnostic systems. The incoming data can be sampled at…

1341

Abstract

Purpose

Individuals’ driving behavior data are becoming available widely through Global Positioning System devices and on-board diagnostic systems. The incoming data can be sampled at rates ranging from one Hertz (or even lower) to hundreds of Hertz. Failing to capture substantial changes in vehicle movements over time by “undersampling” can cause loss of information and misinterpretations of the data, but “oversampling” can waste storage and processing resources. The purpose of this study is to empirically explore how micro-driving decisions to maintain speed, accelerate or decelerate, can be best captured, without substantial loss of information.

Design/methodology/approach

This study creates a set of indicators to quantify the magnitude of information loss (MIL). Each indicator is calculated as a percentage to index the extent of information loss (EIL) in different situations. An overall information loss index named EIL is created to combine the MIL indicators. Data from a driving simulator study collected at 20 Hertz are analyzed (N = 718,481 data points from 35,924 s of driving tests). The study quantifies the relationship between information loss indicators and sampling rates.

Findings

The results show that marginally more information is lost as data are sampled down from 20 to 0.5 Hz, but the relationship is not linear. With four indicators of MILs, the overall EIL is 3.85 per cent for 1-Hz sampling rate driving behavior data. If sampling rates are higher than 2 Hz, all MILs are under 5 per cent for importation loss.

Originality/value

This study contributes by developing a framework for quantifying the relationship between sampling rates, and information loss and depending on the objective of their study, researchers can choose the appropriate sampling rate necessary to get the right amount of accuracy.

Details

Journal of Intelligent and Connected Vehicles, vol. 3 no. 1
Type: Research Article
ISSN: 2399-9802

Keywords

Content available
Book part
Publication date: 10 April 2019

Abstract

Details

The Econometrics of Complex Survey Data
Type: Book
ISBN: 978-1-78756-726-9

Open Access
Article
Publication date: 16 April 2019

Zhishuo Liu, Yao Dongxin, Zhao Kuan and Wang Chun Fang

There is a certain error in the satellite positioning of the vehicle. The error will cause the drift point of the positioning point, which makes the vehicle trajectory shift to…

Abstract

Purpose

There is a certain error in the satellite positioning of the vehicle. The error will cause the drift point of the positioning point, which makes the vehicle trajectory shift to the real road. This paper aims to solve this problem.

Design/methodology/approach

The key technology to solve the problem is map matching (MM). The low sampling frequency of the vehicle is far from the distance between adjacent points, which weakens the correlation between the points, making MM more difficult. In this paper, an MM algorithm based on priority rules is designed for vehicle trajectory characteristics at low sampling frequencies.

Findings

The experimental results show that the MM based on priority rule algorithm can effectively match the trajectory data of low sampling frequency with the actual road, and the matching accuracy is better than other similar algorithms, the processing speed reaches 73 per second.

Research limitations/implications

In the algorithm verification of this paper, although the algorithm design and experimental verification are considered considering the diversity of GPS data sampling frequency, the experimental data used are still a single source.

Originality/value

Based on the GPS trajectory data of the Ministry of Transport, the experimental results show that the accuracy of the priority-based weight-based algorithm is higher. The accuracy of this algorithm is over 98.1 per cent, which is better than other similar algorithms.

Details

International Journal of Crowd Science, vol. 3 no. 1
Type: Research Article
ISSN: 2398-7294

Keywords

Content available
Article
Publication date: 23 October 2023

Adam Biggs and Joseph Hamilton

Evaluating warfighter lethality is a critical aspect of military performance. Raw metrics such as marksmanship speed and accuracy can provide some insight, yet interpreting subtle…

Abstract

Purpose

Evaluating warfighter lethality is a critical aspect of military performance. Raw metrics such as marksmanship speed and accuracy can provide some insight, yet interpreting subtle differences can be challenging. For example, is a speed difference of 300 milliseconds more important than a 10% accuracy difference on the same drill? Marksmanship evaluations must have objective methods to differentiate between critical factors while maintaining a holistic view of human performance.

Design/methodology/approach

Monte Carlo simulations are one method to circumvent speed/accuracy trade-offs within marksmanship evaluations. They can accommodate both speed and accuracy implications simultaneously without needing to hold one constant for the sake of the other. Moreover, Monte Carlo simulations can incorporate variability as a key element of performance. This approach thus allows analysts to determine consistency of performance expectations when projecting future outcomes.

Findings

The review divides outcomes into both theoretical overview and practical implication sections. Each aspect of the Monte Carlo simulation can be addressed separately, reviewed and then incorporated as a potential component of small arms combat modeling. This application allows for new human performance practitioners to more quickly adopt the method for different applications.

Originality/value

Performance implications are often presented as inferential statistics. By using the Monte Carlo simulations, practitioners can present outcomes in terms of lethality. This method should help convey the impact of any marksmanship evaluation to senior leadership better than current inferential statistics, such as effect size measures.

Details

Journal of Defense Analytics and Logistics, vol. 7 no. 2
Type: Research Article
ISSN: 2399-6439

Keywords

Open Access
Article
Publication date: 26 April 2022

Jingfeng Xie, Jun Huang, Lei Song, Jingcheng Fu and Xiaoqiang Lu

The typical approach of modeling the aerodynamics of an aircraft is to develop a complete database through testing or computational fluid dynamics (CFD). The database will be huge…

2029

Abstract

Purpose

The typical approach of modeling the aerodynamics of an aircraft is to develop a complete database through testing or computational fluid dynamics (CFD). The database will be huge if it has a reasonable resolution and requires an unacceptable CFD effort during the conceptional design. Therefore, this paper aims to reduce the computing effort required via establishing a general aerodynamic model that needs minor parameters.

Design/methodology/approach

The model structure was a preconfigured polynomial model, and the parameters were estimated with a recursive method to further reduce the calculation effort. To uniformly disperse the sample points through each step, a unique recursive sampling method based on a Voronoi diagram was presented. In addition, a multivariate orthogonal function approach was used.

Findings

A case study of a flying wing aircraft demonstrated that generating a model with acceptable precision (0.01 absolute error or 5% relative error) costs only 1/54 of the cost of creating a database. A series of six degrees of freedom flight simulations shows that the model’s prediction was accurate.

Originality/value

This method proposed a new way to simplify the model and recursive sampling. It is a low-cost way of obtaining high-fidelity models during primary design, allowing for more precise flight dynamics analysis.

Details

Aircraft Engineering and Aerospace Technology, vol. 94 no. 11
Type: Research Article
ISSN: 1748-8842

Keywords

Open Access
Article
Publication date: 7 August 2019

Markus Neumayer, Thomas Suppan and Thomas Bretterklieber

The application of statistical inversion theory provides a powerful approach for solving estimation problems including the ability for uncertainty quantification (UQ) by means of…

Abstract

Purpose

The application of statistical inversion theory provides a powerful approach for solving estimation problems including the ability for uncertainty quantification (UQ) by means of Markov chain Monte Carlo (MCMC) methods and Monte Carlo integration. This paper aims to analyze the application of a state reduction technique within different MCMC techniques to improve the computational efficiency and the tuning process of these algorithms.

Design/methodology/approach

A reduced state representation is constructed from a general prior distribution. For sampling the Metropolis Hastings (MH) Algorithm and the Gibbs sampler are used. Efficient proposal generation techniques and techniques for conditional sampling are proposed and evaluated for an exemplary inverse problem.

Findings

For the MH-algorithm, high acceptance rates can be obtained with a simple proposal kernel. For the Gibbs sampler, an efficient technique for conditional sampling was found. The state reduction scheme stabilizes the ill-posed inverse problem, allowing a solution without a dedicated prior distribution. The state reduction is suitable to represent general material distributions.

Practical implications

The state reduction scheme and the MCMC techniques can be applied in different imaging problems. The stabilizing nature of the state reduction improves the solution of ill-posed problems. The tuning of the MCMC methods is simplified.

Originality/value

The paper presents a method to improve the solution process of inverse problems within the Bayesian framework. The stabilization of the inverse problem due to the state reduction improves the solution. The approach simplifies the tuning of MCMC methods.

Details

COMPEL - The international journal for computation and mathematics in electrical and electronic engineering , vol. 38 no. 5
Type: Research Article
ISSN: 0332-1649

Keywords

Content available
Article
Publication date: 1 March 2005

Karl Wennberg

This article provides an account of how databases can be effectively used in entrepreneurship research. Improved quality and access to large secondary databases offer paths to…

1267

Abstract

This article provides an account of how databases can be effectively used in entrepreneurship research. Improved quality and access to large secondary databases offer paths to answer questions of great theoretical value. I present an overview of theoretical, methodological, and practical difficulties in working with database data, together with advice on how such difficulties can be overcome. Conclusions are given, together with suggestions of areas where databases might provide real and important contributions to entrepreneurship research.

Details

New England Journal of Entrepreneurship, vol. 8 no. 2
Type: Research Article
ISSN: 2574-8904

Open Access
Book part
Publication date: 4 May 2018

Sutrisno, Rayandra Asyhar, Wimpy Prendika, Hilda Amanda and Fachrur Razi

Purpose – This paper aims to detect or identify the presence of hydrocarbon infiltration on sampling point in the Rambe River area according to the obtained VOCs and the adsorbed…

Abstract

Purpose – This paper aims to detect or identify the presence of hydrocarbon infiltration on sampling point in the Rambe River area according to the obtained VOCs and the adsorbed SVOCs.

Design/Methodology/Approach – The Gore-sorber method has been used to capture volatile organic compounds (VOCs) and semi-volatile organic compounds (SVOCs) as indicators of subsurface hydrocarbon generation and entrapment. This method is usually used in environmental surveys for the oil investigations in certain areas for surface survey screening, designed to collect a broad range of VOCs and (SVOCs) at lower concentrations, quickly and inexpensively. The results also indicated a general correlation between the GORE-SORBER and reference method data. The research was conducted in Rambe River Village, Tebing Tinggi sub-district of Tanjung Jabung Barat district, Jambi Province Indonesia. The collection of the Gore-Sorber modules were analyzed using a gas chromatography-mass spectrometer thermal desorption (GC/MS).

Findings – The results showed that from all sampling points in Tebing Tinggi areas, the dominant components detected are carbonyl sulphide, dimethyl sulfide, ethane, propane, butane, 2-methyl butane, pentane, and carbon sulfide with carbon chain in the range C2-C5. These hydrocarbon gases (C1-C4) which may be from thermogenic or microbial processes. The highest concentrations of carbonyl sulfide were 392.67 ng and dimethyl disulfide 261.90 ng.

Originality/Value – In addition to estimate and predict the petroleum formation, this article provides information about the presence of oil fields in the area of the Sungai Rambe Village

Details

Proceedings of MICoMS 2017
Type: Book
ISBN:

Keywords

Open Access
Book part
Publication date: 4 May 2018

Rahmi Agustina, M. Ali S, Ferdinan Yulianda and Suhendrayatna

Purpose – The purpose of this research is to investigate the relationship of lead (Pb) and zinc (Zn) contents in sediment of Faunus ater (F. ater) population density and to…

Abstract

Purpose – The purpose of this research is to investigate the relationship of lead (Pb) and zinc (Zn) contents in sediment of Faunus ater (F. ater) population density and to analyze the relationship between Pb and Zn accumulation in F. ater with F. ater density in Reuleng River, Leupung, Aceh Besar.

Design/Methodology/ApproachSampling was conducted in November 2016 until January 2017. Density of F. ater was analyzed by density formula while its relationship to Pb and Zn in sediments and F. ater was conducted by correlation analysis method.

Findings – The results showed that correlation between Pb and Zn in sediments and in F. ater varies at each locations on every month of sampling. Pb and Zn content in sediments found a fluctuating relationship in each month of sampling with density of F. ater. Correlation of Pb content in sediments with F. ater density showed a medium correlation in January 2017 with r-value = 0.665. Zn in sediment has a very strong correlation to F. ater density in November 2016 with r-value = 0.891. Pb in F. ater has a medium correlation to F. ater density in January 2017 with r-value = 0.436. Furthermore, accumulation of Zn in F. ater to its density does show some apparent correlation in each month of sampling.

Research Limitation/Implications – This research gives information about the relationship of Pb and Zn contents in sediment to density of F. ater and to analyze correlation of Pb and Zn in F. ater to density of F. ater in Reuleng River, Leupung, Aceh Besar district.

Originality/Value – This is the first time research is conducted about on the correlation between lead and zinc to obtain the density of F. ater.

Details

Proceedings of MICoMS 2017
Type: Book
ISBN:

Keywords

Open Access
Article
Publication date: 17 October 2019

Petros Maravelakis

The purpose this paper is to review some of the statistical methods used in the field of social sciences.

46719

Abstract

Purpose

The purpose this paper is to review some of the statistical methods used in the field of social sciences.

Design/methodology/approach

A review of some of the statistical methodologies used in areas like survey methodology, official statistics, sociology, psychology, political science, criminology, public policy, marketing research, demography, education and economics.

Findings

Several areas are presented such as parametric modeling, nonparametric modeling and multivariate methods. Focus is also given to time series modeling, analysis of categorical data and sampling issues and other useful techniques for the analysis of data in the social sciences. Indicative references are given for all the above methods along with some insights for the application of these techniques.

Originality/value

This paper reviews some statistical methods that are used in social sciences and the authors draw the attention of researchers on less popular methods. The purpose is not to give technical details and also not to refer to all the existing techniques or to all the possible areas of statistics. The focus is mainly on the applied aspect of the techniques and the authors give insights about techniques that can be used to answer problems in the abovementioned areas of research.

Details

Journal of Humanities and Applied Social Sciences, vol. 1 no. 2
Type: Research Article
ISSN:

Keywords

1 – 10 of over 15000