Search results

1 – 10 of over 3000
Open Access
Article
Publication date: 10 December 2021

Pingan Zhu, Chao Zhang and Jun Zou

The purpose of the work is to provide a comprehensive review of the digital image correlation (DIC) technique for those who are interested in performing the DIC technique in the…

Abstract

Purpose

The purpose of the work is to provide a comprehensive review of the digital image correlation (DIC) technique for those who are interested in performing the DIC technique in the area of manufacturing.

Design/methodology/approach

No methodology was used because the paper is a review article.

Findings

no fundings.

Originality/value

Herein, the historical development, main strengths and measurement setup of DIC are introduced. Subsequently, the basic principles of the DIC technique are outlined in detail. The analysis of measurement accuracy associated with experimental factors and correlation algorithms is discussed and some useful recommendations for reducing measurement errors are also offered. Then, the utilization of DIC in different manufacturing fields (e.g. cutting, welding, forming and additive manufacturing) is summarized. Finally, the current challenges and prospects of DIC in intelligent manufacturing are discussed.

Details

Journal of Intelligent Manufacturing and Special Equipment, vol. 2 no. 2
Type: Research Article
ISSN: 2633-6596

Keywords

Open Access
Article
Publication date: 23 January 2023

Junshan Hu, Jie Jin, Yueya Wu, Shanyong Xuan and Wei Tian

Aircraft structures are mainly connected by riveting joints, whose quality and mechanical performance are directly determined by vertical accuracy of riveting holes. This paper…

Abstract

Purpose

Aircraft structures are mainly connected by riveting joints, whose quality and mechanical performance are directly determined by vertical accuracy of riveting holes. This paper proposed a combined vertical accuracy compensation method for drilling and riveting of aircraft panels with great variable curvatures.

Design/methodology/approach

The vertical accuracy compensation method combines online and offline compensation categories in a robot riveting and drilling system. The former category based on laser ranging is aimed to correct the vertical error between actual and theoretical riveting positions, and the latter based on model curvature is used to correct the vertical error caused by the approximate plane fitting in variable-curvature panels.

Findings

The vertical accuracy compensation method is applied in an automatic robot drilling and riveting system. The result reveals that the vertical accuracy error of drilling and riveting is within 0.4°, which meets the requirements of the vertical accuracy in aircraft assembly.

Originality/value

The proposed method is suitable for improving the vertical accuracy of drilling and riveting on panels or skins of aerospace products with great variable curvatures without introducing extra measuring sensors.

Details

Journal of Intelligent Manufacturing and Special Equipment, vol. 4 no. 1
Type: Research Article
ISSN: 2633-6596

Keywords

Open Access
Article
Publication date: 2 December 2016

Taylor Boyd, Grace Docken and John Ruggiero

The purpose of this paper is to improve the estimation of the production frontier in cases where outliers exist. We focus on the case when outliers appear above the true frontier…

2654

Abstract

Purpose

The purpose of this paper is to improve the estimation of the production frontier in cases where outliers exist. We focus on the case when outliers appear above the true frontier due to measurement error.

Design/methodology/approach

The authors use stochastic data envelopment analysis (SDEA) to allow observed points above the frontier. They supplement SDEA with assumptions on the efficiency and show that the true frontier in the presence of outliers can be derived.

Findings

This paper finds that the authors’ maximum likelihood approach outperforms super-efficiency measures. Using simulations, this paper shows that SDEA is a useful model for outlier detection.

Originality/value

The model developed in this paper is original; the authors add distributional assumptions to derive the optimal quantile with SDEA to remove outliers. The authors believe that the value of the paper will lead to many citations because real-world data are often subject to outliers.

Details

Journal of Centrum Cathedra, vol. 9 no. 2
Type: Research Article
ISSN: 1851-6599

Keywords

Open Access
Article
Publication date: 8 February 2023

Edoardo Ramalli and Barbara Pernici

Experiments are the backbone of the development process of data-driven predictive models for scientific applications. The quality of the experiments directly impacts the model…

Abstract

Purpose

Experiments are the backbone of the development process of data-driven predictive models for scientific applications. The quality of the experiments directly impacts the model performance. Uncertainty inherently affects experiment measurements and is often missing in the available data sets due to its estimation cost. For similar reasons, experiments are very few compared to other data sources. Discarding experiments based on the missing uncertainty values would preclude the development of predictive models. Data profiling techniques are fundamental to assess data quality, but some data quality dimensions are challenging to evaluate without knowing the uncertainty. In this context, this paper aims to predict the missing uncertainty of the experiments.

Design/methodology/approach

This work presents a methodology to forecast the experiments’ missing uncertainty, given a data set and its ontological description. The approach is based on knowledge graph embeddings and leverages the task of link prediction over a knowledge graph representation of the experiments database. The validity of the methodology is first tested in multiple conditions using synthetic data and then applied to a large data set of experiments in the chemical kinetic domain as a case study.

Findings

The analysis results of different test case scenarios suggest that knowledge graph embedding can be used to predict the missing uncertainty of the experiments when there is a hidden relationship between the experiment metadata and the uncertainty values. The link prediction task is also resilient to random noise in the relationship. The knowledge graph embedding outperforms the baseline results if the uncertainty depends upon multiple metadata.

Originality/value

The employment of knowledge graph embedding to predict the missing experimental uncertainty is a novel alternative to the current and more costly techniques in the literature. Such contribution permits a better data quality profiling of scientific repositories and improves the development process of data-driven models based on scientific experiments.

Open Access
Article
Publication date: 1 December 2020

Fazlisham Binti Ghazali, Siti Nurhafizah Saleeza Ramlee, Najib Alwi and Hazuan Hizan

This study aimed to develop the construct validity for the Malay version of the Paffenbarger physical activity questionnaire (PPAQ) by adapting the original questionnaire to suit…

2261

Abstract

Purpose

This study aimed to develop the construct validity for the Malay version of the Paffenbarger physical activity questionnaire (PPAQ) by adapting the original questionnaire to suit the local context.

Design/methodology/approach

The PPAQ was adopted and translated into the Malay language and modified to reach good content agreement among a panel of experts. A total of 65 participants aged 22–55 years old, fluent and literate in the Malay language were selected. Principal component analysis (PCA) was used to investigate construct validity. Reliability of this adapted instrument was analyzed according to types of variables.

Findings

The panel of experts reached a consensus that the final four items chosen in the adapted Malay version of PPAQ were valid and supported by a good content validity index (CVI). In total, two domains consonant with the operational domain definition were identified by PCA. Based on scores from intensity and duration of exercise, the study further divided the group into who were physically active and those who chose the unstructured physical activity. Relative reliability after a 14-day interval demonstrated moderate strength of agreement with an acceptable range of measurement error.

Research limitations/implications

PPAQ has been used worldwide but was less familiar in the local context. The Malay-four item PPAQ will provide the locally validated version of physical activity questionnaire. In addition, the authors have improved the original PPAQ by dividing the question items into two distinct domains which will effectively identify those who are physically active and those who are involved in unplanned exercise. Nevertheless, further research is recommended in bigger and heterogeneous samples along with a number of reliability tests.

Practical implications

To the authors’ knowledge, this is the first study to assess internal structure of the four-item version of PPAQ. This analysis successfully identified two components with eigenvalue more than one in the Malay four-item PPAQ. Based on this, the authors were able to separate pool of population into two groups, which are physically active and unplanned exercise (involved in unstructured exercise). The ability of the validated questionnaire to divide the population into various intensities of physical activity is a novel one, which may be useful in many public health studies where high intensity of physical activity; hence, greater energy expenditure is associated with increased longevity, better health benefit and improved cognitive function.

Social implications

In addition, the second domain “unplanned exercise” was successfully grouped together. Implication of the unplanned exercise component is to identify pool of population with active lifestyle awareness and choose the unstructured exercise instead of vigorous and formal exercising. Even though the amount of intensity and duration of incidental exercise does not reach recommended public health recommendation, it has been proven that preferred healthier lifestyle is positively associated with better cognition in later life.

Originality/value

The adapted Malay version of PPAQ has sound psychometric properties and could assist in differentiating groups of population based on their physical activity.

Open Access
Article
Publication date: 26 April 2024

Marcus Gerdin, Ella Kolkowska and Åke Grönlund

Research on employee non-/compliance to information security policies suffers from inconsistent results and there is an ongoing discussion about the dominating survey research…

Abstract

Purpose

Research on employee non-/compliance to information security policies suffers from inconsistent results and there is an ongoing discussion about the dominating survey research methodology and its potential effect on these results. This study aims to add to this discussion by investigating discrepancies between what the authors claim to measure (theoretical properties of variables) and what they actually measure (respondents’ interpretations of the operationalized variables). This study asks: How well do respondents’ interpretations of variables correspond to their theoretical definitions? What are the characteristics of any discrepancies between variable definitions and respondent interpretations?

Design/methodology/approach

This study is based on in-depth interviews with 17 respondents from the Swedish public sector to understand how they interpret questionnaire measurement items operationalizing the variables Perceived Severity from Protection Motivation Theory and Attitude from Theory of Planned Behavior.

Findings

The authors found that respondents’ interpretations in many cases differ substantially from the theoretical definitions. Overall, the authors found four principal ways in which respondents interpreted measurement items – referred to as property contextualization, extension, alteration and oscillation – each implying more or less (dis)alignment with the intended theoretical properties of the two variables examined.

Originality/value

The qualitative method used proved vital to better understand respondents’ interpretations which, in turn, is key for improving self-reporting measurement instruments. To the best of the authors’ knowledge, this study is a first step toward understanding how precise and uniform definitions of variables’ theoretical properties can be operationalized into effective measurement items.

Details

Information & Computer Security, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2056-4961

Keywords

Open Access
Article
Publication date: 13 May 2019

Thomas Salzberger and Monika Koller

Psychometric analyses of self-administered questionnaire data tend to focus on items and instruments as a whole. The purpose of this paper is to investigate the functioning of the…

3030

Abstract

Purpose

Psychometric analyses of self-administered questionnaire data tend to focus on items and instruments as a whole. The purpose of this paper is to investigate the functioning of the response scale and its impact on measurement precision. In terms of the response scale direction, existing evidence is mixed and inconclusive.

Design/methodology/approach

Three experiments are conducted to examine the functioning of response scales of different direction, ranging from agree to disagree versus from disagree to agree. The response scale direction effect is exemplified by two different latent constructs by applying the Rasch model for measurement.

Findings

The agree-to-disagree format generally performs better than the disagree-to-agree variant with spatial proximity between the statement and the agree-pole of the scale appearing to drive the effect. The difference is essentially related to the unit of measurement.

Research limitations/implications

A careful investigation of the functioning of the response scale should be part of every psychometric assessment. The framework of Rasch measurement theory offers unique opportunities in this regard.

Practical implications

Besides content, validity and reliability, academics and practitioners utilising published measurement instruments are advised to consider any evidence on the response scale functioning that is available.

Originality/value

The study exemplifies the application of the Rasch model to assess measurement precision as a function of the design of the response scale. The methodology raises the awareness for the unit of measurement, which typically remains hidden.

Details

European Journal of Marketing, vol. 53 no. 5
Type: Research Article
ISSN: 0309-0566

Keywords

Open Access
Article
Publication date: 13 March 2018

Teik-Kheong Tan and Merouane Lakehal-Ayat

The impact of volatility crush can be devastating to an option buyer and results in a substantial capital loss, even with a directionally correct strategy. As a result, most…

2016

Abstract

Purpose

The impact of volatility crush can be devastating to an option buyer and results in a substantial capital loss, even with a directionally correct strategy. As a result, most volatility plays are for option sellers, but the profit they can achieve is limited and the sellers carry unlimited risk. This paper aims to demonstrate the dynamics of implied volatility (IV) as being influenced by effects of persistence, leverage, market sentiment and liquidity. From the exploratory factor analysis (EFA), they extract four constructs and the results from the confirmatory factor analysis (CFA) indicated a good model fit for the constructs.

Design/methodology/approach

This section describes the methodology used for conducting the study. This includes the study area, study approach, sources of data, sampling technique and the method of data analysis.

Findings

Although there is extensive literature on methods for estimating IV dynamics during earnings announcement, few researchers have looked at the impact of expected market maker move, IV differential and IV Rank on the IV path after the earnings announcement. One reason for this research gap is because of the recent introduction of weekly options for equities by the Chicago Board of Options Exchange (CBOE) back in late 2010. Even then, the CBOE only released weekly options four individual equities – Bank of America (BAC.N), Apple (AAPL.O), Citigroup (C.N) and US-listed shares of BP (BP.L) (BP.N). The introduction of weekly options provided more trading flexibility and precision timing from shorter durations. This automatically expanded expiration choices, which in turned offered greater access and flexibility from the perspective of trading volatility during earnings announcement. This study has demonstrated the impact of including market sentiment and liquidity into the forecasting model for IV during earnings. This understanding in turn helps traders to formulate strategies that can circumvent the undefined risk associated with trading options strategies such as writing strangles.

Research limitations/implications

The first limitation of the study is that the firms included in the study are relatively large, and the results of the study can therefore not be generalized to medium sized and small firms. The second limitation lies in the current sample size, which in many cases was not enough to be able to draw reliable conclusions on. Scaling the sample size up is only a function of time and effort. This is easily overcome and should not be a limitation in the future. The third limitation concerns the measurement of the variables. Under the assumption of a normal distribution of returns (i.e. stock prices follow a random walk process), which means that the distribution of returns is symmetrical, one can estimate the probabilities of potential gains or losses associated with each amount. This means the standard deviation of securities returns, which is called historical volatility and is usually calculated as a moving average, can be used as a risk indicator. The prices used for the calculations are usually the closing prices, but Parkinson (1980) suggests that the day’s high and low prices would provide a better estimate of real volatility. One can also refine the analysis with high-frequency data. Such data enable the avoidance of the bias stemming from the use of closing (or opening) prices, but they have only been available for a relatively short time. The length of the observation period is another topic that is still under debate. There are no criteria that enable one to conclude that volatility calculated in relation to mean returns over 20 trading days (or one month) and then annualized is any more or less representative than volatility calculated over 130 trading days (or six months) and then annualized, or even than volatility measured directly over 260 trading days (one year). Nonetheless, the guidelines adopted in this study represent the best practices of researchers thus far.

Practical implications

This study has indicated that an earnings announcement can provide a volatility mispricing opportunity to allow an investor to profit from a sudden, sharp drop in IV. More specifically, the methodology developed by Tan and Bing is now well supported both empirically and theoretically in terms of qualifying opportunities that can be profitable because of the volatility crush. Conventionally, the option strategy of shorting strangles carries unlimited theoretical risk; however, the methodology has demonstrated that this risk can be substantially reduced if followed judiciously. This profitable strategy relies on a set of qualifying parameters including liquidity, premium collection, volatility differential, expected market move and market sentiment. Building upon this framework, the understanding of the effects of persistence and leverage resulted in further reducing the risk associated with trading options during earnings announcements. As a guideline, the sentiment and liquidity variables help to qualify a trade and the effects of persistence and leverage help to close the qualified trade.

Social implications

The authors find a positive association between the effects of market sentiment, liquidity, persistence and leverage in the dynamics of IV during earnings announcement. These findings substantiate further the four factors that influence IV dynamics during earnings announcement and conclude that just looking at persistence and leverage alone will not generate profitable trading opportunities.

Originality/value

The impact of volatility crush can be devastating to the option buyer with substantial capital loss, even for a directionally correct strategy. As a result, most volatility plays are for option sellers; however, the profit is limited and the sellers carry unlimited risk. The authors demonstrate the dynamics of IV as being influenced by effects of persistence, leverage, market sentiment and liquidity. From the EFA, they extracted four constructs and the results from the CFA indicated a good model fit for the constructs. Using EFA, CFA and Bayesian analysis, how this model can help investors formulate the right strategy to achieve the best risk/reward mix is demonstrated. Using Bayesian estimation and IV differential to proxy for differences of opinion about term structures in option pricing, the authors find a positive association among the effects of market sentiment, liquidity, persistence and leverage in the dynamics of IV during earnings announcement.

Details

PSU Research Review, vol. 2 no. 1
Type: Research Article
ISSN: 2399-1747

Keywords

Open Access
Article
Publication date: 16 January 2020

Magda Joachimiak

In this paper, the Cauchy-type problem for the Laplace equation was solved in the rectangular domain with the use of the Chebyshev polynomials. The purpose of this paper is to…

4409

Abstract

Purpose

In this paper, the Cauchy-type problem for the Laplace equation was solved in the rectangular domain with the use of the Chebyshev polynomials. The purpose of this paper is to present an optimal choice of the regularization parameter for the inverse problem, which allows determining the stable distribution of temperature on one of the boundaries of the rectangle domain with the required accuracy.

Design/methodology/approach

The Cauchy-type problem is ill-posed numerically, therefore, it has been regularized with the use of the modified Tikhonov and Tikhonov–Philips regularization. The influence of the regularization parameter choice on the solution was investigated. To choose the regularization parameter, the Morozov principle, the minimum of energy integral criterion and the L-curve method were applied.

Findings

Numerical examples for the function with singularities outside the domain were solved in this paper. The values of results change significantly within the calculation domain. Next, results of the sought temperature distributions, obtained with the use of different methods of choosing the regularization parameter, were compared. Methods of choosing the regularization parameter were evaluated by the norm Nmax.

Practical implications

Calculation model described in this paper can be applied to determine temperature distribution on the boundary of the heated wall of, for instance, a boiler or a body of the turbine, that is, everywhere the temperature measurement is impossible to be performed on a part of the boundary.

Originality/value

The paper presents a new method for solving the inverse Cauchy problem with the use of the Chebyshev polynomials. The choice of the regularization parameter was analyzed to obtain a solution with the lowest possible sensitivity to input data disturbances.

Details

International Journal of Numerical Methods for Heat & Fluid Flow, vol. 30 no. 10
Type: Research Article
ISSN: 0961-5539

Keywords

Open Access
Article
Publication date: 19 July 2021

Mario Mendocilla, Paloma Miravitlles Matamoros and Jorge Matute

The purpose of this study is to empirically develop and validate a practical, consistent and specific scale to assess perceived service quality at the service encounter at…

9985

Abstract

Purpose

The purpose of this study is to empirically develop and validate a practical, consistent and specific scale to assess perceived service quality at the service encounter at quick-service restaurants (QSRs).

Design/methodology/approach

Development and validation of the scale involved a five-stage process. Data were collected from 430 customers of a QSR belonging to an international brand located in Barcelona. Surveys were applied immediately after the service encounter, using the face-to-face method. The scale development procedure involved exploratory and confirmatory factor analyses.

Findings

The results suggest a specific and parsimonious measurement scale, whose structure comprises 14 items in four dimensions. In contrast to previous studies, this study identified the appropriateness of splitting the interaction quality dimension into two single dimensions, one focusing on the interaction time and other on staff–customer interaction. Furthermore, these indicate that a speedy service, pleasant treatment and food quality are the most valued attributes in QSR.

Practical implications

This scale is a useful instrument to administer and assure service quality standards within QSR management systems. Its practical approach and short survey length ease data collection, considering that customers spend short amounts of time in this type of restaurant. Furthermore, it could also be used by franchisors and restaurant operators as a tool to monitor continuing compliance with service quality standards.

Originality/value

The resulting scale introduces a novel four-factor structure with high goodness of fit to effectively measure customers' perceived service quality in QSRs, where the ease of use and speed of gathering client responses are a key factor for successful implementation.

Details

British Food Journal, vol. 123 no. 13
Type: Research Article
ISSN: 0007-070X

Keywords

1 – 10 of over 3000